Oct 12 05:41:06 crc systemd[1]: Starting Kubernetes Kubelet... Oct 12 05:41:06 crc restorecon[4670]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:06 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 12 05:41:07 crc restorecon[4670]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 12 05:41:07 crc restorecon[4670]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Oct 12 05:41:07 crc kubenswrapper[4930]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 12 05:41:07 crc kubenswrapper[4930]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Oct 12 05:41:07 crc kubenswrapper[4930]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 12 05:41:07 crc kubenswrapper[4930]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 12 05:41:07 crc kubenswrapper[4930]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Oct 12 05:41:07 crc kubenswrapper[4930]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.873789 4930 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.884127 4930 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.884197 4930 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.884211 4930 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.884222 4930 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.884233 4930 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.884244 4930 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.884254 4930 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.884262 4930 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.884270 4930 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.884278 4930 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.884288 4930 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.884299 4930 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.884308 4930 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.884316 4930 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.884323 4930 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.884332 4930 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.884340 4930 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.884349 4930 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.884358 4930 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.884367 4930 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.884376 4930 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.884385 4930 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.884394 4930 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.884403 4930 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.884411 4930 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.884419 4930 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.884426 4930 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.884434 4930 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.884442 4930 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.884467 4930 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.884476 4930 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.884485 4930 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.884523 4930 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.884533 4930 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.884542 4930 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.884550 4930 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.884558 4930 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.884567 4930 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.884575 4930 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.884583 4930 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.884591 4930 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.884598 4930 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.884607 4930 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.884619 4930 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.884631 4930 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.884641 4930 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.884650 4930 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.884658 4930 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.884666 4930 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.884674 4930 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.884682 4930 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.884693 4930 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.884704 4930 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.884716 4930 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.884725 4930 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.884759 4930 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.884769 4930 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.884777 4930 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.884785 4930 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.884793 4930 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.884801 4930 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.884809 4930 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.884816 4930 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.884826 4930 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.884834 4930 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.884842 4930 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.884849 4930 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.884857 4930 feature_gate.go:330] unrecognized feature gate: Example Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.884870 4930 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.884879 4930 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.884888 4930 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.885808 4930 flags.go:64] FLAG: --address="0.0.0.0" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.885837 4930 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.885853 4930 flags.go:64] FLAG: --anonymous-auth="true" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.885866 4930 flags.go:64] FLAG: --application-metrics-count-limit="100" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.885878 4930 flags.go:64] FLAG: --authentication-token-webhook="false" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.885888 4930 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.885902 4930 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.885914 4930 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.885924 4930 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.885933 4930 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.885943 4930 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.885956 4930 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.885966 4930 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.885976 4930 flags.go:64] FLAG: --cgroup-root="" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.885987 4930 flags.go:64] FLAG: --cgroups-per-qos="true" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.885998 4930 flags.go:64] FLAG: --client-ca-file="" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886009 4930 flags.go:64] FLAG: --cloud-config="" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886021 4930 flags.go:64] FLAG: --cloud-provider="" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886032 4930 flags.go:64] FLAG: --cluster-dns="[]" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886048 4930 flags.go:64] FLAG: --cluster-domain="" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886057 4930 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886067 4930 flags.go:64] FLAG: --config-dir="" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886076 4930 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886087 4930 flags.go:64] FLAG: --container-log-max-files="5" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886100 4930 flags.go:64] FLAG: --container-log-max-size="10Mi" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886109 4930 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886119 4930 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886129 4930 flags.go:64] FLAG: --containerd-namespace="k8s.io" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886139 4930 flags.go:64] FLAG: --contention-profiling="false" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886149 4930 flags.go:64] FLAG: --cpu-cfs-quota="true" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886159 4930 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886169 4930 flags.go:64] FLAG: --cpu-manager-policy="none" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886178 4930 flags.go:64] FLAG: --cpu-manager-policy-options="" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886208 4930 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886218 4930 flags.go:64] FLAG: --enable-controller-attach-detach="true" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886227 4930 flags.go:64] FLAG: --enable-debugging-handlers="true" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886237 4930 flags.go:64] FLAG: --enable-load-reader="false" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886246 4930 flags.go:64] FLAG: --enable-server="true" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886255 4930 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886268 4930 flags.go:64] FLAG: --event-burst="100" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886278 4930 flags.go:64] FLAG: --event-qps="50" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886287 4930 flags.go:64] FLAG: --event-storage-age-limit="default=0" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886296 4930 flags.go:64] FLAG: --event-storage-event-limit="default=0" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886306 4930 flags.go:64] FLAG: --eviction-hard="" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886318 4930 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886328 4930 flags.go:64] FLAG: --eviction-minimum-reclaim="" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886338 4930 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886348 4930 flags.go:64] FLAG: --eviction-soft="" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886358 4930 flags.go:64] FLAG: --eviction-soft-grace-period="" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886367 4930 flags.go:64] FLAG: --exit-on-lock-contention="false" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886377 4930 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886386 4930 flags.go:64] FLAG: --experimental-mounter-path="" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886394 4930 flags.go:64] FLAG: --fail-cgroupv1="false" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886403 4930 flags.go:64] FLAG: --fail-swap-on="true" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886413 4930 flags.go:64] FLAG: --feature-gates="" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886424 4930 flags.go:64] FLAG: --file-check-frequency="20s" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886434 4930 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886443 4930 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886452 4930 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886462 4930 flags.go:64] FLAG: --healthz-port="10248" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886471 4930 flags.go:64] FLAG: --help="false" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886481 4930 flags.go:64] FLAG: --hostname-override="" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886491 4930 flags.go:64] FLAG: --housekeeping-interval="10s" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886500 4930 flags.go:64] FLAG: --http-check-frequency="20s" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886509 4930 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886519 4930 flags.go:64] FLAG: --image-credential-provider-config="" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886528 4930 flags.go:64] FLAG: --image-gc-high-threshold="85" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886537 4930 flags.go:64] FLAG: --image-gc-low-threshold="80" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886546 4930 flags.go:64] FLAG: --image-service-endpoint="" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886555 4930 flags.go:64] FLAG: --kernel-memcg-notification="false" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886564 4930 flags.go:64] FLAG: --kube-api-burst="100" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886573 4930 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886582 4930 flags.go:64] FLAG: --kube-api-qps="50" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886591 4930 flags.go:64] FLAG: --kube-reserved="" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886600 4930 flags.go:64] FLAG: --kube-reserved-cgroup="" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886609 4930 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886619 4930 flags.go:64] FLAG: --kubelet-cgroups="" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886627 4930 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886637 4930 flags.go:64] FLAG: --lock-file="" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886645 4930 flags.go:64] FLAG: --log-cadvisor-usage="false" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886654 4930 flags.go:64] FLAG: --log-flush-frequency="5s" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886664 4930 flags.go:64] FLAG: --log-json-info-buffer-size="0" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886678 4930 flags.go:64] FLAG: --log-json-split-stream="false" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886689 4930 flags.go:64] FLAG: --log-text-info-buffer-size="0" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886699 4930 flags.go:64] FLAG: --log-text-split-stream="false" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886708 4930 flags.go:64] FLAG: --logging-format="text" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886718 4930 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886728 4930 flags.go:64] FLAG: --make-iptables-util-chains="true" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886771 4930 flags.go:64] FLAG: --manifest-url="" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886781 4930 flags.go:64] FLAG: --manifest-url-header="" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886795 4930 flags.go:64] FLAG: --max-housekeeping-interval="15s" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886805 4930 flags.go:64] FLAG: --max-open-files="1000000" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886816 4930 flags.go:64] FLAG: --max-pods="110" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886832 4930 flags.go:64] FLAG: --maximum-dead-containers="-1" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886842 4930 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886852 4930 flags.go:64] FLAG: --memory-manager-policy="None" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886861 4930 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886870 4930 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886880 4930 flags.go:64] FLAG: --node-ip="192.168.126.11" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886889 4930 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886912 4930 flags.go:64] FLAG: --node-status-max-images="50" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886922 4930 flags.go:64] FLAG: --node-status-update-frequency="10s" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886932 4930 flags.go:64] FLAG: --oom-score-adj="-999" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886941 4930 flags.go:64] FLAG: --pod-cidr="" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886949 4930 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886963 4930 flags.go:64] FLAG: --pod-manifest-path="" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886972 4930 flags.go:64] FLAG: --pod-max-pids="-1" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886981 4930 flags.go:64] FLAG: --pods-per-core="0" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.886992 4930 flags.go:64] FLAG: --port="10250" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.887003 4930 flags.go:64] FLAG: --protect-kernel-defaults="false" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.887015 4930 flags.go:64] FLAG: --provider-id="" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.887029 4930 flags.go:64] FLAG: --qos-reserved="" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.887040 4930 flags.go:64] FLAG: --read-only-port="10255" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.887051 4930 flags.go:64] FLAG: --register-node="true" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.887060 4930 flags.go:64] FLAG: --register-schedulable="true" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.887069 4930 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.887085 4930 flags.go:64] FLAG: --registry-burst="10" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.887095 4930 flags.go:64] FLAG: --registry-qps="5" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.887104 4930 flags.go:64] FLAG: --reserved-cpus="" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.887115 4930 flags.go:64] FLAG: --reserved-memory="" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.887127 4930 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.887136 4930 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.887146 4930 flags.go:64] FLAG: --rotate-certificates="false" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.887156 4930 flags.go:64] FLAG: --rotate-server-certificates="false" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.887165 4930 flags.go:64] FLAG: --runonce="false" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.887177 4930 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.887188 4930 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.887198 4930 flags.go:64] FLAG: --seccomp-default="false" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.887207 4930 flags.go:64] FLAG: --serialize-image-pulls="true" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.887216 4930 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.887226 4930 flags.go:64] FLAG: --storage-driver-db="cadvisor" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.887236 4930 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.887247 4930 flags.go:64] FLAG: --storage-driver-password="root" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.887257 4930 flags.go:64] FLAG: --storage-driver-secure="false" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.887266 4930 flags.go:64] FLAG: --storage-driver-table="stats" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.887275 4930 flags.go:64] FLAG: --storage-driver-user="root" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.887285 4930 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.887295 4930 flags.go:64] FLAG: --sync-frequency="1m0s" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.887304 4930 flags.go:64] FLAG: --system-cgroups="" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.887313 4930 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.887328 4930 flags.go:64] FLAG: --system-reserved-cgroup="" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.887337 4930 flags.go:64] FLAG: --tls-cert-file="" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.887346 4930 flags.go:64] FLAG: --tls-cipher-suites="[]" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.887367 4930 flags.go:64] FLAG: --tls-min-version="" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.887376 4930 flags.go:64] FLAG: --tls-private-key-file="" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.887385 4930 flags.go:64] FLAG: --topology-manager-policy="none" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.887394 4930 flags.go:64] FLAG: --topology-manager-policy-options="" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.887403 4930 flags.go:64] FLAG: --topology-manager-scope="container" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.887413 4930 flags.go:64] FLAG: --v="2" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.887425 4930 flags.go:64] FLAG: --version="false" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.887438 4930 flags.go:64] FLAG: --vmodule="" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.887449 4930 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.887459 4930 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.887711 4930 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.887723 4930 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.887761 4930 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.887771 4930 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.887782 4930 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.887791 4930 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.887799 4930 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.887807 4930 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.887815 4930 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.887822 4930 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.887830 4930 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.887838 4930 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.887846 4930 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.887854 4930 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.887862 4930 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.887869 4930 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.887877 4930 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.887885 4930 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.887892 4930 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.887903 4930 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.887913 4930 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.887921 4930 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.887932 4930 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.887940 4930 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.887948 4930 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.887956 4930 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.887964 4930 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.887971 4930 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.887979 4930 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.887988 4930 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.887997 4930 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.888011 4930 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.888025 4930 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.888035 4930 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.888046 4930 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.888056 4930 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.888069 4930 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.888078 4930 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.888088 4930 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.888097 4930 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.888105 4930 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.888113 4930 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.888121 4930 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.888129 4930 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.888136 4930 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.888144 4930 feature_gate.go:330] unrecognized feature gate: Example Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.888152 4930 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.888160 4930 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.888168 4930 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.888175 4930 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.888183 4930 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.888191 4930 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.888199 4930 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.888210 4930 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.888223 4930 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.888232 4930 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.888240 4930 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.888248 4930 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.888258 4930 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.888266 4930 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.888276 4930 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.888286 4930 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.888294 4930 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.888303 4930 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.888312 4930 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.888321 4930 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.888329 4930 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.888337 4930 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.888352 4930 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.888362 4930 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.888372 4930 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.888390 4930 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.901970 4930 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.902023 4930 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.902176 4930 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.902192 4930 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.902208 4930 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.902218 4930 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.902228 4930 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.902236 4930 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.902244 4930 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.902254 4930 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.902263 4930 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.902272 4930 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.902280 4930 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.902288 4930 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.902296 4930 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.902305 4930 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.902313 4930 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.902321 4930 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.902329 4930 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.902337 4930 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.902345 4930 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.902353 4930 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.902363 4930 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.902373 4930 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.902381 4930 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.902389 4930 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.902397 4930 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.902405 4930 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.902412 4930 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.902420 4930 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.902428 4930 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.902436 4930 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.902444 4930 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.902451 4930 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.902459 4930 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.902466 4930 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.902486 4930 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.902495 4930 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.902503 4930 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.902511 4930 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.902518 4930 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.902526 4930 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.902534 4930 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.902543 4930 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.902551 4930 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.902560 4930 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.902569 4930 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.902577 4930 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.902585 4930 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.902596 4930 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.902606 4930 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.902616 4930 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.902624 4930 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.902633 4930 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.902641 4930 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.902651 4930 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.902659 4930 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.902669 4930 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.902677 4930 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.902685 4930 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.902693 4930 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.902701 4930 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.902712 4930 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.902722 4930 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.902730 4930 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.902772 4930 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.902783 4930 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.902792 4930 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.902801 4930 feature_gate.go:330] unrecognized feature gate: Example Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.902810 4930 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.902818 4930 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.902827 4930 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.902836 4930 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.902850 4930 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.904031 4930 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.904096 4930 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.904108 4930 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.904119 4930 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.904127 4930 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.904136 4930 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.904145 4930 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.904155 4930 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.904164 4930 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.904174 4930 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.904182 4930 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.904190 4930 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.904198 4930 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.904206 4930 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.904214 4930 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.904226 4930 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.904235 4930 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.904244 4930 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.904254 4930 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.904263 4930 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.904271 4930 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.904279 4930 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.904288 4930 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.904296 4930 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.904304 4930 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.904311 4930 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.904319 4930 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.904327 4930 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.904335 4930 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.904343 4930 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.904351 4930 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.904359 4930 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.904367 4930 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.904375 4930 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.904390 4930 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.904399 4930 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.904406 4930 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.904414 4930 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.904421 4930 feature_gate.go:330] unrecognized feature gate: Example Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.904430 4930 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.904437 4930 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.904447 4930 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.904455 4930 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.904462 4930 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.904470 4930 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.904478 4930 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.904485 4930 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.904493 4930 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.904501 4930 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.904508 4930 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.904519 4930 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.904557 4930 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.904565 4930 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.904575 4930 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.904585 4930 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.904596 4930 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.904605 4930 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.904614 4930 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.904623 4930 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.904632 4930 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.904640 4930 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.904648 4930 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.904657 4930 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.904665 4930 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.904673 4930 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.904681 4930 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.904689 4930 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.904697 4930 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.904704 4930 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.904712 4930 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 12 05:41:07 crc kubenswrapper[4930]: W1012 05:41:07.904723 4930 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.904756 4930 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.905001 4930 server.go:940] "Client rotation is on, will bootstrap in background" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.911596 4930 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.911728 4930 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.913432 4930 server.go:997] "Starting client certificate rotation" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.913478 4930 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.914560 4930 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-09 21:53:56.162435378 +0000 UTC Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.914709 4930 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 688h12m48.24773192s for next certificate rotation Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.942623 4930 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.945457 4930 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 12 05:41:07 crc kubenswrapper[4930]: I1012 05:41:07.965110 4930 log.go:25] "Validated CRI v1 runtime API" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.002079 4930 log.go:25] "Validated CRI v1 image API" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.004021 4930 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.011227 4930 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-10-12-05-25-19-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.011314 4930 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.046090 4930 manager.go:217] Machine: {Timestamp:2025-10-12 05:41:08.042067629 +0000 UTC m=+0.584169474 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:42c4af26-8d34-49a0-8413-58384a3ecd2b BootID:208e324a-7c16-4d54-b585-7c9f58265cb2 Filesystems:[{Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:11:79:29 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:11:79:29 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:7a:23:e4 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:8d:4c:04 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:46:be:b5 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:49:86:09 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:ea:21:0b:c5:7c:df Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:a6:1d:e2:81:52:4a Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.046543 4930 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.046806 4930 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.047446 4930 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.047925 4930 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.047985 4930 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.048327 4930 topology_manager.go:138] "Creating topology manager with none policy" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.048348 4930 container_manager_linux.go:303] "Creating device plugin manager" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.048848 4930 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.048903 4930 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.050330 4930 state_mem.go:36] "Initialized new in-memory state store" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.050482 4930 server.go:1245] "Using root directory" path="/var/lib/kubelet" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.054092 4930 kubelet.go:418] "Attempting to sync node with API server" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.054127 4930 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.054266 4930 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.054310 4930 kubelet.go:324] "Adding apiserver pod source" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.054338 4930 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.059473 4930 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Oct 12 05:41:08 crc kubenswrapper[4930]: W1012 05:41:08.059584 4930 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.111:6443: connect: connection refused Oct 12 05:41:08 crc kubenswrapper[4930]: E1012 05:41:08.059709 4930 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.111:6443: connect: connection refused" logger="UnhandledError" Oct 12 05:41:08 crc kubenswrapper[4930]: W1012 05:41:08.059814 4930 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.111:6443: connect: connection refused Oct 12 05:41:08 crc kubenswrapper[4930]: E1012 05:41:08.059893 4930 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.111:6443: connect: connection refused" logger="UnhandledError" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.060668 4930 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.063012 4930 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.064438 4930 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.064469 4930 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.064480 4930 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.064490 4930 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.064504 4930 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.064513 4930 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.064521 4930 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.064533 4930 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.064544 4930 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.064552 4930 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.064565 4930 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.064572 4930 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.065644 4930 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.066321 4930 server.go:1280] "Started kubelet" Oct 12 05:41:08 crc systemd[1]: Started Kubernetes Kubelet. Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.068705 4930 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.111:6443: connect: connection refused Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.069261 4930 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.069314 4930 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.070259 4930 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.070331 4930 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.070384 4930 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.070409 4930 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 18:49:36.749062962 +0000 UTC Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.070465 4930 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 1429h8m28.678603124s for next certificate rotation Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.070478 4930 volume_manager.go:287] "The desired_state_of_world populator starts" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.070496 4930 volume_manager.go:289] "Starting Kubelet Volume Manager" Oct 12 05:41:08 crc kubenswrapper[4930]: W1012 05:41:08.074908 4930 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.111:6443: connect: connection refused Oct 12 05:41:08 crc kubenswrapper[4930]: E1012 05:41:08.075002 4930 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.111:6443: connect: connection refused" logger="UnhandledError" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.075386 4930 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Oct 12 05:41:08 crc kubenswrapper[4930]: E1012 05:41:08.075423 4930 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 12 05:41:08 crc kubenswrapper[4930]: E1012 05:41:08.076855 4930 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.111:6443: connect: connection refused" interval="200ms" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.080046 4930 factory.go:153] Registering CRI-O factory Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.080089 4930 factory.go:221] Registration of the crio container factory successfully Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.080204 4930 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.080220 4930 factory.go:55] Registering systemd factory Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.080233 4930 factory.go:221] Registration of the systemd container factory successfully Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.080263 4930 factory.go:103] Registering Raw factory Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.080298 4930 manager.go:1196] Started watching for new ooms in manager Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.080594 4930 server.go:460] "Adding debug handlers to kubelet server" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.081378 4930 manager.go:319] Starting recovery of all containers Oct 12 05:41:08 crc kubenswrapper[4930]: E1012 05:41:08.081007 4930 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.111:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186da7ed8bcce1d5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-12 05:41:08.066279893 +0000 UTC m=+0.608381658,LastTimestamp:2025-10-12 05:41:08.066279893 +0000 UTC m=+0.608381658,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.093506 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.093587 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.093606 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.093619 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.093631 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.093643 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.093656 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.093668 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.093683 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.093696 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.093708 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.093721 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.093755 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.093776 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.093789 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.093801 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.093812 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.093824 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.093839 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.093852 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.093866 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.093882 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.093895 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.093909 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.093924 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.093941 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.093970 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.093984 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.093998 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.094037 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.094051 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.094064 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.094077 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.094091 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.094106 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.094121 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.094136 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.094150 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.094165 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.094178 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.094192 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.094205 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.094219 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.094234 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.094248 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.094265 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.094281 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.094295 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.094309 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.094324 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.094340 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.094354 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.094375 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.094390 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.094408 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.094421 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.094434 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.094446 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.094457 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.094469 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.094482 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.094499 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.094512 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.094524 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.094538 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.094551 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.094564 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.094580 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.094595 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.094609 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.094622 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.094636 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.094649 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.094662 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.094675 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.094687 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.094698 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.094715 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.094728 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.094810 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.094824 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.094837 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.094852 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.094866 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.094877 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.094893 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.094907 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.094921 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.094933 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.094946 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.094960 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.094973 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.094985 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.095000 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.095013 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.095026 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.095040 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.095051 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.095066 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.095078 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.095091 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.095103 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.095115 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.095129 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.095150 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.095163 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.095178 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.095192 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.095239 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.095255 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.095267 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.095279 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.095291 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.095304 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.095318 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.095383 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.095397 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.095411 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.095423 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.095435 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.095448 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.095465 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.095478 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.095496 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.095510 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.095525 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.095538 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.095552 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.095565 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.095577 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.095590 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.095602 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.095614 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.095627 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.095639 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.095657 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.095671 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.095683 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.095696 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.095707 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.095721 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.095754 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.095774 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.095787 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.095798 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.095810 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.095825 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.095837 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.095849 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.095867 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.095880 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.095891 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.095902 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.095916 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.095928 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.095940 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.095951 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.095965 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.095977 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.095987 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.096000 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.096013 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.096026 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.096039 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.096053 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.096063 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.096076 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.096090 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.096102 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.096113 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.096128 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.096140 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.096151 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.096165 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.096177 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.096190 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.096203 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.096215 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.096232 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.096245 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.096260 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.098125 4930 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.098155 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.098169 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.098184 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.098199 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.098212 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.098225 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.098238 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.098252 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.098265 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.098277 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.098288 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.098302 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.098315 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.098327 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.098354 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.098369 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.098389 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.098404 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.098417 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.098429 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.098441 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.098454 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.098466 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.098483 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.098497 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.098509 4930 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.098521 4930 reconstruct.go:97] "Volume reconstruction finished" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.098530 4930 reconciler.go:26] "Reconciler: start to sync state" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.112137 4930 manager.go:324] Recovery completed Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.126013 4930 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.129196 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.129242 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.129257 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.130324 4930 cpu_manager.go:225] "Starting CPU manager" policy="none" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.130365 4930 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.130400 4930 state_mem.go:36] "Initialized new in-memory state store" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.132269 4930 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.133912 4930 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.133994 4930 status_manager.go:217] "Starting to sync pod status with apiserver" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.134032 4930 kubelet.go:2335] "Starting kubelet main sync loop" Oct 12 05:41:08 crc kubenswrapper[4930]: W1012 05:41:08.135420 4930 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.111:6443: connect: connection refused Oct 12 05:41:08 crc kubenswrapper[4930]: E1012 05:41:08.135482 4930 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.111:6443: connect: connection refused" logger="UnhandledError" Oct 12 05:41:08 crc kubenswrapper[4930]: E1012 05:41:08.136811 4930 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.150964 4930 policy_none.go:49] "None policy: Start" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.152210 4930 memory_manager.go:170] "Starting memorymanager" policy="None" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.152245 4930 state_mem.go:35] "Initializing new in-memory state store" Oct 12 05:41:08 crc kubenswrapper[4930]: E1012 05:41:08.176265 4930 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.219493 4930 manager.go:334] "Starting Device Plugin manager" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.219615 4930 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.219638 4930 server.go:79] "Starting device plugin registration server" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.220372 4930 eviction_manager.go:189] "Eviction manager: starting control loop" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.220400 4930 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.220647 4930 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.220810 4930 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.220830 4930 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Oct 12 05:41:08 crc kubenswrapper[4930]: E1012 05:41:08.232041 4930 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.237977 4930 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc"] Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.238073 4930 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.239349 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.239403 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.239424 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.239658 4930 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.239891 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.239977 4930 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.240841 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.240882 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.240898 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.241048 4930 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.241261 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.241320 4930 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.242041 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.242086 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.242104 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.242048 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.242146 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.242165 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.242386 4930 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.242494 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.242535 4930 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.242809 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.242844 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.242861 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.243291 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.243323 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.243335 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.244267 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.244327 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.244345 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.244646 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.244693 4930 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.245261 4930 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.246534 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.246597 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.246622 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.247294 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.247343 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.247364 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.247836 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.247916 4930 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.249218 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.249252 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.249267 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:08 crc kubenswrapper[4930]: E1012 05:41:08.277369 4930 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.111:6443: connect: connection refused" interval="400ms" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.300012 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.300071 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.300105 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.300135 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.300165 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.300194 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.300224 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.300255 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.300285 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.300313 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.300342 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.300384 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.300412 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.300440 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.300468 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.320751 4930 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.322309 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.322377 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.322398 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.322442 4930 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 12 05:41:08 crc kubenswrapper[4930]: E1012 05:41:08.323084 4930 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.111:6443: connect: connection refused" node="crc" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.401591 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.401654 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.401688 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.401719 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.401797 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.401839 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.401881 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.401924 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.401963 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.402506 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.402560 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.402640 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.402682 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.402704 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.402730 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.402814 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.402817 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.402891 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.402888 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.402944 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.402954 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.402906 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.402965 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.403023 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.403028 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.403083 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.403101 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.403138 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.403141 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.403117 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 12 05:41:08 crc kubenswrapper[4930]: E1012 05:41:08.469994 4930 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.111:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186da7ed8bcce1d5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-12 05:41:08.066279893 +0000 UTC m=+0.608381658,LastTimestamp:2025-10-12 05:41:08.066279893 +0000 UTC m=+0.608381658,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.523198 4930 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.524980 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.525059 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.525085 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.525131 4930 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 12 05:41:08 crc kubenswrapper[4930]: E1012 05:41:08.525699 4930 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.111:6443: connect: connection refused" node="crc" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.581876 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.596557 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.622475 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 12 05:41:08 crc kubenswrapper[4930]: W1012 05:41:08.648275 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-de5b39af1b224b33b50e582213069ad8f51ea11c0811f42dd03ee0607e47f747 WatchSource:0}: Error finding container de5b39af1b224b33b50e582213069ad8f51ea11c0811f42dd03ee0607e47f747: Status 404 returned error can't find the container with id de5b39af1b224b33b50e582213069ad8f51ea11c0811f42dd03ee0607e47f747 Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.649315 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 12 05:41:08 crc kubenswrapper[4930]: W1012 05:41:08.649800 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-e22184caa224556788f1444649a3db38ff5b04208d7e88093cc95e847d59b0ed WatchSource:0}: Error finding container e22184caa224556788f1444649a3db38ff5b04208d7e88093cc95e847d59b0ed: Status 404 returned error can't find the container with id e22184caa224556788f1444649a3db38ff5b04208d7e88093cc95e847d59b0ed Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.656040 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 12 05:41:08 crc kubenswrapper[4930]: W1012 05:41:08.659868 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-dcb5d82318752b49b4197e182c332495393fb91cd7dfd92c820180bf6410283c WatchSource:0}: Error finding container dcb5d82318752b49b4197e182c332495393fb91cd7dfd92c820180bf6410283c: Status 404 returned error can't find the container with id dcb5d82318752b49b4197e182c332495393fb91cd7dfd92c820180bf6410283c Oct 12 05:41:08 crc kubenswrapper[4930]: W1012 05:41:08.668672 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-3d549eb0630b54209717d14cba5a53efcdf864938c0f3893d31de124d167e9ec WatchSource:0}: Error finding container 3d549eb0630b54209717d14cba5a53efcdf864938c0f3893d31de124d167e9ec: Status 404 returned error can't find the container with id 3d549eb0630b54209717d14cba5a53efcdf864938c0f3893d31de124d167e9ec Oct 12 05:41:08 crc kubenswrapper[4930]: W1012 05:41:08.676548 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-551860cd9f3b22c05757d4a2fe096582a0876f8f45e084cbb32c28f594e487eb WatchSource:0}: Error finding container 551860cd9f3b22c05757d4a2fe096582a0876f8f45e084cbb32c28f594e487eb: Status 404 returned error can't find the container with id 551860cd9f3b22c05757d4a2fe096582a0876f8f45e084cbb32c28f594e487eb Oct 12 05:41:08 crc kubenswrapper[4930]: E1012 05:41:08.678197 4930 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.111:6443: connect: connection refused" interval="800ms" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.926450 4930 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.927904 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.927940 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.927953 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:08 crc kubenswrapper[4930]: I1012 05:41:08.927978 4930 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 12 05:41:08 crc kubenswrapper[4930]: E1012 05:41:08.928372 4930 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.111:6443: connect: connection refused" node="crc" Oct 12 05:41:09 crc kubenswrapper[4930]: W1012 05:41:09.045291 4930 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.111:6443: connect: connection refused Oct 12 05:41:09 crc kubenswrapper[4930]: E1012 05:41:09.045460 4930 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.111:6443: connect: connection refused" logger="UnhandledError" Oct 12 05:41:09 crc kubenswrapper[4930]: W1012 05:41:09.058701 4930 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.111:6443: connect: connection refused Oct 12 05:41:09 crc kubenswrapper[4930]: E1012 05:41:09.058790 4930 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.111:6443: connect: connection refused" logger="UnhandledError" Oct 12 05:41:09 crc kubenswrapper[4930]: I1012 05:41:09.069452 4930 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.111:6443: connect: connection refused Oct 12 05:41:09 crc kubenswrapper[4930]: I1012 05:41:09.143769 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"3d549eb0630b54209717d14cba5a53efcdf864938c0f3893d31de124d167e9ec"} Oct 12 05:41:09 crc kubenswrapper[4930]: I1012 05:41:09.145016 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"dcb5d82318752b49b4197e182c332495393fb91cd7dfd92c820180bf6410283c"} Oct 12 05:41:09 crc kubenswrapper[4930]: I1012 05:41:09.145924 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"de5b39af1b224b33b50e582213069ad8f51ea11c0811f42dd03ee0607e47f747"} Oct 12 05:41:09 crc kubenswrapper[4930]: I1012 05:41:09.147105 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e22184caa224556788f1444649a3db38ff5b04208d7e88093cc95e847d59b0ed"} Oct 12 05:41:09 crc kubenswrapper[4930]: I1012 05:41:09.148214 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"551860cd9f3b22c05757d4a2fe096582a0876f8f45e084cbb32c28f594e487eb"} Oct 12 05:41:09 crc kubenswrapper[4930]: W1012 05:41:09.173069 4930 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.111:6443: connect: connection refused Oct 12 05:41:09 crc kubenswrapper[4930]: E1012 05:41:09.173159 4930 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.111:6443: connect: connection refused" logger="UnhandledError" Oct 12 05:41:09 crc kubenswrapper[4930]: W1012 05:41:09.288214 4930 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.111:6443: connect: connection refused Oct 12 05:41:09 crc kubenswrapper[4930]: E1012 05:41:09.288331 4930 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.111:6443: connect: connection refused" logger="UnhandledError" Oct 12 05:41:09 crc kubenswrapper[4930]: E1012 05:41:09.479097 4930 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.111:6443: connect: connection refused" interval="1.6s" Oct 12 05:41:09 crc kubenswrapper[4930]: I1012 05:41:09.729364 4930 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 05:41:09 crc kubenswrapper[4930]: I1012 05:41:09.732435 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:09 crc kubenswrapper[4930]: I1012 05:41:09.732512 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:09 crc kubenswrapper[4930]: I1012 05:41:09.732539 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:09 crc kubenswrapper[4930]: I1012 05:41:09.732588 4930 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 12 05:41:09 crc kubenswrapper[4930]: E1012 05:41:09.733238 4930 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.111:6443: connect: connection refused" node="crc" Oct 12 05:41:10 crc kubenswrapper[4930]: I1012 05:41:10.070651 4930 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.111:6443: connect: connection refused Oct 12 05:41:10 crc kubenswrapper[4930]: I1012 05:41:10.154493 4930 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="7c2895c24173ca66c5aa199d3049c9f6b8dcda56ae7f0a91d92dbd7408d46be2" exitCode=0 Oct 12 05:41:10 crc kubenswrapper[4930]: I1012 05:41:10.154580 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"7c2895c24173ca66c5aa199d3049c9f6b8dcda56ae7f0a91d92dbd7408d46be2"} Oct 12 05:41:10 crc kubenswrapper[4930]: I1012 05:41:10.154677 4930 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 05:41:10 crc kubenswrapper[4930]: I1012 05:41:10.156429 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:10 crc kubenswrapper[4930]: I1012 05:41:10.156490 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:10 crc kubenswrapper[4930]: I1012 05:41:10.156510 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:10 crc kubenswrapper[4930]: I1012 05:41:10.156654 4930 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="ecfafe319660c39bb6291f6fdb7c0dd529065b7db2e0dd52cbf8e42a56c58dd4" exitCode=0 Oct 12 05:41:10 crc kubenswrapper[4930]: I1012 05:41:10.156767 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"ecfafe319660c39bb6291f6fdb7c0dd529065b7db2e0dd52cbf8e42a56c58dd4"} Oct 12 05:41:10 crc kubenswrapper[4930]: I1012 05:41:10.156845 4930 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 05:41:10 crc kubenswrapper[4930]: I1012 05:41:10.158212 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:10 crc kubenswrapper[4930]: I1012 05:41:10.158244 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:10 crc kubenswrapper[4930]: I1012 05:41:10.158255 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:10 crc kubenswrapper[4930]: I1012 05:41:10.159699 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"586395a6006e26fe969c2ef9af6ec6299523c0868c26989709389f5af24b7e39"} Oct 12 05:41:10 crc kubenswrapper[4930]: I1012 05:41:10.159724 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e44390f068c28c4a9917648c747a322ddfa44915b128a61f145d6e2e23fc167a"} Oct 12 05:41:10 crc kubenswrapper[4930]: I1012 05:41:10.159750 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"428593f6ee3749d62d3cb8553ec6707f0f13f70f7750b8ea90fb0b9f0aa8f337"} Oct 12 05:41:10 crc kubenswrapper[4930]: I1012 05:41:10.159760 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"05e5821bdc37103cb95d9f2a3dcacbff75c87335a6c6ed400846ea49fe370c00"} Oct 12 05:41:10 crc kubenswrapper[4930]: I1012 05:41:10.159867 4930 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 05:41:10 crc kubenswrapper[4930]: I1012 05:41:10.160972 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:10 crc kubenswrapper[4930]: I1012 05:41:10.161022 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:10 crc kubenswrapper[4930]: I1012 05:41:10.161042 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:10 crc kubenswrapper[4930]: I1012 05:41:10.163666 4930 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87" exitCode=0 Oct 12 05:41:10 crc kubenswrapper[4930]: I1012 05:41:10.163759 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87"} Oct 12 05:41:10 crc kubenswrapper[4930]: I1012 05:41:10.163835 4930 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 05:41:10 crc kubenswrapper[4930]: I1012 05:41:10.164948 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:10 crc kubenswrapper[4930]: I1012 05:41:10.164989 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:10 crc kubenswrapper[4930]: I1012 05:41:10.165008 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:10 crc kubenswrapper[4930]: I1012 05:41:10.166022 4930 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7" exitCode=0 Oct 12 05:41:10 crc kubenswrapper[4930]: I1012 05:41:10.166069 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7"} Oct 12 05:41:10 crc kubenswrapper[4930]: I1012 05:41:10.166139 4930 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 05:41:10 crc kubenswrapper[4930]: I1012 05:41:10.167077 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:10 crc kubenswrapper[4930]: I1012 05:41:10.167115 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:10 crc kubenswrapper[4930]: I1012 05:41:10.167136 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:10 crc kubenswrapper[4930]: I1012 05:41:10.167367 4930 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 05:41:10 crc kubenswrapper[4930]: I1012 05:41:10.168763 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:10 crc kubenswrapper[4930]: I1012 05:41:10.168785 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:10 crc kubenswrapper[4930]: I1012 05:41:10.168794 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:10 crc kubenswrapper[4930]: W1012 05:41:10.827284 4930 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.111:6443: connect: connection refused Oct 12 05:41:10 crc kubenswrapper[4930]: E1012 05:41:10.827410 4930 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.111:6443: connect: connection refused" logger="UnhandledError" Oct 12 05:41:11 crc kubenswrapper[4930]: I1012 05:41:11.069952 4930 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.111:6443: connect: connection refused Oct 12 05:41:11 crc kubenswrapper[4930]: E1012 05:41:11.080781 4930 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.111:6443: connect: connection refused" interval="3.2s" Oct 12 05:41:11 crc kubenswrapper[4930]: I1012 05:41:11.118597 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 12 05:41:11 crc kubenswrapper[4930]: I1012 05:41:11.173908 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"ebbbfef668586cda6fda820fbbba5432c59b621a94025e2dd6d7d0733a3dee09"} Oct 12 05:41:11 crc kubenswrapper[4930]: I1012 05:41:11.173981 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"82f1df2fa5060e9f2922e06c412de9fe47f1dd0236b2bba1d81783dfd52a68d7"} Oct 12 05:41:11 crc kubenswrapper[4930]: I1012 05:41:11.173996 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"39f8602e4dd7d746c23c7b303c631d40b2bc2c447acd1491b77be0895c1715a1"} Oct 12 05:41:11 crc kubenswrapper[4930]: I1012 05:41:11.174008 4930 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 05:41:11 crc kubenswrapper[4930]: I1012 05:41:11.175058 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:11 crc kubenswrapper[4930]: I1012 05:41:11.175105 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:11 crc kubenswrapper[4930]: I1012 05:41:11.175123 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:11 crc kubenswrapper[4930]: I1012 05:41:11.178034 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b9e13b7dbf76d9dcdaa4a8edc40eb5ebac2c044e9f171f78f18b33a08784ed28"} Oct 12 05:41:11 crc kubenswrapper[4930]: I1012 05:41:11.178084 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"14de1c8677ea9a164e1713e68861b55fa4b8fd0bff40211afa3ee9e883c65092"} Oct 12 05:41:11 crc kubenswrapper[4930]: I1012 05:41:11.178106 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e93380cd1156afc3c34f019f5026f01b5242b9009c4c5de83edf5301e4cdb7b4"} Oct 12 05:41:11 crc kubenswrapper[4930]: I1012 05:41:11.178123 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"380b8da0bc8ba00ec7fe7a9dd1a4f49f0f0828fa469e21bd947a1b976ae5e584"} Oct 12 05:41:11 crc kubenswrapper[4930]: I1012 05:41:11.180657 4930 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2" exitCode=0 Oct 12 05:41:11 crc kubenswrapper[4930]: I1012 05:41:11.180801 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2"} Oct 12 05:41:11 crc kubenswrapper[4930]: I1012 05:41:11.180835 4930 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 05:41:11 crc kubenswrapper[4930]: I1012 05:41:11.182127 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:11 crc kubenswrapper[4930]: I1012 05:41:11.182174 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:11 crc kubenswrapper[4930]: I1012 05:41:11.182186 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:11 crc kubenswrapper[4930]: I1012 05:41:11.183583 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"46b72209a6e5e34e3ba9a5d17cc19a2817d8e455eea8a0026674589089df2376"} Oct 12 05:41:11 crc kubenswrapper[4930]: I1012 05:41:11.183597 4930 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 05:41:11 crc kubenswrapper[4930]: I1012 05:41:11.183646 4930 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 05:41:11 crc kubenswrapper[4930]: I1012 05:41:11.184852 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:11 crc kubenswrapper[4930]: I1012 05:41:11.184879 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:11 crc kubenswrapper[4930]: I1012 05:41:11.184893 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:11 crc kubenswrapper[4930]: I1012 05:41:11.184880 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:11 crc kubenswrapper[4930]: I1012 05:41:11.185018 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:11 crc kubenswrapper[4930]: I1012 05:41:11.185042 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:11 crc kubenswrapper[4930]: I1012 05:41:11.334204 4930 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 05:41:11 crc kubenswrapper[4930]: I1012 05:41:11.335299 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:11 crc kubenswrapper[4930]: I1012 05:41:11.335363 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:11 crc kubenswrapper[4930]: I1012 05:41:11.335376 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:11 crc kubenswrapper[4930]: I1012 05:41:11.335414 4930 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 12 05:41:11 crc kubenswrapper[4930]: E1012 05:41:11.336456 4930 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.111:6443: connect: connection refused" node="crc" Oct 12 05:41:11 crc kubenswrapper[4930]: W1012 05:41:11.397425 4930 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.111:6443: connect: connection refused Oct 12 05:41:11 crc kubenswrapper[4930]: E1012 05:41:11.397796 4930 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.111:6443: connect: connection refused" logger="UnhandledError" Oct 12 05:41:12 crc kubenswrapper[4930]: I1012 05:41:12.188103 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5fb5b0196fd005013eba91e6ea563d14230abf0a8cbc1d2a4bf9cd4491f7c391"} Oct 12 05:41:12 crc kubenswrapper[4930]: I1012 05:41:12.189287 4930 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 05:41:12 crc kubenswrapper[4930]: I1012 05:41:12.190707 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:12 crc kubenswrapper[4930]: I1012 05:41:12.190903 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:12 crc kubenswrapper[4930]: I1012 05:41:12.191063 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:12 crc kubenswrapper[4930]: I1012 05:41:12.191359 4930 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210" exitCode=0 Oct 12 05:41:12 crc kubenswrapper[4930]: I1012 05:41:12.191463 4930 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 12 05:41:12 crc kubenswrapper[4930]: I1012 05:41:12.191498 4930 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 05:41:12 crc kubenswrapper[4930]: I1012 05:41:12.192198 4930 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 05:41:12 crc kubenswrapper[4930]: I1012 05:41:12.192480 4930 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 05:41:12 crc kubenswrapper[4930]: I1012 05:41:12.192841 4930 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 05:41:12 crc kubenswrapper[4930]: I1012 05:41:12.193096 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210"} Oct 12 05:41:12 crc kubenswrapper[4930]: I1012 05:41:12.193456 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:12 crc kubenswrapper[4930]: I1012 05:41:12.193655 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:12 crc kubenswrapper[4930]: I1012 05:41:12.193826 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:12 crc kubenswrapper[4930]: I1012 05:41:12.193922 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:12 crc kubenswrapper[4930]: I1012 05:41:12.193977 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:12 crc kubenswrapper[4930]: I1012 05:41:12.194116 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:12 crc kubenswrapper[4930]: I1012 05:41:12.194791 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:12 crc kubenswrapper[4930]: I1012 05:41:12.195806 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:12 crc kubenswrapper[4930]: I1012 05:41:12.196261 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:12 crc kubenswrapper[4930]: I1012 05:41:12.196527 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:12 crc kubenswrapper[4930]: I1012 05:41:12.196827 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:12 crc kubenswrapper[4930]: I1012 05:41:12.196852 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:12 crc kubenswrapper[4930]: I1012 05:41:12.831812 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 12 05:41:13 crc kubenswrapper[4930]: I1012 05:41:13.200378 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"0ce0b341a84353da33af118a985d78e1926365e672cd9d9a94a8c9ccdeeeef68"} Oct 12 05:41:13 crc kubenswrapper[4930]: I1012 05:41:13.200457 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f2ee3c6242b1dd92c05cad132f941c6463862d38dd3f32db106c4458b19ca281"} Oct 12 05:41:13 crc kubenswrapper[4930]: I1012 05:41:13.200500 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"7c844939574ab6101aa35b883fd6b7dfabb8338c2de1bc238a60b92f7e0fc99a"} Oct 12 05:41:13 crc kubenswrapper[4930]: I1012 05:41:13.200464 4930 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 12 05:41:13 crc kubenswrapper[4930]: I1012 05:41:13.200553 4930 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 05:41:13 crc kubenswrapper[4930]: I1012 05:41:13.200581 4930 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 05:41:13 crc kubenswrapper[4930]: I1012 05:41:13.202197 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:13 crc kubenswrapper[4930]: I1012 05:41:13.202212 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:13 crc kubenswrapper[4930]: I1012 05:41:13.202293 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:13 crc kubenswrapper[4930]: I1012 05:41:13.202315 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:13 crc kubenswrapper[4930]: I1012 05:41:13.202257 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:13 crc kubenswrapper[4930]: I1012 05:41:13.202424 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:13 crc kubenswrapper[4930]: I1012 05:41:13.743583 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 12 05:41:14 crc kubenswrapper[4930]: I1012 05:41:14.209290 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d04341c7af2fbffb89ce55a0ed2594bdb4fd943ac9088994f81f9245c31c62a2"} Oct 12 05:41:14 crc kubenswrapper[4930]: I1012 05:41:14.209371 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"772b3ac684f8d59be06d3921919db10572b68d9acfb6ae64e85b92c637963057"} Oct 12 05:41:14 crc kubenswrapper[4930]: I1012 05:41:14.209309 4930 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 12 05:41:14 crc kubenswrapper[4930]: I1012 05:41:14.209417 4930 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 05:41:14 crc kubenswrapper[4930]: I1012 05:41:14.209439 4930 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 05:41:14 crc kubenswrapper[4930]: I1012 05:41:14.210832 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:14 crc kubenswrapper[4930]: I1012 05:41:14.210857 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:14 crc kubenswrapper[4930]: I1012 05:41:14.210888 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:14 crc kubenswrapper[4930]: I1012 05:41:14.210910 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:14 crc kubenswrapper[4930]: I1012 05:41:14.210909 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:14 crc kubenswrapper[4930]: I1012 05:41:14.211053 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:14 crc kubenswrapper[4930]: I1012 05:41:14.231872 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 12 05:41:14 crc kubenswrapper[4930]: I1012 05:41:14.537563 4930 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 05:41:14 crc kubenswrapper[4930]: I1012 05:41:14.540357 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:14 crc kubenswrapper[4930]: I1012 05:41:14.540449 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:14 crc kubenswrapper[4930]: I1012 05:41:14.540476 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:14 crc kubenswrapper[4930]: I1012 05:41:14.540538 4930 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 12 05:41:14 crc kubenswrapper[4930]: I1012 05:41:14.940904 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 12 05:41:14 crc kubenswrapper[4930]: I1012 05:41:14.941265 4930 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 05:41:14 crc kubenswrapper[4930]: I1012 05:41:14.942841 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:14 crc kubenswrapper[4930]: I1012 05:41:14.942914 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:14 crc kubenswrapper[4930]: I1012 05:41:14.942936 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:15 crc kubenswrapper[4930]: I1012 05:41:15.211774 4930 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 05:41:15 crc kubenswrapper[4930]: I1012 05:41:15.211833 4930 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 05:41:15 crc kubenswrapper[4930]: I1012 05:41:15.214197 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:15 crc kubenswrapper[4930]: I1012 05:41:15.214435 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:15 crc kubenswrapper[4930]: I1012 05:41:15.214614 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:15 crc kubenswrapper[4930]: I1012 05:41:15.214681 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:15 crc kubenswrapper[4930]: I1012 05:41:15.214709 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:15 crc kubenswrapper[4930]: I1012 05:41:15.214628 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:15 crc kubenswrapper[4930]: I1012 05:41:15.771209 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 12 05:41:15 crc kubenswrapper[4930]: I1012 05:41:15.771413 4930 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 05:41:15 crc kubenswrapper[4930]: I1012 05:41:15.772971 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:15 crc kubenswrapper[4930]: I1012 05:41:15.773038 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:15 crc kubenswrapper[4930]: I1012 05:41:15.773062 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:15 crc kubenswrapper[4930]: I1012 05:41:15.903852 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 12 05:41:16 crc kubenswrapper[4930]: I1012 05:41:16.215096 4930 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 05:41:16 crc kubenswrapper[4930]: I1012 05:41:16.216320 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:16 crc kubenswrapper[4930]: I1012 05:41:16.216380 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:16 crc kubenswrapper[4930]: I1012 05:41:16.216406 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:16 crc kubenswrapper[4930]: I1012 05:41:16.228282 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Oct 12 05:41:16 crc kubenswrapper[4930]: I1012 05:41:16.228465 4930 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 05:41:16 crc kubenswrapper[4930]: I1012 05:41:16.229782 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:16 crc kubenswrapper[4930]: I1012 05:41:16.229833 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:16 crc kubenswrapper[4930]: I1012 05:41:16.229859 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:17 crc kubenswrapper[4930]: I1012 05:41:17.941723 4930 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 12 05:41:17 crc kubenswrapper[4930]: I1012 05:41:17.941867 4930 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 12 05:41:18 crc kubenswrapper[4930]: E1012 05:41:18.232245 4930 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 12 05:41:18 crc kubenswrapper[4930]: I1012 05:41:18.819788 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Oct 12 05:41:18 crc kubenswrapper[4930]: I1012 05:41:18.820030 4930 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 05:41:18 crc kubenswrapper[4930]: I1012 05:41:18.821630 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:18 crc kubenswrapper[4930]: I1012 05:41:18.821697 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:18 crc kubenswrapper[4930]: I1012 05:41:18.821716 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:19 crc kubenswrapper[4930]: I1012 05:41:19.626563 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 12 05:41:19 crc kubenswrapper[4930]: I1012 05:41:19.626793 4930 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 05:41:19 crc kubenswrapper[4930]: I1012 05:41:19.628249 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:19 crc kubenswrapper[4930]: I1012 05:41:19.628291 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:19 crc kubenswrapper[4930]: I1012 05:41:19.628309 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:19 crc kubenswrapper[4930]: I1012 05:41:19.633197 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 12 05:41:20 crc kubenswrapper[4930]: I1012 05:41:20.226165 4930 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 05:41:20 crc kubenswrapper[4930]: I1012 05:41:20.227527 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:20 crc kubenswrapper[4930]: I1012 05:41:20.227579 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:20 crc kubenswrapper[4930]: I1012 05:41:20.227597 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:20 crc kubenswrapper[4930]: I1012 05:41:20.230596 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 12 05:41:21 crc kubenswrapper[4930]: I1012 05:41:21.230666 4930 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 05:41:21 crc kubenswrapper[4930]: I1012 05:41:21.232962 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:21 crc kubenswrapper[4930]: I1012 05:41:21.233024 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:21 crc kubenswrapper[4930]: I1012 05:41:21.233045 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:22 crc kubenswrapper[4930]: W1012 05:41:22.056487 4930 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Oct 12 05:41:22 crc kubenswrapper[4930]: I1012 05:41:22.056594 4930 trace.go:236] Trace[23256371]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (12-Oct-2025 05:41:12.053) (total time: 10002ms): Oct 12 05:41:22 crc kubenswrapper[4930]: Trace[23256371]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10002ms (05:41:22.056) Oct 12 05:41:22 crc kubenswrapper[4930]: Trace[23256371]: [10.002885173s] [10.002885173s] END Oct 12 05:41:22 crc kubenswrapper[4930]: E1012 05:41:22.056617 4930 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Oct 12 05:41:22 crc kubenswrapper[4930]: I1012 05:41:22.070972 4930 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Oct 12 05:41:22 crc kubenswrapper[4930]: W1012 05:41:22.161093 4930 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Oct 12 05:41:22 crc kubenswrapper[4930]: I1012 05:41:22.161231 4930 trace.go:236] Trace[290449113]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (12-Oct-2025 05:41:12.159) (total time: 10001ms): Oct 12 05:41:22 crc kubenswrapper[4930]: Trace[290449113]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (05:41:22.161) Oct 12 05:41:22 crc kubenswrapper[4930]: Trace[290449113]: [10.001847833s] [10.001847833s] END Oct 12 05:41:22 crc kubenswrapper[4930]: E1012 05:41:22.161267 4930 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Oct 12 05:41:22 crc kubenswrapper[4930]: I1012 05:41:22.867188 4930 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 12 05:41:22 crc kubenswrapper[4930]: I1012 05:41:22.867241 4930 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 12 05:41:22 crc kubenswrapper[4930]: I1012 05:41:22.873782 4930 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 12 05:41:22 crc kubenswrapper[4930]: I1012 05:41:22.873853 4930 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 12 05:41:23 crc kubenswrapper[4930]: I1012 05:41:23.237945 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 12 05:41:23 crc kubenswrapper[4930]: I1012 05:41:23.240634 4930 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="5fb5b0196fd005013eba91e6ea563d14230abf0a8cbc1d2a4bf9cd4491f7c391" exitCode=255 Oct 12 05:41:23 crc kubenswrapper[4930]: I1012 05:41:23.240698 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"5fb5b0196fd005013eba91e6ea563d14230abf0a8cbc1d2a4bf9cd4491f7c391"} Oct 12 05:41:23 crc kubenswrapper[4930]: I1012 05:41:23.240982 4930 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 05:41:23 crc kubenswrapper[4930]: I1012 05:41:23.242186 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:23 crc kubenswrapper[4930]: I1012 05:41:23.242241 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:23 crc kubenswrapper[4930]: I1012 05:41:23.242259 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:23 crc kubenswrapper[4930]: I1012 05:41:23.243164 4930 scope.go:117] "RemoveContainer" containerID="5fb5b0196fd005013eba91e6ea563d14230abf0a8cbc1d2a4bf9cd4491f7c391" Oct 12 05:41:23 crc kubenswrapper[4930]: I1012 05:41:23.752341 4930 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 12 05:41:23 crc kubenswrapper[4930]: [+]log ok Oct 12 05:41:23 crc kubenswrapper[4930]: [+]etcd ok Oct 12 05:41:23 crc kubenswrapper[4930]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Oct 12 05:41:23 crc kubenswrapper[4930]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Oct 12 05:41:23 crc kubenswrapper[4930]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 12 05:41:23 crc kubenswrapper[4930]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 12 05:41:23 crc kubenswrapper[4930]: [+]poststarthook/openshift.io-api-request-count-filter ok Oct 12 05:41:23 crc kubenswrapper[4930]: [+]poststarthook/openshift.io-startkubeinformers ok Oct 12 05:41:23 crc kubenswrapper[4930]: [+]poststarthook/generic-apiserver-start-informers ok Oct 12 05:41:23 crc kubenswrapper[4930]: [+]poststarthook/priority-and-fairness-config-consumer ok Oct 12 05:41:23 crc kubenswrapper[4930]: [+]poststarthook/priority-and-fairness-filter ok Oct 12 05:41:23 crc kubenswrapper[4930]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 12 05:41:23 crc kubenswrapper[4930]: [+]poststarthook/start-apiextensions-informers ok Oct 12 05:41:23 crc kubenswrapper[4930]: [+]poststarthook/start-apiextensions-controllers ok Oct 12 05:41:23 crc kubenswrapper[4930]: [+]poststarthook/crd-informer-synced ok Oct 12 05:41:23 crc kubenswrapper[4930]: [+]poststarthook/start-system-namespaces-controller ok Oct 12 05:41:23 crc kubenswrapper[4930]: [+]poststarthook/start-cluster-authentication-info-controller ok Oct 12 05:41:23 crc kubenswrapper[4930]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Oct 12 05:41:23 crc kubenswrapper[4930]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Oct 12 05:41:23 crc kubenswrapper[4930]: [+]poststarthook/start-legacy-token-tracking-controller ok Oct 12 05:41:23 crc kubenswrapper[4930]: [+]poststarthook/start-service-ip-repair-controllers ok Oct 12 05:41:23 crc kubenswrapper[4930]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Oct 12 05:41:23 crc kubenswrapper[4930]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Oct 12 05:41:23 crc kubenswrapper[4930]: [+]poststarthook/priority-and-fairness-config-producer ok Oct 12 05:41:23 crc kubenswrapper[4930]: [+]poststarthook/bootstrap-controller ok Oct 12 05:41:23 crc kubenswrapper[4930]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Oct 12 05:41:23 crc kubenswrapper[4930]: [+]poststarthook/start-kube-aggregator-informers ok Oct 12 05:41:23 crc kubenswrapper[4930]: [+]poststarthook/apiservice-status-local-available-controller ok Oct 12 05:41:23 crc kubenswrapper[4930]: [+]poststarthook/apiservice-status-remote-available-controller ok Oct 12 05:41:23 crc kubenswrapper[4930]: [+]poststarthook/apiservice-registration-controller ok Oct 12 05:41:23 crc kubenswrapper[4930]: [+]poststarthook/apiservice-wait-for-first-sync ok Oct 12 05:41:23 crc kubenswrapper[4930]: [+]poststarthook/apiservice-discovery-controller ok Oct 12 05:41:23 crc kubenswrapper[4930]: [+]poststarthook/kube-apiserver-autoregistration ok Oct 12 05:41:23 crc kubenswrapper[4930]: [+]autoregister-completion ok Oct 12 05:41:23 crc kubenswrapper[4930]: [+]poststarthook/apiservice-openapi-controller ok Oct 12 05:41:23 crc kubenswrapper[4930]: [+]poststarthook/apiservice-openapiv3-controller ok Oct 12 05:41:23 crc kubenswrapper[4930]: livez check failed Oct 12 05:41:23 crc kubenswrapper[4930]: I1012 05:41:23.754892 4930 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 12 05:41:24 crc kubenswrapper[4930]: I1012 05:41:24.245933 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 12 05:41:24 crc kubenswrapper[4930]: I1012 05:41:24.249392 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b74b7cf89e0ba8d3417d38fdfc78b271cb23bc6e737358303956214c8f9a0f9a"} Oct 12 05:41:24 crc kubenswrapper[4930]: I1012 05:41:24.249545 4930 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 05:41:24 crc kubenswrapper[4930]: I1012 05:41:24.250657 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:24 crc kubenswrapper[4930]: I1012 05:41:24.250694 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:24 crc kubenswrapper[4930]: I1012 05:41:24.250706 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:26 crc kubenswrapper[4930]: I1012 05:41:26.018608 4930 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Oct 12 05:41:27 crc kubenswrapper[4930]: I1012 05:41:27.665271 4930 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Oct 12 05:41:27 crc kubenswrapper[4930]: E1012 05:41:27.858797 4930 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Oct 12 05:41:27 crc kubenswrapper[4930]: I1012 05:41:27.858903 4930 trace.go:236] Trace[1086625712]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (12-Oct-2025 05:41:16.047) (total time: 11811ms): Oct 12 05:41:27 crc kubenswrapper[4930]: Trace[1086625712]: ---"Objects listed" error: 11811ms (05:41:27.858) Oct 12 05:41:27 crc kubenswrapper[4930]: Trace[1086625712]: [11.811224154s] [11.811224154s] END Oct 12 05:41:27 crc kubenswrapper[4930]: I1012 05:41:27.858945 4930 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Oct 12 05:41:27 crc kubenswrapper[4930]: I1012 05:41:27.860020 4930 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Oct 12 05:41:27 crc kubenswrapper[4930]: I1012 05:41:27.860678 4930 trace.go:236] Trace[1706094439]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (12-Oct-2025 05:41:15.604) (total time: 12256ms): Oct 12 05:41:27 crc kubenswrapper[4930]: Trace[1706094439]: ---"Objects listed" error: 12256ms (05:41:27.860) Oct 12 05:41:27 crc kubenswrapper[4930]: Trace[1706094439]: [12.256157713s] [12.256157713s] END Oct 12 05:41:27 crc kubenswrapper[4930]: I1012 05:41:27.860719 4930 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Oct 12 05:41:27 crc kubenswrapper[4930]: E1012 05:41:27.866211 4930 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Oct 12 05:41:27 crc kubenswrapper[4930]: I1012 05:41:27.942065 4930 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 12 05:41:27 crc kubenswrapper[4930]: I1012 05:41:27.942147 4930 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.062612 4930 apiserver.go:52] "Watching apiserver" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.064328 4930 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.064559 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf"] Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.065350 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.065380 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.065574 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.065743 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 12 05:41:28 crc kubenswrapper[4930]: E1012 05:41:28.065819 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.065824 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.065850 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 12 05:41:28 crc kubenswrapper[4930]: E1012 05:41:28.065895 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 05:41:28 crc kubenswrapper[4930]: E1012 05:41:28.066111 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.068759 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.068760 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.068793 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.068776 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.068874 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.068879 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.069195 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.070017 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.073060 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.075910 4930 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.106121 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.116795 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.125896 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.133511 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.143107 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.155062 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.161451 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.161639 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.161770 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.161877 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.161954 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.161978 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.162070 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.162097 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.162120 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.162119 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.162317 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.162422 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.162473 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.162477 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.162509 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.162541 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.162562 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.162586 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.162611 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.162634 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.162657 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.162677 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.162702 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.162724 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.162776 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.162803 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.162827 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.162851 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.162875 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.162900 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.162918 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.162941 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.162963 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.162985 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.163003 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.163019 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.163043 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.163067 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.163089 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.163107 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.163131 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.163157 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.163177 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.163198 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.163220 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.163435 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.163498 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.163523 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.163548 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.163572 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.163595 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.163617 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.164395 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.164529 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.164775 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.164803 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.164997 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.165007 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.165203 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.165211 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.165203 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.165265 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.165425 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.163640 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.165860 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.165887 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.165909 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.165936 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.165961 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.165984 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.166008 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.166032 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.166054 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.166079 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.166106 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.166129 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.166152 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.166174 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.166197 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.166218 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.166240 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.166265 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.166287 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.166310 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.166331 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.166352 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.166373 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.166394 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.166415 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.166435 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.166456 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.166478 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.166502 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.166527 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.166552 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.166578 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.166601 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.166622 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.166645 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.166671 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.166695 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.166718 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.166769 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.166791 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.166812 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.166833 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.166856 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.166878 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.166900 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.166925 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.166947 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.166969 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.166993 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.167054 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.167082 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.167106 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.167127 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.167209 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.167233 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.167255 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.167275 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.167297 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.167320 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.167362 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.167383 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.167408 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.167430 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.167451 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.167472 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.167493 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.167514 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.167537 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.167559 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.167581 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.167605 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.167627 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.167652 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.167674 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.167696 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.167717 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.167772 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.167795 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.167816 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.167838 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.167859 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.167881 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.167902 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.167924 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.167945 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.167967 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.167990 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.168013 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.168034 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.168056 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.168102 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.168124 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.168146 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.168168 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.168186 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.168203 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.168227 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.168246 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.168262 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.168278 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.168293 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.168311 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.168326 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.168342 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.168358 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.168373 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.168402 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.168417 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.168434 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.168450 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.168466 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.168481 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.168496 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.168511 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.168528 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.168543 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.168558 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.168574 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.168592 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.165881 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.168614 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.166039 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.166185 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.168635 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.168651 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.168667 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.168683 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.168699 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.168717 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.168749 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.168772 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.168790 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.168806 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.168821 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.168837 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.168852 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.168868 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.168883 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.168900 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.168923 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.168941 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.168956 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.168972 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.168989 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.169006 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.169023 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.169039 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.169056 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.169072 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.169089 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.169105 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.169145 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.169170 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.169188 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.169209 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.169228 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.169244 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.169261 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.169278 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.169295 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.169321 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.169344 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.169366 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.169385 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.169403 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.169467 4930 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.169478 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.169488 4930 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.169499 4930 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.169508 4930 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.169518 4930 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.169527 4930 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.169537 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.169548 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.169558 4930 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.169567 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.169577 4930 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.169586 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.169595 4930 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.169604 4930 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.169614 4930 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.169624 4930 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.169634 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.169644 4930 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.189477 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.166341 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.166484 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.203509 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.167253 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.167423 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.167569 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.167723 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.203576 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.202453 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.167891 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.168290 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.168450 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.168599 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.203656 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.203679 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.169761 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: E1012 05:41:28.169793 4930 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.169901 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.170126 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.170300 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.170422 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.170507 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.170708 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.170787 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.171011 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.171135 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.171263 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.171604 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.171647 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.171681 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.172280 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.172716 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.175063 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.175223 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.175208 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.175712 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.175912 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.175915 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.173131 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.176200 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.176215 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.176458 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.176482 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.176651 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.176659 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.176935 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.177175 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.177201 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.177340 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.177529 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.178151 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.179693 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.179951 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.179979 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.181130 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.183569 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.184043 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.184253 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.184267 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: E1012 05:41:28.184654 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 05:41:28.68463303 +0000 UTC m=+21.226734795 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.186158 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.186260 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.187039 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.188993 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.189021 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.189114 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.189562 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.189580 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.189697 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.190042 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.190381 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.190465 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.191467 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.191500 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.191778 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.192061 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.192533 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.192724 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.192909 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.192936 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.193201 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.193253 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.193758 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.193632 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.194315 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.194591 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.194610 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.194917 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.194925 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.195256 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.195618 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.195835 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.196372 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.196515 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.196844 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.197118 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.198367 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.201853 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.202163 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.202439 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.202609 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.202780 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.203052 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.203187 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.203241 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.203257 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.203440 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.166904 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.203935 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.204020 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.204027 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.203410 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.204078 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.204242 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.205076 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.205087 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.204377 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.204510 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.204496 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.205267 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.204717 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.204886 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.205313 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.207549 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.207918 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: E1012 05:41:28.206218 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-12 05:41:28.705864791 +0000 UTC m=+21.247966596 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.208524 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.208667 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.208834 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.208860 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.209311 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.209429 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.209540 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.209587 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.209800 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.210068 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.210281 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.210929 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.211533 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.211768 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.211775 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.211887 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.212366 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.212696 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.212729 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.213140 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.213564 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 12 05:41:28 crc kubenswrapper[4930]: E1012 05:41:28.214519 4930 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.214761 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 12 05:41:28 crc kubenswrapper[4930]: E1012 05:41:28.214930 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-12 05:41:28.71490281 +0000 UTC m=+21.257004575 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.215076 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.215112 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.215317 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.215374 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.216107 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.217235 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.217307 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.217351 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.217415 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.218584 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.218653 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.219410 4930 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.219424 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.219636 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.222429 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.225165 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.231947 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: E1012 05:41:28.232039 4930 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 12 05:41:28 crc kubenswrapper[4930]: E1012 05:41:28.232074 4930 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 12 05:41:28 crc kubenswrapper[4930]: E1012 05:41:28.232097 4930 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 05:41:28 crc kubenswrapper[4930]: E1012 05:41:28.232182 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-12 05:41:28.732152842 +0000 UTC m=+21.274254807 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.235939 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.236216 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.236279 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.236306 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.236494 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.236525 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.236686 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.236723 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.236806 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.236942 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: E1012 05:41:28.237294 4930 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 12 05:41:28 crc kubenswrapper[4930]: E1012 05:41:28.237320 4930 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 12 05:41:28 crc kubenswrapper[4930]: E1012 05:41:28.237336 4930 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.237377 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: E1012 05:41:28.237403 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-12 05:41:28.737382746 +0000 UTC m=+21.279484511 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.239377 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.239653 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.239757 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.240798 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.242391 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.245475 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.249623 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.251385 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.253013 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.255677 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.256067 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.256560 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.265182 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.270122 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.270185 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.270245 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.270256 4930 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.270266 4930 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.270276 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.270287 4930 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.270296 4930 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.270322 4930 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.270330 4930 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.270338 4930 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.270346 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.270354 4930 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.270362 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.270369 4930 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.270397 4930 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.270406 4930 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.270414 4930 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.270423 4930 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.270431 4930 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.270439 4930 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.270447 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.270470 4930 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.270480 4930 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.270487 4930 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.270496 4930 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.270509 4930 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.270518 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.270526 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.270551 4930 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.270561 4930 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.270569 4930 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.270578 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.270587 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.270595 4930 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.270605 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.270923 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.271045 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.271091 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.271391 4930 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.271406 4930 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.271415 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.271423 4930 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.271433 4930 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.271441 4930 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.271467 4930 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.271478 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.271487 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.271496 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.271505 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.271514 4930 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.271523 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.271532 4930 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.271546 4930 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.271555 4930 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.271565 4930 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.271574 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.271584 4930 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.271592 4930 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.271601 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.271610 4930 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.271619 4930 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.271630 4930 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.271638 4930 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.271647 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.271657 4930 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.271665 4930 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.271674 4930 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.271683 4930 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.271691 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.271700 4930 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.271708 4930 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.271717 4930 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.271725 4930 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.271743 4930 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.271751 4930 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.271760 4930 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.271770 4930 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.271778 4930 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.271786 4930 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.271795 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.271804 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.271813 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.271822 4930 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.271831 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.271840 4930 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.271848 4930 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.271860 4930 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.271869 4930 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.271877 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.271887 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.271896 4930 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.271904 4930 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.271912 4930 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.271920 4930 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.271928 4930 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.271937 4930 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.271945 4930 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.271953 4930 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.271961 4930 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.271969 4930 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.271978 4930 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.271986 4930 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.271994 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.272002 4930 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.272011 4930 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.272019 4930 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.272041 4930 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.272050 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.272058 4930 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.272067 4930 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.272076 4930 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.272086 4930 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.272094 4930 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.272103 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.272113 4930 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.272123 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.272132 4930 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.272141 4930 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.272151 4930 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.272159 4930 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.272168 4930 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.272178 4930 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.272187 4930 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.272196 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.272205 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.272213 4930 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.272222 4930 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.272230 4930 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.272239 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.272248 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.272256 4930 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.272265 4930 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.272273 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.272283 4930 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.272291 4930 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.272299 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.272312 4930 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.272320 4930 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.272328 4930 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.272335 4930 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.272344 4930 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.272369 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.272378 4930 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.272387 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.272396 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.272404 4930 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.272413 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.272421 4930 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.272429 4930 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.272437 4930 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.272445 4930 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.272453 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.272460 4930 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.272469 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.272476 4930 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.272484 4930 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.272492 4930 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.272500 4930 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.272508 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.272517 4930 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.272524 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.272532 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.272540 4930 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.272548 4930 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.272556 4930 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.272563 4930 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.272570 4930 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.272578 4930 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.272588 4930 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.272595 4930 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.272603 4930 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.272611 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.272619 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.272627 4930 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.272635 4930 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.272650 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.272657 4930 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.272665 4930 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.278689 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.289753 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.373053 4930 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.391070 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.397334 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.405097 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 12 05:41:28 crc kubenswrapper[4930]: W1012 05:41:28.409651 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-8adcc3f72d2ed23f78aefbd7fe6e8948432d94e1d73596e6e2941a26d9f7e4ab WatchSource:0}: Error finding container 8adcc3f72d2ed23f78aefbd7fe6e8948432d94e1d73596e6e2941a26d9f7e4ab: Status 404 returned error can't find the container with id 8adcc3f72d2ed23f78aefbd7fe6e8948432d94e1d73596e6e2941a26d9f7e4ab Oct 12 05:41:28 crc kubenswrapper[4930]: W1012 05:41:28.420117 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-dc8b2484a7aa153414835ab28d813e0c692defaa4c9ac159082b3254201d290b WatchSource:0}: Error finding container dc8b2484a7aa153414835ab28d813e0c692defaa4c9ac159082b3254201d290b: Status 404 returned error can't find the container with id dc8b2484a7aa153414835ab28d813e0c692defaa4c9ac159082b3254201d290b Oct 12 05:41:28 crc kubenswrapper[4930]: W1012 05:41:28.434401 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-d829e88f123dd055dc32237a80177840c93dfc63c180cd545976729c93407e5b WatchSource:0}: Error finding container d829e88f123dd055dc32237a80177840c93dfc63c180cd545976729c93407e5b: Status 404 returned error can't find the container with id d829e88f123dd055dc32237a80177840c93dfc63c180cd545976729c93407e5b Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.749423 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.750022 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.756689 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.759165 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.763899 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.772554 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.778704 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 05:41:28 crc kubenswrapper[4930]: E1012 05:41:28.778858 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 05:41:29.778833298 +0000 UTC m=+22.320935103 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.779031 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 05:41:28 crc kubenswrapper[4930]: E1012 05:41:28.779146 4930 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 12 05:41:28 crc kubenswrapper[4930]: E1012 05:41:28.779213 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-12 05:41:29.779199745 +0000 UTC m=+22.321301540 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.779629 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 05:41:28 crc kubenswrapper[4930]: E1012 05:41:28.779833 4930 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 12 05:41:28 crc kubenswrapper[4930]: E1012 05:41:28.779868 4930 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 12 05:41:28 crc kubenswrapper[4930]: E1012 05:41:28.779888 4930 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 05:41:28 crc kubenswrapper[4930]: E1012 05:41:28.779945 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-12 05:41:29.77992974 +0000 UTC m=+22.322031545 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.779996 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.780033 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 05:41:28 crc kubenswrapper[4930]: E1012 05:41:28.780109 4930 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 12 05:41:28 crc kubenswrapper[4930]: E1012 05:41:28.780133 4930 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 12 05:41:28 crc kubenswrapper[4930]: E1012 05:41:28.780142 4930 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 05:41:28 crc kubenswrapper[4930]: E1012 05:41:28.780177 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-12 05:41:29.780167954 +0000 UTC m=+22.322269719 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 05:41:28 crc kubenswrapper[4930]: E1012 05:41:28.780212 4930 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 12 05:41:28 crc kubenswrapper[4930]: E1012 05:41:28.780304 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-12 05:41:29.780287477 +0000 UTC m=+22.322389282 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.791354 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.807791 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.826245 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.851460 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.854126 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.866457 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.872151 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.874406 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.889609 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.922049 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.942614 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.955594 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.973565 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 05:41:28 crc kubenswrapper[4930]: I1012 05:41:28.996818 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2acab1-2f7d-4497-ab0c-d2088825de18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://380b8da0bc8ba00ec7fe7a9dd1a4f49f0f0828fa469e21bd947a1b976ae5e584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14de1c8677ea9a164e1713e68861b55fa4b8fd0bff40211afa3ee9e883c65092\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93380cd1156afc3c34f019f5026f01b5242b9009c4c5de83edf5301e4cdb7b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b74b7cf89e0ba8d3417d38fdfc78b271cb23bc6e737358303956214c8f9a0f9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fb5b0196fd005013eba91e6ea563d14230abf0a8cbc1d2a4bf9cd4491f7c391\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T05:41:22Z\\\",\\\"message\\\":\\\"W1012 05:41:11.395066 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 05:41:11.395683 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760247671 cert, and key in /tmp/serving-cert-1865248722/serving-signer.crt, /tmp/serving-cert-1865248722/serving-signer.key\\\\nI1012 05:41:11.668419 1 observer_polling.go:159] Starting file observer\\\\nW1012 05:41:11.671952 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 05:41:11.672482 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 05:41:11.675332 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1865248722/tls.crt::/tmp/serving-cert-1865248722/tls.key\\\\\\\"\\\\nF1012 05:41:22.294380 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9e13b7dbf76d9dcdaa4a8edc40eb5ebac2c044e9f171f78f18b33a08784ed28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.015366 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd0aa975-5017-4585-b9d9-66563c734f99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2ee3c6242b1dd92c05cad132f941c6463862d38dd3f32db106c4458b19ca281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce0b341a84353da33af118a985d78e1926365e672cd9d9a94a8c9ccdeeeef68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://772b3ac684f8d59be06d3921919db10572b68d9acfb6ae64e85b92c637963057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04341c7af2fbffb89ce55a0ed2594bdb4fd943ac9088994f81f9245c31c62a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c844939574ab6101aa35b883fd6b7dfabb8338c2de1bc238a60b92f7e0fc99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.033491 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.044552 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.071032 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2acab1-2f7d-4497-ab0c-d2088825de18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://380b8da0bc8ba00ec7fe7a9dd1a4f49f0f0828fa469e21bd947a1b976ae5e584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14de1c8677ea9a164e1713e68861b55fa4b8fd0bff40211afa3ee9e883c65092\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93380cd1156afc3c34f019f5026f01b5242b9009c4c5de83edf5301e4cdb7b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b74b7cf89e0ba8d3417d38fdfc78b271cb23bc6e737358303956214c8f9a0f9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fb5b0196fd005013eba91e6ea563d14230abf0a8cbc1d2a4bf9cd4491f7c391\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T05:41:22Z\\\",\\\"message\\\":\\\"W1012 05:41:11.395066 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 05:41:11.395683 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760247671 cert, and key in /tmp/serving-cert-1865248722/serving-signer.crt, /tmp/serving-cert-1865248722/serving-signer.key\\\\nI1012 05:41:11.668419 1 observer_polling.go:159] Starting file observer\\\\nW1012 05:41:11.671952 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 05:41:11.672482 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 05:41:11.675332 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1865248722/tls.crt::/tmp/serving-cert-1865248722/tls.key\\\\\\\"\\\\nF1012 05:41:22.294380 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9e13b7dbf76d9dcdaa4a8edc40eb5ebac2c044e9f171f78f18b33a08784ed28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.084920 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.097535 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.110149 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.122416 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.263333 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"d829e88f123dd055dc32237a80177840c93dfc63c180cd545976729c93407e5b"} Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.264510 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"4d92840f87187d6a4e0a40972f9b1601b70c300104b6023b980e0939e8b83eba"} Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.264564 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"0d6f5f971b8708420f74c2b9602179815dd69145979d42bbaeb899290c01c286"} Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.264575 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"dc8b2484a7aa153414835ab28d813e0c692defaa4c9ac159082b3254201d290b"} Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.266549 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"85c87fc1896b611cf87d3f6e8dce960e0aea4e0a5e161d12e046bec5a4d46d3d"} Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.266581 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"8adcc3f72d2ed23f78aefbd7fe6e8948432d94e1d73596e6e2941a26d9f7e4ab"} Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.282668 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-br2vl"] Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.282971 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-br2vl" Oct 12 05:41:29 crc kubenswrapper[4930]: E1012 05:41:29.283526 4930 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-apiserver-crc\" already exists" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.292463 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.292541 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.292601 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.308746 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd0aa975-5017-4585-b9d9-66563c734f99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2ee3c6242b1dd92c05cad132f941c6463862d38dd3f32db106c4458b19ca281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce0b341a84353da33af118a985d78e1926365e672cd9d9a94a8c9ccdeeeef68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://772b3ac684f8d59be06d3921919db10572b68d9acfb6ae64e85b92c637963057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04341c7af2fbffb89ce55a0ed2594bdb4fd943ac9088994f81f9245c31c62a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c844939574ab6101aa35b883fd6b7dfabb8338c2de1bc238a60b92f7e0fc99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:29Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.321239 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d92840f87187d6a4e0a40972f9b1601b70c300104b6023b980e0939e8b83eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d6f5f971b8708420f74c2b9602179815dd69145979d42bbaeb899290c01c286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:29Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.336879 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:29Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.353623 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:29Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.366679 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:29Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.381286 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:29Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.384782 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/545bf51c-0b04-4166-a984-ec9c1276470a-hosts-file\") pod \"node-resolver-br2vl\" (UID: \"545bf51c-0b04-4166-a984-ec9c1276470a\") " pod="openshift-dns/node-resolver-br2vl" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.384852 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tpks\" (UniqueName: \"kubernetes.io/projected/545bf51c-0b04-4166-a984-ec9c1276470a-kube-api-access-5tpks\") pod \"node-resolver-br2vl\" (UID: \"545bf51c-0b04-4166-a984-ec9c1276470a\") " pod="openshift-dns/node-resolver-br2vl" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.397708 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:29Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.414114 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2acab1-2f7d-4497-ab0c-d2088825de18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://380b8da0bc8ba00ec7fe7a9dd1a4f49f0f0828fa469e21bd947a1b976ae5e584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14de1c8677ea9a164e1713e68861b55fa4b8fd0bff40211afa3ee9e883c65092\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93380cd1156afc3c34f019f5026f01b5242b9009c4c5de83edf5301e4cdb7b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b74b7cf89e0ba8d3417d38fdfc78b271cb23bc6e737358303956214c8f9a0f9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fb5b0196fd005013eba91e6ea563d14230abf0a8cbc1d2a4bf9cd4491f7c391\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T05:41:22Z\\\",\\\"message\\\":\\\"W1012 05:41:11.395066 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 05:41:11.395683 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760247671 cert, and key in /tmp/serving-cert-1865248722/serving-signer.crt, /tmp/serving-cert-1865248722/serving-signer.key\\\\nI1012 05:41:11.668419 1 observer_polling.go:159] Starting file observer\\\\nW1012 05:41:11.671952 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 05:41:11.672482 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 05:41:11.675332 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1865248722/tls.crt::/tmp/serving-cert-1865248722/tls.key\\\\\\\"\\\\nF1012 05:41:22.294380 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9e13b7dbf76d9dcdaa4a8edc40eb5ebac2c044e9f171f78f18b33a08784ed28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:29Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.439913 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd0aa975-5017-4585-b9d9-66563c734f99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2ee3c6242b1dd92c05cad132f941c6463862d38dd3f32db106c4458b19ca281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce0b341a84353da33af118a985d78e1926365e672cd9d9a94a8c9ccdeeeef68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://772b3ac684f8d59be06d3921919db10572b68d9acfb6ae64e85b92c637963057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04341c7af2fbffb89ce55a0ed2594bdb4fd943ac9088994f81f9245c31c62a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c844939574ab6101aa35b883fd6b7dfabb8338c2de1bc238a60b92f7e0fc99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:29Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.455392 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d92840f87187d6a4e0a40972f9b1601b70c300104b6023b980e0939e8b83eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d6f5f971b8708420f74c2b9602179815dd69145979d42bbaeb899290c01c286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:29Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.469825 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c87fc1896b611cf87d3f6e8dce960e0aea4e0a5e161d12e046bec5a4d46d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:29Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.483634 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:29Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.486044 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tpks\" (UniqueName: \"kubernetes.io/projected/545bf51c-0b04-4166-a984-ec9c1276470a-kube-api-access-5tpks\") pod \"node-resolver-br2vl\" (UID: \"545bf51c-0b04-4166-a984-ec9c1276470a\") " pod="openshift-dns/node-resolver-br2vl" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.486562 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/545bf51c-0b04-4166-a984-ec9c1276470a-hosts-file\") pod \"node-resolver-br2vl\" (UID: \"545bf51c-0b04-4166-a984-ec9c1276470a\") " pod="openshift-dns/node-resolver-br2vl" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.486663 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/545bf51c-0b04-4166-a984-ec9c1276470a-hosts-file\") pod \"node-resolver-br2vl\" (UID: \"545bf51c-0b04-4166-a984-ec9c1276470a\") " pod="openshift-dns/node-resolver-br2vl" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.513564 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:29Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.514056 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tpks\" (UniqueName: \"kubernetes.io/projected/545bf51c-0b04-4166-a984-ec9c1276470a-kube-api-access-5tpks\") pod \"node-resolver-br2vl\" (UID: \"545bf51c-0b04-4166-a984-ec9c1276470a\") " pod="openshift-dns/node-resolver-br2vl" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.537026 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:29Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.565810 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:29Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.589364 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2acab1-2f7d-4497-ab0c-d2088825de18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://380b8da0bc8ba00ec7fe7a9dd1a4f49f0f0828fa469e21bd947a1b976ae5e584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14de1c8677ea9a164e1713e68861b55fa4b8fd0bff40211afa3ee9e883c65092\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93380cd1156afc3c34f019f5026f01b5242b9009c4c5de83edf5301e4cdb7b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b74b7cf89e0ba8d3417d38fdfc78b271cb23bc6e737358303956214c8f9a0f9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fb5b0196fd005013eba91e6ea563d14230abf0a8cbc1d2a4bf9cd4491f7c391\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T05:41:22Z\\\",\\\"message\\\":\\\"W1012 05:41:11.395066 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 05:41:11.395683 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760247671 cert, and key in /tmp/serving-cert-1865248722/serving-signer.crt, /tmp/serving-cert-1865248722/serving-signer.key\\\\nI1012 05:41:11.668419 1 observer_polling.go:159] Starting file observer\\\\nW1012 05:41:11.671952 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 05:41:11.672482 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 05:41:11.675332 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1865248722/tls.crt::/tmp/serving-cert-1865248722/tls.key\\\\\\\"\\\\nF1012 05:41:22.294380 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9e13b7dbf76d9dcdaa4a8edc40eb5ebac2c044e9f171f78f18b33a08784ed28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:29Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.604598 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-br2vl" Oct 12 05:41:29 crc kubenswrapper[4930]: W1012 05:41:29.618552 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod545bf51c_0b04_4166_a984_ec9c1276470a.slice/crio-7f0176e4e5ba011654dfa340f47ebe527accb86e89090edf9bacea42d2e9a761 WatchSource:0}: Error finding container 7f0176e4e5ba011654dfa340f47ebe527accb86e89090edf9bacea42d2e9a761: Status 404 returned error can't find the container with id 7f0176e4e5ba011654dfa340f47ebe527accb86e89090edf9bacea42d2e9a761 Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.621047 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-br2vl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"545bf51c-0b04-4166-a984-ec9c1276470a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tpks\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-br2vl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:29Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.697047 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-tq29s"] Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.697393 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-tq29s" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.698379 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-mk4tf"] Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.698908 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.699173 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-vwttt"] Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.699778 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-vwttt" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.699960 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.700639 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.701178 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Oct 12 05:41:29 crc kubenswrapper[4930]: W1012 05:41:29.701444 4930 reflector.go:561] object-"openshift-machine-config-operator"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-config-operator": no relationship found between node 'crc' and this object Oct 12 05:41:29 crc kubenswrapper[4930]: E1012 05:41:29.701481 4930 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-config-operator\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-config-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 12 05:41:29 crc kubenswrapper[4930]: W1012 05:41:29.701515 4930 reflector.go:561] object-"openshift-machine-config-operator"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-config-operator": no relationship found between node 'crc' and this object Oct 12 05:41:29 crc kubenswrapper[4930]: E1012 05:41:29.701549 4930 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-config-operator\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-config-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 12 05:41:29 crc kubenswrapper[4930]: W1012 05:41:29.702143 4930 reflector.go:561] object-"openshift-machine-config-operator"/"proxy-tls": failed to list *v1.Secret: secrets "proxy-tls" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-machine-config-operator": no relationship found between node 'crc' and this object Oct 12 05:41:29 crc kubenswrapper[4930]: E1012 05:41:29.702175 4930 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-config-operator\"/\"proxy-tls\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"proxy-tls\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-machine-config-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.702194 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.702273 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.702437 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-mdhw6"] Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.702635 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.702659 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.706279 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.706459 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.706486 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.711705 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.711868 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.712456 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.712571 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.712666 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.712713 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.712821 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.734285 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd0aa975-5017-4585-b9d9-66563c734f99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2ee3c6242b1dd92c05cad132f941c6463862d38dd3f32db106c4458b19ca281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce0b341a84353da33af118a985d78e1926365e672cd9d9a94a8c9ccdeeeef68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://772b3ac684f8d59be06d3921919db10572b68d9acfb6ae64e85b92c637963057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04341c7af2fbffb89ce55a0ed2594bdb4fd943ac9088994f81f9245c31c62a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c844939574ab6101aa35b883fd6b7dfabb8338c2de1bc238a60b92f7e0fc99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:29Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.754321 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d92840f87187d6a4e0a40972f9b1601b70c300104b6023b980e0939e8b83eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d6f5f971b8708420f74c2b9602179815dd69145979d42bbaeb899290c01c286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:29Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.766763 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:29Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.788965 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.789044 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.789066 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.789089 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c1c3ae9e-26ae-418f-b261-eabc4302b332-multus-daemon-config\") pod \"multus-tq29s\" (UID: \"c1c3ae9e-26ae-418f-b261-eabc4302b332\") " pod="openshift-multus/multus-tq29s" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.789107 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c1c3ae9e-26ae-418f-b261-eabc4302b332-cnibin\") pod \"multus-tq29s\" (UID: \"c1c3ae9e-26ae-418f-b261-eabc4302b332\") " pod="openshift-multus/multus-tq29s" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.789121 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c1c3ae9e-26ae-418f-b261-eabc4302b332-host-var-lib-cni-bin\") pod \"multus-tq29s\" (UID: \"c1c3ae9e-26ae-418f-b261-eabc4302b332\") " pod="openshift-multus/multus-tq29s" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.789137 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.789155 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.789171 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c1c3ae9e-26ae-418f-b261-eabc4302b332-cni-binary-copy\") pod \"multus-tq29s\" (UID: \"c1c3ae9e-26ae-418f-b261-eabc4302b332\") " pod="openshift-multus/multus-tq29s" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.789191 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c1c3ae9e-26ae-418f-b261-eabc4302b332-host-run-k8s-cni-cncf-io\") pod \"multus-tq29s\" (UID: \"c1c3ae9e-26ae-418f-b261-eabc4302b332\") " pod="openshift-multus/multus-tq29s" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.789206 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c1c3ae9e-26ae-418f-b261-eabc4302b332-multus-conf-dir\") pod \"multus-tq29s\" (UID: \"c1c3ae9e-26ae-418f-b261-eabc4302b332\") " pod="openshift-multus/multus-tq29s" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.789221 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s29kh\" (UniqueName: \"kubernetes.io/projected/c1c3ae9e-26ae-418f-b261-eabc4302b332-kube-api-access-s29kh\") pod \"multus-tq29s\" (UID: \"c1c3ae9e-26ae-418f-b261-eabc4302b332\") " pod="openshift-multus/multus-tq29s" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.789238 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c1c3ae9e-26ae-418f-b261-eabc4302b332-hostroot\") pod \"multus-tq29s\" (UID: \"c1c3ae9e-26ae-418f-b261-eabc4302b332\") " pod="openshift-multus/multus-tq29s" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.789252 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c1c3ae9e-26ae-418f-b261-eabc4302b332-host-run-multus-certs\") pod \"multus-tq29s\" (UID: \"c1c3ae9e-26ae-418f-b261-eabc4302b332\") " pod="openshift-multus/multus-tq29s" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.789270 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/02f8684c-a3e4-44e8-9741-9f54488d8d8d-mcd-auth-proxy-config\") pod \"machine-config-daemon-mk4tf\" (UID: \"02f8684c-a3e4-44e8-9741-9f54488d8d8d\") " pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.789286 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d928520f-ca1d-4cca-b966-c1e6c9168db0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-vwttt\" (UID: \"d928520f-ca1d-4cca-b966-c1e6c9168db0\") " pod="openshift-multus/multus-additional-cni-plugins-vwttt" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.789303 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4m7n\" (UniqueName: \"kubernetes.io/projected/d928520f-ca1d-4cca-b966-c1e6c9168db0-kube-api-access-l4m7n\") pod \"multus-additional-cni-plugins-vwttt\" (UID: \"d928520f-ca1d-4cca-b966-c1e6c9168db0\") " pod="openshift-multus/multus-additional-cni-plugins-vwttt" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.789322 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d928520f-ca1d-4cca-b966-c1e6c9168db0-os-release\") pod \"multus-additional-cni-plugins-vwttt\" (UID: \"d928520f-ca1d-4cca-b966-c1e6c9168db0\") " pod="openshift-multus/multus-additional-cni-plugins-vwttt" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.789336 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c1c3ae9e-26ae-418f-b261-eabc4302b332-host-run-netns\") pod \"multus-tq29s\" (UID: \"c1c3ae9e-26ae-418f-b261-eabc4302b332\") " pod="openshift-multus/multus-tq29s" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.789352 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d928520f-ca1d-4cca-b966-c1e6c9168db0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-vwttt\" (UID: \"d928520f-ca1d-4cca-b966-c1e6c9168db0\") " pod="openshift-multus/multus-additional-cni-plugins-vwttt" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.789367 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c1c3ae9e-26ae-418f-b261-eabc4302b332-system-cni-dir\") pod \"multus-tq29s\" (UID: \"c1c3ae9e-26ae-418f-b261-eabc4302b332\") " pod="openshift-multus/multus-tq29s" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.789382 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c1c3ae9e-26ae-418f-b261-eabc4302b332-multus-cni-dir\") pod \"multus-tq29s\" (UID: \"c1c3ae9e-26ae-418f-b261-eabc4302b332\") " pod="openshift-multus/multus-tq29s" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.789396 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/02f8684c-a3e4-44e8-9741-9f54488d8d8d-rootfs\") pod \"machine-config-daemon-mk4tf\" (UID: \"02f8684c-a3e4-44e8-9741-9f54488d8d8d\") " pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.789409 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqrgm\" (UniqueName: \"kubernetes.io/projected/02f8684c-a3e4-44e8-9741-9f54488d8d8d-kube-api-access-zqrgm\") pod \"machine-config-daemon-mk4tf\" (UID: \"02f8684c-a3e4-44e8-9741-9f54488d8d8d\") " pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.789424 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d928520f-ca1d-4cca-b966-c1e6c9168db0-cnibin\") pod \"multus-additional-cni-plugins-vwttt\" (UID: \"d928520f-ca1d-4cca-b966-c1e6c9168db0\") " pod="openshift-multus/multus-additional-cni-plugins-vwttt" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.789438 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c1c3ae9e-26ae-418f-b261-eabc4302b332-host-var-lib-cni-multus\") pod \"multus-tq29s\" (UID: \"c1c3ae9e-26ae-418f-b261-eabc4302b332\") " pod="openshift-multus/multus-tq29s" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.789453 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c1c3ae9e-26ae-418f-b261-eabc4302b332-etc-kubernetes\") pod \"multus-tq29s\" (UID: \"c1c3ae9e-26ae-418f-b261-eabc4302b332\") " pod="openshift-multus/multus-tq29s" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.789468 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d928520f-ca1d-4cca-b966-c1e6c9168db0-system-cni-dir\") pod \"multus-additional-cni-plugins-vwttt\" (UID: \"d928520f-ca1d-4cca-b966-c1e6c9168db0\") " pod="openshift-multus/multus-additional-cni-plugins-vwttt" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.789484 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d928520f-ca1d-4cca-b966-c1e6c9168db0-cni-binary-copy\") pod \"multus-additional-cni-plugins-vwttt\" (UID: \"d928520f-ca1d-4cca-b966-c1e6c9168db0\") " pod="openshift-multus/multus-additional-cni-plugins-vwttt" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.789500 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c1c3ae9e-26ae-418f-b261-eabc4302b332-os-release\") pod \"multus-tq29s\" (UID: \"c1c3ae9e-26ae-418f-b261-eabc4302b332\") " pod="openshift-multus/multus-tq29s" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.789514 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c1c3ae9e-26ae-418f-b261-eabc4302b332-multus-socket-dir-parent\") pod \"multus-tq29s\" (UID: \"c1c3ae9e-26ae-418f-b261-eabc4302b332\") " pod="openshift-multus/multus-tq29s" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.789528 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c1c3ae9e-26ae-418f-b261-eabc4302b332-host-var-lib-kubelet\") pod \"multus-tq29s\" (UID: \"c1c3ae9e-26ae-418f-b261-eabc4302b332\") " pod="openshift-multus/multus-tq29s" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.789545 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/02f8684c-a3e4-44e8-9741-9f54488d8d8d-proxy-tls\") pod \"machine-config-daemon-mk4tf\" (UID: \"02f8684c-a3e4-44e8-9741-9f54488d8d8d\") " pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" Oct 12 05:41:29 crc kubenswrapper[4930]: E1012 05:41:29.789636 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 05:41:31.789621012 +0000 UTC m=+24.331722767 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:41:29 crc kubenswrapper[4930]: E1012 05:41:29.789719 4930 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 12 05:41:29 crc kubenswrapper[4930]: E1012 05:41:29.789731 4930 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 12 05:41:29 crc kubenswrapper[4930]: E1012 05:41:29.789759 4930 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 05:41:29 crc kubenswrapper[4930]: E1012 05:41:29.789796 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-12 05:41:31.789789546 +0000 UTC m=+24.331891311 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 05:41:29 crc kubenswrapper[4930]: E1012 05:41:29.790073 4930 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 12 05:41:29 crc kubenswrapper[4930]: E1012 05:41:29.790096 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-12 05:41:31.790089782 +0000 UTC m=+24.332191547 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 12 05:41:29 crc kubenswrapper[4930]: E1012 05:41:29.790185 4930 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 12 05:41:29 crc kubenswrapper[4930]: E1012 05:41:29.790195 4930 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 12 05:41:29 crc kubenswrapper[4930]: E1012 05:41:29.790202 4930 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 05:41:29 crc kubenswrapper[4930]: E1012 05:41:29.790223 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-12 05:41:31.790217544 +0000 UTC m=+24.332319309 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 05:41:29 crc kubenswrapper[4930]: E1012 05:41:29.790264 4930 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 12 05:41:29 crc kubenswrapper[4930]: E1012 05:41:29.790287 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-12 05:41:31.790278855 +0000 UTC m=+24.332380620 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.808086 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:29Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.824981 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:29Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.846563 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2acab1-2f7d-4497-ab0c-d2088825de18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://380b8da0bc8ba00ec7fe7a9dd1a4f49f0f0828fa469e21bd947a1b976ae5e584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14de1c8677ea9a164e1713e68861b55fa4b8fd0bff40211afa3ee9e883c65092\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93380cd1156afc3c34f019f5026f01b5242b9009c4c5de83edf5301e4cdb7b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b74b7cf89e0ba8d3417d38fdfc78b271cb23bc6e737358303956214c8f9a0f9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fb5b0196fd005013eba91e6ea563d14230abf0a8cbc1d2a4bf9cd4491f7c391\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T05:41:22Z\\\",\\\"message\\\":\\\"W1012 05:41:11.395066 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 05:41:11.395683 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760247671 cert, and key in /tmp/serving-cert-1865248722/serving-signer.crt, /tmp/serving-cert-1865248722/serving-signer.key\\\\nI1012 05:41:11.668419 1 observer_polling.go:159] Starting file observer\\\\nW1012 05:41:11.671952 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 05:41:11.672482 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 05:41:11.675332 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1865248722/tls.crt::/tmp/serving-cert-1865248722/tls.key\\\\\\\"\\\\nF1012 05:41:22.294380 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9e13b7dbf76d9dcdaa4a8edc40eb5ebac2c044e9f171f78f18b33a08784ed28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:29Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.861259 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c87fc1896b611cf87d3f6e8dce960e0aea4e0a5e161d12e046bec5a4d46d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:29Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.874097 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:29Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.887799 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-br2vl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"545bf51c-0b04-4166-a984-ec9c1276470a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tpks\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-br2vl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:29Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.890096 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c1c3ae9e-26ae-418f-b261-eabc4302b332-etc-kubernetes\") pod \"multus-tq29s\" (UID: \"c1c3ae9e-26ae-418f-b261-eabc4302b332\") " pod="openshift-multus/multus-tq29s" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.890129 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d928520f-ca1d-4cca-b966-c1e6c9168db0-cni-binary-copy\") pod \"multus-additional-cni-plugins-vwttt\" (UID: \"d928520f-ca1d-4cca-b966-c1e6c9168db0\") " pod="openshift-multus/multus-additional-cni-plugins-vwttt" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.890147 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c1c3ae9e-26ae-418f-b261-eabc4302b332-os-release\") pod \"multus-tq29s\" (UID: \"c1c3ae9e-26ae-418f-b261-eabc4302b332\") " pod="openshift-multus/multus-tq29s" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.890166 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c1c3ae9e-26ae-418f-b261-eabc4302b332-multus-socket-dir-parent\") pod \"multus-tq29s\" (UID: \"c1c3ae9e-26ae-418f-b261-eabc4302b332\") " pod="openshift-multus/multus-tq29s" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.890188 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0add0fa2-092f-4dcc-8c72-82881564bf63-ovn-node-metrics-cert\") pod \"ovnkube-node-mdhw6\" (UID: \"0add0fa2-092f-4dcc-8c72-82881564bf63\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.890206 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d928520f-ca1d-4cca-b966-c1e6c9168db0-system-cni-dir\") pod \"multus-additional-cni-plugins-vwttt\" (UID: \"d928520f-ca1d-4cca-b966-c1e6c9168db0\") " pod="openshift-multus/multus-additional-cni-plugins-vwttt" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.890226 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-systemd-units\") pod \"ovnkube-node-mdhw6\" (UID: \"0add0fa2-092f-4dcc-8c72-82881564bf63\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.890246 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-var-lib-openvswitch\") pod \"ovnkube-node-mdhw6\" (UID: \"0add0fa2-092f-4dcc-8c72-82881564bf63\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.890265 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-etc-openvswitch\") pod \"ovnkube-node-mdhw6\" (UID: \"0add0fa2-092f-4dcc-8c72-82881564bf63\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.890285 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-run-ovn\") pod \"ovnkube-node-mdhw6\" (UID: \"0add0fa2-092f-4dcc-8c72-82881564bf63\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.890304 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c1c3ae9e-26ae-418f-b261-eabc4302b332-host-var-lib-kubelet\") pod \"multus-tq29s\" (UID: \"c1c3ae9e-26ae-418f-b261-eabc4302b332\") " pod="openshift-multus/multus-tq29s" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.890254 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c1c3ae9e-26ae-418f-b261-eabc4302b332-etc-kubernetes\") pod \"multus-tq29s\" (UID: \"c1c3ae9e-26ae-418f-b261-eabc4302b332\") " pod="openshift-multus/multus-tq29s" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.890322 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d928520f-ca1d-4cca-b966-c1e6c9168db0-system-cni-dir\") pod \"multus-additional-cni-plugins-vwttt\" (UID: \"d928520f-ca1d-4cca-b966-c1e6c9168db0\") " pod="openshift-multus/multus-additional-cni-plugins-vwttt" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.890352 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/02f8684c-a3e4-44e8-9741-9f54488d8d8d-proxy-tls\") pod \"machine-config-daemon-mk4tf\" (UID: \"02f8684c-a3e4-44e8-9741-9f54488d8d8d\") " pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.890430 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c1c3ae9e-26ae-418f-b261-eabc4302b332-os-release\") pod \"multus-tq29s\" (UID: \"c1c3ae9e-26ae-418f-b261-eabc4302b332\") " pod="openshift-multus/multus-tq29s" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.890455 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c1c3ae9e-26ae-418f-b261-eabc4302b332-host-var-lib-kubelet\") pod \"multus-tq29s\" (UID: \"c1c3ae9e-26ae-418f-b261-eabc4302b332\") " pod="openshift-multus/multus-tq29s" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.890467 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c1c3ae9e-26ae-418f-b261-eabc4302b332-multus-socket-dir-parent\") pod \"multus-tq29s\" (UID: \"c1c3ae9e-26ae-418f-b261-eabc4302b332\") " pod="openshift-multus/multus-tq29s" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.890547 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c1c3ae9e-26ae-418f-b261-eabc4302b332-multus-daemon-config\") pod \"multus-tq29s\" (UID: \"c1c3ae9e-26ae-418f-b261-eabc4302b332\") " pod="openshift-multus/multus-tq29s" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.890604 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-host-kubelet\") pod \"ovnkube-node-mdhw6\" (UID: \"0add0fa2-092f-4dcc-8c72-82881564bf63\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.890627 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-host-slash\") pod \"ovnkube-node-mdhw6\" (UID: \"0add0fa2-092f-4dcc-8c72-82881564bf63\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.890641 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-log-socket\") pod \"ovnkube-node-mdhw6\" (UID: \"0add0fa2-092f-4dcc-8c72-82881564bf63\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.890683 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-host-run-ovn-kubernetes\") pod \"ovnkube-node-mdhw6\" (UID: \"0add0fa2-092f-4dcc-8c72-82881564bf63\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.890702 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0add0fa2-092f-4dcc-8c72-82881564bf63-ovnkube-config\") pod \"ovnkube-node-mdhw6\" (UID: \"0add0fa2-092f-4dcc-8c72-82881564bf63\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.890717 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0add0fa2-092f-4dcc-8c72-82881564bf63-ovnkube-script-lib\") pod \"ovnkube-node-mdhw6\" (UID: \"0add0fa2-092f-4dcc-8c72-82881564bf63\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.890754 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c1c3ae9e-26ae-418f-b261-eabc4302b332-cnibin\") pod \"multus-tq29s\" (UID: \"c1c3ae9e-26ae-418f-b261-eabc4302b332\") " pod="openshift-multus/multus-tq29s" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.890784 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c1c3ae9e-26ae-418f-b261-eabc4302b332-host-var-lib-cni-bin\") pod \"multus-tq29s\" (UID: \"c1c3ae9e-26ae-418f-b261-eabc4302b332\") " pod="openshift-multus/multus-tq29s" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.890816 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c1c3ae9e-26ae-418f-b261-eabc4302b332-cni-binary-copy\") pod \"multus-tq29s\" (UID: \"c1c3ae9e-26ae-418f-b261-eabc4302b332\") " pod="openshift-multus/multus-tq29s" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.890835 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c1c3ae9e-26ae-418f-b261-eabc4302b332-host-run-k8s-cni-cncf-io\") pod \"multus-tq29s\" (UID: \"c1c3ae9e-26ae-418f-b261-eabc4302b332\") " pod="openshift-multus/multus-tq29s" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.890847 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c1c3ae9e-26ae-418f-b261-eabc4302b332-host-var-lib-cni-bin\") pod \"multus-tq29s\" (UID: \"c1c3ae9e-26ae-418f-b261-eabc4302b332\") " pod="openshift-multus/multus-tq29s" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.890855 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mdhw6\" (UID: \"0add0fa2-092f-4dcc-8c72-82881564bf63\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.890819 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c1c3ae9e-26ae-418f-b261-eabc4302b332-cnibin\") pod \"multus-tq29s\" (UID: \"c1c3ae9e-26ae-418f-b261-eabc4302b332\") " pod="openshift-multus/multus-tq29s" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.890879 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c1c3ae9e-26ae-418f-b261-eabc4302b332-multus-conf-dir\") pod \"multus-tq29s\" (UID: \"c1c3ae9e-26ae-418f-b261-eabc4302b332\") " pod="openshift-multus/multus-tq29s" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.890898 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s29kh\" (UniqueName: \"kubernetes.io/projected/c1c3ae9e-26ae-418f-b261-eabc4302b332-kube-api-access-s29kh\") pod \"multus-tq29s\" (UID: \"c1c3ae9e-26ae-418f-b261-eabc4302b332\") " pod="openshift-multus/multus-tq29s" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.890915 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-node-log\") pod \"ovnkube-node-mdhw6\" (UID: \"0add0fa2-092f-4dcc-8c72-82881564bf63\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.890914 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c1c3ae9e-26ae-418f-b261-eabc4302b332-host-run-k8s-cni-cncf-io\") pod \"multus-tq29s\" (UID: \"c1c3ae9e-26ae-418f-b261-eabc4302b332\") " pod="openshift-multus/multus-tq29s" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.890932 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c1c3ae9e-26ae-418f-b261-eabc4302b332-hostroot\") pod \"multus-tq29s\" (UID: \"c1c3ae9e-26ae-418f-b261-eabc4302b332\") " pod="openshift-multus/multus-tq29s" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.890962 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c1c3ae9e-26ae-418f-b261-eabc4302b332-hostroot\") pod \"multus-tq29s\" (UID: \"c1c3ae9e-26ae-418f-b261-eabc4302b332\") " pod="openshift-multus/multus-tq29s" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.890980 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c1c3ae9e-26ae-418f-b261-eabc4302b332-host-run-multus-certs\") pod \"multus-tq29s\" (UID: \"c1c3ae9e-26ae-418f-b261-eabc4302b332\") " pod="openshift-multus/multus-tq29s" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.890981 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c1c3ae9e-26ae-418f-b261-eabc4302b332-multus-conf-dir\") pod \"multus-tq29s\" (UID: \"c1c3ae9e-26ae-418f-b261-eabc4302b332\") " pod="openshift-multus/multus-tq29s" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.891004 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-host-run-netns\") pod \"ovnkube-node-mdhw6\" (UID: \"0add0fa2-092f-4dcc-8c72-82881564bf63\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.891023 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c1c3ae9e-26ae-418f-b261-eabc4302b332-host-run-multus-certs\") pod \"multus-tq29s\" (UID: \"c1c3ae9e-26ae-418f-b261-eabc4302b332\") " pod="openshift-multus/multus-tq29s" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.891097 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-run-systemd\") pod \"ovnkube-node-mdhw6\" (UID: \"0add0fa2-092f-4dcc-8c72-82881564bf63\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.891149 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/02f8684c-a3e4-44e8-9741-9f54488d8d8d-mcd-auth-proxy-config\") pod \"machine-config-daemon-mk4tf\" (UID: \"02f8684c-a3e4-44e8-9741-9f54488d8d8d\") " pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.891185 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4m7n\" (UniqueName: \"kubernetes.io/projected/d928520f-ca1d-4cca-b966-c1e6c9168db0-kube-api-access-l4m7n\") pod \"multus-additional-cni-plugins-vwttt\" (UID: \"d928520f-ca1d-4cca-b966-c1e6c9168db0\") " pod="openshift-multus/multus-additional-cni-plugins-vwttt" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.891205 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d928520f-ca1d-4cca-b966-c1e6c9168db0-cni-binary-copy\") pod \"multus-additional-cni-plugins-vwttt\" (UID: \"d928520f-ca1d-4cca-b966-c1e6c9168db0\") " pod="openshift-multus/multus-additional-cni-plugins-vwttt" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.891225 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0add0fa2-092f-4dcc-8c72-82881564bf63-env-overrides\") pod \"ovnkube-node-mdhw6\" (UID: \"0add0fa2-092f-4dcc-8c72-82881564bf63\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.891262 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c1c3ae9e-26ae-418f-b261-eabc4302b332-multus-daemon-config\") pod \"multus-tq29s\" (UID: \"c1c3ae9e-26ae-418f-b261-eabc4302b332\") " pod="openshift-multus/multus-tq29s" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.891265 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d928520f-ca1d-4cca-b966-c1e6c9168db0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-vwttt\" (UID: \"d928520f-ca1d-4cca-b966-c1e6c9168db0\") " pod="openshift-multus/multus-additional-cni-plugins-vwttt" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.891301 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c1c3ae9e-26ae-418f-b261-eabc4302b332-host-run-netns\") pod \"multus-tq29s\" (UID: \"c1c3ae9e-26ae-418f-b261-eabc4302b332\") " pod="openshift-multus/multus-tq29s" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.891335 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d928520f-ca1d-4cca-b966-c1e6c9168db0-os-release\") pod \"multus-additional-cni-plugins-vwttt\" (UID: \"d928520f-ca1d-4cca-b966-c1e6c9168db0\") " pod="openshift-multus/multus-additional-cni-plugins-vwttt" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.891373 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c1c3ae9e-26ae-418f-b261-eabc4302b332-multus-cni-dir\") pod \"multus-tq29s\" (UID: \"c1c3ae9e-26ae-418f-b261-eabc4302b332\") " pod="openshift-multus/multus-tq29s" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.891384 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c1c3ae9e-26ae-418f-b261-eabc4302b332-host-run-netns\") pod \"multus-tq29s\" (UID: \"c1c3ae9e-26ae-418f-b261-eabc4302b332\") " pod="openshift-multus/multus-tq29s" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.891405 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/02f8684c-a3e4-44e8-9741-9f54488d8d8d-rootfs\") pod \"machine-config-daemon-mk4tf\" (UID: \"02f8684c-a3e4-44e8-9741-9f54488d8d8d\") " pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.891436 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqrgm\" (UniqueName: \"kubernetes.io/projected/02f8684c-a3e4-44e8-9741-9f54488d8d8d-kube-api-access-zqrgm\") pod \"machine-config-daemon-mk4tf\" (UID: \"02f8684c-a3e4-44e8-9741-9f54488d8d8d\") " pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.891466 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/02f8684c-a3e4-44e8-9741-9f54488d8d8d-rootfs\") pod \"machine-config-daemon-mk4tf\" (UID: \"02f8684c-a3e4-44e8-9741-9f54488d8d8d\") " pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.891470 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-run-openvswitch\") pod \"ovnkube-node-mdhw6\" (UID: \"0add0fa2-092f-4dcc-8c72-82881564bf63\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.891506 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d928520f-ca1d-4cca-b966-c1e6c9168db0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-vwttt\" (UID: \"d928520f-ca1d-4cca-b966-c1e6c9168db0\") " pod="openshift-multus/multus-additional-cni-plugins-vwttt" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.891528 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c1c3ae9e-26ae-418f-b261-eabc4302b332-multus-cni-dir\") pod \"multus-tq29s\" (UID: \"c1c3ae9e-26ae-418f-b261-eabc4302b332\") " pod="openshift-multus/multus-tq29s" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.891538 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c1c3ae9e-26ae-418f-b261-eabc4302b332-system-cni-dir\") pod \"multus-tq29s\" (UID: \"c1c3ae9e-26ae-418f-b261-eabc4302b332\") " pod="openshift-multus/multus-tq29s" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.891444 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d928520f-ca1d-4cca-b966-c1e6c9168db0-os-release\") pod \"multus-additional-cni-plugins-vwttt\" (UID: \"d928520f-ca1d-4cca-b966-c1e6c9168db0\") " pod="openshift-multus/multus-additional-cni-plugins-vwttt" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.891571 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c1c3ae9e-26ae-418f-b261-eabc4302b332-cni-binary-copy\") pod \"multus-tq29s\" (UID: \"c1c3ae9e-26ae-418f-b261-eabc4302b332\") " pod="openshift-multus/multus-tq29s" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.891573 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-host-cni-bin\") pod \"ovnkube-node-mdhw6\" (UID: \"0add0fa2-092f-4dcc-8c72-82881564bf63\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.891613 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-host-cni-netd\") pod \"ovnkube-node-mdhw6\" (UID: \"0add0fa2-092f-4dcc-8c72-82881564bf63\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.891629 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7hpj\" (UniqueName: \"kubernetes.io/projected/0add0fa2-092f-4dcc-8c72-82881564bf63-kube-api-access-g7hpj\") pod \"ovnkube-node-mdhw6\" (UID: \"0add0fa2-092f-4dcc-8c72-82881564bf63\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.891639 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c1c3ae9e-26ae-418f-b261-eabc4302b332-system-cni-dir\") pod \"multus-tq29s\" (UID: \"c1c3ae9e-26ae-418f-b261-eabc4302b332\") " pod="openshift-multus/multus-tq29s" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.891650 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d928520f-ca1d-4cca-b966-c1e6c9168db0-cnibin\") pod \"multus-additional-cni-plugins-vwttt\" (UID: \"d928520f-ca1d-4cca-b966-c1e6c9168db0\") " pod="openshift-multus/multus-additional-cni-plugins-vwttt" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.891666 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c1c3ae9e-26ae-418f-b261-eabc4302b332-host-var-lib-cni-multus\") pod \"multus-tq29s\" (UID: \"c1c3ae9e-26ae-418f-b261-eabc4302b332\") " pod="openshift-multus/multus-tq29s" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.891780 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c1c3ae9e-26ae-418f-b261-eabc4302b332-host-var-lib-cni-multus\") pod \"multus-tq29s\" (UID: \"c1c3ae9e-26ae-418f-b261-eabc4302b332\") " pod="openshift-multus/multus-tq29s" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.891784 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/02f8684c-a3e4-44e8-9741-9f54488d8d8d-mcd-auth-proxy-config\") pod \"machine-config-daemon-mk4tf\" (UID: \"02f8684c-a3e4-44e8-9741-9f54488d8d8d\") " pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.891804 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d928520f-ca1d-4cca-b966-c1e6c9168db0-cnibin\") pod \"multus-additional-cni-plugins-vwttt\" (UID: \"d928520f-ca1d-4cca-b966-c1e6c9168db0\") " pod="openshift-multus/multus-additional-cni-plugins-vwttt" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.892011 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d928520f-ca1d-4cca-b966-c1e6c9168db0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-vwttt\" (UID: \"d928520f-ca1d-4cca-b966-c1e6c9168db0\") " pod="openshift-multus/multus-additional-cni-plugins-vwttt" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.892072 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d928520f-ca1d-4cca-b966-c1e6c9168db0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-vwttt\" (UID: \"d928520f-ca1d-4cca-b966-c1e6c9168db0\") " pod="openshift-multus/multus-additional-cni-plugins-vwttt" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.907213 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tq29s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1c3ae9e-26ae-418f-b261-eabc4302b332\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s29kh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tq29s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:29Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.908209 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s29kh\" (UniqueName: \"kubernetes.io/projected/c1c3ae9e-26ae-418f-b261-eabc4302b332-kube-api-access-s29kh\") pod \"multus-tq29s\" (UID: \"c1c3ae9e-26ae-418f-b261-eabc4302b332\") " pod="openshift-multus/multus-tq29s" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.914345 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4m7n\" (UniqueName: \"kubernetes.io/projected/d928520f-ca1d-4cca-b966-c1e6c9168db0-kube-api-access-l4m7n\") pod \"multus-additional-cni-plugins-vwttt\" (UID: \"d928520f-ca1d-4cca-b966-c1e6c9168db0\") " pod="openshift-multus/multus-additional-cni-plugins-vwttt" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.939160 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd0aa975-5017-4585-b9d9-66563c734f99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2ee3c6242b1dd92c05cad132f941c6463862d38dd3f32db106c4458b19ca281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce0b341a84353da33af118a985d78e1926365e672cd9d9a94a8c9ccdeeeef68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://772b3ac684f8d59be06d3921919db10572b68d9acfb6ae64e85b92c637963057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04341c7af2fbffb89ce55a0ed2594bdb4fd943ac9088994f81f9245c31c62a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c844939574ab6101aa35b883fd6b7dfabb8338c2de1bc238a60b92f7e0fc99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:29Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:29 crc kubenswrapper[4930]: I1012 05:41:29.960933 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vwttt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d928520f-ca1d-4cca-b966-c1e6c9168db0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vwttt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:29Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.026126 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-tq29s" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.026203 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0add0fa2-092f-4dcc-8c72-82881564bf63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mdhw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:29Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.026323 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-run-openvswitch\") pod \"ovnkube-node-mdhw6\" (UID: \"0add0fa2-092f-4dcc-8c72-82881564bf63\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.026355 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-host-cni-bin\") pod \"ovnkube-node-mdhw6\" (UID: \"0add0fa2-092f-4dcc-8c72-82881564bf63\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.026373 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-host-cni-netd\") pod \"ovnkube-node-mdhw6\" (UID: \"0add0fa2-092f-4dcc-8c72-82881564bf63\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.026393 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7hpj\" (UniqueName: \"kubernetes.io/projected/0add0fa2-092f-4dcc-8c72-82881564bf63-kube-api-access-g7hpj\") pod \"ovnkube-node-mdhw6\" (UID: \"0add0fa2-092f-4dcc-8c72-82881564bf63\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.026411 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0add0fa2-092f-4dcc-8c72-82881564bf63-ovn-node-metrics-cert\") pod \"ovnkube-node-mdhw6\" (UID: \"0add0fa2-092f-4dcc-8c72-82881564bf63\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.026427 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-run-ovn\") pod \"ovnkube-node-mdhw6\" (UID: \"0add0fa2-092f-4dcc-8c72-82881564bf63\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.026458 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-host-cni-bin\") pod \"ovnkube-node-mdhw6\" (UID: \"0add0fa2-092f-4dcc-8c72-82881564bf63\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.026476 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-systemd-units\") pod \"ovnkube-node-mdhw6\" (UID: \"0add0fa2-092f-4dcc-8c72-82881564bf63\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.026493 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-var-lib-openvswitch\") pod \"ovnkube-node-mdhw6\" (UID: \"0add0fa2-092f-4dcc-8c72-82881564bf63\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.026507 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-etc-openvswitch\") pod \"ovnkube-node-mdhw6\" (UID: \"0add0fa2-092f-4dcc-8c72-82881564bf63\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.026521 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-host-slash\") pod \"ovnkube-node-mdhw6\" (UID: \"0add0fa2-092f-4dcc-8c72-82881564bf63\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.026537 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-log-socket\") pod \"ovnkube-node-mdhw6\" (UID: \"0add0fa2-092f-4dcc-8c72-82881564bf63\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.026551 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-host-cni-netd\") pod \"ovnkube-node-mdhw6\" (UID: \"0add0fa2-092f-4dcc-8c72-82881564bf63\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.026564 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-host-kubelet\") pod \"ovnkube-node-mdhw6\" (UID: \"0add0fa2-092f-4dcc-8c72-82881564bf63\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.026493 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-run-openvswitch\") pod \"ovnkube-node-mdhw6\" (UID: \"0add0fa2-092f-4dcc-8c72-82881564bf63\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.026582 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-host-run-ovn-kubernetes\") pod \"ovnkube-node-mdhw6\" (UID: \"0add0fa2-092f-4dcc-8c72-82881564bf63\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.026597 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0add0fa2-092f-4dcc-8c72-82881564bf63-ovnkube-config\") pod \"ovnkube-node-mdhw6\" (UID: \"0add0fa2-092f-4dcc-8c72-82881564bf63\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.026604 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-run-ovn\") pod \"ovnkube-node-mdhw6\" (UID: \"0add0fa2-092f-4dcc-8c72-82881564bf63\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.026611 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0add0fa2-092f-4dcc-8c72-82881564bf63-ovnkube-script-lib\") pod \"ovnkube-node-mdhw6\" (UID: \"0add0fa2-092f-4dcc-8c72-82881564bf63\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.026628 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mdhw6\" (UID: \"0add0fa2-092f-4dcc-8c72-82881564bf63\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.026644 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-var-lib-openvswitch\") pod \"ovnkube-node-mdhw6\" (UID: \"0add0fa2-092f-4dcc-8c72-82881564bf63\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.026659 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-node-log\") pod \"ovnkube-node-mdhw6\" (UID: \"0add0fa2-092f-4dcc-8c72-82881564bf63\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.026676 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-host-kubelet\") pod \"ovnkube-node-mdhw6\" (UID: \"0add0fa2-092f-4dcc-8c72-82881564bf63\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.026676 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-host-run-netns\") pod \"ovnkube-node-mdhw6\" (UID: \"0add0fa2-092f-4dcc-8c72-82881564bf63\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.026697 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-host-run-netns\") pod \"ovnkube-node-mdhw6\" (UID: \"0add0fa2-092f-4dcc-8c72-82881564bf63\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.026713 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-run-systemd\") pod \"ovnkube-node-mdhw6\" (UID: \"0add0fa2-092f-4dcc-8c72-82881564bf63\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.026723 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-host-run-ovn-kubernetes\") pod \"ovnkube-node-mdhw6\" (UID: \"0add0fa2-092f-4dcc-8c72-82881564bf63\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.026746 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0add0fa2-092f-4dcc-8c72-82881564bf63-env-overrides\") pod \"ovnkube-node-mdhw6\" (UID: \"0add0fa2-092f-4dcc-8c72-82881564bf63\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.026946 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mdhw6\" (UID: \"0add0fa2-092f-4dcc-8c72-82881564bf63\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.027309 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0add0fa2-092f-4dcc-8c72-82881564bf63-ovnkube-config\") pod \"ovnkube-node-mdhw6\" (UID: \"0add0fa2-092f-4dcc-8c72-82881564bf63\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.027309 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0add0fa2-092f-4dcc-8c72-82881564bf63-env-overrides\") pod \"ovnkube-node-mdhw6\" (UID: \"0add0fa2-092f-4dcc-8c72-82881564bf63\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.027334 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-etc-openvswitch\") pod \"ovnkube-node-mdhw6\" (UID: \"0add0fa2-092f-4dcc-8c72-82881564bf63\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.026629 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-systemd-units\") pod \"ovnkube-node-mdhw6\" (UID: \"0add0fa2-092f-4dcc-8c72-82881564bf63\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.027364 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-host-slash\") pod \"ovnkube-node-mdhw6\" (UID: \"0add0fa2-092f-4dcc-8c72-82881564bf63\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.027367 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-node-log\") pod \"ovnkube-node-mdhw6\" (UID: \"0add0fa2-092f-4dcc-8c72-82881564bf63\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.027396 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-log-socket\") pod \"ovnkube-node-mdhw6\" (UID: \"0add0fa2-092f-4dcc-8c72-82881564bf63\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.027418 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-run-systemd\") pod \"ovnkube-node-mdhw6\" (UID: \"0add0fa2-092f-4dcc-8c72-82881564bf63\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.027442 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0add0fa2-092f-4dcc-8c72-82881564bf63-ovnkube-script-lib\") pod \"ovnkube-node-mdhw6\" (UID: \"0add0fa2-092f-4dcc-8c72-82881564bf63\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.030178 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0add0fa2-092f-4dcc-8c72-82881564bf63-ovn-node-metrics-cert\") pod \"ovnkube-node-mdhw6\" (UID: \"0add0fa2-092f-4dcc-8c72-82881564bf63\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.037943 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-vwttt" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.057304 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7hpj\" (UniqueName: \"kubernetes.io/projected/0add0fa2-092f-4dcc-8c72-82881564bf63-kube-api-access-g7hpj\") pod \"ovnkube-node-mdhw6\" (UID: \"0add0fa2-092f-4dcc-8c72-82881564bf63\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.085319 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2acab1-2f7d-4497-ab0c-d2088825de18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://380b8da0bc8ba00ec7fe7a9dd1a4f49f0f0828fa469e21bd947a1b976ae5e584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14de1c8677ea9a164e1713e68861b55fa4b8fd0bff40211afa3ee9e883c65092\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93380cd1156afc3c34f019f5026f01b5242b9009c4c5de83edf5301e4cdb7b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b74b7cf89e0ba8d3417d38fdfc78b271cb23bc6e737358303956214c8f9a0f9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fb5b0196fd005013eba91e6ea563d14230abf0a8cbc1d2a4bf9cd4491f7c391\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T05:41:22Z\\\",\\\"message\\\":\\\"W1012 05:41:11.395066 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 05:41:11.395683 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760247671 cert, and key in /tmp/serving-cert-1865248722/serving-signer.crt, /tmp/serving-cert-1865248722/serving-signer.key\\\\nI1012 05:41:11.668419 1 observer_polling.go:159] Starting file observer\\\\nW1012 05:41:11.671952 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 05:41:11.672482 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 05:41:11.675332 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1865248722/tls.crt::/tmp/serving-cert-1865248722/tls.key\\\\\\\"\\\\nF1012 05:41:22.294380 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9e13b7dbf76d9dcdaa4a8edc40eb5ebac2c044e9f171f78f18b33a08784ed28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:30Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.123765 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:30Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.135603 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 05:41:30 crc kubenswrapper[4930]: E1012 05:41:30.135821 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.135864 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.135906 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 05:41:30 crc kubenswrapper[4930]: E1012 05:41:30.136002 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 05:41:30 crc kubenswrapper[4930]: E1012 05:41:30.136077 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.139650 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.140081 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:30Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.140175 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.141431 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.142030 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.143613 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.144488 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.145136 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.146146 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.146842 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.147791 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.148317 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.149374 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.149991 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.150516 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.151412 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.151988 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.153422 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.153818 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.156804 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.157371 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.157816 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.158834 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.159266 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.159940 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:30Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.160416 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.160861 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.162023 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.162712 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.163185 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.164122 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.164627 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.165455 4930 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.165553 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.167405 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.173895 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.174548 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.174840 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-br2vl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"545bf51c-0b04-4166-a984-ec9c1276470a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tpks\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-br2vl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:30Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.177067 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.180325 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.180917 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.181551 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.182715 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.183172 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.184128 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.187847 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.188458 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.189259 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.189791 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.190624 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.191514 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.194336 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.194837 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.195810 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.196378 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.196938 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.199673 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d92840f87187d6a4e0a40972f9b1601b70c300104b6023b980e0939e8b83eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d6f5f971b8708420f74c2b9602179815dd69145979d42bbaeb899290c01c286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:30Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.201233 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.217680 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c87fc1896b611cf87d3f6e8dce960e0aea4e0a5e161d12e046bec5a4d46d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:30Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.229952 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:30Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.262540 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tq29s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1c3ae9e-26ae-418f-b261-eabc4302b332\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s29kh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tq29s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:30Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.271010 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tq29s" event={"ID":"c1c3ae9e-26ae-418f-b261-eabc4302b332","Type":"ContainerStarted","Data":"cbe4d76b294d8c174b77355aae9ec55f4d38c04446fc048603a294d171332584"} Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.271076 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tq29s" event={"ID":"c1c3ae9e-26ae-418f-b261-eabc4302b332","Type":"ContainerStarted","Data":"8310b5f5187c9316353496a36f6092d92c9af8009f21360a70d4d1f754dc5bce"} Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.273957 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-br2vl" event={"ID":"545bf51c-0b04-4166-a984-ec9c1276470a","Type":"ContainerStarted","Data":"e6148cdf3aed29894038db482881c0cf3d0fd2d99c241e87d864099a3eb4248f"} Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.273983 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-br2vl" event={"ID":"545bf51c-0b04-4166-a984-ec9c1276470a","Type":"ContainerStarted","Data":"7f0176e4e5ba011654dfa340f47ebe527accb86e89090edf9bacea42d2e9a761"} Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.275835 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vwttt" event={"ID":"d928520f-ca1d-4cca-b966-c1e6c9168db0","Type":"ContainerStarted","Data":"90e05dfe4ad3597a5a75475eaada007fb15ab0a3e3971c273a798392db14b6ce"} Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.277181 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02f8684c-a3e4-44e8-9741-9f54488d8d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqrgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqrgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mk4tf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:30Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.291673 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:30Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.304249 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-br2vl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"545bf51c-0b04-4166-a984-ec9c1276470a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6148cdf3aed29894038db482881c0cf3d0fd2d99c241e87d864099a3eb4248f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tpks\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-br2vl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:30Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.320258 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2acab1-2f7d-4497-ab0c-d2088825de18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://380b8da0bc8ba00ec7fe7a9dd1a4f49f0f0828fa469e21bd947a1b976ae5e584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14de1c8677ea9a164e1713e68861b55fa4b8fd0bff40211afa3ee9e883c65092\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93380cd1156afc3c34f019f5026f01b5242b9009c4c5de83edf5301e4cdb7b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b74b7cf89e0ba8d3417d38fdfc78b271cb23bc6e737358303956214c8f9a0f9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fb5b0196fd005013eba91e6ea563d14230abf0a8cbc1d2a4bf9cd4491f7c391\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T05:41:22Z\\\",\\\"message\\\":\\\"W1012 05:41:11.395066 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 05:41:11.395683 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760247671 cert, and key in /tmp/serving-cert-1865248722/serving-signer.crt, /tmp/serving-cert-1865248722/serving-signer.key\\\\nI1012 05:41:11.668419 1 observer_polling.go:159] Starting file observer\\\\nW1012 05:41:11.671952 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 05:41:11.672482 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 05:41:11.675332 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1865248722/tls.crt::/tmp/serving-cert-1865248722/tls.key\\\\\\\"\\\\nF1012 05:41:22.294380 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9e13b7dbf76d9dcdaa4a8edc40eb5ebac2c044e9f171f78f18b33a08784ed28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:30Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.338234 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:30Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.341192 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" Oct 12 05:41:30 crc kubenswrapper[4930]: W1012 05:41:30.354614 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0add0fa2_092f_4dcc_8c72_82881564bf63.slice/crio-52edd9d2d5d32cd79f2107a4cbd70e8cfbbf8f5b8fe5c2fc1e883106efee0254 WatchSource:0}: Error finding container 52edd9d2d5d32cd79f2107a4cbd70e8cfbbf8f5b8fe5c2fc1e883106efee0254: Status 404 returned error can't find the container with id 52edd9d2d5d32cd79f2107a4cbd70e8cfbbf8f5b8fe5c2fc1e883106efee0254 Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.363143 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:30Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.383278 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d92840f87187d6a4e0a40972f9b1601b70c300104b6023b980e0939e8b83eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d6f5f971b8708420f74c2b9602179815dd69145979d42bbaeb899290c01c286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:30Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.400707 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:30Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.423166 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tq29s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1c3ae9e-26ae-418f-b261-eabc4302b332\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbe4d76b294d8c174b77355aae9ec55f4d38c04446fc048603a294d171332584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s29kh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tq29s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:30Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.436856 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02f8684c-a3e4-44e8-9741-9f54488d8d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqrgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqrgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mk4tf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:30Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.451373 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c87fc1896b611cf87d3f6e8dce960e0aea4e0a5e161d12e046bec5a4d46d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:30Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.478029 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd0aa975-5017-4585-b9d9-66563c734f99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2ee3c6242b1dd92c05cad132f941c6463862d38dd3f32db106c4458b19ca281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce0b341a84353da33af118a985d78e1926365e672cd9d9a94a8c9ccdeeeef68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://772b3ac684f8d59be06d3921919db10572b68d9acfb6ae64e85b92c637963057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04341c7af2fbffb89ce55a0ed2594bdb4fd943ac9088994f81f9245c31c62a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c844939574ab6101aa35b883fd6b7dfabb8338c2de1bc238a60b92f7e0fc99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:30Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.496628 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vwttt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d928520f-ca1d-4cca-b966-c1e6c9168db0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vwttt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:30Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.527218 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0add0fa2-092f-4dcc-8c72-82881564bf63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mdhw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:30Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:30 crc kubenswrapper[4930]: E1012 05:41:30.891018 4930 secret.go:188] Couldn't get secret openshift-machine-config-operator/proxy-tls: failed to sync secret cache: timed out waiting for the condition Oct 12 05:41:30 crc kubenswrapper[4930]: E1012 05:41:30.891177 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/02f8684c-a3e4-44e8-9741-9f54488d8d8d-proxy-tls podName:02f8684c-a3e4-44e8-9741-9f54488d8d8d nodeName:}" failed. No retries permitted until 2025-10-12 05:41:31.391139165 +0000 UTC m=+23.933240930 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/02f8684c-a3e4-44e8-9741-9f54488d8d8d-proxy-tls") pod "machine-config-daemon-mk4tf" (UID: "02f8684c-a3e4-44e8-9741-9f54488d8d8d") : failed to sync secret cache: timed out waiting for the condition Oct 12 05:41:30 crc kubenswrapper[4930]: I1012 05:41:30.903687 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Oct 12 05:41:31 crc kubenswrapper[4930]: I1012 05:41:31.116416 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Oct 12 05:41:31 crc kubenswrapper[4930]: I1012 05:41:31.132105 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqrgm\" (UniqueName: \"kubernetes.io/projected/02f8684c-a3e4-44e8-9741-9f54488d8d8d-kube-api-access-zqrgm\") pod \"machine-config-daemon-mk4tf\" (UID: \"02f8684c-a3e4-44e8-9741-9f54488d8d8d\") " pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" Oct 12 05:41:31 crc kubenswrapper[4930]: I1012 05:41:31.236151 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-jd2dw"] Oct 12 05:41:31 crc kubenswrapper[4930]: I1012 05:41:31.236699 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jd2dw" Oct 12 05:41:31 crc kubenswrapper[4930]: I1012 05:41:31.243287 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Oct 12 05:41:31 crc kubenswrapper[4930]: I1012 05:41:31.243702 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Oct 12 05:41:31 crc kubenswrapper[4930]: I1012 05:41:31.243960 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/835a1f98-4ae1-499b-b08c-a87dbcf8eaf9-serviceca\") pod \"node-ca-jd2dw\" (UID: \"835a1f98-4ae1-499b-b08c-a87dbcf8eaf9\") " pod="openshift-image-registry/node-ca-jd2dw" Oct 12 05:41:31 crc kubenswrapper[4930]: I1012 05:41:31.244046 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/835a1f98-4ae1-499b-b08c-a87dbcf8eaf9-host\") pod \"node-ca-jd2dw\" (UID: \"835a1f98-4ae1-499b-b08c-a87dbcf8eaf9\") " pod="openshift-image-registry/node-ca-jd2dw" Oct 12 05:41:31 crc kubenswrapper[4930]: I1012 05:41:31.243307 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Oct 12 05:41:31 crc kubenswrapper[4930]: I1012 05:41:31.243663 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Oct 12 05:41:31 crc kubenswrapper[4930]: I1012 05:41:31.244260 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvkzz\" (UniqueName: \"kubernetes.io/projected/835a1f98-4ae1-499b-b08c-a87dbcf8eaf9-kube-api-access-lvkzz\") pod \"node-ca-jd2dw\" (UID: \"835a1f98-4ae1-499b-b08c-a87dbcf8eaf9\") " pod="openshift-image-registry/node-ca-jd2dw" Oct 12 05:41:31 crc kubenswrapper[4930]: I1012 05:41:31.261910 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d92840f87187d6a4e0a40972f9b1601b70c300104b6023b980e0939e8b83eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d6f5f971b8708420f74c2b9602179815dd69145979d42bbaeb899290c01c286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:31Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:31 crc kubenswrapper[4930]: I1012 05:41:31.276096 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c87fc1896b611cf87d3f6e8dce960e0aea4e0a5e161d12e046bec5a4d46d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:31Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:31 crc kubenswrapper[4930]: I1012 05:41:31.280544 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"489b43d7659655253d7055c2725d6b59d594b9eea5d1e6dc78b015f8d44824ca"} Oct 12 05:41:31 crc kubenswrapper[4930]: I1012 05:41:31.282807 4930 generic.go:334] "Generic (PLEG): container finished" podID="0add0fa2-092f-4dcc-8c72-82881564bf63" containerID="5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb" exitCode=0 Oct 12 05:41:31 crc kubenswrapper[4930]: I1012 05:41:31.282916 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" event={"ID":"0add0fa2-092f-4dcc-8c72-82881564bf63","Type":"ContainerDied","Data":"5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb"} Oct 12 05:41:31 crc kubenswrapper[4930]: I1012 05:41:31.282954 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" event={"ID":"0add0fa2-092f-4dcc-8c72-82881564bf63","Type":"ContainerStarted","Data":"52edd9d2d5d32cd79f2107a4cbd70e8cfbbf8f5b8fe5c2fc1e883106efee0254"} Oct 12 05:41:31 crc kubenswrapper[4930]: I1012 05:41:31.286037 4930 generic.go:334] "Generic (PLEG): container finished" podID="d928520f-ca1d-4cca-b966-c1e6c9168db0" containerID="271a5e8f9d11a30e34aaa4965e6d9d036cfc47d469d87a2ff0ddcae5ba4e8fe8" exitCode=0 Oct 12 05:41:31 crc kubenswrapper[4930]: I1012 05:41:31.286091 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vwttt" event={"ID":"d928520f-ca1d-4cca-b966-c1e6c9168db0","Type":"ContainerDied","Data":"271a5e8f9d11a30e34aaa4965e6d9d036cfc47d469d87a2ff0ddcae5ba4e8fe8"} Oct 12 05:41:31 crc kubenswrapper[4930]: I1012 05:41:31.297545 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:31Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:31 crc kubenswrapper[4930]: I1012 05:41:31.303319 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Oct 12 05:41:31 crc kubenswrapper[4930]: I1012 05:41:31.312823 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tq29s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1c3ae9e-26ae-418f-b261-eabc4302b332\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbe4d76b294d8c174b77355aae9ec55f4d38c04446fc048603a294d171332584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s29kh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tq29s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:31Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:31 crc kubenswrapper[4930]: I1012 05:41:31.331614 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02f8684c-a3e4-44e8-9741-9f54488d8d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqrgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqrgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mk4tf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:31Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:31 crc kubenswrapper[4930]: I1012 05:41:31.345333 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/835a1f98-4ae1-499b-b08c-a87dbcf8eaf9-serviceca\") pod \"node-ca-jd2dw\" (UID: \"835a1f98-4ae1-499b-b08c-a87dbcf8eaf9\") " pod="openshift-image-registry/node-ca-jd2dw" Oct 12 05:41:31 crc kubenswrapper[4930]: I1012 05:41:31.345416 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/835a1f98-4ae1-499b-b08c-a87dbcf8eaf9-host\") pod \"node-ca-jd2dw\" (UID: \"835a1f98-4ae1-499b-b08c-a87dbcf8eaf9\") " pod="openshift-image-registry/node-ca-jd2dw" Oct 12 05:41:31 crc kubenswrapper[4930]: I1012 05:41:31.345506 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvkzz\" (UniqueName: \"kubernetes.io/projected/835a1f98-4ae1-499b-b08c-a87dbcf8eaf9-kube-api-access-lvkzz\") pod \"node-ca-jd2dw\" (UID: \"835a1f98-4ae1-499b-b08c-a87dbcf8eaf9\") " pod="openshift-image-registry/node-ca-jd2dw" Oct 12 05:41:31 crc kubenswrapper[4930]: I1012 05:41:31.346999 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/835a1f98-4ae1-499b-b08c-a87dbcf8eaf9-host\") pod \"node-ca-jd2dw\" (UID: \"835a1f98-4ae1-499b-b08c-a87dbcf8eaf9\") " pod="openshift-image-registry/node-ca-jd2dw" Oct 12 05:41:31 crc kubenswrapper[4930]: I1012 05:41:31.348862 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/835a1f98-4ae1-499b-b08c-a87dbcf8eaf9-serviceca\") pod \"node-ca-jd2dw\" (UID: \"835a1f98-4ae1-499b-b08c-a87dbcf8eaf9\") " pod="openshift-image-registry/node-ca-jd2dw" Oct 12 05:41:31 crc kubenswrapper[4930]: I1012 05:41:31.355670 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd0aa975-5017-4585-b9d9-66563c734f99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2ee3c6242b1dd92c05cad132f941c6463862d38dd3f32db106c4458b19ca281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce0b341a84353da33af118a985d78e1926365e672cd9d9a94a8c9ccdeeeef68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://772b3ac684f8d59be06d3921919db10572b68d9acfb6ae64e85b92c637963057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04341c7af2fbffb89ce55a0ed2594bdb4fd943ac9088994f81f9245c31c62a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c844939574ab6101aa35b883fd6b7dfabb8338c2de1bc238a60b92f7e0fc99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:31Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:31 crc kubenswrapper[4930]: I1012 05:41:31.372987 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvkzz\" (UniqueName: \"kubernetes.io/projected/835a1f98-4ae1-499b-b08c-a87dbcf8eaf9-kube-api-access-lvkzz\") pod \"node-ca-jd2dw\" (UID: \"835a1f98-4ae1-499b-b08c-a87dbcf8eaf9\") " pod="openshift-image-registry/node-ca-jd2dw" Oct 12 05:41:31 crc kubenswrapper[4930]: I1012 05:41:31.374981 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vwttt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d928520f-ca1d-4cca-b966-c1e6c9168db0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vwttt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:31Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:31 crc kubenswrapper[4930]: I1012 05:41:31.403330 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0add0fa2-092f-4dcc-8c72-82881564bf63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mdhw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:31Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:31 crc kubenswrapper[4930]: I1012 05:41:31.421627 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2acab1-2f7d-4497-ab0c-d2088825de18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://380b8da0bc8ba00ec7fe7a9dd1a4f49f0f0828fa469e21bd947a1b976ae5e584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14de1c8677ea9a164e1713e68861b55fa4b8fd0bff40211afa3ee9e883c65092\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93380cd1156afc3c34f019f5026f01b5242b9009c4c5de83edf5301e4cdb7b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b74b7cf89e0ba8d3417d38fdfc78b271cb23bc6e737358303956214c8f9a0f9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fb5b0196fd005013eba91e6ea563d14230abf0a8cbc1d2a4bf9cd4491f7c391\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T05:41:22Z\\\",\\\"message\\\":\\\"W1012 05:41:11.395066 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 05:41:11.395683 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760247671 cert, and key in /tmp/serving-cert-1865248722/serving-signer.crt, /tmp/serving-cert-1865248722/serving-signer.key\\\\nI1012 05:41:11.668419 1 observer_polling.go:159] Starting file observer\\\\nW1012 05:41:11.671952 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 05:41:11.672482 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 05:41:11.675332 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1865248722/tls.crt::/tmp/serving-cert-1865248722/tls.key\\\\\\\"\\\\nF1012 05:41:22.294380 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9e13b7dbf76d9dcdaa4a8edc40eb5ebac2c044e9f171f78f18b33a08784ed28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:31Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:31 crc kubenswrapper[4930]: I1012 05:41:31.438540 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:31Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:31 crc kubenswrapper[4930]: I1012 05:41:31.446315 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/02f8684c-a3e4-44e8-9741-9f54488d8d8d-proxy-tls\") pod \"machine-config-daemon-mk4tf\" (UID: \"02f8684c-a3e4-44e8-9741-9f54488d8d8d\") " pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" Oct 12 05:41:31 crc kubenswrapper[4930]: I1012 05:41:31.449425 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:31Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:31 crc kubenswrapper[4930]: I1012 05:41:31.452884 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/02f8684c-a3e4-44e8-9741-9f54488d8d8d-proxy-tls\") pod \"machine-config-daemon-mk4tf\" (UID: \"02f8684c-a3e4-44e8-9741-9f54488d8d8d\") " pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" Oct 12 05:41:31 crc kubenswrapper[4930]: I1012 05:41:31.463170 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:31Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:31 crc kubenswrapper[4930]: I1012 05:41:31.473455 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-br2vl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"545bf51c-0b04-4166-a984-ec9c1276470a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6148cdf3aed29894038db482881c0cf3d0fd2d99c241e87d864099a3eb4248f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tpks\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-br2vl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:31Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:31 crc kubenswrapper[4930]: I1012 05:41:31.483598 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jd2dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"835a1f98-4ae1-499b-b08c-a87dbcf8eaf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvkzz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jd2dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:31Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:31 crc kubenswrapper[4930]: I1012 05:41:31.508945 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd0aa975-5017-4585-b9d9-66563c734f99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2ee3c6242b1dd92c05cad132f941c6463862d38dd3f32db106c4458b19ca281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce0b341a84353da33af118a985d78e1926365e672cd9d9a94a8c9ccdeeeef68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://772b3ac684f8d59be06d3921919db10572b68d9acfb6ae64e85b92c637963057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04341c7af2fbffb89ce55a0ed2594bdb4fd943ac9088994f81f9245c31c62a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c844939574ab6101aa35b883fd6b7dfabb8338c2de1bc238a60b92f7e0fc99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:31Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:31 crc kubenswrapper[4930]: I1012 05:41:31.527874 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" Oct 12 05:41:31 crc kubenswrapper[4930]: I1012 05:41:31.527982 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vwttt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d928520f-ca1d-4cca-b966-c1e6c9168db0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://271a5e8f9d11a30e34aaa4965e6d9d036cfc47d469d87a2ff0ddcae5ba4e8fe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://271a5e8f9d11a30e34aaa4965e6d9d036cfc47d469d87a2ff0ddcae5ba4e8fe8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vwttt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:31Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:31 crc kubenswrapper[4930]: W1012 05:41:31.553833 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02f8684c_a3e4_44e8_9741_9f54488d8d8d.slice/crio-9bbe7cc4d509ad69afca86d95baed3067e9557a5daaee8a317ef16a4235567d5 WatchSource:0}: Error finding container 9bbe7cc4d509ad69afca86d95baed3067e9557a5daaee8a317ef16a4235567d5: Status 404 returned error can't find the container with id 9bbe7cc4d509ad69afca86d95baed3067e9557a5daaee8a317ef16a4235567d5 Oct 12 05:41:31 crc kubenswrapper[4930]: I1012 05:41:31.554225 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0add0fa2-092f-4dcc-8c72-82881564bf63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mdhw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:31Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:31 crc kubenswrapper[4930]: I1012 05:41:31.557863 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jd2dw" Oct 12 05:41:31 crc kubenswrapper[4930]: I1012 05:41:31.565853 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-br2vl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"545bf51c-0b04-4166-a984-ec9c1276470a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6148cdf3aed29894038db482881c0cf3d0fd2d99c241e87d864099a3eb4248f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tpks\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-br2vl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:31Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:31 crc kubenswrapper[4930]: W1012 05:41:31.583286 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod835a1f98_4ae1_499b_b08c_a87dbcf8eaf9.slice/crio-4563c0c3ae8c2188924206e860cb2ed486f7f0fe9a9ad5a43adce3b4bb94f9fa WatchSource:0}: Error finding container 4563c0c3ae8c2188924206e860cb2ed486f7f0fe9a9ad5a43adce3b4bb94f9fa: Status 404 returned error can't find the container with id 4563c0c3ae8c2188924206e860cb2ed486f7f0fe9a9ad5a43adce3b4bb94f9fa Oct 12 05:41:31 crc kubenswrapper[4930]: I1012 05:41:31.585047 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jd2dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"835a1f98-4ae1-499b-b08c-a87dbcf8eaf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvkzz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jd2dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:31Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:31 crc kubenswrapper[4930]: I1012 05:41:31.603953 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2acab1-2f7d-4497-ab0c-d2088825de18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://380b8da0bc8ba00ec7fe7a9dd1a4f49f0f0828fa469e21bd947a1b976ae5e584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14de1c8677ea9a164e1713e68861b55fa4b8fd0bff40211afa3ee9e883c65092\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93380cd1156afc3c34f019f5026f01b5242b9009c4c5de83edf5301e4cdb7b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b74b7cf89e0ba8d3417d38fdfc78b271cb23bc6e737358303956214c8f9a0f9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fb5b0196fd005013eba91e6ea563d14230abf0a8cbc1d2a4bf9cd4491f7c391\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T05:41:22Z\\\",\\\"message\\\":\\\"W1012 05:41:11.395066 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 05:41:11.395683 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760247671 cert, and key in /tmp/serving-cert-1865248722/serving-signer.crt, /tmp/serving-cert-1865248722/serving-signer.key\\\\nI1012 05:41:11.668419 1 observer_polling.go:159] Starting file observer\\\\nW1012 05:41:11.671952 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 05:41:11.672482 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 05:41:11.675332 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1865248722/tls.crt::/tmp/serving-cert-1865248722/tls.key\\\\\\\"\\\\nF1012 05:41:22.294380 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9e13b7dbf76d9dcdaa4a8edc40eb5ebac2c044e9f171f78f18b33a08784ed28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:31Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:31 crc kubenswrapper[4930]: I1012 05:41:31.620409 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:31Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:31 crc kubenswrapper[4930]: I1012 05:41:31.640448 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:31Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:31 crc kubenswrapper[4930]: I1012 05:41:31.660003 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://489b43d7659655253d7055c2725d6b59d594b9eea5d1e6dc78b015f8d44824ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:31Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:31 crc kubenswrapper[4930]: I1012 05:41:31.672330 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d92840f87187d6a4e0a40972f9b1601b70c300104b6023b980e0939e8b83eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d6f5f971b8708420f74c2b9602179815dd69145979d42bbaeb899290c01c286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:31Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:31 crc kubenswrapper[4930]: I1012 05:41:31.687764 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tq29s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1c3ae9e-26ae-418f-b261-eabc4302b332\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbe4d76b294d8c174b77355aae9ec55f4d38c04446fc048603a294d171332584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s29kh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tq29s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:31Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:31 crc kubenswrapper[4930]: I1012 05:41:31.698218 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02f8684c-a3e4-44e8-9741-9f54488d8d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqrgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqrgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mk4tf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:31Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:31 crc kubenswrapper[4930]: I1012 05:41:31.735702 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c87fc1896b611cf87d3f6e8dce960e0aea4e0a5e161d12e046bec5a4d46d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:31Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:31 crc kubenswrapper[4930]: I1012 05:41:31.780148 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:31Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:31 crc kubenswrapper[4930]: I1012 05:41:31.849807 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 05:41:31 crc kubenswrapper[4930]: E1012 05:41:31.850035 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 05:41:35.849997949 +0000 UTC m=+28.392099714 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:41:31 crc kubenswrapper[4930]: I1012 05:41:31.850102 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 05:41:31 crc kubenswrapper[4930]: I1012 05:41:31.850155 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 05:41:31 crc kubenswrapper[4930]: I1012 05:41:31.850196 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 05:41:31 crc kubenswrapper[4930]: I1012 05:41:31.850216 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 05:41:31 crc kubenswrapper[4930]: E1012 05:41:31.850253 4930 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 12 05:41:31 crc kubenswrapper[4930]: E1012 05:41:31.850274 4930 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 12 05:41:31 crc kubenswrapper[4930]: E1012 05:41:31.850286 4930 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 05:41:31 crc kubenswrapper[4930]: E1012 05:41:31.850338 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-12 05:41:35.850322626 +0000 UTC m=+28.392424391 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 05:41:31 crc kubenswrapper[4930]: E1012 05:41:31.850423 4930 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 12 05:41:31 crc kubenswrapper[4930]: E1012 05:41:31.850452 4930 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 12 05:41:31 crc kubenswrapper[4930]: E1012 05:41:31.850500 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-12 05:41:35.850492269 +0000 UTC m=+28.392594024 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 12 05:41:31 crc kubenswrapper[4930]: E1012 05:41:31.850517 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-12 05:41:35.85050868 +0000 UTC m=+28.392610445 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 12 05:41:31 crc kubenswrapper[4930]: E1012 05:41:31.850575 4930 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 12 05:41:31 crc kubenswrapper[4930]: E1012 05:41:31.850620 4930 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 12 05:41:31 crc kubenswrapper[4930]: E1012 05:41:31.850635 4930 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 05:41:31 crc kubenswrapper[4930]: E1012 05:41:31.850714 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-12 05:41:35.850691313 +0000 UTC m=+28.392793078 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 05:41:32 crc kubenswrapper[4930]: I1012 05:41:32.135315 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 05:41:32 crc kubenswrapper[4930]: E1012 05:41:32.136040 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 05:41:32 crc kubenswrapper[4930]: I1012 05:41:32.135361 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 05:41:32 crc kubenswrapper[4930]: E1012 05:41:32.136152 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 05:41:32 crc kubenswrapper[4930]: I1012 05:41:32.135337 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 05:41:32 crc kubenswrapper[4930]: E1012 05:41:32.136237 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 05:41:32 crc kubenswrapper[4930]: I1012 05:41:32.300037 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" event={"ID":"02f8684c-a3e4-44e8-9741-9f54488d8d8d","Type":"ContainerStarted","Data":"e74c17edeca2e2b56f4be36327914c7232a5eee3190d7cd9ce54fe81050942ec"} Oct 12 05:41:32 crc kubenswrapper[4930]: I1012 05:41:32.300113 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" event={"ID":"02f8684c-a3e4-44e8-9741-9f54488d8d8d","Type":"ContainerStarted","Data":"2193886cf3e24fbf30c3e6f91ab52d9b7e68968c8b42a4d0deafab5668555701"} Oct 12 05:41:32 crc kubenswrapper[4930]: I1012 05:41:32.300131 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" event={"ID":"02f8684c-a3e4-44e8-9741-9f54488d8d8d","Type":"ContainerStarted","Data":"9bbe7cc4d509ad69afca86d95baed3067e9557a5daaee8a317ef16a4235567d5"} Oct 12 05:41:32 crc kubenswrapper[4930]: I1012 05:41:32.306077 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" event={"ID":"0add0fa2-092f-4dcc-8c72-82881564bf63","Type":"ContainerStarted","Data":"a2cba08ba8fd74a521bdd005e51be52f80687a1ccf0f250d11b709c894be8235"} Oct 12 05:41:32 crc kubenswrapper[4930]: I1012 05:41:32.306137 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" event={"ID":"0add0fa2-092f-4dcc-8c72-82881564bf63","Type":"ContainerStarted","Data":"1006beab015e55856b70b9bcb93bb81b541e6c593fae8fd366c361990d233914"} Oct 12 05:41:32 crc kubenswrapper[4930]: I1012 05:41:32.306151 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" event={"ID":"0add0fa2-092f-4dcc-8c72-82881564bf63","Type":"ContainerStarted","Data":"48a4295347cdfdf0864fbc2b102904f73e19cc33021fc5a82eed61255c4641d3"} Oct 12 05:41:32 crc kubenswrapper[4930]: I1012 05:41:32.306164 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" event={"ID":"0add0fa2-092f-4dcc-8c72-82881564bf63","Type":"ContainerStarted","Data":"7495c144ce989bf10b08b199a7459ea85351b77941a3edcb0ef76a01b2e08164"} Oct 12 05:41:32 crc kubenswrapper[4930]: I1012 05:41:32.306176 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" event={"ID":"0add0fa2-092f-4dcc-8c72-82881564bf63","Type":"ContainerStarted","Data":"b17051b78f584f1199be3038a82d9e675d1b9fb4da939611577c0b05235a147a"} Oct 12 05:41:32 crc kubenswrapper[4930]: I1012 05:41:32.306193 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" event={"ID":"0add0fa2-092f-4dcc-8c72-82881564bf63","Type":"ContainerStarted","Data":"c9898a7d7e617b5eb8eab6b6d9cb5d2ea92e8fbb0f4b1e0e60711e8f4a5e3aef"} Oct 12 05:41:32 crc kubenswrapper[4930]: I1012 05:41:32.307642 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jd2dw" event={"ID":"835a1f98-4ae1-499b-b08c-a87dbcf8eaf9","Type":"ContainerStarted","Data":"a7816344abd767a7cdc2aa06418215233815285a683bb5a7fceeeb9a383b00bd"} Oct 12 05:41:32 crc kubenswrapper[4930]: I1012 05:41:32.307676 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jd2dw" event={"ID":"835a1f98-4ae1-499b-b08c-a87dbcf8eaf9","Type":"ContainerStarted","Data":"4563c0c3ae8c2188924206e860cb2ed486f7f0fe9a9ad5a43adce3b4bb94f9fa"} Oct 12 05:41:32 crc kubenswrapper[4930]: I1012 05:41:32.310301 4930 generic.go:334] "Generic (PLEG): container finished" podID="d928520f-ca1d-4cca-b966-c1e6c9168db0" containerID="2a7387bc9fd2f9b98dd174adbd738309e22b90e1c82756e0e4c114cd5bc1279a" exitCode=0 Oct 12 05:41:32 crc kubenswrapper[4930]: I1012 05:41:32.310355 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vwttt" event={"ID":"d928520f-ca1d-4cca-b966-c1e6c9168db0","Type":"ContainerDied","Data":"2a7387bc9fd2f9b98dd174adbd738309e22b90e1c82756e0e4c114cd5bc1279a"} Oct 12 05:41:32 crc kubenswrapper[4930]: I1012 05:41:32.321868 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c87fc1896b611cf87d3f6e8dce960e0aea4e0a5e161d12e046bec5a4d46d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:32Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:32 crc kubenswrapper[4930]: I1012 05:41:32.337256 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:32Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:32 crc kubenswrapper[4930]: I1012 05:41:32.352173 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tq29s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1c3ae9e-26ae-418f-b261-eabc4302b332\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbe4d76b294d8c174b77355aae9ec55f4d38c04446fc048603a294d171332584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s29kh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tq29s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:32Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:32 crc kubenswrapper[4930]: I1012 05:41:32.364384 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02f8684c-a3e4-44e8-9741-9f54488d8d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74c17edeca2e2b56f4be36327914c7232a5eee3190d7cd9ce54fe81050942ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqrgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2193886cf3e24fbf30c3e6f91ab52d9b7e68968c8b42a4d0deafab5668555701\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqrgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mk4tf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:32Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:32 crc kubenswrapper[4930]: I1012 05:41:32.388142 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd0aa975-5017-4585-b9d9-66563c734f99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2ee3c6242b1dd92c05cad132f941c6463862d38dd3f32db106c4458b19ca281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce0b341a84353da33af118a985d78e1926365e672cd9d9a94a8c9ccdeeeef68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://772b3ac684f8d59be06d3921919db10572b68d9acfb6ae64e85b92c637963057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04341c7af2fbffb89ce55a0ed2594bdb4fd943ac9088994f81f9245c31c62a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c844939574ab6101aa35b883fd6b7dfabb8338c2de1bc238a60b92f7e0fc99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:32Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:32 crc kubenswrapper[4930]: I1012 05:41:32.404299 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vwttt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d928520f-ca1d-4cca-b966-c1e6c9168db0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://271a5e8f9d11a30e34aaa4965e6d9d036cfc47d469d87a2ff0ddcae5ba4e8fe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://271a5e8f9d11a30e34aaa4965e6d9d036cfc47d469d87a2ff0ddcae5ba4e8fe8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vwttt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:32Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:32 crc kubenswrapper[4930]: I1012 05:41:32.430701 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0add0fa2-092f-4dcc-8c72-82881564bf63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mdhw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:32Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:32 crc kubenswrapper[4930]: I1012 05:41:32.446446 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2acab1-2f7d-4497-ab0c-d2088825de18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://380b8da0bc8ba00ec7fe7a9dd1a4f49f0f0828fa469e21bd947a1b976ae5e584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14de1c8677ea9a164e1713e68861b55fa4b8fd0bff40211afa3ee9e883c65092\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93380cd1156afc3c34f019f5026f01b5242b9009c4c5de83edf5301e4cdb7b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b74b7cf89e0ba8d3417d38fdfc78b271cb23bc6e737358303956214c8f9a0f9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fb5b0196fd005013eba91e6ea563d14230abf0a8cbc1d2a4bf9cd4491f7c391\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T05:41:22Z\\\",\\\"message\\\":\\\"W1012 05:41:11.395066 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 05:41:11.395683 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760247671 cert, and key in /tmp/serving-cert-1865248722/serving-signer.crt, /tmp/serving-cert-1865248722/serving-signer.key\\\\nI1012 05:41:11.668419 1 observer_polling.go:159] Starting file observer\\\\nW1012 05:41:11.671952 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 05:41:11.672482 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 05:41:11.675332 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1865248722/tls.crt::/tmp/serving-cert-1865248722/tls.key\\\\\\\"\\\\nF1012 05:41:22.294380 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9e13b7dbf76d9dcdaa4a8edc40eb5ebac2c044e9f171f78f18b33a08784ed28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:32Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:32 crc kubenswrapper[4930]: I1012 05:41:32.461437 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:32Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:32 crc kubenswrapper[4930]: I1012 05:41:32.480359 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:32Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:32 crc kubenswrapper[4930]: I1012 05:41:32.493198 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://489b43d7659655253d7055c2725d6b59d594b9eea5d1e6dc78b015f8d44824ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:32Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:32 crc kubenswrapper[4930]: I1012 05:41:32.506487 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-br2vl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"545bf51c-0b04-4166-a984-ec9c1276470a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6148cdf3aed29894038db482881c0cf3d0fd2d99c241e87d864099a3eb4248f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tpks\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-br2vl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:32Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:32 crc kubenswrapper[4930]: I1012 05:41:32.519480 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jd2dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"835a1f98-4ae1-499b-b08c-a87dbcf8eaf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvkzz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jd2dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:32Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:32 crc kubenswrapper[4930]: I1012 05:41:32.533214 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d92840f87187d6a4e0a40972f9b1601b70c300104b6023b980e0939e8b83eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d6f5f971b8708420f74c2b9602179815dd69145979d42bbaeb899290c01c286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:32Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:32 crc kubenswrapper[4930]: I1012 05:41:32.546703 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d92840f87187d6a4e0a40972f9b1601b70c300104b6023b980e0939e8b83eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d6f5f971b8708420f74c2b9602179815dd69145979d42bbaeb899290c01c286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:32Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:32 crc kubenswrapper[4930]: I1012 05:41:32.561176 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02f8684c-a3e4-44e8-9741-9f54488d8d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74c17edeca2e2b56f4be36327914c7232a5eee3190d7cd9ce54fe81050942ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqrgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2193886cf3e24fbf30c3e6f91ab52d9b7e68968c8b42a4d0deafab5668555701\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqrgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mk4tf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:32Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:32 crc kubenswrapper[4930]: I1012 05:41:32.575589 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c87fc1896b611cf87d3f6e8dce960e0aea4e0a5e161d12e046bec5a4d46d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:32Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:32 crc kubenswrapper[4930]: I1012 05:41:32.590824 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:32Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:32 crc kubenswrapper[4930]: I1012 05:41:32.605049 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tq29s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1c3ae9e-26ae-418f-b261-eabc4302b332\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbe4d76b294d8c174b77355aae9ec55f4d38c04446fc048603a294d171332584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s29kh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tq29s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:32Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:32 crc kubenswrapper[4930]: I1012 05:41:32.624579 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd0aa975-5017-4585-b9d9-66563c734f99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2ee3c6242b1dd92c05cad132f941c6463862d38dd3f32db106c4458b19ca281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce0b341a84353da33af118a985d78e1926365e672cd9d9a94a8c9ccdeeeef68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://772b3ac684f8d59be06d3921919db10572b68d9acfb6ae64e85b92c637963057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04341c7af2fbffb89ce55a0ed2594bdb4fd943ac9088994f81f9245c31c62a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c844939574ab6101aa35b883fd6b7dfabb8338c2de1bc238a60b92f7e0fc99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:32Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:32 crc kubenswrapper[4930]: I1012 05:41:32.642262 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vwttt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d928520f-ca1d-4cca-b966-c1e6c9168db0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://271a5e8f9d11a30e34aaa4965e6d9d036cfc47d469d87a2ff0ddcae5ba4e8fe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://271a5e8f9d11a30e34aaa4965e6d9d036cfc47d469d87a2ff0ddcae5ba4e8fe8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a7387bc9fd2f9b98dd174adbd738309e22b90e1c82756e0e4c114cd5bc1279a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a7387bc9fd2f9b98dd174adbd738309e22b90e1c82756e0e4c114cd5bc1279a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vwttt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:32Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:32 crc kubenswrapper[4930]: I1012 05:41:32.665856 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0add0fa2-092f-4dcc-8c72-82881564bf63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mdhw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:32Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:32 crc kubenswrapper[4930]: I1012 05:41:32.695913 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jd2dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"835a1f98-4ae1-499b-b08c-a87dbcf8eaf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7816344abd767a7cdc2aa06418215233815285a683bb5a7fceeeb9a383b00bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvkzz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jd2dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:32Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:32 crc kubenswrapper[4930]: I1012 05:41:32.766363 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2acab1-2f7d-4497-ab0c-d2088825de18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://380b8da0bc8ba00ec7fe7a9dd1a4f49f0f0828fa469e21bd947a1b976ae5e584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14de1c8677ea9a164e1713e68861b55fa4b8fd0bff40211afa3ee9e883c65092\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93380cd1156afc3c34f019f5026f01b5242b9009c4c5de83edf5301e4cdb7b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b74b7cf89e0ba8d3417d38fdfc78b271cb23bc6e737358303956214c8f9a0f9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fb5b0196fd005013eba91e6ea563d14230abf0a8cbc1d2a4bf9cd4491f7c391\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T05:41:22Z\\\",\\\"message\\\":\\\"W1012 05:41:11.395066 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 05:41:11.395683 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760247671 cert, and key in /tmp/serving-cert-1865248722/serving-signer.crt, /tmp/serving-cert-1865248722/serving-signer.key\\\\nI1012 05:41:11.668419 1 observer_polling.go:159] Starting file observer\\\\nW1012 05:41:11.671952 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 05:41:11.672482 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 05:41:11.675332 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1865248722/tls.crt::/tmp/serving-cert-1865248722/tls.key\\\\\\\"\\\\nF1012 05:41:22.294380 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9e13b7dbf76d9dcdaa4a8edc40eb5ebac2c044e9f171f78f18b33a08784ed28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:32Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:32 crc kubenswrapper[4930]: I1012 05:41:32.792111 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:32Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:32 crc kubenswrapper[4930]: I1012 05:41:32.814959 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:32Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:32 crc kubenswrapper[4930]: I1012 05:41:32.851068 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://489b43d7659655253d7055c2725d6b59d594b9eea5d1e6dc78b015f8d44824ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:32Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:32 crc kubenswrapper[4930]: I1012 05:41:32.891768 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-br2vl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"545bf51c-0b04-4166-a984-ec9c1276470a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6148cdf3aed29894038db482881c0cf3d0fd2d99c241e87d864099a3eb4248f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tpks\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-br2vl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:32Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:33 crc kubenswrapper[4930]: I1012 05:41:33.317755 4930 generic.go:334] "Generic (PLEG): container finished" podID="d928520f-ca1d-4cca-b966-c1e6c9168db0" containerID="a6b73defe23757469590b3d3d92e7b79bf3b6dcbcaec44d750770cd2b94caaea" exitCode=0 Oct 12 05:41:33 crc kubenswrapper[4930]: I1012 05:41:33.317878 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vwttt" event={"ID":"d928520f-ca1d-4cca-b966-c1e6c9168db0","Type":"ContainerDied","Data":"a6b73defe23757469590b3d3d92e7b79bf3b6dcbcaec44d750770cd2b94caaea"} Oct 12 05:41:33 crc kubenswrapper[4930]: I1012 05:41:33.347838 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd0aa975-5017-4585-b9d9-66563c734f99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2ee3c6242b1dd92c05cad132f941c6463862d38dd3f32db106c4458b19ca281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce0b341a84353da33af118a985d78e1926365e672cd9d9a94a8c9ccdeeeef68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://772b3ac684f8d59be06d3921919db10572b68d9acfb6ae64e85b92c637963057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04341c7af2fbffb89ce55a0ed2594bdb4fd943ac9088994f81f9245c31c62a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c844939574ab6101aa35b883fd6b7dfabb8338c2de1bc238a60b92f7e0fc99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:33Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:33 crc kubenswrapper[4930]: I1012 05:41:33.366699 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vwttt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d928520f-ca1d-4cca-b966-c1e6c9168db0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://271a5e8f9d11a30e34aaa4965e6d9d036cfc47d469d87a2ff0ddcae5ba4e8fe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://271a5e8f9d11a30e34aaa4965e6d9d036cfc47d469d87a2ff0ddcae5ba4e8fe8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a7387bc9fd2f9b98dd174adbd738309e22b90e1c82756e0e4c114cd5bc1279a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a7387bc9fd2f9b98dd174adbd738309e22b90e1c82756e0e4c114cd5bc1279a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b73defe23757469590b3d3d92e7b79bf3b6dcbcaec44d750770cd2b94caaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6b73defe23757469590b3d3d92e7b79bf3b6dcbcaec44d750770cd2b94caaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vwttt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:33Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:33 crc kubenswrapper[4930]: I1012 05:41:33.393043 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0add0fa2-092f-4dcc-8c72-82881564bf63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mdhw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:33Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:33 crc kubenswrapper[4930]: I1012 05:41:33.407590 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://489b43d7659655253d7055c2725d6b59d594b9eea5d1e6dc78b015f8d44824ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:33Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:33 crc kubenswrapper[4930]: I1012 05:41:33.425297 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-br2vl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"545bf51c-0b04-4166-a984-ec9c1276470a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6148cdf3aed29894038db482881c0cf3d0fd2d99c241e87d864099a3eb4248f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tpks\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-br2vl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:33Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:33 crc kubenswrapper[4930]: I1012 05:41:33.443562 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jd2dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"835a1f98-4ae1-499b-b08c-a87dbcf8eaf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7816344abd767a7cdc2aa06418215233815285a683bb5a7fceeeb9a383b00bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvkzz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jd2dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:33Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:33 crc kubenswrapper[4930]: I1012 05:41:33.460067 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2acab1-2f7d-4497-ab0c-d2088825de18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://380b8da0bc8ba00ec7fe7a9dd1a4f49f0f0828fa469e21bd947a1b976ae5e584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14de1c8677ea9a164e1713e68861b55fa4b8fd0bff40211afa3ee9e883c65092\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93380cd1156afc3c34f019f5026f01b5242b9009c4c5de83edf5301e4cdb7b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b74b7cf89e0ba8d3417d38fdfc78b271cb23bc6e737358303956214c8f9a0f9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fb5b0196fd005013eba91e6ea563d14230abf0a8cbc1d2a4bf9cd4491f7c391\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T05:41:22Z\\\",\\\"message\\\":\\\"W1012 05:41:11.395066 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 05:41:11.395683 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760247671 cert, and key in /tmp/serving-cert-1865248722/serving-signer.crt, /tmp/serving-cert-1865248722/serving-signer.key\\\\nI1012 05:41:11.668419 1 observer_polling.go:159] Starting file observer\\\\nW1012 05:41:11.671952 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 05:41:11.672482 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 05:41:11.675332 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1865248722/tls.crt::/tmp/serving-cert-1865248722/tls.key\\\\\\\"\\\\nF1012 05:41:22.294380 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9e13b7dbf76d9dcdaa4a8edc40eb5ebac2c044e9f171f78f18b33a08784ed28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:33Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:33 crc kubenswrapper[4930]: I1012 05:41:33.476313 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:33Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:33 crc kubenswrapper[4930]: I1012 05:41:33.486780 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:33Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:33 crc kubenswrapper[4930]: I1012 05:41:33.501758 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d92840f87187d6a4e0a40972f9b1601b70c300104b6023b980e0939e8b83eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d6f5f971b8708420f74c2b9602179815dd69145979d42bbaeb899290c01c286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:33Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:33 crc kubenswrapper[4930]: I1012 05:41:33.515594 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:33Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:33 crc kubenswrapper[4930]: I1012 05:41:33.527515 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tq29s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1c3ae9e-26ae-418f-b261-eabc4302b332\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbe4d76b294d8c174b77355aae9ec55f4d38c04446fc048603a294d171332584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s29kh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tq29s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:33Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:33 crc kubenswrapper[4930]: I1012 05:41:33.541720 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02f8684c-a3e4-44e8-9741-9f54488d8d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74c17edeca2e2b56f4be36327914c7232a5eee3190d7cd9ce54fe81050942ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqrgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2193886cf3e24fbf30c3e6f91ab52d9b7e68968c8b42a4d0deafab5668555701\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqrgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mk4tf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:33Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:33 crc kubenswrapper[4930]: I1012 05:41:33.557100 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c87fc1896b611cf87d3f6e8dce960e0aea4e0a5e161d12e046bec5a4d46d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:33Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.134864 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 05:41:34 crc kubenswrapper[4930]: E1012 05:41:34.135042 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.135509 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.135586 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 05:41:34 crc kubenswrapper[4930]: E1012 05:41:34.135700 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 05:41:34 crc kubenswrapper[4930]: E1012 05:41:34.135836 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.238713 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.264353 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c87fc1896b611cf87d3f6e8dce960e0aea4e0a5e161d12e046bec5a4d46d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:34Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.266891 4930 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.270858 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.270923 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.270942 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.271042 4930 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.280884 4930 kubelet_node_status.go:115] "Node was previously registered" node="crc" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.281281 4930 kubelet_node_status.go:79] "Successfully registered node" node="crc" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.283137 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.283188 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.283206 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.283234 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.283254 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:34Z","lastTransitionTime":"2025-10-12T05:41:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.285587 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:34Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:34 crc kubenswrapper[4930]: E1012 05:41:34.300445 4930 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"208e324a-7c16-4d54-b585-7c9f58265cb2\\\",\\\"systemUUID\\\":\\\"42c4af26-8d34-49a0-8413-58384a3ecd2b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:34Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.304732 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.304860 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.304883 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.304906 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.304925 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:34Z","lastTransitionTime":"2025-10-12T05:41:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.307600 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tq29s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1c3ae9e-26ae-418f-b261-eabc4302b332\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbe4d76b294d8c174b77355aae9ec55f4d38c04446fc048603a294d171332584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s29kh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tq29s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:34Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.323917 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02f8684c-a3e4-44e8-9741-9f54488d8d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74c17edeca2e2b56f4be36327914c7232a5eee3190d7cd9ce54fe81050942ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqrgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2193886cf3e24fbf30c3e6f91ab52d9b7e68968c8b42a4d0deafab5668555701\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqrgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mk4tf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:34Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:34 crc kubenswrapper[4930]: E1012 05:41:34.326048 4930 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"208e324a-7c16-4d54-b585-7c9f58265cb2\\\",\\\"systemUUID\\\":\\\"42c4af26-8d34-49a0-8413-58384a3ecd2b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:34Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.327535 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" event={"ID":"0add0fa2-092f-4dcc-8c72-82881564bf63","Type":"ContainerStarted","Data":"dcac9953a6ca0ac812cb0d1aed7ff1179eed01831f4af4a098f3e6938b5bc666"} Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.330327 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.330365 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.330377 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.330397 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.330412 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:34Z","lastTransitionTime":"2025-10-12T05:41:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.331980 4930 generic.go:334] "Generic (PLEG): container finished" podID="d928520f-ca1d-4cca-b966-c1e6c9168db0" containerID="9c379d6c40705bccb05ab8835e3b2410fc0a98db9e372b4dbd62cc3d4c1c5470" exitCode=0 Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.332054 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vwttt" event={"ID":"d928520f-ca1d-4cca-b966-c1e6c9168db0","Type":"ContainerDied","Data":"9c379d6c40705bccb05ab8835e3b2410fc0a98db9e372b4dbd62cc3d4c1c5470"} Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.359296 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd0aa975-5017-4585-b9d9-66563c734f99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2ee3c6242b1dd92c05cad132f941c6463862d38dd3f32db106c4458b19ca281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce0b341a84353da33af118a985d78e1926365e672cd9d9a94a8c9ccdeeeef68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://772b3ac684f8d59be06d3921919db10572b68d9acfb6ae64e85b92c637963057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04341c7af2fbffb89ce55a0ed2594bdb4fd943ac9088994f81f9245c31c62a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c844939574ab6101aa35b883fd6b7dfabb8338c2de1bc238a60b92f7e0fc99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:34Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:34 crc kubenswrapper[4930]: E1012 05:41:34.359216 4930 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"208e324a-7c16-4d54-b585-7c9f58265cb2\\\",\\\"systemUUID\\\":\\\"42c4af26-8d34-49a0-8413-58384a3ecd2b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:34Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.364000 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.364038 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.364048 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.364071 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.364084 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:34Z","lastTransitionTime":"2025-10-12T05:41:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:34 crc kubenswrapper[4930]: E1012 05:41:34.382536 4930 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"208e324a-7c16-4d54-b585-7c9f58265cb2\\\",\\\"systemUUID\\\":\\\"42c4af26-8d34-49a0-8413-58384a3ecd2b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:34Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.384659 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vwttt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d928520f-ca1d-4cca-b966-c1e6c9168db0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://271a5e8f9d11a30e34aaa4965e6d9d036cfc47d469d87a2ff0ddcae5ba4e8fe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://271a5e8f9d11a30e34aaa4965e6d9d036cfc47d469d87a2ff0ddcae5ba4e8fe8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a7387bc9fd2f9b98dd174adbd738309e22b90e1c82756e0e4c114cd5bc1279a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a7387bc9fd2f9b98dd174adbd738309e22b90e1c82756e0e4c114cd5bc1279a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b73defe23757469590b3d3d92e7b79bf3b6dcbcaec44d750770cd2b94caaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6b73defe23757469590b3d3d92e7b79bf3b6dcbcaec44d750770cd2b94caaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vwttt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:34Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.389373 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.389430 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.389451 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.389478 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.389499 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:34Z","lastTransitionTime":"2025-10-12T05:41:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:34 crc kubenswrapper[4930]: E1012 05:41:34.416754 4930 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"208e324a-7c16-4d54-b585-7c9f58265cb2\\\",\\\"systemUUID\\\":\\\"42c4af26-8d34-49a0-8413-58384a3ecd2b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:34Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:34 crc kubenswrapper[4930]: E1012 05:41:34.416913 4930 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.421756 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.421795 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.421808 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.421831 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.421848 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:34Z","lastTransitionTime":"2025-10-12T05:41:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.441914 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0add0fa2-092f-4dcc-8c72-82881564bf63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mdhw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:34Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.464920 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2acab1-2f7d-4497-ab0c-d2088825de18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://380b8da0bc8ba00ec7fe7a9dd1a4f49f0f0828fa469e21bd947a1b976ae5e584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14de1c8677ea9a164e1713e68861b55fa4b8fd0bff40211afa3ee9e883c65092\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93380cd1156afc3c34f019f5026f01b5242b9009c4c5de83edf5301e4cdb7b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b74b7cf89e0ba8d3417d38fdfc78b271cb23bc6e737358303956214c8f9a0f9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fb5b0196fd005013eba91e6ea563d14230abf0a8cbc1d2a4bf9cd4491f7c391\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T05:41:22Z\\\",\\\"message\\\":\\\"W1012 05:41:11.395066 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 05:41:11.395683 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760247671 cert, and key in /tmp/serving-cert-1865248722/serving-signer.crt, /tmp/serving-cert-1865248722/serving-signer.key\\\\nI1012 05:41:11.668419 1 observer_polling.go:159] Starting file observer\\\\nW1012 05:41:11.671952 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 05:41:11.672482 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 05:41:11.675332 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1865248722/tls.crt::/tmp/serving-cert-1865248722/tls.key\\\\\\\"\\\\nF1012 05:41:22.294380 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9e13b7dbf76d9dcdaa4a8edc40eb5ebac2c044e9f171f78f18b33a08784ed28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:34Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.488779 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:34Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.511434 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:34Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.521854 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://489b43d7659655253d7055c2725d6b59d594b9eea5d1e6dc78b015f8d44824ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:34Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.524579 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.524645 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.524657 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.524680 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.524692 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:34Z","lastTransitionTime":"2025-10-12T05:41:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.533055 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-br2vl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"545bf51c-0b04-4166-a984-ec9c1276470a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6148cdf3aed29894038db482881c0cf3d0fd2d99c241e87d864099a3eb4248f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tpks\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-br2vl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:34Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.548946 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jd2dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"835a1f98-4ae1-499b-b08c-a87dbcf8eaf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7816344abd767a7cdc2aa06418215233815285a683bb5a7fceeeb9a383b00bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvkzz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jd2dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:34Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.562329 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d92840f87187d6a4e0a40972f9b1601b70c300104b6023b980e0939e8b83eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d6f5f971b8708420f74c2b9602179815dd69145979d42bbaeb899290c01c286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:34Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.573376 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d92840f87187d6a4e0a40972f9b1601b70c300104b6023b980e0939e8b83eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d6f5f971b8708420f74c2b9602179815dd69145979d42bbaeb899290c01c286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:34Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.590535 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c87fc1896b611cf87d3f6e8dce960e0aea4e0a5e161d12e046bec5a4d46d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:34Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.603555 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:34Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.616900 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tq29s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1c3ae9e-26ae-418f-b261-eabc4302b332\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbe4d76b294d8c174b77355aae9ec55f4d38c04446fc048603a294d171332584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s29kh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tq29s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:34Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.628039 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.628088 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.628098 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.628129 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.628140 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:34Z","lastTransitionTime":"2025-10-12T05:41:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.630438 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02f8684c-a3e4-44e8-9741-9f54488d8d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74c17edeca2e2b56f4be36327914c7232a5eee3190d7cd9ce54fe81050942ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqrgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2193886cf3e24fbf30c3e6f91ab52d9b7e68968c8b42a4d0deafab5668555701\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqrgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mk4tf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:34Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.649919 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd0aa975-5017-4585-b9d9-66563c734f99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2ee3c6242b1dd92c05cad132f941c6463862d38dd3f32db106c4458b19ca281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce0b341a84353da33af118a985d78e1926365e672cd9d9a94a8c9ccdeeeef68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://772b3ac684f8d59be06d3921919db10572b68d9acfb6ae64e85b92c637963057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04341c7af2fbffb89ce55a0ed2594bdb4fd943ac9088994f81f9245c31c62a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c844939574ab6101aa35b883fd6b7dfabb8338c2de1bc238a60b92f7e0fc99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:34Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.664697 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vwttt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d928520f-ca1d-4cca-b966-c1e6c9168db0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://271a5e8f9d11a30e34aaa4965e6d9d036cfc47d469d87a2ff0ddcae5ba4e8fe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://271a5e8f9d11a30e34aaa4965e6d9d036cfc47d469d87a2ff0ddcae5ba4e8fe8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a7387bc9fd2f9b98dd174adbd738309e22b90e1c82756e0e4c114cd5bc1279a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a7387bc9fd2f9b98dd174adbd738309e22b90e1c82756e0e4c114cd5bc1279a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b73defe23757469590b3d3d92e7b79bf3b6dcbcaec44d750770cd2b94caaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6b73defe23757469590b3d3d92e7b79bf3b6dcbcaec44d750770cd2b94caaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c379d6c40705bccb05ab8835e3b2410fc0a98db9e372b4dbd62cc3d4c1c5470\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c379d6c40705bccb05ab8835e3b2410fc0a98db9e372b4dbd62cc3d4c1c5470\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vwttt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:34Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.687386 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0add0fa2-092f-4dcc-8c72-82881564bf63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mdhw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:34Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.708396 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2acab1-2f7d-4497-ab0c-d2088825de18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://380b8da0bc8ba00ec7fe7a9dd1a4f49f0f0828fa469e21bd947a1b976ae5e584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14de1c8677ea9a164e1713e68861b55fa4b8fd0bff40211afa3ee9e883c65092\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93380cd1156afc3c34f019f5026f01b5242b9009c4c5de83edf5301e4cdb7b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b74b7cf89e0ba8d3417d38fdfc78b271cb23bc6e737358303956214c8f9a0f9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fb5b0196fd005013eba91e6ea563d14230abf0a8cbc1d2a4bf9cd4491f7c391\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T05:41:22Z\\\",\\\"message\\\":\\\"W1012 05:41:11.395066 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 05:41:11.395683 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760247671 cert, and key in /tmp/serving-cert-1865248722/serving-signer.crt, /tmp/serving-cert-1865248722/serving-signer.key\\\\nI1012 05:41:11.668419 1 observer_polling.go:159] Starting file observer\\\\nW1012 05:41:11.671952 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 05:41:11.672482 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 05:41:11.675332 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1865248722/tls.crt::/tmp/serving-cert-1865248722/tls.key\\\\\\\"\\\\nF1012 05:41:22.294380 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9e13b7dbf76d9dcdaa4a8edc40eb5ebac2c044e9f171f78f18b33a08784ed28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:34Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.726202 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:34Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.733382 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.733417 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.733429 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.733449 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.733461 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:34Z","lastTransitionTime":"2025-10-12T05:41:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.742839 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:34Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.759800 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://489b43d7659655253d7055c2725d6b59d594b9eea5d1e6dc78b015f8d44824ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:34Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.771691 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-br2vl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"545bf51c-0b04-4166-a984-ec9c1276470a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6148cdf3aed29894038db482881c0cf3d0fd2d99c241e87d864099a3eb4248f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tpks\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-br2vl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:34Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.785763 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jd2dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"835a1f98-4ae1-499b-b08c-a87dbcf8eaf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7816344abd767a7cdc2aa06418215233815285a683bb5a7fceeeb9a383b00bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvkzz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jd2dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:34Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.835803 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.835843 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.835853 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.835871 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.835882 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:34Z","lastTransitionTime":"2025-10-12T05:41:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.938389 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.938421 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.938430 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.938445 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.938455 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:34Z","lastTransitionTime":"2025-10-12T05:41:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.945836 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.952350 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.959068 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.970617 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tq29s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1c3ae9e-26ae-418f-b261-eabc4302b332\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbe4d76b294d8c174b77355aae9ec55f4d38c04446fc048603a294d171332584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s29kh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tq29s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:34Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:34 crc kubenswrapper[4930]: I1012 05:41:34.982590 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02f8684c-a3e4-44e8-9741-9f54488d8d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74c17edeca2e2b56f4be36327914c7232a5eee3190d7cd9ce54fe81050942ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqrgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2193886cf3e24fbf30c3e6f91ab52d9b7e68968c8b42a4d0deafab5668555701\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqrgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mk4tf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:34Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.001170 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c87fc1896b611cf87d3f6e8dce960e0aea4e0a5e161d12e046bec5a4d46d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:34Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.018882 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:35Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.042066 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.042440 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.042450 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.042467 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.042482 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:35Z","lastTransitionTime":"2025-10-12T05:41:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.059784 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd0aa975-5017-4585-b9d9-66563c734f99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2ee3c6242b1dd92c05cad132f941c6463862d38dd3f32db106c4458b19ca281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce0b341a84353da33af118a985d78e1926365e672cd9d9a94a8c9ccdeeeef68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://772b3ac684f8d59be06d3921919db10572b68d9acfb6ae64e85b92c637963057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04341c7af2fbffb89ce55a0ed2594bdb4fd943ac9088994f81f9245c31c62a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c844939574ab6101aa35b883fd6b7dfabb8338c2de1bc238a60b92f7e0fc99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:35Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.089336 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vwttt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d928520f-ca1d-4cca-b966-c1e6c9168db0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://271a5e8f9d11a30e34aaa4965e6d9d036cfc47d469d87a2ff0ddcae5ba4e8fe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://271a5e8f9d11a30e34aaa4965e6d9d036cfc47d469d87a2ff0ddcae5ba4e8fe8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a7387bc9fd2f9b98dd174adbd738309e22b90e1c82756e0e4c114cd5bc1279a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a7387bc9fd2f9b98dd174adbd738309e22b90e1c82756e0e4c114cd5bc1279a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b73defe23757469590b3d3d92e7b79bf3b6dcbcaec44d750770cd2b94caaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6b73defe23757469590b3d3d92e7b79bf3b6dcbcaec44d750770cd2b94caaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c379d6c40705bccb05ab8835e3b2410fc0a98db9e372b4dbd62cc3d4c1c5470\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c379d6c40705bccb05ab8835e3b2410fc0a98db9e372b4dbd62cc3d4c1c5470\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vwttt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:35Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.118632 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0add0fa2-092f-4dcc-8c72-82881564bf63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mdhw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:35Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.132853 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-br2vl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"545bf51c-0b04-4166-a984-ec9c1276470a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6148cdf3aed29894038db482881c0cf3d0fd2d99c241e87d864099a3eb4248f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tpks\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-br2vl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:35Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.145283 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.145369 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.145394 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.145425 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.145444 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:35Z","lastTransitionTime":"2025-10-12T05:41:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.150919 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jd2dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"835a1f98-4ae1-499b-b08c-a87dbcf8eaf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7816344abd767a7cdc2aa06418215233815285a683bb5a7fceeeb9a383b00bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvkzz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jd2dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:35Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.166597 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2acab1-2f7d-4497-ab0c-d2088825de18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://380b8da0bc8ba00ec7fe7a9dd1a4f49f0f0828fa469e21bd947a1b976ae5e584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14de1c8677ea9a164e1713e68861b55fa4b8fd0bff40211afa3ee9e883c65092\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93380cd1156afc3c34f019f5026f01b5242b9009c4c5de83edf5301e4cdb7b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b74b7cf89e0ba8d3417d38fdfc78b271cb23bc6e737358303956214c8f9a0f9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fb5b0196fd005013eba91e6ea563d14230abf0a8cbc1d2a4bf9cd4491f7c391\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T05:41:22Z\\\",\\\"message\\\":\\\"W1012 05:41:11.395066 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 05:41:11.395683 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760247671 cert, and key in /tmp/serving-cert-1865248722/serving-signer.crt, /tmp/serving-cert-1865248722/serving-signer.key\\\\nI1012 05:41:11.668419 1 observer_polling.go:159] Starting file observer\\\\nW1012 05:41:11.671952 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 05:41:11.672482 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 05:41:11.675332 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1865248722/tls.crt::/tmp/serving-cert-1865248722/tls.key\\\\\\\"\\\\nF1012 05:41:22.294380 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9e13b7dbf76d9dcdaa4a8edc40eb5ebac2c044e9f171f78f18b33a08784ed28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:35Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.187633 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:35Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.207035 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:35Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.226653 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://489b43d7659655253d7055c2725d6b59d594b9eea5d1e6dc78b015f8d44824ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:35Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.240721 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d92840f87187d6a4e0a40972f9b1601b70c300104b6023b980e0939e8b83eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d6f5f971b8708420f74c2b9602179815dd69145979d42bbaeb899290c01c286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:35Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.247545 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.247592 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.247608 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.247628 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.247641 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:35Z","lastTransitionTime":"2025-10-12T05:41:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.261762 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd0aa975-5017-4585-b9d9-66563c734f99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2ee3c6242b1dd92c05cad132f941c6463862d38dd3f32db106c4458b19ca281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce0b341a84353da33af118a985d78e1926365e672cd9d9a94a8c9ccdeeeef68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://772b3ac684f8d59be06d3921919db10572b68d9acfb6ae64e85b92c637963057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04341c7af2fbffb89ce55a0ed2594bdb4fd943ac9088994f81f9245c31c62a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c844939574ab6101aa35b883fd6b7dfabb8338c2de1bc238a60b92f7e0fc99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:35Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.279289 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vwttt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d928520f-ca1d-4cca-b966-c1e6c9168db0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://271a5e8f9d11a30e34aaa4965e6d9d036cfc47d469d87a2ff0ddcae5ba4e8fe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://271a5e8f9d11a30e34aaa4965e6d9d036cfc47d469d87a2ff0ddcae5ba4e8fe8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a7387bc9fd2f9b98dd174adbd738309e22b90e1c82756e0e4c114cd5bc1279a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a7387bc9fd2f9b98dd174adbd738309e22b90e1c82756e0e4c114cd5bc1279a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b73defe23757469590b3d3d92e7b79bf3b6dcbcaec44d750770cd2b94caaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6b73defe23757469590b3d3d92e7b79bf3b6dcbcaec44d750770cd2b94caaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c379d6c40705bccb05ab8835e3b2410fc0a98db9e372b4dbd62cc3d4c1c5470\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c379d6c40705bccb05ab8835e3b2410fc0a98db9e372b4dbd62cc3d4c1c5470\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vwttt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:35Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.320902 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0add0fa2-092f-4dcc-8c72-82881564bf63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mdhw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:35Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.338932 4930 generic.go:334] "Generic (PLEG): container finished" podID="d928520f-ca1d-4cca-b966-c1e6c9168db0" containerID="eb5e3c8e16c925694d42868813bc57edd9c50c88f2153189a2b9f8451655aff1" exitCode=0 Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.338984 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vwttt" event={"ID":"d928520f-ca1d-4cca-b966-c1e6c9168db0","Type":"ContainerDied","Data":"eb5e3c8e16c925694d42868813bc57edd9c50c88f2153189a2b9f8451655aff1"} Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.350064 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.350102 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.350114 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.350131 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.350146 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:35Z","lastTransitionTime":"2025-10-12T05:41:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.361359 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2acab1-2f7d-4497-ab0c-d2088825de18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://380b8da0bc8ba00ec7fe7a9dd1a4f49f0f0828fa469e21bd947a1b976ae5e584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14de1c8677ea9a164e1713e68861b55fa4b8fd0bff40211afa3ee9e883c65092\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93380cd1156afc3c34f019f5026f01b5242b9009c4c5de83edf5301e4cdb7b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b74b7cf89e0ba8d3417d38fdfc78b271cb23bc6e737358303956214c8f9a0f9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fb5b0196fd005013eba91e6ea563d14230abf0a8cbc1d2a4bf9cd4491f7c391\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T05:41:22Z\\\",\\\"message\\\":\\\"W1012 05:41:11.395066 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 05:41:11.395683 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760247671 cert, and key in /tmp/serving-cert-1865248722/serving-signer.crt, /tmp/serving-cert-1865248722/serving-signer.key\\\\nI1012 05:41:11.668419 1 observer_polling.go:159] Starting file observer\\\\nW1012 05:41:11.671952 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 05:41:11.672482 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 05:41:11.675332 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1865248722/tls.crt::/tmp/serving-cert-1865248722/tls.key\\\\\\\"\\\\nF1012 05:41:22.294380 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9e13b7dbf76d9dcdaa4a8edc40eb5ebac2c044e9f171f78f18b33a08784ed28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:35Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:35 crc kubenswrapper[4930]: E1012 05:41:35.375832 4930 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.414868 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:35Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.453594 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.453642 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.453654 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.453674 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.453689 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:35Z","lastTransitionTime":"2025-10-12T05:41:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.454225 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:35Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.496050 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://489b43d7659655253d7055c2725d6b59d594b9eea5d1e6dc78b015f8d44824ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:35Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.534928 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-br2vl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"545bf51c-0b04-4166-a984-ec9c1276470a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6148cdf3aed29894038db482881c0cf3d0fd2d99c241e87d864099a3eb4248f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tpks\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-br2vl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:35Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.556359 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.556400 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.556412 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.556431 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.556444 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:35Z","lastTransitionTime":"2025-10-12T05:41:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.575075 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jd2dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"835a1f98-4ae1-499b-b08c-a87dbcf8eaf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7816344abd767a7cdc2aa06418215233815285a683bb5a7fceeeb9a383b00bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvkzz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jd2dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:35Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.616995 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e930f9f-8120-43b6-95ba-39319c069f64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://428593f6ee3749d62d3cb8553ec6707f0f13f70f7750b8ea90fb0b9f0aa8f337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05e5821bdc37103cb95d9f2a3dcacbff75c87335a6c6ed400846ea49fe370c00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e44390f068c28c4a9917648c747a322ddfa44915b128a61f145d6e2e23fc167a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://586395a6006e26fe969c2ef9af6ec6299523c0868c26989709389f5af24b7e39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:35Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.657258 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d92840f87187d6a4e0a40972f9b1601b70c300104b6023b980e0939e8b83eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d6f5f971b8708420f74c2b9602179815dd69145979d42bbaeb899290c01c286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:35Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.660050 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.660107 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.660119 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.660140 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.660177 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:35Z","lastTransitionTime":"2025-10-12T05:41:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.701058 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c87fc1896b611cf87d3f6e8dce960e0aea4e0a5e161d12e046bec5a4d46d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:35Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.734418 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:35Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.763512 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.763541 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.763550 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.763565 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.763574 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:35Z","lastTransitionTime":"2025-10-12T05:41:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.774706 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tq29s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1c3ae9e-26ae-418f-b261-eabc4302b332\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbe4d76b294d8c174b77355aae9ec55f4d38c04446fc048603a294d171332584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s29kh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tq29s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:35Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.818571 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02f8684c-a3e4-44e8-9741-9f54488d8d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74c17edeca2e2b56f4be36327914c7232a5eee3190d7cd9ce54fe81050942ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqrgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2193886cf3e24fbf30c3e6f91ab52d9b7e68968c8b42a4d0deafab5668555701\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqrgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mk4tf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:35Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.857356 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c87fc1896b611cf87d3f6e8dce960e0aea4e0a5e161d12e046bec5a4d46d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:35Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.868290 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.868344 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.868355 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.868375 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.868635 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:35Z","lastTransitionTime":"2025-10-12T05:41:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.896234 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.896440 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.896495 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.896540 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.896575 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 05:41:35 crc kubenswrapper[4930]: E1012 05:41:35.896902 4930 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 12 05:41:35 crc kubenswrapper[4930]: E1012 05:41:35.896989 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-12 05:41:43.896962162 +0000 UTC m=+36.439063937 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 12 05:41:35 crc kubenswrapper[4930]: E1012 05:41:35.897085 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 05:41:43.897075894 +0000 UTC m=+36.439177679 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:41:35 crc kubenswrapper[4930]: E1012 05:41:35.897167 4930 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 12 05:41:35 crc kubenswrapper[4930]: E1012 05:41:35.897183 4930 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 12 05:41:35 crc kubenswrapper[4930]: E1012 05:41:35.897202 4930 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 05:41:35 crc kubenswrapper[4930]: E1012 05:41:35.897244 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-12 05:41:43.897231877 +0000 UTC m=+36.439333662 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 05:41:35 crc kubenswrapper[4930]: E1012 05:41:35.897300 4930 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 12 05:41:35 crc kubenswrapper[4930]: E1012 05:41:35.897334 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-12 05:41:43.897325959 +0000 UTC m=+36.439427744 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 12 05:41:35 crc kubenswrapper[4930]: E1012 05:41:35.897398 4930 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 12 05:41:35 crc kubenswrapper[4930]: E1012 05:41:35.897412 4930 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 12 05:41:35 crc kubenswrapper[4930]: E1012 05:41:35.897423 4930 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 05:41:35 crc kubenswrapper[4930]: E1012 05:41:35.897449 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-12 05:41:43.897440951 +0000 UTC m=+36.439542736 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.902651 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:35Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.945533 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tq29s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1c3ae9e-26ae-418f-b261-eabc4302b332\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbe4d76b294d8c174b77355aae9ec55f4d38c04446fc048603a294d171332584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s29kh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tq29s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:35Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.972514 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.972596 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.972618 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.972655 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.972678 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:35Z","lastTransitionTime":"2025-10-12T05:41:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:35 crc kubenswrapper[4930]: I1012 05:41:35.980989 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02f8684c-a3e4-44e8-9741-9f54488d8d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74c17edeca2e2b56f4be36327914c7232a5eee3190d7cd9ce54fe81050942ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqrgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2193886cf3e24fbf30c3e6f91ab52d9b7e68968c8b42a4d0deafab5668555701\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqrgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mk4tf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:35Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:36 crc kubenswrapper[4930]: I1012 05:41:36.033359 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd0aa975-5017-4585-b9d9-66563c734f99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2ee3c6242b1dd92c05cad132f941c6463862d38dd3f32db106c4458b19ca281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce0b341a84353da33af118a985d78e1926365e672cd9d9a94a8c9ccdeeeef68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://772b3ac684f8d59be06d3921919db10572b68d9acfb6ae64e85b92c637963057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04341c7af2fbffb89ce55a0ed2594bdb4fd943ac9088994f81f9245c31c62a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c844939574ab6101aa35b883fd6b7dfabb8338c2de1bc238a60b92f7e0fc99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:36Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:36 crc kubenswrapper[4930]: I1012 05:41:36.067638 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vwttt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d928520f-ca1d-4cca-b966-c1e6c9168db0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://271a5e8f9d11a30e34aaa4965e6d9d036cfc47d469d87a2ff0ddcae5ba4e8fe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://271a5e8f9d11a30e34aaa4965e6d9d036cfc47d469d87a2ff0ddcae5ba4e8fe8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a7387bc9fd2f9b98dd174adbd738309e22b90e1c82756e0e4c114cd5bc1279a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a7387bc9fd2f9b98dd174adbd738309e22b90e1c82756e0e4c114cd5bc1279a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b73defe23757469590b3d3d92e7b79bf3b6dcbcaec44d750770cd2b94caaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6b73defe23757469590b3d3d92e7b79bf3b6dcbcaec44d750770cd2b94caaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c379d6c40705bccb05ab8835e3b2410fc0a98db9e372b4dbd62cc3d4c1c5470\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c379d6c40705bccb05ab8835e3b2410fc0a98db9e372b4dbd62cc3d4c1c5470\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5e3c8e16c925694d42868813bc57edd9c50c88f2153189a2b9f8451655aff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5e3c8e16c925694d42868813bc57edd9c50c88f2153189a2b9f8451655aff1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vwttt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:36Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:36 crc kubenswrapper[4930]: I1012 05:41:36.077181 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:36 crc kubenswrapper[4930]: I1012 05:41:36.077245 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:36 crc kubenswrapper[4930]: I1012 05:41:36.077264 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:36 crc kubenswrapper[4930]: I1012 05:41:36.077295 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:36 crc kubenswrapper[4930]: I1012 05:41:36.077315 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:36Z","lastTransitionTime":"2025-10-12T05:41:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:36 crc kubenswrapper[4930]: I1012 05:41:36.114339 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0add0fa2-092f-4dcc-8c72-82881564bf63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mdhw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:36Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:36 crc kubenswrapper[4930]: I1012 05:41:36.135287 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 05:41:36 crc kubenswrapper[4930]: I1012 05:41:36.135454 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 05:41:36 crc kubenswrapper[4930]: I1012 05:41:36.135828 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 05:41:36 crc kubenswrapper[4930]: E1012 05:41:36.135824 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 05:41:36 crc kubenswrapper[4930]: E1012 05:41:36.136139 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 05:41:36 crc kubenswrapper[4930]: E1012 05:41:36.136234 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 05:41:36 crc kubenswrapper[4930]: I1012 05:41:36.144231 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2acab1-2f7d-4497-ab0c-d2088825de18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://380b8da0bc8ba00ec7fe7a9dd1a4f49f0f0828fa469e21bd947a1b976ae5e584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14de1c8677ea9a164e1713e68861b55fa4b8fd0bff40211afa3ee9e883c65092\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93380cd1156afc3c34f019f5026f01b5242b9009c4c5de83edf5301e4cdb7b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b74b7cf89e0ba8d3417d38fdfc78b271cb23bc6e737358303956214c8f9a0f9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fb5b0196fd005013eba91e6ea563d14230abf0a8cbc1d2a4bf9cd4491f7c391\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T05:41:22Z\\\",\\\"message\\\":\\\"W1012 05:41:11.395066 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 05:41:11.395683 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760247671 cert, and key in /tmp/serving-cert-1865248722/serving-signer.crt, /tmp/serving-cert-1865248722/serving-signer.key\\\\nI1012 05:41:11.668419 1 observer_polling.go:159] Starting file observer\\\\nW1012 05:41:11.671952 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 05:41:11.672482 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 05:41:11.675332 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1865248722/tls.crt::/tmp/serving-cert-1865248722/tls.key\\\\\\\"\\\\nF1012 05:41:22.294380 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9e13b7dbf76d9dcdaa4a8edc40eb5ebac2c044e9f171f78f18b33a08784ed28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:36Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:36 crc kubenswrapper[4930]: I1012 05:41:36.181604 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:36Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:36 crc kubenswrapper[4930]: I1012 05:41:36.181996 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:36 crc kubenswrapper[4930]: I1012 05:41:36.182083 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:36 crc kubenswrapper[4930]: I1012 05:41:36.182103 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:36 crc kubenswrapper[4930]: I1012 05:41:36.182158 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:36 crc kubenswrapper[4930]: I1012 05:41:36.182179 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:36Z","lastTransitionTime":"2025-10-12T05:41:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:36 crc kubenswrapper[4930]: I1012 05:41:36.220187 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:36Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:36 crc kubenswrapper[4930]: I1012 05:41:36.262485 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://489b43d7659655253d7055c2725d6b59d594b9eea5d1e6dc78b015f8d44824ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:36Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:36 crc kubenswrapper[4930]: I1012 05:41:36.285601 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:36 crc kubenswrapper[4930]: I1012 05:41:36.285666 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:36 crc kubenswrapper[4930]: I1012 05:41:36.285680 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:36 crc kubenswrapper[4930]: I1012 05:41:36.285703 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:36 crc kubenswrapper[4930]: I1012 05:41:36.285719 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:36Z","lastTransitionTime":"2025-10-12T05:41:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:36 crc kubenswrapper[4930]: I1012 05:41:36.297771 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-br2vl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"545bf51c-0b04-4166-a984-ec9c1276470a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6148cdf3aed29894038db482881c0cf3d0fd2d99c241e87d864099a3eb4248f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tpks\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-br2vl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:36Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:36 crc kubenswrapper[4930]: I1012 05:41:36.336403 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jd2dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"835a1f98-4ae1-499b-b08c-a87dbcf8eaf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7816344abd767a7cdc2aa06418215233815285a683bb5a7fceeeb9a383b00bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvkzz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jd2dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:36Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:36 crc kubenswrapper[4930]: I1012 05:41:36.360257 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vwttt" event={"ID":"d928520f-ca1d-4cca-b966-c1e6c9168db0","Type":"ContainerDied","Data":"5d01f2b61f7fe3605f4ad0626d2cf28756a8c7e3de343b9a20acafe7f35c19bf"} Oct 12 05:41:36 crc kubenswrapper[4930]: I1012 05:41:36.360175 4930 generic.go:334] "Generic (PLEG): container finished" podID="d928520f-ca1d-4cca-b966-c1e6c9168db0" containerID="5d01f2b61f7fe3605f4ad0626d2cf28756a8c7e3de343b9a20acafe7f35c19bf" exitCode=0 Oct 12 05:41:36 crc kubenswrapper[4930]: I1012 05:41:36.394783 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e930f9f-8120-43b6-95ba-39319c069f64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://428593f6ee3749d62d3cb8553ec6707f0f13f70f7750b8ea90fb0b9f0aa8f337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05e5821bdc37103cb95d9f2a3dcacbff75c87335a6c6ed400846ea49fe370c00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e44390f068c28c4a9917648c747a322ddfa44915b128a61f145d6e2e23fc167a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://586395a6006e26fe969c2ef9af6ec6299523c0868c26989709389f5af24b7e39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:36Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:36 crc kubenswrapper[4930]: I1012 05:41:36.397424 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:36 crc kubenswrapper[4930]: I1012 05:41:36.397491 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:36 crc kubenswrapper[4930]: I1012 05:41:36.397514 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:36 crc kubenswrapper[4930]: I1012 05:41:36.397545 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:36 crc kubenswrapper[4930]: I1012 05:41:36.397566 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:36Z","lastTransitionTime":"2025-10-12T05:41:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:36 crc kubenswrapper[4930]: I1012 05:41:36.423785 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d92840f87187d6a4e0a40972f9b1601b70c300104b6023b980e0939e8b83eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d6f5f971b8708420f74c2b9602179815dd69145979d42bbaeb899290c01c286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:36Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:36 crc kubenswrapper[4930]: I1012 05:41:36.463998 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2acab1-2f7d-4497-ab0c-d2088825de18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://380b8da0bc8ba00ec7fe7a9dd1a4f49f0f0828fa469e21bd947a1b976ae5e584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14de1c8677ea9a164e1713e68861b55fa4b8fd0bff40211afa3ee9e883c65092\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93380cd1156afc3c34f019f5026f01b5242b9009c4c5de83edf5301e4cdb7b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b74b7cf89e0ba8d3417d38fdfc78b271cb23bc6e737358303956214c8f9a0f9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fb5b0196fd005013eba91e6ea563d14230abf0a8cbc1d2a4bf9cd4491f7c391\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T05:41:22Z\\\",\\\"message\\\":\\\"W1012 05:41:11.395066 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 05:41:11.395683 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760247671 cert, and key in /tmp/serving-cert-1865248722/serving-signer.crt, /tmp/serving-cert-1865248722/serving-signer.key\\\\nI1012 05:41:11.668419 1 observer_polling.go:159] Starting file observer\\\\nW1012 05:41:11.671952 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 05:41:11.672482 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 05:41:11.675332 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1865248722/tls.crt::/tmp/serving-cert-1865248722/tls.key\\\\\\\"\\\\nF1012 05:41:22.294380 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9e13b7dbf76d9dcdaa4a8edc40eb5ebac2c044e9f171f78f18b33a08784ed28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:36Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:36 crc kubenswrapper[4930]: I1012 05:41:36.498516 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:36Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:36 crc kubenswrapper[4930]: I1012 05:41:36.500423 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:36 crc kubenswrapper[4930]: I1012 05:41:36.500486 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:36 crc kubenswrapper[4930]: I1012 05:41:36.500499 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:36 crc kubenswrapper[4930]: I1012 05:41:36.500521 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:36 crc kubenswrapper[4930]: I1012 05:41:36.500534 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:36Z","lastTransitionTime":"2025-10-12T05:41:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:36 crc kubenswrapper[4930]: I1012 05:41:36.538152 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:36Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:36 crc kubenswrapper[4930]: I1012 05:41:36.576371 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://489b43d7659655253d7055c2725d6b59d594b9eea5d1e6dc78b015f8d44824ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:36Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:36 crc kubenswrapper[4930]: I1012 05:41:36.603682 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:36 crc kubenswrapper[4930]: I1012 05:41:36.603715 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:36 crc kubenswrapper[4930]: I1012 05:41:36.603725 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:36 crc kubenswrapper[4930]: I1012 05:41:36.603772 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:36 crc kubenswrapper[4930]: I1012 05:41:36.603784 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:36Z","lastTransitionTime":"2025-10-12T05:41:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:36 crc kubenswrapper[4930]: I1012 05:41:36.617032 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-br2vl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"545bf51c-0b04-4166-a984-ec9c1276470a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6148cdf3aed29894038db482881c0cf3d0fd2d99c241e87d864099a3eb4248f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tpks\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-br2vl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:36Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:36 crc kubenswrapper[4930]: I1012 05:41:36.654424 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jd2dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"835a1f98-4ae1-499b-b08c-a87dbcf8eaf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7816344abd767a7cdc2aa06418215233815285a683bb5a7fceeeb9a383b00bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvkzz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jd2dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:36Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:36 crc kubenswrapper[4930]: I1012 05:41:36.697073 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e930f9f-8120-43b6-95ba-39319c069f64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://428593f6ee3749d62d3cb8553ec6707f0f13f70f7750b8ea90fb0b9f0aa8f337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05e5821bdc37103cb95d9f2a3dcacbff75c87335a6c6ed400846ea49fe370c00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e44390f068c28c4a9917648c747a322ddfa44915b128a61f145d6e2e23fc167a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://586395a6006e26fe969c2ef9af6ec6299523c0868c26989709389f5af24b7e39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:36Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:36 crc kubenswrapper[4930]: I1012 05:41:36.707198 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:36 crc kubenswrapper[4930]: I1012 05:41:36.707254 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:36 crc kubenswrapper[4930]: I1012 05:41:36.707270 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:36 crc kubenswrapper[4930]: I1012 05:41:36.707292 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:36 crc kubenswrapper[4930]: I1012 05:41:36.707308 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:36Z","lastTransitionTime":"2025-10-12T05:41:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:36 crc kubenswrapper[4930]: I1012 05:41:36.738398 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d92840f87187d6a4e0a40972f9b1601b70c300104b6023b980e0939e8b83eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d6f5f971b8708420f74c2b9602179815dd69145979d42bbaeb899290c01c286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:36Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:36 crc kubenswrapper[4930]: I1012 05:41:36.779411 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c87fc1896b611cf87d3f6e8dce960e0aea4e0a5e161d12e046bec5a4d46d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:36Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:36 crc kubenswrapper[4930]: I1012 05:41:36.810725 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:36 crc kubenswrapper[4930]: I1012 05:41:36.810829 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:36 crc kubenswrapper[4930]: I1012 05:41:36.810873 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:36 crc kubenswrapper[4930]: I1012 05:41:36.810900 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:36 crc kubenswrapper[4930]: I1012 05:41:36.810914 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:36Z","lastTransitionTime":"2025-10-12T05:41:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:36 crc kubenswrapper[4930]: I1012 05:41:36.814059 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:36Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:36 crc kubenswrapper[4930]: I1012 05:41:36.856841 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tq29s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1c3ae9e-26ae-418f-b261-eabc4302b332\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbe4d76b294d8c174b77355aae9ec55f4d38c04446fc048603a294d171332584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s29kh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tq29s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:36Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:36 crc kubenswrapper[4930]: I1012 05:41:36.896516 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02f8684c-a3e4-44e8-9741-9f54488d8d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74c17edeca2e2b56f4be36327914c7232a5eee3190d7cd9ce54fe81050942ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqrgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2193886cf3e24fbf30c3e6f91ab52d9b7e68968c8b42a4d0deafab5668555701\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqrgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mk4tf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:36Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:36 crc kubenswrapper[4930]: I1012 05:41:36.919028 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:36 crc kubenswrapper[4930]: I1012 05:41:36.919097 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:36 crc kubenswrapper[4930]: I1012 05:41:36.919117 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:36 crc kubenswrapper[4930]: I1012 05:41:36.919145 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:36 crc kubenswrapper[4930]: I1012 05:41:36.919165 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:36Z","lastTransitionTime":"2025-10-12T05:41:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:36 crc kubenswrapper[4930]: I1012 05:41:36.957884 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd0aa975-5017-4585-b9d9-66563c734f99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2ee3c6242b1dd92c05cad132f941c6463862d38dd3f32db106c4458b19ca281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce0b341a84353da33af118a985d78e1926365e672cd9d9a94a8c9ccdeeeef68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://772b3ac684f8d59be06d3921919db10572b68d9acfb6ae64e85b92c637963057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04341c7af2fbffb89ce55a0ed2594bdb4fd943ac9088994f81f9245c31c62a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c844939574ab6101aa35b883fd6b7dfabb8338c2de1bc238a60b92f7e0fc99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:36Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:36 crc kubenswrapper[4930]: I1012 05:41:36.979341 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vwttt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d928520f-ca1d-4cca-b966-c1e6c9168db0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://271a5e8f9d11a30e34aaa4965e6d9d036cfc47d469d87a2ff0ddcae5ba4e8fe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://271a5e8f9d11a30e34aaa4965e6d9d036cfc47d469d87a2ff0ddcae5ba4e8fe8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a7387bc9fd2f9b98dd174adbd738309e22b90e1c82756e0e4c114cd5bc1279a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a7387bc9fd2f9b98dd174adbd738309e22b90e1c82756e0e4c114cd5bc1279a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b73defe23757469590b3d3d92e7b79bf3b6dcbcaec44d750770cd2b94caaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6b73defe23757469590b3d3d92e7b79bf3b6dcbcaec44d750770cd2b94caaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c379d6c40705bccb05ab8835e3b2410fc0a98db9e372b4dbd62cc3d4c1c5470\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c379d6c40705bccb05ab8835e3b2410fc0a98db9e372b4dbd62cc3d4c1c5470\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5e3c8e16c925694d42868813bc57edd9c50c88f2153189a2b9f8451655aff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5e3c8e16c925694d42868813bc57edd9c50c88f2153189a2b9f8451655aff1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d01f2b61f7fe3605f4ad0626d2cf28756a8c7e3de343b9a20acafe7f35c19bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d01f2b61f7fe3605f4ad0626d2cf28756a8c7e3de343b9a20acafe7f35c19bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vwttt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:36Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:37 crc kubenswrapper[4930]: I1012 05:41:37.020664 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0add0fa2-092f-4dcc-8c72-82881564bf63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mdhw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:37Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:37 crc kubenswrapper[4930]: I1012 05:41:37.022299 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:37 crc kubenswrapper[4930]: I1012 05:41:37.022368 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:37 crc kubenswrapper[4930]: I1012 05:41:37.022388 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:37 crc kubenswrapper[4930]: I1012 05:41:37.022417 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:37 crc kubenswrapper[4930]: I1012 05:41:37.022439 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:37Z","lastTransitionTime":"2025-10-12T05:41:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:37 crc kubenswrapper[4930]: I1012 05:41:37.126604 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:37 crc kubenswrapper[4930]: I1012 05:41:37.126672 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:37 crc kubenswrapper[4930]: I1012 05:41:37.126691 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:37 crc kubenswrapper[4930]: I1012 05:41:37.126717 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:37 crc kubenswrapper[4930]: I1012 05:41:37.126732 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:37Z","lastTransitionTime":"2025-10-12T05:41:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:37 crc kubenswrapper[4930]: I1012 05:41:37.230002 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:37 crc kubenswrapper[4930]: I1012 05:41:37.230057 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:37 crc kubenswrapper[4930]: I1012 05:41:37.230071 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:37 crc kubenswrapper[4930]: I1012 05:41:37.230093 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:37 crc kubenswrapper[4930]: I1012 05:41:37.230110 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:37Z","lastTransitionTime":"2025-10-12T05:41:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:37 crc kubenswrapper[4930]: I1012 05:41:37.334276 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:37 crc kubenswrapper[4930]: I1012 05:41:37.334355 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:37 crc kubenswrapper[4930]: I1012 05:41:37.334380 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:37 crc kubenswrapper[4930]: I1012 05:41:37.334421 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:37 crc kubenswrapper[4930]: I1012 05:41:37.334447 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:37Z","lastTransitionTime":"2025-10-12T05:41:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:37 crc kubenswrapper[4930]: I1012 05:41:37.371231 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" event={"ID":"0add0fa2-092f-4dcc-8c72-82881564bf63","Type":"ContainerStarted","Data":"6016f66ca9ca63c708c040eb6f9cdd047860fc43bc2877294360b3ea8b0a3289"} Oct 12 05:41:37 crc kubenswrapper[4930]: I1012 05:41:37.371620 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" Oct 12 05:41:37 crc kubenswrapper[4930]: I1012 05:41:37.376144 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vwttt" event={"ID":"d928520f-ca1d-4cca-b966-c1e6c9168db0","Type":"ContainerStarted","Data":"083fd9c4a4a835276945402005503980476de4e834b8c77b4347cbfd74347baf"} Oct 12 05:41:37 crc kubenswrapper[4930]: I1012 05:41:37.448876 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:37Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:37 crc kubenswrapper[4930]: I1012 05:41:37.455110 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:37 crc kubenswrapper[4930]: I1012 05:41:37.455160 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:37 crc kubenswrapper[4930]: I1012 05:41:37.455172 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:37 crc kubenswrapper[4930]: I1012 05:41:37.455190 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:37 crc kubenswrapper[4930]: I1012 05:41:37.455201 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:37Z","lastTransitionTime":"2025-10-12T05:41:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:37 crc kubenswrapper[4930]: I1012 05:41:37.475560 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://489b43d7659655253d7055c2725d6b59d594b9eea5d1e6dc78b015f8d44824ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:37Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:37 crc kubenswrapper[4930]: I1012 05:41:37.478427 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" Oct 12 05:41:37 crc kubenswrapper[4930]: I1012 05:41:37.488130 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-br2vl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"545bf51c-0b04-4166-a984-ec9c1276470a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6148cdf3aed29894038db482881c0cf3d0fd2d99c241e87d864099a3eb4248f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tpks\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-br2vl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:37Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:37 crc kubenswrapper[4930]: I1012 05:41:37.500966 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jd2dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"835a1f98-4ae1-499b-b08c-a87dbcf8eaf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7816344abd767a7cdc2aa06418215233815285a683bb5a7fceeeb9a383b00bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvkzz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jd2dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:37Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:37 crc kubenswrapper[4930]: I1012 05:41:37.521140 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2acab1-2f7d-4497-ab0c-d2088825de18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://380b8da0bc8ba00ec7fe7a9dd1a4f49f0f0828fa469e21bd947a1b976ae5e584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14de1c8677ea9a164e1713e68861b55fa4b8fd0bff40211afa3ee9e883c65092\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93380cd1156afc3c34f019f5026f01b5242b9009c4c5de83edf5301e4cdb7b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b74b7cf89e0ba8d3417d38fdfc78b271cb23bc6e737358303956214c8f9a0f9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fb5b0196fd005013eba91e6ea563d14230abf0a8cbc1d2a4bf9cd4491f7c391\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T05:41:22Z\\\",\\\"message\\\":\\\"W1012 05:41:11.395066 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 05:41:11.395683 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760247671 cert, and key in /tmp/serving-cert-1865248722/serving-signer.crt, /tmp/serving-cert-1865248722/serving-signer.key\\\\nI1012 05:41:11.668419 1 observer_polling.go:159] Starting file observer\\\\nW1012 05:41:11.671952 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 05:41:11.672482 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 05:41:11.675332 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1865248722/tls.crt::/tmp/serving-cert-1865248722/tls.key\\\\\\\"\\\\nF1012 05:41:22.294380 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9e13b7dbf76d9dcdaa4a8edc40eb5ebac2c044e9f171f78f18b33a08784ed28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:37Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:37 crc kubenswrapper[4930]: I1012 05:41:37.539600 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:37Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:37 crc kubenswrapper[4930]: I1012 05:41:37.553604 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e930f9f-8120-43b6-95ba-39319c069f64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://428593f6ee3749d62d3cb8553ec6707f0f13f70f7750b8ea90fb0b9f0aa8f337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05e5821bdc37103cb95d9f2a3dcacbff75c87335a6c6ed400846ea49fe370c00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e44390f068c28c4a9917648c747a322ddfa44915b128a61f145d6e2e23fc167a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://586395a6006e26fe969c2ef9af6ec6299523c0868c26989709389f5af24b7e39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:37Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:37 crc kubenswrapper[4930]: I1012 05:41:37.557856 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:37 crc kubenswrapper[4930]: I1012 05:41:37.557911 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:37 crc kubenswrapper[4930]: I1012 05:41:37.557930 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:37 crc kubenswrapper[4930]: I1012 05:41:37.557955 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:37 crc kubenswrapper[4930]: I1012 05:41:37.557970 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:37Z","lastTransitionTime":"2025-10-12T05:41:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:37 crc kubenswrapper[4930]: I1012 05:41:37.569789 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d92840f87187d6a4e0a40972f9b1601b70c300104b6023b980e0939e8b83eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d6f5f971b8708420f74c2b9602179815dd69145979d42bbaeb899290c01c286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:37Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:37 crc kubenswrapper[4930]: I1012 05:41:37.585114 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c87fc1896b611cf87d3f6e8dce960e0aea4e0a5e161d12e046bec5a4d46d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:37Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:37 crc kubenswrapper[4930]: I1012 05:41:37.600559 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:37Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:37 crc kubenswrapper[4930]: I1012 05:41:37.621520 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tq29s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1c3ae9e-26ae-418f-b261-eabc4302b332\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbe4d76b294d8c174b77355aae9ec55f4d38c04446fc048603a294d171332584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s29kh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tq29s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:37Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:37 crc kubenswrapper[4930]: I1012 05:41:37.640611 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02f8684c-a3e4-44e8-9741-9f54488d8d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74c17edeca2e2b56f4be36327914c7232a5eee3190d7cd9ce54fe81050942ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqrgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2193886cf3e24fbf30c3e6f91ab52d9b7e68968c8b42a4d0deafab5668555701\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqrgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mk4tf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:37Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:37 crc kubenswrapper[4930]: I1012 05:41:37.660656 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:37 crc kubenswrapper[4930]: I1012 05:41:37.660708 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:37 crc kubenswrapper[4930]: I1012 05:41:37.660718 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:37 crc kubenswrapper[4930]: I1012 05:41:37.660758 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:37 crc kubenswrapper[4930]: I1012 05:41:37.660772 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:37Z","lastTransitionTime":"2025-10-12T05:41:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:37 crc kubenswrapper[4930]: I1012 05:41:37.671653 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0add0fa2-092f-4dcc-8c72-82881564bf63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7495c144ce989bf10b08b199a7459ea85351b77941a3edcb0ef76a01b2e08164\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a4295347cdfdf0864fbc2b102904f73e19cc33021fc5a82eed61255c4641d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2cba08ba8fd74a521bdd005e51be52f80687a1ccf0f250d11b709c894be8235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1006beab015e55856b70b9bcb93bb81b541e6c593fae8fd366c361990d233914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17051b78f584f1199be3038a82d9e675d1b9fb4da939611577c0b05235a147a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9898a7d7e617b5eb8eab6b6d9cb5d2ea92e8fbb0f4b1e0e60711e8f4a5e3aef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6016f66ca9ca63c708c040eb6f9cdd047860fc43bc2877294360b3ea8b0a3289\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcac9953a6ca0ac812cb0d1aed7ff1179eed01831f4af4a098f3e6938b5bc666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mdhw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:37Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:37 crc kubenswrapper[4930]: I1012 05:41:37.699772 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd0aa975-5017-4585-b9d9-66563c734f99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2ee3c6242b1dd92c05cad132f941c6463862d38dd3f32db106c4458b19ca281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce0b341a84353da33af118a985d78e1926365e672cd9d9a94a8c9ccdeeeef68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://772b3ac684f8d59be06d3921919db10572b68d9acfb6ae64e85b92c637963057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04341c7af2fbffb89ce55a0ed2594bdb4fd943ac9088994f81f9245c31c62a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c844939574ab6101aa35b883fd6b7dfabb8338c2de1bc238a60b92f7e0fc99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:37Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:37 crc kubenswrapper[4930]: I1012 05:41:37.720538 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vwttt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d928520f-ca1d-4cca-b966-c1e6c9168db0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://271a5e8f9d11a30e34aaa4965e6d9d036cfc47d469d87a2ff0ddcae5ba4e8fe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://271a5e8f9d11a30e34aaa4965e6d9d036cfc47d469d87a2ff0ddcae5ba4e8fe8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a7387bc9fd2f9b98dd174adbd738309e22b90e1c82756e0e4c114cd5bc1279a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a7387bc9fd2f9b98dd174adbd738309e22b90e1c82756e0e4c114cd5bc1279a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b73defe23757469590b3d3d92e7b79bf3b6dcbcaec44d750770cd2b94caaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6b73defe23757469590b3d3d92e7b79bf3b6dcbcaec44d750770cd2b94caaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c379d6c40705bccb05ab8835e3b2410fc0a98db9e372b4dbd62cc3d4c1c5470\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c379d6c40705bccb05ab8835e3b2410fc0a98db9e372b4dbd62cc3d4c1c5470\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5e3c8e16c925694d42868813bc57edd9c50c88f2153189a2b9f8451655aff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5e3c8e16c925694d42868813bc57edd9c50c88f2153189a2b9f8451655aff1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d01f2b61f7fe3605f4ad0626d2cf28756a8c7e3de343b9a20acafe7f35c19bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d01f2b61f7fe3605f4ad0626d2cf28756a8c7e3de343b9a20acafe7f35c19bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vwttt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:37Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:37 crc kubenswrapper[4930]: I1012 05:41:37.738981 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://489b43d7659655253d7055c2725d6b59d594b9eea5d1e6dc78b015f8d44824ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:37Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:37 crc kubenswrapper[4930]: I1012 05:41:37.755212 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-br2vl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"545bf51c-0b04-4166-a984-ec9c1276470a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6148cdf3aed29894038db482881c0cf3d0fd2d99c241e87d864099a3eb4248f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tpks\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-br2vl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:37Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:37 crc kubenswrapper[4930]: I1012 05:41:37.763877 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:37 crc kubenswrapper[4930]: I1012 05:41:37.763943 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:37 crc kubenswrapper[4930]: I1012 05:41:37.763963 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:37 crc kubenswrapper[4930]: I1012 05:41:37.763994 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:37 crc kubenswrapper[4930]: I1012 05:41:37.764013 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:37Z","lastTransitionTime":"2025-10-12T05:41:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:37 crc kubenswrapper[4930]: I1012 05:41:37.773679 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jd2dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"835a1f98-4ae1-499b-b08c-a87dbcf8eaf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7816344abd767a7cdc2aa06418215233815285a683bb5a7fceeeb9a383b00bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvkzz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jd2dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:37Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:37 crc kubenswrapper[4930]: I1012 05:41:37.797186 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2acab1-2f7d-4497-ab0c-d2088825de18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://380b8da0bc8ba00ec7fe7a9dd1a4f49f0f0828fa469e21bd947a1b976ae5e584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14de1c8677ea9a164e1713e68861b55fa4b8fd0bff40211afa3ee9e883c65092\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93380cd1156afc3c34f019f5026f01b5242b9009c4c5de83edf5301e4cdb7b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b74b7cf89e0ba8d3417d38fdfc78b271cb23bc6e737358303956214c8f9a0f9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fb5b0196fd005013eba91e6ea563d14230abf0a8cbc1d2a4bf9cd4491f7c391\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T05:41:22Z\\\",\\\"message\\\":\\\"W1012 05:41:11.395066 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 05:41:11.395683 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760247671 cert, and key in /tmp/serving-cert-1865248722/serving-signer.crt, /tmp/serving-cert-1865248722/serving-signer.key\\\\nI1012 05:41:11.668419 1 observer_polling.go:159] Starting file observer\\\\nW1012 05:41:11.671952 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 05:41:11.672482 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 05:41:11.675332 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1865248722/tls.crt::/tmp/serving-cert-1865248722/tls.key\\\\\\\"\\\\nF1012 05:41:22.294380 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9e13b7dbf76d9dcdaa4a8edc40eb5ebac2c044e9f171f78f18b33a08784ed28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:37Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:37 crc kubenswrapper[4930]: I1012 05:41:37.815933 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:37Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:37 crc kubenswrapper[4930]: I1012 05:41:37.861201 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:37Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:37 crc kubenswrapper[4930]: I1012 05:41:37.867097 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:37 crc kubenswrapper[4930]: I1012 05:41:37.867170 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:37 crc kubenswrapper[4930]: I1012 05:41:37.867185 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:37 crc kubenswrapper[4930]: I1012 05:41:37.867206 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:37 crc kubenswrapper[4930]: I1012 05:41:37.867218 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:37Z","lastTransitionTime":"2025-10-12T05:41:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:37 crc kubenswrapper[4930]: I1012 05:41:37.896471 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e930f9f-8120-43b6-95ba-39319c069f64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://428593f6ee3749d62d3cb8553ec6707f0f13f70f7750b8ea90fb0b9f0aa8f337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05e5821bdc37103cb95d9f2a3dcacbff75c87335a6c6ed400846ea49fe370c00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e44390f068c28c4a9917648c747a322ddfa44915b128a61f145d6e2e23fc167a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://586395a6006e26fe969c2ef9af6ec6299523c0868c26989709389f5af24b7e39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:37Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:37 crc kubenswrapper[4930]: I1012 05:41:37.937271 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d92840f87187d6a4e0a40972f9b1601b70c300104b6023b980e0939e8b83eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d6f5f971b8708420f74c2b9602179815dd69145979d42bbaeb899290c01c286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:37Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:37 crc kubenswrapper[4930]: I1012 05:41:37.970680 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:37 crc kubenswrapper[4930]: I1012 05:41:37.970727 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:37 crc kubenswrapper[4930]: I1012 05:41:37.970777 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:37 crc kubenswrapper[4930]: I1012 05:41:37.970800 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:37 crc kubenswrapper[4930]: I1012 05:41:37.970812 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:37Z","lastTransitionTime":"2025-10-12T05:41:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:37 crc kubenswrapper[4930]: I1012 05:41:37.976222 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:37Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:38 crc kubenswrapper[4930]: I1012 05:41:38.022241 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tq29s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1c3ae9e-26ae-418f-b261-eabc4302b332\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbe4d76b294d8c174b77355aae9ec55f4d38c04446fc048603a294d171332584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s29kh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tq29s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:38Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:38 crc kubenswrapper[4930]: I1012 05:41:38.056054 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02f8684c-a3e4-44e8-9741-9f54488d8d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74c17edeca2e2b56f4be36327914c7232a5eee3190d7cd9ce54fe81050942ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqrgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2193886cf3e24fbf30c3e6f91ab52d9b7e68968c8b42a4d0deafab5668555701\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqrgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mk4tf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:38Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:38 crc kubenswrapper[4930]: I1012 05:41:38.073905 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:38 crc kubenswrapper[4930]: I1012 05:41:38.073970 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:38 crc kubenswrapper[4930]: I1012 05:41:38.073988 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:38 crc kubenswrapper[4930]: I1012 05:41:38.074017 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:38 crc kubenswrapper[4930]: I1012 05:41:38.074037 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:38Z","lastTransitionTime":"2025-10-12T05:41:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:38 crc kubenswrapper[4930]: I1012 05:41:38.102443 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c87fc1896b611cf87d3f6e8dce960e0aea4e0a5e161d12e046bec5a4d46d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:38Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:38 crc kubenswrapper[4930]: I1012 05:41:38.135195 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 05:41:38 crc kubenswrapper[4930]: I1012 05:41:38.135233 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 05:41:38 crc kubenswrapper[4930]: I1012 05:41:38.135198 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 05:41:38 crc kubenswrapper[4930]: E1012 05:41:38.135444 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 05:41:38 crc kubenswrapper[4930]: E1012 05:41:38.135604 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 05:41:38 crc kubenswrapper[4930]: E1012 05:41:38.135702 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 05:41:38 crc kubenswrapper[4930]: I1012 05:41:38.157303 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd0aa975-5017-4585-b9d9-66563c734f99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2ee3c6242b1dd92c05cad132f941c6463862d38dd3f32db106c4458b19ca281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce0b341a84353da33af118a985d78e1926365e672cd9d9a94a8c9ccdeeeef68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://772b3ac684f8d59be06d3921919db10572b68d9acfb6ae64e85b92c637963057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04341c7af2fbffb89ce55a0ed2594bdb4fd943ac9088994f81f9245c31c62a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c844939574ab6101aa35b883fd6b7dfabb8338c2de1bc238a60b92f7e0fc99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:38Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:38 crc kubenswrapper[4930]: I1012 05:41:38.177805 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:38 crc kubenswrapper[4930]: I1012 05:41:38.177865 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:38 crc kubenswrapper[4930]: I1012 05:41:38.177880 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:38 crc kubenswrapper[4930]: I1012 05:41:38.177901 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:38 crc kubenswrapper[4930]: I1012 05:41:38.177916 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:38Z","lastTransitionTime":"2025-10-12T05:41:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:38 crc kubenswrapper[4930]: I1012 05:41:38.180649 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vwttt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d928520f-ca1d-4cca-b966-c1e6c9168db0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://083fd9c4a4a835276945402005503980476de4e834b8c77b4347cbfd74347baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://271a5e8f9d11a30e34aaa4965e6d9d036cfc47d469d87a2ff0ddcae5ba4e8fe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://271a5e8f9d11a30e34aaa4965e6d9d036cfc47d469d87a2ff0ddcae5ba4e8fe8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a7387bc9fd2f9b98dd174adbd738309e22b90e1c82756e0e4c114cd5bc1279a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a7387bc9fd2f9b98dd174adbd738309e22b90e1c82756e0e4c114cd5bc1279a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b73defe23757469590b3d3d92e7b79bf3b6dcbcaec44d750770cd2b94caaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6b73defe23757469590b3d3d92e7b79bf3b6dcbcaec44d750770cd2b94caaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c379d6c40705bccb05ab8835e3b2410fc0a98db9e372b4dbd62cc3d4c1c5470\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c379d6c40705bccb05ab8835e3b2410fc0a98db9e372b4dbd62cc3d4c1c5470\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5e3c8e16c925694d42868813bc57edd9c50c88f2153189a2b9f8451655aff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5e3c8e16c925694d42868813bc57edd9c50c88f2153189a2b9f8451655aff1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d01f2b61f7fe3605f4ad0626d2cf28756a8c7e3de343b9a20acafe7f35c19bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d01f2b61f7fe3605f4ad0626d2cf28756a8c7e3de343b9a20acafe7f35c19bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vwttt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:38Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:38 crc kubenswrapper[4930]: I1012 05:41:38.233877 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0add0fa2-092f-4dcc-8c72-82881564bf63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7495c144ce989bf10b08b199a7459ea85351b77941a3edcb0ef76a01b2e08164\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a4295347cdfdf0864fbc2b102904f73e19cc33021fc5a82eed61255c4641d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2cba08ba8fd74a521bdd005e51be52f80687a1ccf0f250d11b709c894be8235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1006beab015e55856b70b9bcb93bb81b541e6c593fae8fd366c361990d233914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17051b78f584f1199be3038a82d9e675d1b9fb4da939611577c0b05235a147a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9898a7d7e617b5eb8eab6b6d9cb5d2ea92e8fbb0f4b1e0e60711e8f4a5e3aef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6016f66ca9ca63c708c040eb6f9cdd047860fc43bc2877294360b3ea8b0a3289\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcac9953a6ca0ac812cb0d1aed7ff1179eed01831f4af4a098f3e6938b5bc666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mdhw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:38Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:38 crc kubenswrapper[4930]: I1012 05:41:38.264668 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2acab1-2f7d-4497-ab0c-d2088825de18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://380b8da0bc8ba00ec7fe7a9dd1a4f49f0f0828fa469e21bd947a1b976ae5e584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14de1c8677ea9a164e1713e68861b55fa4b8fd0bff40211afa3ee9e883c65092\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93380cd1156afc3c34f019f5026f01b5242b9009c4c5de83edf5301e4cdb7b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b74b7cf89e0ba8d3417d38fdfc78b271cb23bc6e737358303956214c8f9a0f9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fb5b0196fd005013eba91e6ea563d14230abf0a8cbc1d2a4bf9cd4491f7c391\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T05:41:22Z\\\",\\\"message\\\":\\\"W1012 05:41:11.395066 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 05:41:11.395683 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760247671 cert, and key in /tmp/serving-cert-1865248722/serving-signer.crt, /tmp/serving-cert-1865248722/serving-signer.key\\\\nI1012 05:41:11.668419 1 observer_polling.go:159] Starting file observer\\\\nW1012 05:41:11.671952 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 05:41:11.672482 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 05:41:11.675332 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1865248722/tls.crt::/tmp/serving-cert-1865248722/tls.key\\\\\\\"\\\\nF1012 05:41:22.294380 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9e13b7dbf76d9dcdaa4a8edc40eb5ebac2c044e9f171f78f18b33a08784ed28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:38Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:38 crc kubenswrapper[4930]: I1012 05:41:38.282943 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:38 crc kubenswrapper[4930]: I1012 05:41:38.283035 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:38 crc kubenswrapper[4930]: I1012 05:41:38.283060 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:38 crc kubenswrapper[4930]: I1012 05:41:38.283093 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:38 crc kubenswrapper[4930]: I1012 05:41:38.283127 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:38Z","lastTransitionTime":"2025-10-12T05:41:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:38 crc kubenswrapper[4930]: I1012 05:41:38.303808 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:38Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:38 crc kubenswrapper[4930]: I1012 05:41:38.339505 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:38Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:38 crc kubenswrapper[4930]: I1012 05:41:38.378471 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://489b43d7659655253d7055c2725d6b59d594b9eea5d1e6dc78b015f8d44824ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:38Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:38 crc kubenswrapper[4930]: I1012 05:41:38.383809 4930 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 12 05:41:38 crc kubenswrapper[4930]: I1012 05:41:38.385231 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" Oct 12 05:41:38 crc kubenswrapper[4930]: I1012 05:41:38.387188 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:38 crc kubenswrapper[4930]: I1012 05:41:38.387995 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:38 crc kubenswrapper[4930]: I1012 05:41:38.388032 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:38 crc kubenswrapper[4930]: I1012 05:41:38.388058 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:38 crc kubenswrapper[4930]: I1012 05:41:38.388077 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:38Z","lastTransitionTime":"2025-10-12T05:41:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:38 crc kubenswrapper[4930]: I1012 05:41:38.418306 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-br2vl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"545bf51c-0b04-4166-a984-ec9c1276470a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6148cdf3aed29894038db482881c0cf3d0fd2d99c241e87d864099a3eb4248f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tpks\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-br2vl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:38Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:38 crc kubenswrapper[4930]: I1012 05:41:38.422371 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" Oct 12 05:41:38 crc kubenswrapper[4930]: I1012 05:41:38.453081 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jd2dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"835a1f98-4ae1-499b-b08c-a87dbcf8eaf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7816344abd767a7cdc2aa06418215233815285a683bb5a7fceeeb9a383b00bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvkzz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jd2dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:38Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:38 crc kubenswrapper[4930]: I1012 05:41:38.493064 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:38 crc kubenswrapper[4930]: I1012 05:41:38.493120 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:38 crc kubenswrapper[4930]: I1012 05:41:38.493137 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:38 crc kubenswrapper[4930]: I1012 05:41:38.493164 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:38 crc kubenswrapper[4930]: I1012 05:41:38.493182 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:38Z","lastTransitionTime":"2025-10-12T05:41:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:38 crc kubenswrapper[4930]: I1012 05:41:38.502023 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e930f9f-8120-43b6-95ba-39319c069f64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://428593f6ee3749d62d3cb8553ec6707f0f13f70f7750b8ea90fb0b9f0aa8f337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05e5821bdc37103cb95d9f2a3dcacbff75c87335a6c6ed400846ea49fe370c00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e44390f068c28c4a9917648c747a322ddfa44915b128a61f145d6e2e23fc167a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://586395a6006e26fe969c2ef9af6ec6299523c0868c26989709389f5af24b7e39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:38Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:38 crc kubenswrapper[4930]: I1012 05:41:38.539774 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d92840f87187d6a4e0a40972f9b1601b70c300104b6023b980e0939e8b83eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d6f5f971b8708420f74c2b9602179815dd69145979d42bbaeb899290c01c286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:38Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:38 crc kubenswrapper[4930]: I1012 05:41:38.581530 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c87fc1896b611cf87d3f6e8dce960e0aea4e0a5e161d12e046bec5a4d46d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:38Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:38 crc kubenswrapper[4930]: I1012 05:41:38.596775 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:38 crc kubenswrapper[4930]: I1012 05:41:38.596830 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:38 crc kubenswrapper[4930]: I1012 05:41:38.596841 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:38 crc kubenswrapper[4930]: I1012 05:41:38.596860 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:38 crc kubenswrapper[4930]: I1012 05:41:38.596873 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:38Z","lastTransitionTime":"2025-10-12T05:41:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:38 crc kubenswrapper[4930]: I1012 05:41:38.616046 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:38Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:38 crc kubenswrapper[4930]: I1012 05:41:38.660877 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tq29s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1c3ae9e-26ae-418f-b261-eabc4302b332\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbe4d76b294d8c174b77355aae9ec55f4d38c04446fc048603a294d171332584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s29kh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tq29s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:38Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:38 crc kubenswrapper[4930]: I1012 05:41:38.697174 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02f8684c-a3e4-44e8-9741-9f54488d8d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74c17edeca2e2b56f4be36327914c7232a5eee3190d7cd9ce54fe81050942ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqrgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2193886cf3e24fbf30c3e6f91ab52d9b7e68968c8b42a4d0deafab5668555701\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqrgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mk4tf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:38Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:38 crc kubenswrapper[4930]: I1012 05:41:38.699344 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:38 crc kubenswrapper[4930]: I1012 05:41:38.699394 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:38 crc kubenswrapper[4930]: I1012 05:41:38.699412 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:38 crc kubenswrapper[4930]: I1012 05:41:38.699437 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:38 crc kubenswrapper[4930]: I1012 05:41:38.699455 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:38Z","lastTransitionTime":"2025-10-12T05:41:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:38 crc kubenswrapper[4930]: I1012 05:41:38.755718 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd0aa975-5017-4585-b9d9-66563c734f99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2ee3c6242b1dd92c05cad132f941c6463862d38dd3f32db106c4458b19ca281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce0b341a84353da33af118a985d78e1926365e672cd9d9a94a8c9ccdeeeef68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://772b3ac684f8d59be06d3921919db10572b68d9acfb6ae64e85b92c637963057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04341c7af2fbffb89ce55a0ed2594bdb4fd943ac9088994f81f9245c31c62a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c844939574ab6101aa35b883fd6b7dfabb8338c2de1bc238a60b92f7e0fc99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:38Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:38 crc kubenswrapper[4930]: I1012 05:41:38.785029 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vwttt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d928520f-ca1d-4cca-b966-c1e6c9168db0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://083fd9c4a4a835276945402005503980476de4e834b8c77b4347cbfd74347baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://271a5e8f9d11a30e34aaa4965e6d9d036cfc47d469d87a2ff0ddcae5ba4e8fe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://271a5e8f9d11a30e34aaa4965e6d9d036cfc47d469d87a2ff0ddcae5ba4e8fe8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a7387bc9fd2f9b98dd174adbd738309e22b90e1c82756e0e4c114cd5bc1279a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a7387bc9fd2f9b98dd174adbd738309e22b90e1c82756e0e4c114cd5bc1279a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b73defe23757469590b3d3d92e7b79bf3b6dcbcaec44d750770cd2b94caaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6b73defe23757469590b3d3d92e7b79bf3b6dcbcaec44d750770cd2b94caaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c379d6c40705bccb05ab8835e3b2410fc0a98db9e372b4dbd62cc3d4c1c5470\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c379d6c40705bccb05ab8835e3b2410fc0a98db9e372b4dbd62cc3d4c1c5470\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5e3c8e16c925694d42868813bc57edd9c50c88f2153189a2b9f8451655aff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5e3c8e16c925694d42868813bc57edd9c50c88f2153189a2b9f8451655aff1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d01f2b61f7fe3605f4ad0626d2cf28756a8c7e3de343b9a20acafe7f35c19bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d01f2b61f7fe3605f4ad0626d2cf28756a8c7e3de343b9a20acafe7f35c19bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vwttt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:38Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:38 crc kubenswrapper[4930]: I1012 05:41:38.802571 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:38 crc kubenswrapper[4930]: I1012 05:41:38.802634 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:38 crc kubenswrapper[4930]: I1012 05:41:38.802647 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:38 crc kubenswrapper[4930]: I1012 05:41:38.802667 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:38 crc kubenswrapper[4930]: I1012 05:41:38.802680 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:38Z","lastTransitionTime":"2025-10-12T05:41:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:38 crc kubenswrapper[4930]: I1012 05:41:38.832044 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0add0fa2-092f-4dcc-8c72-82881564bf63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7495c144ce989bf10b08b199a7459ea85351b77941a3edcb0ef76a01b2e08164\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a4295347cdfdf0864fbc2b102904f73e19cc33021fc5a82eed61255c4641d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2cba08ba8fd74a521bdd005e51be52f80687a1ccf0f250d11b709c894be8235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1006beab015e55856b70b9bcb93bb81b541e6c593fae8fd366c361990d233914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17051b78f584f1199be3038a82d9e675d1b9fb4da939611577c0b05235a147a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9898a7d7e617b5eb8eab6b6d9cb5d2ea92e8fbb0f4b1e0e60711e8f4a5e3aef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6016f66ca9ca63c708c040eb6f9cdd047860fc43bc2877294360b3ea8b0a3289\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcac9953a6ca0ac812cb0d1aed7ff1179eed01831f4af4a098f3e6938b5bc666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mdhw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:38Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:38 crc kubenswrapper[4930]: I1012 05:41:38.861297 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c87fc1896b611cf87d3f6e8dce960e0aea4e0a5e161d12e046bec5a4d46d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:38Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:38 crc kubenswrapper[4930]: I1012 05:41:38.905588 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:38 crc kubenswrapper[4930]: I1012 05:41:38.905647 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:38 crc kubenswrapper[4930]: I1012 05:41:38.905664 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:38 crc kubenswrapper[4930]: I1012 05:41:38.905688 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:38 crc kubenswrapper[4930]: I1012 05:41:38.905706 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:38Z","lastTransitionTime":"2025-10-12T05:41:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:38 crc kubenswrapper[4930]: I1012 05:41:38.906414 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:38Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:38 crc kubenswrapper[4930]: I1012 05:41:38.939768 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tq29s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1c3ae9e-26ae-418f-b261-eabc4302b332\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbe4d76b294d8c174b77355aae9ec55f4d38c04446fc048603a294d171332584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s29kh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tq29s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:38Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:38 crc kubenswrapper[4930]: I1012 05:41:38.972675 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02f8684c-a3e4-44e8-9741-9f54488d8d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74c17edeca2e2b56f4be36327914c7232a5eee3190d7cd9ce54fe81050942ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqrgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2193886cf3e24fbf30c3e6f91ab52d9b7e68968c8b42a4d0deafab5668555701\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqrgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mk4tf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:38Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:39 crc kubenswrapper[4930]: I1012 05:41:39.008832 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:39 crc kubenswrapper[4930]: I1012 05:41:39.008933 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:39 crc kubenswrapper[4930]: I1012 05:41:39.008962 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:39 crc kubenswrapper[4930]: I1012 05:41:39.008994 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:39 crc kubenswrapper[4930]: I1012 05:41:39.009020 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:39Z","lastTransitionTime":"2025-10-12T05:41:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:39 crc kubenswrapper[4930]: I1012 05:41:39.034729 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd0aa975-5017-4585-b9d9-66563c734f99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2ee3c6242b1dd92c05cad132f941c6463862d38dd3f32db106c4458b19ca281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce0b341a84353da33af118a985d78e1926365e672cd9d9a94a8c9ccdeeeef68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://772b3ac684f8d59be06d3921919db10572b68d9acfb6ae64e85b92c637963057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04341c7af2fbffb89ce55a0ed2594bdb4fd943ac9088994f81f9245c31c62a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c844939574ab6101aa35b883fd6b7dfabb8338c2de1bc238a60b92f7e0fc99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:39Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:39 crc kubenswrapper[4930]: I1012 05:41:39.062918 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vwttt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d928520f-ca1d-4cca-b966-c1e6c9168db0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://083fd9c4a4a835276945402005503980476de4e834b8c77b4347cbfd74347baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://271a5e8f9d11a30e34aaa4965e6d9d036cfc47d469d87a2ff0ddcae5ba4e8fe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://271a5e8f9d11a30e34aaa4965e6d9d036cfc47d469d87a2ff0ddcae5ba4e8fe8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a7387bc9fd2f9b98dd174adbd738309e22b90e1c82756e0e4c114cd5bc1279a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a7387bc9fd2f9b98dd174adbd738309e22b90e1c82756e0e4c114cd5bc1279a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b73defe23757469590b3d3d92e7b79bf3b6dcbcaec44d750770cd2b94caaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6b73defe23757469590b3d3d92e7b79bf3b6dcbcaec44d750770cd2b94caaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c379d6c40705bccb05ab8835e3b2410fc0a98db9e372b4dbd62cc3d4c1c5470\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c379d6c40705bccb05ab8835e3b2410fc0a98db9e372b4dbd62cc3d4c1c5470\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5e3c8e16c925694d42868813bc57edd9c50c88f2153189a2b9f8451655aff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5e3c8e16c925694d42868813bc57edd9c50c88f2153189a2b9f8451655aff1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d01f2b61f7fe3605f4ad0626d2cf28756a8c7e3de343b9a20acafe7f35c19bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d01f2b61f7fe3605f4ad0626d2cf28756a8c7e3de343b9a20acafe7f35c19bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vwttt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:39Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:39 crc kubenswrapper[4930]: I1012 05:41:39.112797 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:39 crc kubenswrapper[4930]: I1012 05:41:39.112859 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:39 crc kubenswrapper[4930]: I1012 05:41:39.112878 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:39 crc kubenswrapper[4930]: I1012 05:41:39.112907 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:39 crc kubenswrapper[4930]: I1012 05:41:39.112928 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:39Z","lastTransitionTime":"2025-10-12T05:41:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:39 crc kubenswrapper[4930]: I1012 05:41:39.112476 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0add0fa2-092f-4dcc-8c72-82881564bf63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7495c144ce989bf10b08b199a7459ea85351b77941a3edcb0ef76a01b2e08164\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a4295347cdfdf0864fbc2b102904f73e19cc33021fc5a82eed61255c4641d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2cba08ba8fd74a521bdd005e51be52f80687a1ccf0f250d11b709c894be8235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1006beab015e55856b70b9bcb93bb81b541e6c593fae8fd366c361990d233914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17051b78f584f1199be3038a82d9e675d1b9fb4da939611577c0b05235a147a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9898a7d7e617b5eb8eab6b6d9cb5d2ea92e8fbb0f4b1e0e60711e8f4a5e3aef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6016f66ca9ca63c708c040eb6f9cdd047860fc43bc2877294360b3ea8b0a3289\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcac9953a6ca0ac812cb0d1aed7ff1179eed01831f4af4a098f3e6938b5bc666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mdhw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:39Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:39 crc kubenswrapper[4930]: I1012 05:41:39.136324 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2acab1-2f7d-4497-ab0c-d2088825de18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://380b8da0bc8ba00ec7fe7a9dd1a4f49f0f0828fa469e21bd947a1b976ae5e584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14de1c8677ea9a164e1713e68861b55fa4b8fd0bff40211afa3ee9e883c65092\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93380cd1156afc3c34f019f5026f01b5242b9009c4c5de83edf5301e4cdb7b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b74b7cf89e0ba8d3417d38fdfc78b271cb23bc6e737358303956214c8f9a0f9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fb5b0196fd005013eba91e6ea563d14230abf0a8cbc1d2a4bf9cd4491f7c391\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T05:41:22Z\\\",\\\"message\\\":\\\"W1012 05:41:11.395066 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 05:41:11.395683 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760247671 cert, and key in /tmp/serving-cert-1865248722/serving-signer.crt, /tmp/serving-cert-1865248722/serving-signer.key\\\\nI1012 05:41:11.668419 1 observer_polling.go:159] Starting file observer\\\\nW1012 05:41:11.671952 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 05:41:11.672482 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 05:41:11.675332 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1865248722/tls.crt::/tmp/serving-cert-1865248722/tls.key\\\\\\\"\\\\nF1012 05:41:22.294380 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9e13b7dbf76d9dcdaa4a8edc40eb5ebac2c044e9f171f78f18b33a08784ed28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:39Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:39 crc kubenswrapper[4930]: I1012 05:41:39.181027 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:39Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:39 crc kubenswrapper[4930]: I1012 05:41:39.216609 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:39 crc kubenswrapper[4930]: I1012 05:41:39.216663 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:39 crc kubenswrapper[4930]: I1012 05:41:39.216674 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:39 crc kubenswrapper[4930]: I1012 05:41:39.216697 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:39 crc kubenswrapper[4930]: I1012 05:41:39.216709 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:39Z","lastTransitionTime":"2025-10-12T05:41:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:39 crc kubenswrapper[4930]: I1012 05:41:39.219808 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:39Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:39 crc kubenswrapper[4930]: I1012 05:41:39.257785 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://489b43d7659655253d7055c2725d6b59d594b9eea5d1e6dc78b015f8d44824ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:39Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:39 crc kubenswrapper[4930]: I1012 05:41:39.291940 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-br2vl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"545bf51c-0b04-4166-a984-ec9c1276470a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6148cdf3aed29894038db482881c0cf3d0fd2d99c241e87d864099a3eb4248f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tpks\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-br2vl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:39Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:39 crc kubenswrapper[4930]: I1012 05:41:39.322018 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:39 crc kubenswrapper[4930]: I1012 05:41:39.322081 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:39 crc kubenswrapper[4930]: I1012 05:41:39.322102 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:39 crc kubenswrapper[4930]: I1012 05:41:39.322127 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:39 crc kubenswrapper[4930]: I1012 05:41:39.322145 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:39Z","lastTransitionTime":"2025-10-12T05:41:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:39 crc kubenswrapper[4930]: I1012 05:41:39.338073 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jd2dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"835a1f98-4ae1-499b-b08c-a87dbcf8eaf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7816344abd767a7cdc2aa06418215233815285a683bb5a7fceeeb9a383b00bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvkzz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jd2dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:39Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:39 crc kubenswrapper[4930]: I1012 05:41:39.383514 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e930f9f-8120-43b6-95ba-39319c069f64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://428593f6ee3749d62d3cb8553ec6707f0f13f70f7750b8ea90fb0b9f0aa8f337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05e5821bdc37103cb95d9f2a3dcacbff75c87335a6c6ed400846ea49fe370c00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e44390f068c28c4a9917648c747a322ddfa44915b128a61f145d6e2e23fc167a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://586395a6006e26fe969c2ef9af6ec6299523c0868c26989709389f5af24b7e39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:39Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:39 crc kubenswrapper[4930]: I1012 05:41:39.387406 4930 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 12 05:41:39 crc kubenswrapper[4930]: I1012 05:41:39.420986 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d92840f87187d6a4e0a40972f9b1601b70c300104b6023b980e0939e8b83eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d6f5f971b8708420f74c2b9602179815dd69145979d42bbaeb899290c01c286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:39Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:39 crc kubenswrapper[4930]: I1012 05:41:39.426455 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:39 crc kubenswrapper[4930]: I1012 05:41:39.426523 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:39 crc kubenswrapper[4930]: I1012 05:41:39.426543 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:39 crc kubenswrapper[4930]: I1012 05:41:39.426567 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:39 crc kubenswrapper[4930]: I1012 05:41:39.426580 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:39Z","lastTransitionTime":"2025-10-12T05:41:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:39 crc kubenswrapper[4930]: I1012 05:41:39.531208 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:39 crc kubenswrapper[4930]: I1012 05:41:39.531266 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:39 crc kubenswrapper[4930]: I1012 05:41:39.531285 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:39 crc kubenswrapper[4930]: I1012 05:41:39.531309 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:39 crc kubenswrapper[4930]: I1012 05:41:39.531328 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:39Z","lastTransitionTime":"2025-10-12T05:41:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:39 crc kubenswrapper[4930]: I1012 05:41:39.635014 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:39 crc kubenswrapper[4930]: I1012 05:41:39.635092 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:39 crc kubenswrapper[4930]: I1012 05:41:39.635111 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:39 crc kubenswrapper[4930]: I1012 05:41:39.635141 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:39 crc kubenswrapper[4930]: I1012 05:41:39.635159 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:39Z","lastTransitionTime":"2025-10-12T05:41:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:39 crc kubenswrapper[4930]: I1012 05:41:39.738675 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:39 crc kubenswrapper[4930]: I1012 05:41:39.738780 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:39 crc kubenswrapper[4930]: I1012 05:41:39.738805 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:39 crc kubenswrapper[4930]: I1012 05:41:39.738833 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:39 crc kubenswrapper[4930]: I1012 05:41:39.738850 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:39Z","lastTransitionTime":"2025-10-12T05:41:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:39 crc kubenswrapper[4930]: I1012 05:41:39.841861 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:39 crc kubenswrapper[4930]: I1012 05:41:39.841937 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:39 crc kubenswrapper[4930]: I1012 05:41:39.841963 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:39 crc kubenswrapper[4930]: I1012 05:41:39.841990 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:39 crc kubenswrapper[4930]: I1012 05:41:39.842007 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:39Z","lastTransitionTime":"2025-10-12T05:41:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:39 crc kubenswrapper[4930]: I1012 05:41:39.945155 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:39 crc kubenswrapper[4930]: I1012 05:41:39.945232 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:39 crc kubenswrapper[4930]: I1012 05:41:39.945252 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:39 crc kubenswrapper[4930]: I1012 05:41:39.945281 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:39 crc kubenswrapper[4930]: I1012 05:41:39.945301 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:39Z","lastTransitionTime":"2025-10-12T05:41:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:40 crc kubenswrapper[4930]: I1012 05:41:40.048807 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:40 crc kubenswrapper[4930]: I1012 05:41:40.048855 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:40 crc kubenswrapper[4930]: I1012 05:41:40.048865 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:40 crc kubenswrapper[4930]: I1012 05:41:40.048882 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:40 crc kubenswrapper[4930]: I1012 05:41:40.048892 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:40Z","lastTransitionTime":"2025-10-12T05:41:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:40 crc kubenswrapper[4930]: I1012 05:41:40.134885 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 05:41:40 crc kubenswrapper[4930]: I1012 05:41:40.135023 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 05:41:40 crc kubenswrapper[4930]: E1012 05:41:40.135127 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 05:41:40 crc kubenswrapper[4930]: I1012 05:41:40.135284 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 05:41:40 crc kubenswrapper[4930]: E1012 05:41:40.135497 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 05:41:40 crc kubenswrapper[4930]: E1012 05:41:40.135817 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 05:41:40 crc kubenswrapper[4930]: I1012 05:41:40.156696 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:40 crc kubenswrapper[4930]: I1012 05:41:40.156820 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:40 crc kubenswrapper[4930]: I1012 05:41:40.156900 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:40 crc kubenswrapper[4930]: I1012 05:41:40.156939 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:40 crc kubenswrapper[4930]: I1012 05:41:40.156968 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:40Z","lastTransitionTime":"2025-10-12T05:41:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:40 crc kubenswrapper[4930]: I1012 05:41:40.260612 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:40 crc kubenswrapper[4930]: I1012 05:41:40.260700 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:40 crc kubenswrapper[4930]: I1012 05:41:40.260719 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:40 crc kubenswrapper[4930]: I1012 05:41:40.260785 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:40 crc kubenswrapper[4930]: I1012 05:41:40.260806 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:40Z","lastTransitionTime":"2025-10-12T05:41:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:40 crc kubenswrapper[4930]: I1012 05:41:40.364880 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:40 crc kubenswrapper[4930]: I1012 05:41:40.364962 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:40 crc kubenswrapper[4930]: I1012 05:41:40.364983 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:40 crc kubenswrapper[4930]: I1012 05:41:40.365012 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:40 crc kubenswrapper[4930]: I1012 05:41:40.365032 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:40Z","lastTransitionTime":"2025-10-12T05:41:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:40 crc kubenswrapper[4930]: I1012 05:41:40.392870 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mdhw6_0add0fa2-092f-4dcc-8c72-82881564bf63/ovnkube-controller/0.log" Oct 12 05:41:40 crc kubenswrapper[4930]: I1012 05:41:40.395551 4930 generic.go:334] "Generic (PLEG): container finished" podID="0add0fa2-092f-4dcc-8c72-82881564bf63" containerID="6016f66ca9ca63c708c040eb6f9cdd047860fc43bc2877294360b3ea8b0a3289" exitCode=1 Oct 12 05:41:40 crc kubenswrapper[4930]: I1012 05:41:40.395621 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" event={"ID":"0add0fa2-092f-4dcc-8c72-82881564bf63","Type":"ContainerDied","Data":"6016f66ca9ca63c708c040eb6f9cdd047860fc43bc2877294360b3ea8b0a3289"} Oct 12 05:41:40 crc kubenswrapper[4930]: I1012 05:41:40.396966 4930 scope.go:117] "RemoveContainer" containerID="6016f66ca9ca63c708c040eb6f9cdd047860fc43bc2877294360b3ea8b0a3289" Oct 12 05:41:40 crc kubenswrapper[4930]: I1012 05:41:40.422999 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e930f9f-8120-43b6-95ba-39319c069f64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://428593f6ee3749d62d3cb8553ec6707f0f13f70f7750b8ea90fb0b9f0aa8f337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05e5821bdc37103cb95d9f2a3dcacbff75c87335a6c6ed400846ea49fe370c00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e44390f068c28c4a9917648c747a322ddfa44915b128a61f145d6e2e23fc167a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://586395a6006e26fe969c2ef9af6ec6299523c0868c26989709389f5af24b7e39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:40Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:40 crc kubenswrapper[4930]: I1012 05:41:40.442682 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d92840f87187d6a4e0a40972f9b1601b70c300104b6023b980e0939e8b83eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d6f5f971b8708420f74c2b9602179815dd69145979d42bbaeb899290c01c286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:40Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:40 crc kubenswrapper[4930]: I1012 05:41:40.463136 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tq29s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1c3ae9e-26ae-418f-b261-eabc4302b332\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbe4d76b294d8c174b77355aae9ec55f4d38c04446fc048603a294d171332584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s29kh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tq29s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:40Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:40 crc kubenswrapper[4930]: I1012 05:41:40.467928 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:40 crc kubenswrapper[4930]: I1012 05:41:40.467963 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:40 crc kubenswrapper[4930]: I1012 05:41:40.467972 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:40 crc kubenswrapper[4930]: I1012 05:41:40.467986 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:40 crc kubenswrapper[4930]: I1012 05:41:40.467996 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:40Z","lastTransitionTime":"2025-10-12T05:41:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:40 crc kubenswrapper[4930]: I1012 05:41:40.479033 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02f8684c-a3e4-44e8-9741-9f54488d8d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74c17edeca2e2b56f4be36327914c7232a5eee3190d7cd9ce54fe81050942ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqrgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2193886cf3e24fbf30c3e6f91ab52d9b7e68968c8b42a4d0deafab5668555701\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqrgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mk4tf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:40Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:40 crc kubenswrapper[4930]: I1012 05:41:40.502470 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c87fc1896b611cf87d3f6e8dce960e0aea4e0a5e161d12e046bec5a4d46d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:40Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:40 crc kubenswrapper[4930]: I1012 05:41:40.523139 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:40Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:40 crc kubenswrapper[4930]: I1012 05:41:40.547846 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd0aa975-5017-4585-b9d9-66563c734f99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2ee3c6242b1dd92c05cad132f941c6463862d38dd3f32db106c4458b19ca281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce0b341a84353da33af118a985d78e1926365e672cd9d9a94a8c9ccdeeeef68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://772b3ac684f8d59be06d3921919db10572b68d9acfb6ae64e85b92c637963057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04341c7af2fbffb89ce55a0ed2594bdb4fd943ac9088994f81f9245c31c62a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c844939574ab6101aa35b883fd6b7dfabb8338c2de1bc238a60b92f7e0fc99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:40Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:40 crc kubenswrapper[4930]: I1012 05:41:40.566477 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vwttt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d928520f-ca1d-4cca-b966-c1e6c9168db0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://083fd9c4a4a835276945402005503980476de4e834b8c77b4347cbfd74347baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://271a5e8f9d11a30e34aaa4965e6d9d036cfc47d469d87a2ff0ddcae5ba4e8fe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://271a5e8f9d11a30e34aaa4965e6d9d036cfc47d469d87a2ff0ddcae5ba4e8fe8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a7387bc9fd2f9b98dd174adbd738309e22b90e1c82756e0e4c114cd5bc1279a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a7387bc9fd2f9b98dd174adbd738309e22b90e1c82756e0e4c114cd5bc1279a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b73defe23757469590b3d3d92e7b79bf3b6dcbcaec44d750770cd2b94caaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6b73defe23757469590b3d3d92e7b79bf3b6dcbcaec44d750770cd2b94caaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c379d6c40705bccb05ab8835e3b2410fc0a98db9e372b4dbd62cc3d4c1c5470\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c379d6c40705bccb05ab8835e3b2410fc0a98db9e372b4dbd62cc3d4c1c5470\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5e3c8e16c925694d42868813bc57edd9c50c88f2153189a2b9f8451655aff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5e3c8e16c925694d42868813bc57edd9c50c88f2153189a2b9f8451655aff1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d01f2b61f7fe3605f4ad0626d2cf28756a8c7e3de343b9a20acafe7f35c19bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d01f2b61f7fe3605f4ad0626d2cf28756a8c7e3de343b9a20acafe7f35c19bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vwttt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:40Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:40 crc kubenswrapper[4930]: I1012 05:41:40.570360 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:40 crc kubenswrapper[4930]: I1012 05:41:40.570390 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:40 crc kubenswrapper[4930]: I1012 05:41:40.570405 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:40 crc kubenswrapper[4930]: I1012 05:41:40.570423 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:40 crc kubenswrapper[4930]: I1012 05:41:40.570435 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:40Z","lastTransitionTime":"2025-10-12T05:41:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:40 crc kubenswrapper[4930]: I1012 05:41:40.596032 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0add0fa2-092f-4dcc-8c72-82881564bf63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7495c144ce989bf10b08b199a7459ea85351b77941a3edcb0ef76a01b2e08164\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a4295347cdfdf0864fbc2b102904f73e19cc33021fc5a82eed61255c4641d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2cba08ba8fd74a521bdd005e51be52f80687a1ccf0f250d11b709c894be8235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1006beab015e55856b70b9bcb93bb81b541e6c593fae8fd366c361990d233914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17051b78f584f1199be3038a82d9e675d1b9fb4da939611577c0b05235a147a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9898a7d7e617b5eb8eab6b6d9cb5d2ea92e8fbb0f4b1e0e60711e8f4a5e3aef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6016f66ca9ca63c708c040eb6f9cdd047860fc43bc2877294360b3ea8b0a3289\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6016f66ca9ca63c708c040eb6f9cdd047860fc43bc2877294360b3ea8b0a3289\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T05:41:39Z\\\",\\\"message\\\":\\\"ce event handler 5 for removal\\\\nI1012 05:41:39.557967 6216 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1012 05:41:39.557981 6216 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1012 05:41:39.557989 6216 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1012 05:41:39.558149 6216 handler.go:208] Removed *v1.Node event handler 2\\\\nI1012 05:41:39.559241 6216 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1012 05:41:39.559356 6216 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1012 05:41:39.559823 6216 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1012 05:41:39.559854 6216 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1012 05:41:39.559907 6216 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1012 05:41:39.559942 6216 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1012 05:41:39.559987 6216 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1012 05:41:39.560008 6216 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1012 05:41:39.560021 6216 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1012 05:41:39.561837 6216 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcac9953a6ca0ac812cb0d1aed7ff1179eed01831f4af4a098f3e6938b5bc666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mdhw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:40Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:40 crc kubenswrapper[4930]: I1012 05:41:40.610082 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-br2vl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"545bf51c-0b04-4166-a984-ec9c1276470a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6148cdf3aed29894038db482881c0cf3d0fd2d99c241e87d864099a3eb4248f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tpks\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-br2vl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:40Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:40 crc kubenswrapper[4930]: I1012 05:41:40.622031 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jd2dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"835a1f98-4ae1-499b-b08c-a87dbcf8eaf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7816344abd767a7cdc2aa06418215233815285a683bb5a7fceeeb9a383b00bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvkzz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jd2dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:40Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:40 crc kubenswrapper[4930]: I1012 05:41:40.635668 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2acab1-2f7d-4497-ab0c-d2088825de18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://380b8da0bc8ba00ec7fe7a9dd1a4f49f0f0828fa469e21bd947a1b976ae5e584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14de1c8677ea9a164e1713e68861b55fa4b8fd0bff40211afa3ee9e883c65092\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93380cd1156afc3c34f019f5026f01b5242b9009c4c5de83edf5301e4cdb7b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b74b7cf89e0ba8d3417d38fdfc78b271cb23bc6e737358303956214c8f9a0f9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fb5b0196fd005013eba91e6ea563d14230abf0a8cbc1d2a4bf9cd4491f7c391\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T05:41:22Z\\\",\\\"message\\\":\\\"W1012 05:41:11.395066 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 05:41:11.395683 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760247671 cert, and key in /tmp/serving-cert-1865248722/serving-signer.crt, /tmp/serving-cert-1865248722/serving-signer.key\\\\nI1012 05:41:11.668419 1 observer_polling.go:159] Starting file observer\\\\nW1012 05:41:11.671952 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 05:41:11.672482 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 05:41:11.675332 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1865248722/tls.crt::/tmp/serving-cert-1865248722/tls.key\\\\\\\"\\\\nF1012 05:41:22.294380 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9e13b7dbf76d9dcdaa4a8edc40eb5ebac2c044e9f171f78f18b33a08784ed28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:40Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:40 crc kubenswrapper[4930]: I1012 05:41:40.648755 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:40Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:40 crc kubenswrapper[4930]: I1012 05:41:40.668980 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:40Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:40 crc kubenswrapper[4930]: I1012 05:41:40.672904 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:40 crc kubenswrapper[4930]: I1012 05:41:40.672932 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:40 crc kubenswrapper[4930]: I1012 05:41:40.672944 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:40 crc kubenswrapper[4930]: I1012 05:41:40.672978 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:40 crc kubenswrapper[4930]: I1012 05:41:40.672990 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:40Z","lastTransitionTime":"2025-10-12T05:41:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:40 crc kubenswrapper[4930]: I1012 05:41:40.687006 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://489b43d7659655253d7055c2725d6b59d594b9eea5d1e6dc78b015f8d44824ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:40Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:40 crc kubenswrapper[4930]: I1012 05:41:40.776305 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:40 crc kubenswrapper[4930]: I1012 05:41:40.776371 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:40 crc kubenswrapper[4930]: I1012 05:41:40.776388 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:40 crc kubenswrapper[4930]: I1012 05:41:40.776411 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:40 crc kubenswrapper[4930]: I1012 05:41:40.776436 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:40Z","lastTransitionTime":"2025-10-12T05:41:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:40 crc kubenswrapper[4930]: I1012 05:41:40.879294 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:40 crc kubenswrapper[4930]: I1012 05:41:40.879371 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:40 crc kubenswrapper[4930]: I1012 05:41:40.879389 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:40 crc kubenswrapper[4930]: I1012 05:41:40.879417 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:40 crc kubenswrapper[4930]: I1012 05:41:40.879435 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:40Z","lastTransitionTime":"2025-10-12T05:41:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:40 crc kubenswrapper[4930]: I1012 05:41:40.982288 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:40 crc kubenswrapper[4930]: I1012 05:41:40.982342 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:40 crc kubenswrapper[4930]: I1012 05:41:40.982355 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:40 crc kubenswrapper[4930]: I1012 05:41:40.982376 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:40 crc kubenswrapper[4930]: I1012 05:41:40.982393 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:40Z","lastTransitionTime":"2025-10-12T05:41:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:41 crc kubenswrapper[4930]: I1012 05:41:41.085502 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:41 crc kubenswrapper[4930]: I1012 05:41:41.085592 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:41 crc kubenswrapper[4930]: I1012 05:41:41.085623 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:41 crc kubenswrapper[4930]: I1012 05:41:41.085657 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:41 crc kubenswrapper[4930]: I1012 05:41:41.085686 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:41Z","lastTransitionTime":"2025-10-12T05:41:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:41 crc kubenswrapper[4930]: I1012 05:41:41.189839 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:41 crc kubenswrapper[4930]: I1012 05:41:41.189910 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:41 crc kubenswrapper[4930]: I1012 05:41:41.189932 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:41 crc kubenswrapper[4930]: I1012 05:41:41.189961 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:41 crc kubenswrapper[4930]: I1012 05:41:41.189983 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:41Z","lastTransitionTime":"2025-10-12T05:41:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:41 crc kubenswrapper[4930]: I1012 05:41:41.292325 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:41 crc kubenswrapper[4930]: I1012 05:41:41.292395 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:41 crc kubenswrapper[4930]: I1012 05:41:41.292418 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:41 crc kubenswrapper[4930]: I1012 05:41:41.292450 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:41 crc kubenswrapper[4930]: I1012 05:41:41.292478 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:41Z","lastTransitionTime":"2025-10-12T05:41:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:41 crc kubenswrapper[4930]: I1012 05:41:41.396195 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:41 crc kubenswrapper[4930]: I1012 05:41:41.396264 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:41 crc kubenswrapper[4930]: I1012 05:41:41.396308 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:41 crc kubenswrapper[4930]: I1012 05:41:41.396338 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:41 crc kubenswrapper[4930]: I1012 05:41:41.396362 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:41Z","lastTransitionTime":"2025-10-12T05:41:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:41 crc kubenswrapper[4930]: I1012 05:41:41.410064 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mdhw6_0add0fa2-092f-4dcc-8c72-82881564bf63/ovnkube-controller/0.log" Oct 12 05:41:41 crc kubenswrapper[4930]: I1012 05:41:41.415028 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" event={"ID":"0add0fa2-092f-4dcc-8c72-82881564bf63","Type":"ContainerStarted","Data":"264bb5418af04a677f569add84f3eb35c3760f20db19a03595723d261202e317"} Oct 12 05:41:41 crc kubenswrapper[4930]: I1012 05:41:41.415236 4930 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 12 05:41:41 crc kubenswrapper[4930]: I1012 05:41:41.440426 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e930f9f-8120-43b6-95ba-39319c069f64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://428593f6ee3749d62d3cb8553ec6707f0f13f70f7750b8ea90fb0b9f0aa8f337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05e5821bdc37103cb95d9f2a3dcacbff75c87335a6c6ed400846ea49fe370c00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e44390f068c28c4a9917648c747a322ddfa44915b128a61f145d6e2e23fc167a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://586395a6006e26fe969c2ef9af6ec6299523c0868c26989709389f5af24b7e39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:41Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:41 crc kubenswrapper[4930]: I1012 05:41:41.461015 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d92840f87187d6a4e0a40972f9b1601b70c300104b6023b980e0939e8b83eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d6f5f971b8708420f74c2b9602179815dd69145979d42bbaeb899290c01c286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:41Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:41 crc kubenswrapper[4930]: I1012 05:41:41.479612 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02f8684c-a3e4-44e8-9741-9f54488d8d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74c17edeca2e2b56f4be36327914c7232a5eee3190d7cd9ce54fe81050942ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqrgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2193886cf3e24fbf30c3e6f91ab52d9b7e68968c8b42a4d0deafab5668555701\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqrgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mk4tf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:41Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:41 crc kubenswrapper[4930]: I1012 05:41:41.501420 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:41 crc kubenswrapper[4930]: I1012 05:41:41.501478 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:41 crc kubenswrapper[4930]: I1012 05:41:41.501491 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:41 crc kubenswrapper[4930]: I1012 05:41:41.501521 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:41 crc kubenswrapper[4930]: I1012 05:41:41.501538 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:41Z","lastTransitionTime":"2025-10-12T05:41:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:41 crc kubenswrapper[4930]: I1012 05:41:41.501580 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c87fc1896b611cf87d3f6e8dce960e0aea4e0a5e161d12e046bec5a4d46d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:41Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:41 crc kubenswrapper[4930]: I1012 05:41:41.521096 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:41Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:41 crc kubenswrapper[4930]: I1012 05:41:41.542707 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tq29s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1c3ae9e-26ae-418f-b261-eabc4302b332\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbe4d76b294d8c174b77355aae9ec55f4d38c04446fc048603a294d171332584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s29kh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tq29s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:41Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:41 crc kubenswrapper[4930]: I1012 05:41:41.574626 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd0aa975-5017-4585-b9d9-66563c734f99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2ee3c6242b1dd92c05cad132f941c6463862d38dd3f32db106c4458b19ca281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce0b341a84353da33af118a985d78e1926365e672cd9d9a94a8c9ccdeeeef68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://772b3ac684f8d59be06d3921919db10572b68d9acfb6ae64e85b92c637963057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04341c7af2fbffb89ce55a0ed2594bdb4fd943ac9088994f81f9245c31c62a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c844939574ab6101aa35b883fd6b7dfabb8338c2de1bc238a60b92f7e0fc99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:41Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:41 crc kubenswrapper[4930]: I1012 05:41:41.590886 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vwttt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d928520f-ca1d-4cca-b966-c1e6c9168db0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://083fd9c4a4a835276945402005503980476de4e834b8c77b4347cbfd74347baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://271a5e8f9d11a30e34aaa4965e6d9d036cfc47d469d87a2ff0ddcae5ba4e8fe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://271a5e8f9d11a30e34aaa4965e6d9d036cfc47d469d87a2ff0ddcae5ba4e8fe8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a7387bc9fd2f9b98dd174adbd738309e22b90e1c82756e0e4c114cd5bc1279a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a7387bc9fd2f9b98dd174adbd738309e22b90e1c82756e0e4c114cd5bc1279a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b73defe23757469590b3d3d92e7b79bf3b6dcbcaec44d750770cd2b94caaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6b73defe23757469590b3d3d92e7b79bf3b6dcbcaec44d750770cd2b94caaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c379d6c40705bccb05ab8835e3b2410fc0a98db9e372b4dbd62cc3d4c1c5470\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c379d6c40705bccb05ab8835e3b2410fc0a98db9e372b4dbd62cc3d4c1c5470\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5e3c8e16c925694d42868813bc57edd9c50c88f2153189a2b9f8451655aff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5e3c8e16c925694d42868813bc57edd9c50c88f2153189a2b9f8451655aff1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d01f2b61f7fe3605f4ad0626d2cf28756a8c7e3de343b9a20acafe7f35c19bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d01f2b61f7fe3605f4ad0626d2cf28756a8c7e3de343b9a20acafe7f35c19bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vwttt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:41Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:41 crc kubenswrapper[4930]: I1012 05:41:41.604504 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:41 crc kubenswrapper[4930]: I1012 05:41:41.604587 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:41 crc kubenswrapper[4930]: I1012 05:41:41.604614 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:41 crc kubenswrapper[4930]: I1012 05:41:41.604648 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:41 crc kubenswrapper[4930]: I1012 05:41:41.604673 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:41Z","lastTransitionTime":"2025-10-12T05:41:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:41 crc kubenswrapper[4930]: I1012 05:41:41.622843 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0add0fa2-092f-4dcc-8c72-82881564bf63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7495c144ce989bf10b08b199a7459ea85351b77941a3edcb0ef76a01b2e08164\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a4295347cdfdf0864fbc2b102904f73e19cc33021fc5a82eed61255c4641d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2cba08ba8fd74a521bdd005e51be52f80687a1ccf0f250d11b709c894be8235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1006beab015e55856b70b9bcb93bb81b541e6c593fae8fd366c361990d233914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17051b78f584f1199be3038a82d9e675d1b9fb4da939611577c0b05235a147a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9898a7d7e617b5eb8eab6b6d9cb5d2ea92e8fbb0f4b1e0e60711e8f4a5e3aef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://264bb5418af04a677f569add84f3eb35c3760f20db19a03595723d261202e317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6016f66ca9ca63c708c040eb6f9cdd047860fc43bc2877294360b3ea8b0a3289\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T05:41:39Z\\\",\\\"message\\\":\\\"ce event handler 5 for removal\\\\nI1012 05:41:39.557967 6216 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1012 05:41:39.557981 6216 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1012 05:41:39.557989 6216 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1012 05:41:39.558149 6216 handler.go:208] Removed *v1.Node event handler 2\\\\nI1012 05:41:39.559241 6216 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1012 05:41:39.559356 6216 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1012 05:41:39.559823 6216 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1012 05:41:39.559854 6216 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1012 05:41:39.559907 6216 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1012 05:41:39.559942 6216 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1012 05:41:39.559987 6216 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1012 05:41:39.560008 6216 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1012 05:41:39.560021 6216 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1012 05:41:39.561837 6216 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcac9953a6ca0ac812cb0d1aed7ff1179eed01831f4af4a098f3e6938b5bc666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mdhw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:41Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:41 crc kubenswrapper[4930]: I1012 05:41:41.641789 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jd2dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"835a1f98-4ae1-499b-b08c-a87dbcf8eaf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7816344abd767a7cdc2aa06418215233815285a683bb5a7fceeeb9a383b00bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvkzz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jd2dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:41Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:41 crc kubenswrapper[4930]: I1012 05:41:41.662423 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2acab1-2f7d-4497-ab0c-d2088825de18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://380b8da0bc8ba00ec7fe7a9dd1a4f49f0f0828fa469e21bd947a1b976ae5e584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14de1c8677ea9a164e1713e68861b55fa4b8fd0bff40211afa3ee9e883c65092\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93380cd1156afc3c34f019f5026f01b5242b9009c4c5de83edf5301e4cdb7b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b74b7cf89e0ba8d3417d38fdfc78b271cb23bc6e737358303956214c8f9a0f9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fb5b0196fd005013eba91e6ea563d14230abf0a8cbc1d2a4bf9cd4491f7c391\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T05:41:22Z\\\",\\\"message\\\":\\\"W1012 05:41:11.395066 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 05:41:11.395683 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760247671 cert, and key in /tmp/serving-cert-1865248722/serving-signer.crt, /tmp/serving-cert-1865248722/serving-signer.key\\\\nI1012 05:41:11.668419 1 observer_polling.go:159] Starting file observer\\\\nW1012 05:41:11.671952 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 05:41:11.672482 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 05:41:11.675332 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1865248722/tls.crt::/tmp/serving-cert-1865248722/tls.key\\\\\\\"\\\\nF1012 05:41:22.294380 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9e13b7dbf76d9dcdaa4a8edc40eb5ebac2c044e9f171f78f18b33a08784ed28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:41Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:41 crc kubenswrapper[4930]: I1012 05:41:41.677525 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:41Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:41 crc kubenswrapper[4930]: I1012 05:41:41.693281 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:41Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:41 crc kubenswrapper[4930]: I1012 05:41:41.707361 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:41 crc kubenswrapper[4930]: I1012 05:41:41.707396 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:41 crc kubenswrapper[4930]: I1012 05:41:41.707410 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:41 crc kubenswrapper[4930]: I1012 05:41:41.707432 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:41 crc kubenswrapper[4930]: I1012 05:41:41.707446 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:41Z","lastTransitionTime":"2025-10-12T05:41:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:41 crc kubenswrapper[4930]: I1012 05:41:41.708050 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://489b43d7659655253d7055c2725d6b59d594b9eea5d1e6dc78b015f8d44824ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:41Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:41 crc kubenswrapper[4930]: I1012 05:41:41.723500 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-br2vl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"545bf51c-0b04-4166-a984-ec9c1276470a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6148cdf3aed29894038db482881c0cf3d0fd2d99c241e87d864099a3eb4248f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tpks\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-br2vl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:41Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:41 crc kubenswrapper[4930]: I1012 05:41:41.810794 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:41 crc kubenswrapper[4930]: I1012 05:41:41.810938 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:41 crc kubenswrapper[4930]: I1012 05:41:41.811007 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:41 crc kubenswrapper[4930]: I1012 05:41:41.811043 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:41 crc kubenswrapper[4930]: I1012 05:41:41.811102 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:41Z","lastTransitionTime":"2025-10-12T05:41:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:41 crc kubenswrapper[4930]: I1012 05:41:41.915441 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:41 crc kubenswrapper[4930]: I1012 05:41:41.915528 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:41 crc kubenswrapper[4930]: I1012 05:41:41.915553 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:41 crc kubenswrapper[4930]: I1012 05:41:41.915587 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:41 crc kubenswrapper[4930]: I1012 05:41:41.915612 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:41Z","lastTransitionTime":"2025-10-12T05:41:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.019056 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.019101 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.019111 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.019125 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.019136 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:42Z","lastTransitionTime":"2025-10-12T05:41:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.122425 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.122490 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.122507 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.122533 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.122556 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:42Z","lastTransitionTime":"2025-10-12T05:41:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.135136 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.135153 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 05:41:42 crc kubenswrapper[4930]: E1012 05:41:42.135354 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 05:41:42 crc kubenswrapper[4930]: E1012 05:41:42.135475 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.135516 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 05:41:42 crc kubenswrapper[4930]: E1012 05:41:42.135666 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.155804 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.225977 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.226057 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.226076 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.226102 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.226122 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:42Z","lastTransitionTime":"2025-10-12T05:41:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.329244 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.329388 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.329405 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.329429 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.329446 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:42Z","lastTransitionTime":"2025-10-12T05:41:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.421913 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mdhw6_0add0fa2-092f-4dcc-8c72-82881564bf63/ovnkube-controller/1.log" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.422728 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mdhw6_0add0fa2-092f-4dcc-8c72-82881564bf63/ovnkube-controller/0.log" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.429805 4930 generic.go:334] "Generic (PLEG): container finished" podID="0add0fa2-092f-4dcc-8c72-82881564bf63" containerID="264bb5418af04a677f569add84f3eb35c3760f20db19a03595723d261202e317" exitCode=1 Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.429878 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" event={"ID":"0add0fa2-092f-4dcc-8c72-82881564bf63","Type":"ContainerDied","Data":"264bb5418af04a677f569add84f3eb35c3760f20db19a03595723d261202e317"} Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.429965 4930 scope.go:117] "RemoveContainer" containerID="6016f66ca9ca63c708c040eb6f9cdd047860fc43bc2877294360b3ea8b0a3289" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.433267 4930 scope.go:117] "RemoveContainer" containerID="264bb5418af04a677f569add84f3eb35c3760f20db19a03595723d261202e317" Oct 12 05:41:42 crc kubenswrapper[4930]: E1012 05:41:42.434206 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-mdhw6_openshift-ovn-kubernetes(0add0fa2-092f-4dcc-8c72-82881564bf63)\"" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" podUID="0add0fa2-092f-4dcc-8c72-82881564bf63" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.443054 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.443131 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.443159 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.443198 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.443225 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:42Z","lastTransitionTime":"2025-10-12T05:41:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.462045 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e930f9f-8120-43b6-95ba-39319c069f64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://428593f6ee3749d62d3cb8553ec6707f0f13f70f7750b8ea90fb0b9f0aa8f337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05e5821bdc37103cb95d9f2a3dcacbff75c87335a6c6ed400846ea49fe370c00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e44390f068c28c4a9917648c747a322ddfa44915b128a61f145d6e2e23fc167a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://586395a6006e26fe969c2ef9af6ec6299523c0868c26989709389f5af24b7e39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:42Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.485067 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d92840f87187d6a4e0a40972f9b1601b70c300104b6023b980e0939e8b83eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d6f5f971b8708420f74c2b9602179815dd69145979d42bbaeb899290c01c286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:42Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.505783 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tq29s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1c3ae9e-26ae-418f-b261-eabc4302b332\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbe4d76b294d8c174b77355aae9ec55f4d38c04446fc048603a294d171332584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s29kh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tq29s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:42Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.526045 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02f8684c-a3e4-44e8-9741-9f54488d8d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74c17edeca2e2b56f4be36327914c7232a5eee3190d7cd9ce54fe81050942ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqrgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2193886cf3e24fbf30c3e6f91ab52d9b7e68968c8b42a4d0deafab5668555701\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqrgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mk4tf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:42Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.546869 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.546914 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.546943 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.546969 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.546988 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:42Z","lastTransitionTime":"2025-10-12T05:41:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.548672 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c87fc1896b611cf87d3f6e8dce960e0aea4e0a5e161d12e046bec5a4d46d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:42Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.570908 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:42Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.605446 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd0aa975-5017-4585-b9d9-66563c734f99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2ee3c6242b1dd92c05cad132f941c6463862d38dd3f32db106c4458b19ca281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce0b341a84353da33af118a985d78e1926365e672cd9d9a94a8c9ccdeeeef68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://772b3ac684f8d59be06d3921919db10572b68d9acfb6ae64e85b92c637963057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04341c7af2fbffb89ce55a0ed2594bdb4fd943ac9088994f81f9245c31c62a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c844939574ab6101aa35b883fd6b7dfabb8338c2de1bc238a60b92f7e0fc99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:42Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.630438 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vwttt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d928520f-ca1d-4cca-b966-c1e6c9168db0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://083fd9c4a4a835276945402005503980476de4e834b8c77b4347cbfd74347baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://271a5e8f9d11a30e34aaa4965e6d9d036cfc47d469d87a2ff0ddcae5ba4e8fe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://271a5e8f9d11a30e34aaa4965e6d9d036cfc47d469d87a2ff0ddcae5ba4e8fe8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a7387bc9fd2f9b98dd174adbd738309e22b90e1c82756e0e4c114cd5bc1279a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a7387bc9fd2f9b98dd174adbd738309e22b90e1c82756e0e4c114cd5bc1279a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b73defe23757469590b3d3d92e7b79bf3b6dcbcaec44d750770cd2b94caaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6b73defe23757469590b3d3d92e7b79bf3b6dcbcaec44d750770cd2b94caaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c379d6c40705bccb05ab8835e3b2410fc0a98db9e372b4dbd62cc3d4c1c5470\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c379d6c40705bccb05ab8835e3b2410fc0a98db9e372b4dbd62cc3d4c1c5470\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5e3c8e16c925694d42868813bc57edd9c50c88f2153189a2b9f8451655aff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5e3c8e16c925694d42868813bc57edd9c50c88f2153189a2b9f8451655aff1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d01f2b61f7fe3605f4ad0626d2cf28756a8c7e3de343b9a20acafe7f35c19bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d01f2b61f7fe3605f4ad0626d2cf28756a8c7e3de343b9a20acafe7f35c19bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vwttt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:42Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.649603 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.649661 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.649676 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.649698 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.649709 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:42Z","lastTransitionTime":"2025-10-12T05:41:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.662662 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0add0fa2-092f-4dcc-8c72-82881564bf63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7495c144ce989bf10b08b199a7459ea85351b77941a3edcb0ef76a01b2e08164\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a4295347cdfdf0864fbc2b102904f73e19cc33021fc5a82eed61255c4641d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2cba08ba8fd74a521bdd005e51be52f80687a1ccf0f250d11b709c894be8235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1006beab015e55856b70b9bcb93bb81b541e6c593fae8fd366c361990d233914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17051b78f584f1199be3038a82d9e675d1b9fb4da939611577c0b05235a147a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9898a7d7e617b5eb8eab6b6d9cb5d2ea92e8fbb0f4b1e0e60711e8f4a5e3aef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://264bb5418af04a677f569add84f3eb35c3760f20db19a03595723d261202e317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6016f66ca9ca63c708c040eb6f9cdd047860fc43bc2877294360b3ea8b0a3289\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T05:41:39Z\\\",\\\"message\\\":\\\"ce event handler 5 for removal\\\\nI1012 05:41:39.557967 6216 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1012 05:41:39.557981 6216 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1012 05:41:39.557989 6216 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1012 05:41:39.558149 6216 handler.go:208] Removed *v1.Node event handler 2\\\\nI1012 05:41:39.559241 6216 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1012 05:41:39.559356 6216 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1012 05:41:39.559823 6216 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1012 05:41:39.559854 6216 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1012 05:41:39.559907 6216 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1012 05:41:39.559942 6216 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1012 05:41:39.559987 6216 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1012 05:41:39.560008 6216 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1012 05:41:39.560021 6216 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1012 05:41:39.561837 6216 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://264bb5418af04a677f569add84f3eb35c3760f20db19a03595723d261202e317\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T05:41:41Z\\\",\\\"message\\\":\\\"ster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.34:8443: 10.217.5.34:8888:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7715118b-bb1b-400a-803e-7ab2cc3eeec0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1012 05:41:41.453945 6374 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:41Z is after 2025-08-24T17:21:41Z]\\\\nI1012 05:41:41.453849 6374 services_controller.go:451] Built service openshift-dns-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-dns-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcac9953a6ca0ac812cb0d1aed7ff1179eed01831f4af4a098f3e6938b5bc666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mdhw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:42Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.681235 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-br2vl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"545bf51c-0b04-4166-a984-ec9c1276470a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6148cdf3aed29894038db482881c0cf3d0fd2d99c241e87d864099a3eb4248f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tpks\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-br2vl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:42Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.698527 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jd2dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"835a1f98-4ae1-499b-b08c-a87dbcf8eaf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7816344abd767a7cdc2aa06418215233815285a683bb5a7fceeeb9a383b00bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvkzz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jd2dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:42Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.717544 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2acab1-2f7d-4497-ab0c-d2088825de18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://380b8da0bc8ba00ec7fe7a9dd1a4f49f0f0828fa469e21bd947a1b976ae5e584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14de1c8677ea9a164e1713e68861b55fa4b8fd0bff40211afa3ee9e883c65092\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93380cd1156afc3c34f019f5026f01b5242b9009c4c5de83edf5301e4cdb7b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b74b7cf89e0ba8d3417d38fdfc78b271cb23bc6e737358303956214c8f9a0f9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fb5b0196fd005013eba91e6ea563d14230abf0a8cbc1d2a4bf9cd4491f7c391\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T05:41:22Z\\\",\\\"message\\\":\\\"W1012 05:41:11.395066 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 05:41:11.395683 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760247671 cert, and key in /tmp/serving-cert-1865248722/serving-signer.crt, /tmp/serving-cert-1865248722/serving-signer.key\\\\nI1012 05:41:11.668419 1 observer_polling.go:159] Starting file observer\\\\nW1012 05:41:11.671952 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 05:41:11.672482 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 05:41:11.675332 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1865248722/tls.crt::/tmp/serving-cert-1865248722/tls.key\\\\\\\"\\\\nF1012 05:41:22.294380 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9e13b7dbf76d9dcdaa4a8edc40eb5ebac2c044e9f171f78f18b33a08784ed28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:42Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.720704 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fjsw8"] Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.721283 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fjsw8" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.723581 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.728076 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.741598 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:42Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.753895 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.753951 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.753970 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.753996 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.754013 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:42Z","lastTransitionTime":"2025-10-12T05:41:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.763306 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:42Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.784074 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://489b43d7659655253d7055c2725d6b59d594b9eea5d1e6dc78b015f8d44824ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:42Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.808773 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c21ff03a-29e7-467c-93e2-45f3a6cef5af-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-fjsw8\" (UID: \"c21ff03a-29e7-467c-93e2-45f3a6cef5af\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fjsw8" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.808680 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2acab1-2f7d-4497-ab0c-d2088825de18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://380b8da0bc8ba00ec7fe7a9dd1a4f49f0f0828fa469e21bd947a1b976ae5e584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14de1c8677ea9a164e1713e68861b55fa4b8fd0bff40211afa3ee9e883c65092\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93380cd1156afc3c34f019f5026f01b5242b9009c4c5de83edf5301e4cdb7b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b74b7cf89e0ba8d3417d38fdfc78b271cb23bc6e737358303956214c8f9a0f9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fb5b0196fd005013eba91e6ea563d14230abf0a8cbc1d2a4bf9cd4491f7c391\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T05:41:22Z\\\",\\\"message\\\":\\\"W1012 05:41:11.395066 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 05:41:11.395683 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760247671 cert, and key in /tmp/serving-cert-1865248722/serving-signer.crt, /tmp/serving-cert-1865248722/serving-signer.key\\\\nI1012 05:41:11.668419 1 observer_polling.go:159] Starting file observer\\\\nW1012 05:41:11.671952 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 05:41:11.672482 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 05:41:11.675332 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1865248722/tls.crt::/tmp/serving-cert-1865248722/tls.key\\\\\\\"\\\\nF1012 05:41:22.294380 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9e13b7dbf76d9dcdaa4a8edc40eb5ebac2c044e9f171f78f18b33a08784ed28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:42Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.808872 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dk8z2\" (UniqueName: \"kubernetes.io/projected/c21ff03a-29e7-467c-93e2-45f3a6cef5af-kube-api-access-dk8z2\") pod \"ovnkube-control-plane-749d76644c-fjsw8\" (UID: \"c21ff03a-29e7-467c-93e2-45f3a6cef5af\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fjsw8" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.809098 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c21ff03a-29e7-467c-93e2-45f3a6cef5af-env-overrides\") pod \"ovnkube-control-plane-749d76644c-fjsw8\" (UID: \"c21ff03a-29e7-467c-93e2-45f3a6cef5af\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fjsw8" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.809205 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c21ff03a-29e7-467c-93e2-45f3a6cef5af-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-fjsw8\" (UID: \"c21ff03a-29e7-467c-93e2-45f3a6cef5af\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fjsw8" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.829065 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:42Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.848729 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:42Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.857349 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.857414 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.857437 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.857466 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.857488 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:42Z","lastTransitionTime":"2025-10-12T05:41:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.867889 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://489b43d7659655253d7055c2725d6b59d594b9eea5d1e6dc78b015f8d44824ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:42Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.883622 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-br2vl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"545bf51c-0b04-4166-a984-ec9c1276470a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6148cdf3aed29894038db482881c0cf3d0fd2d99c241e87d864099a3eb4248f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tpks\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-br2vl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:42Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.899522 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jd2dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"835a1f98-4ae1-499b-b08c-a87dbcf8eaf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7816344abd767a7cdc2aa06418215233815285a683bb5a7fceeeb9a383b00bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvkzz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jd2dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:42Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.912272 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c21ff03a-29e7-467c-93e2-45f3a6cef5af-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-fjsw8\" (UID: \"c21ff03a-29e7-467c-93e2-45f3a6cef5af\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fjsw8" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.912409 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dk8z2\" (UniqueName: \"kubernetes.io/projected/c21ff03a-29e7-467c-93e2-45f3a6cef5af-kube-api-access-dk8z2\") pod \"ovnkube-control-plane-749d76644c-fjsw8\" (UID: \"c21ff03a-29e7-467c-93e2-45f3a6cef5af\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fjsw8" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.912457 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c21ff03a-29e7-467c-93e2-45f3a6cef5af-env-overrides\") pod \"ovnkube-control-plane-749d76644c-fjsw8\" (UID: \"c21ff03a-29e7-467c-93e2-45f3a6cef5af\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fjsw8" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.912549 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c21ff03a-29e7-467c-93e2-45f3a6cef5af-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-fjsw8\" (UID: \"c21ff03a-29e7-467c-93e2-45f3a6cef5af\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fjsw8" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.919661 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c21ff03a-29e7-467c-93e2-45f3a6cef5af-env-overrides\") pod \"ovnkube-control-plane-749d76644c-fjsw8\" (UID: \"c21ff03a-29e7-467c-93e2-45f3a6cef5af\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fjsw8" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.920166 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c21ff03a-29e7-467c-93e2-45f3a6cef5af-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-fjsw8\" (UID: \"c21ff03a-29e7-467c-93e2-45f3a6cef5af\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fjsw8" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.925922 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c21ff03a-29e7-467c-93e2-45f3a6cef5af-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-fjsw8\" (UID: \"c21ff03a-29e7-467c-93e2-45f3a6cef5af\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fjsw8" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.938293 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e930f9f-8120-43b6-95ba-39319c069f64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://428593f6ee3749d62d3cb8553ec6707f0f13f70f7750b8ea90fb0b9f0aa8f337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05e5821bdc37103cb95d9f2a3dcacbff75c87335a6c6ed400846ea49fe370c00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e44390f068c28c4a9917648c747a322ddfa44915b128a61f145d6e2e23fc167a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://586395a6006e26fe969c2ef9af6ec6299523c0868c26989709389f5af24b7e39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:42Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.966650 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dk8z2\" (UniqueName: \"kubernetes.io/projected/c21ff03a-29e7-467c-93e2-45f3a6cef5af-kube-api-access-dk8z2\") pod \"ovnkube-control-plane-749d76644c-fjsw8\" (UID: \"c21ff03a-29e7-467c-93e2-45f3a6cef5af\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fjsw8" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.969028 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.969066 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.969082 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.969104 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.969122 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:42Z","lastTransitionTime":"2025-10-12T05:41:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:42 crc kubenswrapper[4930]: I1012 05:41:42.979209 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d92840f87187d6a4e0a40972f9b1601b70c300104b6023b980e0939e8b83eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d6f5f971b8708420f74c2b9602179815dd69145979d42bbaeb899290c01c286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:42Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.002270 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c87fc1896b611cf87d3f6e8dce960e0aea4e0a5e161d12e046bec5a4d46d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:42Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.016246 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:43Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.030682 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tq29s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1c3ae9e-26ae-418f-b261-eabc4302b332\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbe4d76b294d8c174b77355aae9ec55f4d38c04446fc048603a294d171332584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s29kh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tq29s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:43Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.041555 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02f8684c-a3e4-44e8-9741-9f54488d8d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74c17edeca2e2b56f4be36327914c7232a5eee3190d7cd9ce54fe81050942ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqrgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2193886cf3e24fbf30c3e6f91ab52d9b7e68968c8b42a4d0deafab5668555701\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqrgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mk4tf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:43Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.044790 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fjsw8" Oct 12 05:41:43 crc kubenswrapper[4930]: W1012 05:41:43.056976 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc21ff03a_29e7_467c_93e2_45f3a6cef5af.slice/crio-c4b54fdd355cc2785aa821425273ecc02e291639ab870eeffcc17137e94aedb5 WatchSource:0}: Error finding container c4b54fdd355cc2785aa821425273ecc02e291639ab870eeffcc17137e94aedb5: Status 404 returned error can't find the container with id c4b54fdd355cc2785aa821425273ecc02e291639ab870eeffcc17137e94aedb5 Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.062584 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd0aa975-5017-4585-b9d9-66563c734f99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2ee3c6242b1dd92c05cad132f941c6463862d38dd3f32db106c4458b19ca281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce0b341a84353da33af118a985d78e1926365e672cd9d9a94a8c9ccdeeeef68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://772b3ac684f8d59be06d3921919db10572b68d9acfb6ae64e85b92c637963057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04341c7af2fbffb89ce55a0ed2594bdb4fd943ac9088994f81f9245c31c62a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c844939574ab6101aa35b883fd6b7dfabb8338c2de1bc238a60b92f7e0fc99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:43Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.080816 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.080868 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.080881 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.081113 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.081136 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:43Z","lastTransitionTime":"2025-10-12T05:41:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.083618 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vwttt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d928520f-ca1d-4cca-b966-c1e6c9168db0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://083fd9c4a4a835276945402005503980476de4e834b8c77b4347cbfd74347baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://271a5e8f9d11a30e34aaa4965e6d9d036cfc47d469d87a2ff0ddcae5ba4e8fe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://271a5e8f9d11a30e34aaa4965e6d9d036cfc47d469d87a2ff0ddcae5ba4e8fe8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a7387bc9fd2f9b98dd174adbd738309e22b90e1c82756e0e4c114cd5bc1279a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a7387bc9fd2f9b98dd174adbd738309e22b90e1c82756e0e4c114cd5bc1279a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b73defe23757469590b3d3d92e7b79bf3b6dcbcaec44d750770cd2b94caaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6b73defe23757469590b3d3d92e7b79bf3b6dcbcaec44d750770cd2b94caaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c379d6c40705bccb05ab8835e3b2410fc0a98db9e372b4dbd62cc3d4c1c5470\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c379d6c40705bccb05ab8835e3b2410fc0a98db9e372b4dbd62cc3d4c1c5470\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5e3c8e16c925694d42868813bc57edd9c50c88f2153189a2b9f8451655aff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5e3c8e16c925694d42868813bc57edd9c50c88f2153189a2b9f8451655aff1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d01f2b61f7fe3605f4ad0626d2cf28756a8c7e3de343b9a20acafe7f35c19bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d01f2b61f7fe3605f4ad0626d2cf28756a8c7e3de343b9a20acafe7f35c19bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vwttt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:43Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.102919 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0add0fa2-092f-4dcc-8c72-82881564bf63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7495c144ce989bf10b08b199a7459ea85351b77941a3edcb0ef76a01b2e08164\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a4295347cdfdf0864fbc2b102904f73e19cc33021fc5a82eed61255c4641d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2cba08ba8fd74a521bdd005e51be52f80687a1ccf0f250d11b709c894be8235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1006beab015e55856b70b9bcb93bb81b541e6c593fae8fd366c361990d233914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17051b78f584f1199be3038a82d9e675d1b9fb4da939611577c0b05235a147a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9898a7d7e617b5eb8eab6b6d9cb5d2ea92e8fbb0f4b1e0e60711e8f4a5e3aef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://264bb5418af04a677f569add84f3eb35c3760f20db19a03595723d261202e317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6016f66ca9ca63c708c040eb6f9cdd047860fc43bc2877294360b3ea8b0a3289\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T05:41:39Z\\\",\\\"message\\\":\\\"ce event handler 5 for removal\\\\nI1012 05:41:39.557967 6216 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1012 05:41:39.557981 6216 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1012 05:41:39.557989 6216 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1012 05:41:39.558149 6216 handler.go:208] Removed *v1.Node event handler 2\\\\nI1012 05:41:39.559241 6216 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1012 05:41:39.559356 6216 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1012 05:41:39.559823 6216 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1012 05:41:39.559854 6216 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1012 05:41:39.559907 6216 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1012 05:41:39.559942 6216 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1012 05:41:39.559987 6216 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1012 05:41:39.560008 6216 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1012 05:41:39.560021 6216 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1012 05:41:39.561837 6216 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://264bb5418af04a677f569add84f3eb35c3760f20db19a03595723d261202e317\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T05:41:41Z\\\",\\\"message\\\":\\\"ster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.34:8443: 10.217.5.34:8888:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7715118b-bb1b-400a-803e-7ab2cc3eeec0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1012 05:41:41.453945 6374 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:41Z is after 2025-08-24T17:21:41Z]\\\\nI1012 05:41:41.453849 6374 services_controller.go:451] Built service openshift-dns-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-dns-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcac9953a6ca0ac812cb0d1aed7ff1179eed01831f4af4a098f3e6938b5bc666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mdhw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:43Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.116691 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fjsw8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c21ff03a-29e7-467c-93e2-45f3a6cef5af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dk8z2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dk8z2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fjsw8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:43Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.184123 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.184171 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.184186 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.184205 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.184221 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:43Z","lastTransitionTime":"2025-10-12T05:41:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.287651 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.287809 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.287830 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.287894 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.287916 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:43Z","lastTransitionTime":"2025-10-12T05:41:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.391364 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.391415 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.391427 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.391446 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.391459 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:43Z","lastTransitionTime":"2025-10-12T05:41:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.438064 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mdhw6_0add0fa2-092f-4dcc-8c72-82881564bf63/ovnkube-controller/1.log" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.443658 4930 scope.go:117] "RemoveContainer" containerID="264bb5418af04a677f569add84f3eb35c3760f20db19a03595723d261202e317" Oct 12 05:41:43 crc kubenswrapper[4930]: E1012 05:41:43.444003 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-mdhw6_openshift-ovn-kubernetes(0add0fa2-092f-4dcc-8c72-82881564bf63)\"" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" podUID="0add0fa2-092f-4dcc-8c72-82881564bf63" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.445218 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fjsw8" event={"ID":"c21ff03a-29e7-467c-93e2-45f3a6cef5af","Type":"ContainerStarted","Data":"bae41878203d4a2974f46d284f4fc05e705fb9fc9b1a3e8159db00a4c9d70762"} Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.445292 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fjsw8" event={"ID":"c21ff03a-29e7-467c-93e2-45f3a6cef5af","Type":"ContainerStarted","Data":"df55e443aa226c07478c488ee54785f35fbb1d6285775e82abcbf924aaf304fd"} Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.445317 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fjsw8" event={"ID":"c21ff03a-29e7-467c-93e2-45f3a6cef5af","Type":"ContainerStarted","Data":"c4b54fdd355cc2785aa821425273ecc02e291639ab870eeffcc17137e94aedb5"} Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.473221 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd0aa975-5017-4585-b9d9-66563c734f99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2ee3c6242b1dd92c05cad132f941c6463862d38dd3f32db106c4458b19ca281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce0b341a84353da33af118a985d78e1926365e672cd9d9a94a8c9ccdeeeef68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://772b3ac684f8d59be06d3921919db10572b68d9acfb6ae64e85b92c637963057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04341c7af2fbffb89ce55a0ed2594bdb4fd943ac9088994f81f9245c31c62a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c844939574ab6101aa35b883fd6b7dfabb8338c2de1bc238a60b92f7e0fc99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:43Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.493414 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.493476 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.493493 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.493520 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.493538 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:43Z","lastTransitionTime":"2025-10-12T05:41:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.494686 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vwttt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d928520f-ca1d-4cca-b966-c1e6c9168db0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://083fd9c4a4a835276945402005503980476de4e834b8c77b4347cbfd74347baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://271a5e8f9d11a30e34aaa4965e6d9d036cfc47d469d87a2ff0ddcae5ba4e8fe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://271a5e8f9d11a30e34aaa4965e6d9d036cfc47d469d87a2ff0ddcae5ba4e8fe8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a7387bc9fd2f9b98dd174adbd738309e22b90e1c82756e0e4c114cd5bc1279a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a7387bc9fd2f9b98dd174adbd738309e22b90e1c82756e0e4c114cd5bc1279a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b73defe23757469590b3d3d92e7b79bf3b6dcbcaec44d750770cd2b94caaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6b73defe23757469590b3d3d92e7b79bf3b6dcbcaec44d750770cd2b94caaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c379d6c40705bccb05ab8835e3b2410fc0a98db9e372b4dbd62cc3d4c1c5470\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c379d6c40705bccb05ab8835e3b2410fc0a98db9e372b4dbd62cc3d4c1c5470\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5e3c8e16c925694d42868813bc57edd9c50c88f2153189a2b9f8451655aff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5e3c8e16c925694d42868813bc57edd9c50c88f2153189a2b9f8451655aff1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d01f2b61f7fe3605f4ad0626d2cf28756a8c7e3de343b9a20acafe7f35c19bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d01f2b61f7fe3605f4ad0626d2cf28756a8c7e3de343b9a20acafe7f35c19bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vwttt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:43Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.530572 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0add0fa2-092f-4dcc-8c72-82881564bf63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7495c144ce989bf10b08b199a7459ea85351b77941a3edcb0ef76a01b2e08164\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a4295347cdfdf0864fbc2b102904f73e19cc33021fc5a82eed61255c4641d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2cba08ba8fd74a521bdd005e51be52f80687a1ccf0f250d11b709c894be8235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1006beab015e55856b70b9bcb93bb81b541e6c593fae8fd366c361990d233914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17051b78f584f1199be3038a82d9e675d1b9fb4da939611577c0b05235a147a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9898a7d7e617b5eb8eab6b6d9cb5d2ea92e8fbb0f4b1e0e60711e8f4a5e3aef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://264bb5418af04a677f569add84f3eb35c3760f20db19a03595723d261202e317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://264bb5418af04a677f569add84f3eb35c3760f20db19a03595723d261202e317\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T05:41:41Z\\\",\\\"message\\\":\\\"ster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.34:8443: 10.217.5.34:8888:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7715118b-bb1b-400a-803e-7ab2cc3eeec0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1012 05:41:41.453945 6374 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:41Z is after 2025-08-24T17:21:41Z]\\\\nI1012 05:41:41.453849 6374 services_controller.go:451] Built service openshift-dns-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-dns-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-mdhw6_openshift-ovn-kubernetes(0add0fa2-092f-4dcc-8c72-82881564bf63)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcac9953a6ca0ac812cb0d1aed7ff1179eed01831f4af4a098f3e6938b5bc666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mdhw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:43Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.545530 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fjsw8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c21ff03a-29e7-467c-93e2-45f3a6cef5af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dk8z2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dk8z2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fjsw8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:43Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.569052 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2acab1-2f7d-4497-ab0c-d2088825de18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://380b8da0bc8ba00ec7fe7a9dd1a4f49f0f0828fa469e21bd947a1b976ae5e584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14de1c8677ea9a164e1713e68861b55fa4b8fd0bff40211afa3ee9e883c65092\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93380cd1156afc3c34f019f5026f01b5242b9009c4c5de83edf5301e4cdb7b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b74b7cf89e0ba8d3417d38fdfc78b271cb23bc6e737358303956214c8f9a0f9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fb5b0196fd005013eba91e6ea563d14230abf0a8cbc1d2a4bf9cd4491f7c391\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T05:41:22Z\\\",\\\"message\\\":\\\"W1012 05:41:11.395066 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 05:41:11.395683 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760247671 cert, and key in /tmp/serving-cert-1865248722/serving-signer.crt, /tmp/serving-cert-1865248722/serving-signer.key\\\\nI1012 05:41:11.668419 1 observer_polling.go:159] Starting file observer\\\\nW1012 05:41:11.671952 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 05:41:11.672482 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 05:41:11.675332 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1865248722/tls.crt::/tmp/serving-cert-1865248722/tls.key\\\\\\\"\\\\nF1012 05:41:22.294380 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9e13b7dbf76d9dcdaa4a8edc40eb5ebac2c044e9f171f78f18b33a08784ed28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:43Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.589600 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:43Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.596226 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.596292 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.596312 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.596342 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.596364 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:43Z","lastTransitionTime":"2025-10-12T05:41:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.623920 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:43Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.642127 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://489b43d7659655253d7055c2725d6b59d594b9eea5d1e6dc78b015f8d44824ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:43Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.653592 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-br2vl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"545bf51c-0b04-4166-a984-ec9c1276470a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6148cdf3aed29894038db482881c0cf3d0fd2d99c241e87d864099a3eb4248f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tpks\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-br2vl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:43Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.666519 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jd2dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"835a1f98-4ae1-499b-b08c-a87dbcf8eaf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7816344abd767a7cdc2aa06418215233815285a683bb5a7fceeeb9a383b00bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvkzz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jd2dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:43Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.685239 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e930f9f-8120-43b6-95ba-39319c069f64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://428593f6ee3749d62d3cb8553ec6707f0f13f70f7750b8ea90fb0b9f0aa8f337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05e5821bdc37103cb95d9f2a3dcacbff75c87335a6c6ed400846ea49fe370c00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e44390f068c28c4a9917648c747a322ddfa44915b128a61f145d6e2e23fc167a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://586395a6006e26fe969c2ef9af6ec6299523c0868c26989709389f5af24b7e39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:43Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.699629 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d92840f87187d6a4e0a40972f9b1601b70c300104b6023b980e0939e8b83eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d6f5f971b8708420f74c2b9602179815dd69145979d42bbaeb899290c01c286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:43Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.699831 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.699920 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.699931 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.699963 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.699974 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:43Z","lastTransitionTime":"2025-10-12T05:41:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.721824 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c87fc1896b611cf87d3f6e8dce960e0aea4e0a5e161d12e046bec5a4d46d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:43Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.738364 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:43Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.753804 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tq29s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1c3ae9e-26ae-418f-b261-eabc4302b332\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbe4d76b294d8c174b77355aae9ec55f4d38c04446fc048603a294d171332584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s29kh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tq29s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:43Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.764702 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02f8684c-a3e4-44e8-9741-9f54488d8d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74c17edeca2e2b56f4be36327914c7232a5eee3190d7cd9ce54fe81050942ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqrgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2193886cf3e24fbf30c3e6f91ab52d9b7e68968c8b42a4d0deafab5668555701\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqrgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mk4tf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:43Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.779321 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e930f9f-8120-43b6-95ba-39319c069f64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://428593f6ee3749d62d3cb8553ec6707f0f13f70f7750b8ea90fb0b9f0aa8f337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05e5821bdc37103cb95d9f2a3dcacbff75c87335a6c6ed400846ea49fe370c00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e44390f068c28c4a9917648c747a322ddfa44915b128a61f145d6e2e23fc167a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://586395a6006e26fe969c2ef9af6ec6299523c0868c26989709389f5af24b7e39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:43Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.792400 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d92840f87187d6a4e0a40972f9b1601b70c300104b6023b980e0939e8b83eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d6f5f971b8708420f74c2b9602179815dd69145979d42bbaeb899290c01c286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:43Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.805257 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:43Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.809919 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.810125 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.810336 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.810430 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.810507 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:43Z","lastTransitionTime":"2025-10-12T05:41:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.822284 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tq29s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1c3ae9e-26ae-418f-b261-eabc4302b332\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbe4d76b294d8c174b77355aae9ec55f4d38c04446fc048603a294d171332584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s29kh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tq29s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:43Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.834381 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02f8684c-a3e4-44e8-9741-9f54488d8d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74c17edeca2e2b56f4be36327914c7232a5eee3190d7cd9ce54fe81050942ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqrgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2193886cf3e24fbf30c3e6f91ab52d9b7e68968c8b42a4d0deafab5668555701\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqrgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mk4tf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:43Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.847064 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c87fc1896b611cf87d3f6e8dce960e0aea4e0a5e161d12e046bec5a4d46d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:43Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.859622 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fjsw8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c21ff03a-29e7-467c-93e2-45f3a6cef5af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df55e443aa226c07478c488ee54785f35fbb1d6285775e82abcbf924aaf304fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dk8z2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bae41878203d4a2974f46d284f4fc05e705fb9fc9b1a3e8159db00a4c9d70762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dk8z2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fjsw8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:43Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.884649 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd0aa975-5017-4585-b9d9-66563c734f99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2ee3c6242b1dd92c05cad132f941c6463862d38dd3f32db106c4458b19ca281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce0b341a84353da33af118a985d78e1926365e672cd9d9a94a8c9ccdeeeef68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://772b3ac684f8d59be06d3921919db10572b68d9acfb6ae64e85b92c637963057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04341c7af2fbffb89ce55a0ed2594bdb4fd943ac9088994f81f9245c31c62a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c844939574ab6101aa35b883fd6b7dfabb8338c2de1bc238a60b92f7e0fc99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:43Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.901539 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vwttt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d928520f-ca1d-4cca-b966-c1e6c9168db0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://083fd9c4a4a835276945402005503980476de4e834b8c77b4347cbfd74347baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://271a5e8f9d11a30e34aaa4965e6d9d036cfc47d469d87a2ff0ddcae5ba4e8fe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://271a5e8f9d11a30e34aaa4965e6d9d036cfc47d469d87a2ff0ddcae5ba4e8fe8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a7387bc9fd2f9b98dd174adbd738309e22b90e1c82756e0e4c114cd5bc1279a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a7387bc9fd2f9b98dd174adbd738309e22b90e1c82756e0e4c114cd5bc1279a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b73defe23757469590b3d3d92e7b79bf3b6dcbcaec44d750770cd2b94caaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6b73defe23757469590b3d3d92e7b79bf3b6dcbcaec44d750770cd2b94caaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c379d6c40705bccb05ab8835e3b2410fc0a98db9e372b4dbd62cc3d4c1c5470\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c379d6c40705bccb05ab8835e3b2410fc0a98db9e372b4dbd62cc3d4c1c5470\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5e3c8e16c925694d42868813bc57edd9c50c88f2153189a2b9f8451655aff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5e3c8e16c925694d42868813bc57edd9c50c88f2153189a2b9f8451655aff1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d01f2b61f7fe3605f4ad0626d2cf28756a8c7e3de343b9a20acafe7f35c19bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d01f2b61f7fe3605f4ad0626d2cf28756a8c7e3de343b9a20acafe7f35c19bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vwttt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:43Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.912513 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.912541 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.912552 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.912567 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.912578 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:43Z","lastTransitionTime":"2025-10-12T05:41:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.921990 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0add0fa2-092f-4dcc-8c72-82881564bf63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7495c144ce989bf10b08b199a7459ea85351b77941a3edcb0ef76a01b2e08164\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a4295347cdfdf0864fbc2b102904f73e19cc33021fc5a82eed61255c4641d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2cba08ba8fd74a521bdd005e51be52f80687a1ccf0f250d11b709c894be8235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1006beab015e55856b70b9bcb93bb81b541e6c593fae8fd366c361990d233914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17051b78f584f1199be3038a82d9e675d1b9fb4da939611577c0b05235a147a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9898a7d7e617b5eb8eab6b6d9cb5d2ea92e8fbb0f4b1e0e60711e8f4a5e3aef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://264bb5418af04a677f569add84f3eb35c3760f20db19a03595723d261202e317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://264bb5418af04a677f569add84f3eb35c3760f20db19a03595723d261202e317\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T05:41:41Z\\\",\\\"message\\\":\\\"ster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.34:8443: 10.217.5.34:8888:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7715118b-bb1b-400a-803e-7ab2cc3eeec0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1012 05:41:41.453945 6374 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:41Z is after 2025-08-24T17:21:41Z]\\\\nI1012 05:41:41.453849 6374 services_controller.go:451] Built service openshift-dns-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-dns-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-mdhw6_openshift-ovn-kubernetes(0add0fa2-092f-4dcc-8c72-82881564bf63)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcac9953a6ca0ac812cb0d1aed7ff1179eed01831f4af4a098f3e6938b5bc666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mdhw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:43Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.922609 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 05:41:43 crc kubenswrapper[4930]: E1012 05:41:43.922694 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 05:41:59.922674405 +0000 UTC m=+52.464776180 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.922936 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.923041 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.923119 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.923196 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 05:41:43 crc kubenswrapper[4930]: E1012 05:41:43.923341 4930 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 12 05:41:43 crc kubenswrapper[4930]: E1012 05:41:43.923429 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-12 05:41:59.92342032 +0000 UTC m=+52.465522085 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 12 05:41:43 crc kubenswrapper[4930]: E1012 05:41:43.923751 4930 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 12 05:41:43 crc kubenswrapper[4930]: E1012 05:41:43.923848 4930 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 12 05:41:43 crc kubenswrapper[4930]: E1012 05:41:43.924009 4930 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 05:41:43 crc kubenswrapper[4930]: E1012 05:41:43.924065 4930 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 12 05:41:43 crc kubenswrapper[4930]: E1012 05:41:43.924136 4930 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 12 05:41:43 crc kubenswrapper[4930]: E1012 05:41:43.924170 4930 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 05:41:43 crc kubenswrapper[4930]: E1012 05:41:43.923928 4930 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 12 05:41:43 crc kubenswrapper[4930]: E1012 05:41:43.924113 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-12 05:41:59.924104383 +0000 UTC m=+52.466206138 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 05:41:43 crc kubenswrapper[4930]: E1012 05:41:43.924409 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-12 05:41:59.924398859 +0000 UTC m=+52.466500624 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 05:41:43 crc kubenswrapper[4930]: E1012 05:41:43.924484 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-12 05:41:59.924476471 +0000 UTC m=+52.466578236 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.934838 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://489b43d7659655253d7055c2725d6b59d594b9eea5d1e6dc78b015f8d44824ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:43Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.944115 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-br2vl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"545bf51c-0b04-4166-a984-ec9c1276470a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6148cdf3aed29894038db482881c0cf3d0fd2d99c241e87d864099a3eb4248f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tpks\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-br2vl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:43Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.953151 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jd2dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"835a1f98-4ae1-499b-b08c-a87dbcf8eaf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7816344abd767a7cdc2aa06418215233815285a683bb5a7fceeeb9a383b00bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvkzz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jd2dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:43Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.964396 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2acab1-2f7d-4497-ab0c-d2088825de18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://380b8da0bc8ba00ec7fe7a9dd1a4f49f0f0828fa469e21bd947a1b976ae5e584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14de1c8677ea9a164e1713e68861b55fa4b8fd0bff40211afa3ee9e883c65092\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93380cd1156afc3c34f019f5026f01b5242b9009c4c5de83edf5301e4cdb7b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b74b7cf89e0ba8d3417d38fdfc78b271cb23bc6e737358303956214c8f9a0f9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fb5b0196fd005013eba91e6ea563d14230abf0a8cbc1d2a4bf9cd4491f7c391\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T05:41:22Z\\\",\\\"message\\\":\\\"W1012 05:41:11.395066 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 05:41:11.395683 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760247671 cert, and key in /tmp/serving-cert-1865248722/serving-signer.crt, /tmp/serving-cert-1865248722/serving-signer.key\\\\nI1012 05:41:11.668419 1 observer_polling.go:159] Starting file observer\\\\nW1012 05:41:11.671952 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 05:41:11.672482 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 05:41:11.675332 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1865248722/tls.crt::/tmp/serving-cert-1865248722/tls.key\\\\\\\"\\\\nF1012 05:41:22.294380 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9e13b7dbf76d9dcdaa4a8edc40eb5ebac2c044e9f171f78f18b33a08784ed28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:43Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.974291 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:43Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:43 crc kubenswrapper[4930]: I1012 05:41:43.984065 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:43Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.014941 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.015000 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.015019 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.015043 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.015061 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:44Z","lastTransitionTime":"2025-10-12T05:41:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.118506 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.118636 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.118658 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.119052 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.119075 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:44Z","lastTransitionTime":"2025-10-12T05:41:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.134387 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.134476 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.134783 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 05:41:44 crc kubenswrapper[4930]: E1012 05:41:44.135009 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 05:41:44 crc kubenswrapper[4930]: E1012 05:41:44.134927 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 05:41:44 crc kubenswrapper[4930]: E1012 05:41:44.135347 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.223305 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.223380 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.223398 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.223859 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.223902 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:44Z","lastTransitionTime":"2025-10-12T05:41:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.244961 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-7cjzn"] Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.246175 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cjzn" Oct 12 05:41:44 crc kubenswrapper[4930]: E1012 05:41:44.246456 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cjzn" podUID="dda08509-105f-4935-a8a9-ff852e73c3ce" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.268028 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e930f9f-8120-43b6-95ba-39319c069f64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://428593f6ee3749d62d3cb8553ec6707f0f13f70f7750b8ea90fb0b9f0aa8f337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05e5821bdc37103cb95d9f2a3dcacbff75c87335a6c6ed400846ea49fe370c00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e44390f068c28c4a9917648c747a322ddfa44915b128a61f145d6e2e23fc167a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://586395a6006e26fe969c2ef9af6ec6299523c0868c26989709389f5af24b7e39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:44Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.287666 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d92840f87187d6a4e0a40972f9b1601b70c300104b6023b980e0939e8b83eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d6f5f971b8708420f74c2b9602179815dd69145979d42bbaeb899290c01c286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:44Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.307896 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c87fc1896b611cf87d3f6e8dce960e0aea4e0a5e161d12e046bec5a4d46d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:44Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.325226 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:44Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.327386 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ntq5\" (UniqueName: \"kubernetes.io/projected/dda08509-105f-4935-a8a9-ff852e73c3ce-kube-api-access-2ntq5\") pod \"network-metrics-daemon-7cjzn\" (UID: \"dda08509-105f-4935-a8a9-ff852e73c3ce\") " pod="openshift-multus/network-metrics-daemon-7cjzn" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.327493 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dda08509-105f-4935-a8a9-ff852e73c3ce-metrics-certs\") pod \"network-metrics-daemon-7cjzn\" (UID: \"dda08509-105f-4935-a8a9-ff852e73c3ce\") " pod="openshift-multus/network-metrics-daemon-7cjzn" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.327534 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.327583 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.327606 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.327635 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.327656 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:44Z","lastTransitionTime":"2025-10-12T05:41:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.343497 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tq29s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1c3ae9e-26ae-418f-b261-eabc4302b332\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbe4d76b294d8c174b77355aae9ec55f4d38c04446fc048603a294d171332584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s29kh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tq29s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:44Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.362801 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02f8684c-a3e4-44e8-9741-9f54488d8d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74c17edeca2e2b56f4be36327914c7232a5eee3190d7cd9ce54fe81050942ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqrgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2193886cf3e24fbf30c3e6f91ab52d9b7e68968c8b42a4d0deafab5668555701\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqrgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mk4tf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:44Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.399919 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd0aa975-5017-4585-b9d9-66563c734f99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2ee3c6242b1dd92c05cad132f941c6463862d38dd3f32db106c4458b19ca281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce0b341a84353da33af118a985d78e1926365e672cd9d9a94a8c9ccdeeeef68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://772b3ac684f8d59be06d3921919db10572b68d9acfb6ae64e85b92c637963057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04341c7af2fbffb89ce55a0ed2594bdb4fd943ac9088994f81f9245c31c62a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c844939574ab6101aa35b883fd6b7dfabb8338c2de1bc238a60b92f7e0fc99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:44Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.424000 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vwttt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d928520f-ca1d-4cca-b966-c1e6c9168db0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://083fd9c4a4a835276945402005503980476de4e834b8c77b4347cbfd74347baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://271a5e8f9d11a30e34aaa4965e6d9d036cfc47d469d87a2ff0ddcae5ba4e8fe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://271a5e8f9d11a30e34aaa4965e6d9d036cfc47d469d87a2ff0ddcae5ba4e8fe8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a7387bc9fd2f9b98dd174adbd738309e22b90e1c82756e0e4c114cd5bc1279a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a7387bc9fd2f9b98dd174adbd738309e22b90e1c82756e0e4c114cd5bc1279a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b73defe23757469590b3d3d92e7b79bf3b6dcbcaec44d750770cd2b94caaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6b73defe23757469590b3d3d92e7b79bf3b6dcbcaec44d750770cd2b94caaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c379d6c40705bccb05ab8835e3b2410fc0a98db9e372b4dbd62cc3d4c1c5470\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c379d6c40705bccb05ab8835e3b2410fc0a98db9e372b4dbd62cc3d4c1c5470\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5e3c8e16c925694d42868813bc57edd9c50c88f2153189a2b9f8451655aff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5e3c8e16c925694d42868813bc57edd9c50c88f2153189a2b9f8451655aff1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d01f2b61f7fe3605f4ad0626d2cf28756a8c7e3de343b9a20acafe7f35c19bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d01f2b61f7fe3605f4ad0626d2cf28756a8c7e3de343b9a20acafe7f35c19bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vwttt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:44Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.428250 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ntq5\" (UniqueName: \"kubernetes.io/projected/dda08509-105f-4935-a8a9-ff852e73c3ce-kube-api-access-2ntq5\") pod \"network-metrics-daemon-7cjzn\" (UID: \"dda08509-105f-4935-a8a9-ff852e73c3ce\") " pod="openshift-multus/network-metrics-daemon-7cjzn" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.428342 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dda08509-105f-4935-a8a9-ff852e73c3ce-metrics-certs\") pod \"network-metrics-daemon-7cjzn\" (UID: \"dda08509-105f-4935-a8a9-ff852e73c3ce\") " pod="openshift-multus/network-metrics-daemon-7cjzn" Oct 12 05:41:44 crc kubenswrapper[4930]: E1012 05:41:44.428487 4930 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 12 05:41:44 crc kubenswrapper[4930]: E1012 05:41:44.428551 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dda08509-105f-4935-a8a9-ff852e73c3ce-metrics-certs podName:dda08509-105f-4935-a8a9-ff852e73c3ce nodeName:}" failed. No retries permitted until 2025-10-12 05:41:44.928529831 +0000 UTC m=+37.470631626 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dda08509-105f-4935-a8a9-ff852e73c3ce-metrics-certs") pod "network-metrics-daemon-7cjzn" (UID: "dda08509-105f-4935-a8a9-ff852e73c3ce") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.431300 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.431705 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.431955 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.432167 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.432363 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:44Z","lastTransitionTime":"2025-10-12T05:41:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.454349 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0add0fa2-092f-4dcc-8c72-82881564bf63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7495c144ce989bf10b08b199a7459ea85351b77941a3edcb0ef76a01b2e08164\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a4295347cdfdf0864fbc2b102904f73e19cc33021fc5a82eed61255c4641d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2cba08ba8fd74a521bdd005e51be52f80687a1ccf0f250d11b709c894be8235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1006beab015e55856b70b9bcb93bb81b541e6c593fae8fd366c361990d233914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17051b78f584f1199be3038a82d9e675d1b9fb4da939611577c0b05235a147a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9898a7d7e617b5eb8eab6b6d9cb5d2ea92e8fbb0f4b1e0e60711e8f4a5e3aef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://264bb5418af04a677f569add84f3eb35c3760f20db19a03595723d261202e317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://264bb5418af04a677f569add84f3eb35c3760f20db19a03595723d261202e317\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T05:41:41Z\\\",\\\"message\\\":\\\"ster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.34:8443: 10.217.5.34:8888:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7715118b-bb1b-400a-803e-7ab2cc3eeec0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1012 05:41:41.453945 6374 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:41Z is after 2025-08-24T17:21:41Z]\\\\nI1012 05:41:41.453849 6374 services_controller.go:451] Built service openshift-dns-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-dns-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-mdhw6_openshift-ovn-kubernetes(0add0fa2-092f-4dcc-8c72-82881564bf63)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcac9953a6ca0ac812cb0d1aed7ff1179eed01831f4af4a098f3e6938b5bc666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mdhw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:44Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.460327 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ntq5\" (UniqueName: \"kubernetes.io/projected/dda08509-105f-4935-a8a9-ff852e73c3ce-kube-api-access-2ntq5\") pod \"network-metrics-daemon-7cjzn\" (UID: \"dda08509-105f-4935-a8a9-ff852e73c3ce\") " pod="openshift-multus/network-metrics-daemon-7cjzn" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.469339 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.469433 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.469460 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.469490 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.469511 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:44Z","lastTransitionTime":"2025-10-12T05:41:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.477422 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fjsw8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c21ff03a-29e7-467c-93e2-45f3a6cef5af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df55e443aa226c07478c488ee54785f35fbb1d6285775e82abcbf924aaf304fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dk8z2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bae41878203d4a2974f46d284f4fc05e705fb9fc9b1a3e8159db00a4c9d70762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dk8z2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fjsw8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:44Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:44 crc kubenswrapper[4930]: E1012 05:41:44.489513 4930 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"208e324a-7c16-4d54-b585-7c9f58265cb2\\\",\\\"systemUUID\\\":\\\"42c4af26-8d34-49a0-8413-58384a3ecd2b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:44Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.494368 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7cjzn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dda08509-105f-4935-a8a9-ff852e73c3ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2ntq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2ntq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7cjzn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:44Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.495162 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.495371 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.495528 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.495900 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.496090 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:44Z","lastTransitionTime":"2025-10-12T05:41:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.518182 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2acab1-2f7d-4497-ab0c-d2088825de18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://380b8da0bc8ba00ec7fe7a9dd1a4f49f0f0828fa469e21bd947a1b976ae5e584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14de1c8677ea9a164e1713e68861b55fa4b8fd0bff40211afa3ee9e883c65092\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93380cd1156afc3c34f019f5026f01b5242b9009c4c5de83edf5301e4cdb7b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b74b7cf89e0ba8d3417d38fdfc78b271cb23bc6e737358303956214c8f9a0f9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fb5b0196fd005013eba91e6ea563d14230abf0a8cbc1d2a4bf9cd4491f7c391\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T05:41:22Z\\\",\\\"message\\\":\\\"W1012 05:41:11.395066 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 05:41:11.395683 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760247671 cert, and key in /tmp/serving-cert-1865248722/serving-signer.crt, /tmp/serving-cert-1865248722/serving-signer.key\\\\nI1012 05:41:11.668419 1 observer_polling.go:159] Starting file observer\\\\nW1012 05:41:11.671952 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 05:41:11.672482 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 05:41:11.675332 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1865248722/tls.crt::/tmp/serving-cert-1865248722/tls.key\\\\\\\"\\\\nF1012 05:41:22.294380 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9e13b7dbf76d9dcdaa4a8edc40eb5ebac2c044e9f171f78f18b33a08784ed28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:44Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:44 crc kubenswrapper[4930]: E1012 05:41:44.522507 4930 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"208e324a-7c16-4d54-b585-7c9f58265cb2\\\",\\\"systemUUID\\\":\\\"42c4af26-8d34-49a0-8413-58384a3ecd2b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:44Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.529070 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.529124 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.529138 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.529163 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.529181 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:44Z","lastTransitionTime":"2025-10-12T05:41:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.541440 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:44Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:44 crc kubenswrapper[4930]: E1012 05:41:44.548447 4930 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"208e324a-7c16-4d54-b585-7c9f58265cb2\\\",\\\"systemUUID\\\":\\\"42c4af26-8d34-49a0-8413-58384a3ecd2b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:44Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.553363 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.553440 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.553459 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.553488 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.553507 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:44Z","lastTransitionTime":"2025-10-12T05:41:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.562516 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:44Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:44 crc kubenswrapper[4930]: E1012 05:41:44.569103 4930 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"208e324a-7c16-4d54-b585-7c9f58265cb2\\\",\\\"systemUUID\\\":\\\"42c4af26-8d34-49a0-8413-58384a3ecd2b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:44Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.574105 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.574183 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.574223 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.574245 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.574262 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:44Z","lastTransitionTime":"2025-10-12T05:41:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.581217 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://489b43d7659655253d7055c2725d6b59d594b9eea5d1e6dc78b015f8d44824ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:44Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:44 crc kubenswrapper[4930]: E1012 05:41:44.593869 4930 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"208e324a-7c16-4d54-b585-7c9f58265cb2\\\",\\\"systemUUID\\\":\\\"42c4af26-8d34-49a0-8413-58384a3ecd2b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:44Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:44 crc kubenswrapper[4930]: E1012 05:41:44.593997 4930 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.596720 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.597063 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.597361 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.597684 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.598550 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:44Z","lastTransitionTime":"2025-10-12T05:41:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.599476 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-br2vl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"545bf51c-0b04-4166-a984-ec9c1276470a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6148cdf3aed29894038db482881c0cf3d0fd2d99c241e87d864099a3eb4248f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tpks\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-br2vl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:44Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.613795 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jd2dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"835a1f98-4ae1-499b-b08c-a87dbcf8eaf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7816344abd767a7cdc2aa06418215233815285a683bb5a7fceeeb9a383b00bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvkzz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jd2dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:44Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.702613 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.703043 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.703273 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.703519 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.703786 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:44Z","lastTransitionTime":"2025-10-12T05:41:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.807040 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.807090 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.807107 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.807131 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.807149 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:44Z","lastTransitionTime":"2025-10-12T05:41:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.910503 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.910559 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.910576 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.910601 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.910619 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:44Z","lastTransitionTime":"2025-10-12T05:41:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:44 crc kubenswrapper[4930]: I1012 05:41:44.934951 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dda08509-105f-4935-a8a9-ff852e73c3ce-metrics-certs\") pod \"network-metrics-daemon-7cjzn\" (UID: \"dda08509-105f-4935-a8a9-ff852e73c3ce\") " pod="openshift-multus/network-metrics-daemon-7cjzn" Oct 12 05:41:44 crc kubenswrapper[4930]: E1012 05:41:44.935167 4930 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 12 05:41:44 crc kubenswrapper[4930]: E1012 05:41:44.935287 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dda08509-105f-4935-a8a9-ff852e73c3ce-metrics-certs podName:dda08509-105f-4935-a8a9-ff852e73c3ce nodeName:}" failed. No retries permitted until 2025-10-12 05:41:45.935251365 +0000 UTC m=+38.477353190 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dda08509-105f-4935-a8a9-ff852e73c3ce-metrics-certs") pod "network-metrics-daemon-7cjzn" (UID: "dda08509-105f-4935-a8a9-ff852e73c3ce") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 12 05:41:45 crc kubenswrapper[4930]: I1012 05:41:45.014633 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:45 crc kubenswrapper[4930]: I1012 05:41:45.014695 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:45 crc kubenswrapper[4930]: I1012 05:41:45.014714 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:45 crc kubenswrapper[4930]: I1012 05:41:45.014774 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:45 crc kubenswrapper[4930]: I1012 05:41:45.014794 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:45Z","lastTransitionTime":"2025-10-12T05:41:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:45 crc kubenswrapper[4930]: I1012 05:41:45.118101 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:45 crc kubenswrapper[4930]: I1012 05:41:45.118168 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:45 crc kubenswrapper[4930]: I1012 05:41:45.118196 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:45 crc kubenswrapper[4930]: I1012 05:41:45.118221 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:45 crc kubenswrapper[4930]: I1012 05:41:45.118238 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:45Z","lastTransitionTime":"2025-10-12T05:41:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:45 crc kubenswrapper[4930]: I1012 05:41:45.222262 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:45 crc kubenswrapper[4930]: I1012 05:41:45.222331 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:45 crc kubenswrapper[4930]: I1012 05:41:45.222350 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:45 crc kubenswrapper[4930]: I1012 05:41:45.222375 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:45 crc kubenswrapper[4930]: I1012 05:41:45.222394 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:45Z","lastTransitionTime":"2025-10-12T05:41:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:45 crc kubenswrapper[4930]: I1012 05:41:45.325958 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:45 crc kubenswrapper[4930]: I1012 05:41:45.326013 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:45 crc kubenswrapper[4930]: I1012 05:41:45.326031 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:45 crc kubenswrapper[4930]: I1012 05:41:45.326054 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:45 crc kubenswrapper[4930]: I1012 05:41:45.326072 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:45Z","lastTransitionTime":"2025-10-12T05:41:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:45 crc kubenswrapper[4930]: I1012 05:41:45.429274 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:45 crc kubenswrapper[4930]: I1012 05:41:45.429330 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:45 crc kubenswrapper[4930]: I1012 05:41:45.429347 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:45 crc kubenswrapper[4930]: I1012 05:41:45.429371 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:45 crc kubenswrapper[4930]: I1012 05:41:45.429389 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:45Z","lastTransitionTime":"2025-10-12T05:41:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:45 crc kubenswrapper[4930]: I1012 05:41:45.532374 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:45 crc kubenswrapper[4930]: I1012 05:41:45.532437 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:45 crc kubenswrapper[4930]: I1012 05:41:45.532458 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:45 crc kubenswrapper[4930]: I1012 05:41:45.532481 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:45 crc kubenswrapper[4930]: I1012 05:41:45.532499 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:45Z","lastTransitionTime":"2025-10-12T05:41:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:45 crc kubenswrapper[4930]: I1012 05:41:45.635842 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:45 crc kubenswrapper[4930]: I1012 05:41:45.636069 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:45 crc kubenswrapper[4930]: I1012 05:41:45.636091 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:45 crc kubenswrapper[4930]: I1012 05:41:45.636118 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:45 crc kubenswrapper[4930]: I1012 05:41:45.636141 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:45Z","lastTransitionTime":"2025-10-12T05:41:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:45 crc kubenswrapper[4930]: I1012 05:41:45.739829 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:45 crc kubenswrapper[4930]: I1012 05:41:45.740210 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:45 crc kubenswrapper[4930]: I1012 05:41:45.740355 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:45 crc kubenswrapper[4930]: I1012 05:41:45.740517 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:45 crc kubenswrapper[4930]: I1012 05:41:45.740648 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:45Z","lastTransitionTime":"2025-10-12T05:41:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:45 crc kubenswrapper[4930]: I1012 05:41:45.844007 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:45 crc kubenswrapper[4930]: I1012 05:41:45.844064 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:45 crc kubenswrapper[4930]: I1012 05:41:45.844084 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:45 crc kubenswrapper[4930]: I1012 05:41:45.844108 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:45 crc kubenswrapper[4930]: I1012 05:41:45.844127 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:45Z","lastTransitionTime":"2025-10-12T05:41:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:45 crc kubenswrapper[4930]: I1012 05:41:45.945854 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dda08509-105f-4935-a8a9-ff852e73c3ce-metrics-certs\") pod \"network-metrics-daemon-7cjzn\" (UID: \"dda08509-105f-4935-a8a9-ff852e73c3ce\") " pod="openshift-multus/network-metrics-daemon-7cjzn" Oct 12 05:41:45 crc kubenswrapper[4930]: E1012 05:41:45.946074 4930 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 12 05:41:45 crc kubenswrapper[4930]: E1012 05:41:45.946192 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dda08509-105f-4935-a8a9-ff852e73c3ce-metrics-certs podName:dda08509-105f-4935-a8a9-ff852e73c3ce nodeName:}" failed. No retries permitted until 2025-10-12 05:41:47.94616076 +0000 UTC m=+40.488262565 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dda08509-105f-4935-a8a9-ff852e73c3ce-metrics-certs") pod "network-metrics-daemon-7cjzn" (UID: "dda08509-105f-4935-a8a9-ff852e73c3ce") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 12 05:41:45 crc kubenswrapper[4930]: I1012 05:41:45.947006 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:45 crc kubenswrapper[4930]: I1012 05:41:45.947056 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:45 crc kubenswrapper[4930]: I1012 05:41:45.947075 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:45 crc kubenswrapper[4930]: I1012 05:41:45.947100 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:45 crc kubenswrapper[4930]: I1012 05:41:45.947121 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:45Z","lastTransitionTime":"2025-10-12T05:41:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:46 crc kubenswrapper[4930]: I1012 05:41:46.050042 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:46 crc kubenswrapper[4930]: I1012 05:41:46.050103 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:46 crc kubenswrapper[4930]: I1012 05:41:46.050121 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:46 crc kubenswrapper[4930]: I1012 05:41:46.050145 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:46 crc kubenswrapper[4930]: I1012 05:41:46.050162 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:46Z","lastTransitionTime":"2025-10-12T05:41:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:46 crc kubenswrapper[4930]: I1012 05:41:46.134321 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 05:41:46 crc kubenswrapper[4930]: I1012 05:41:46.134384 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 05:41:46 crc kubenswrapper[4930]: I1012 05:41:46.134418 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 05:41:46 crc kubenswrapper[4930]: E1012 05:41:46.134531 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 05:41:46 crc kubenswrapper[4930]: E1012 05:41:46.134698 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 05:41:46 crc kubenswrapper[4930]: I1012 05:41:46.134839 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cjzn" Oct 12 05:41:46 crc kubenswrapper[4930]: E1012 05:41:46.134845 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 05:41:46 crc kubenswrapper[4930]: E1012 05:41:46.135055 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cjzn" podUID="dda08509-105f-4935-a8a9-ff852e73c3ce" Oct 12 05:41:46 crc kubenswrapper[4930]: I1012 05:41:46.153491 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:46 crc kubenswrapper[4930]: I1012 05:41:46.153561 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:46 crc kubenswrapper[4930]: I1012 05:41:46.153585 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:46 crc kubenswrapper[4930]: I1012 05:41:46.153614 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:46 crc kubenswrapper[4930]: I1012 05:41:46.153636 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:46Z","lastTransitionTime":"2025-10-12T05:41:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:46 crc kubenswrapper[4930]: I1012 05:41:46.256442 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:46 crc kubenswrapper[4930]: I1012 05:41:46.256490 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:46 crc kubenswrapper[4930]: I1012 05:41:46.256508 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:46 crc kubenswrapper[4930]: I1012 05:41:46.256533 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:46 crc kubenswrapper[4930]: I1012 05:41:46.256550 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:46Z","lastTransitionTime":"2025-10-12T05:41:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:46 crc kubenswrapper[4930]: I1012 05:41:46.359568 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:46 crc kubenswrapper[4930]: I1012 05:41:46.360100 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:46 crc kubenswrapper[4930]: I1012 05:41:46.360247 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:46 crc kubenswrapper[4930]: I1012 05:41:46.360380 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:46 crc kubenswrapper[4930]: I1012 05:41:46.360503 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:46Z","lastTransitionTime":"2025-10-12T05:41:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:46 crc kubenswrapper[4930]: I1012 05:41:46.463699 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:46 crc kubenswrapper[4930]: I1012 05:41:46.463796 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:46 crc kubenswrapper[4930]: I1012 05:41:46.463816 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:46 crc kubenswrapper[4930]: I1012 05:41:46.463844 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:46 crc kubenswrapper[4930]: I1012 05:41:46.463861 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:46Z","lastTransitionTime":"2025-10-12T05:41:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:46 crc kubenswrapper[4930]: I1012 05:41:46.567238 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:46 crc kubenswrapper[4930]: I1012 05:41:46.567286 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:46 crc kubenswrapper[4930]: I1012 05:41:46.567302 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:46 crc kubenswrapper[4930]: I1012 05:41:46.567324 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:46 crc kubenswrapper[4930]: I1012 05:41:46.567342 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:46Z","lastTransitionTime":"2025-10-12T05:41:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:46 crc kubenswrapper[4930]: I1012 05:41:46.670391 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:46 crc kubenswrapper[4930]: I1012 05:41:46.670841 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:46 crc kubenswrapper[4930]: I1012 05:41:46.671071 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:46 crc kubenswrapper[4930]: I1012 05:41:46.671269 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:46 crc kubenswrapper[4930]: I1012 05:41:46.671448 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:46Z","lastTransitionTime":"2025-10-12T05:41:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:46 crc kubenswrapper[4930]: I1012 05:41:46.774214 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:46 crc kubenswrapper[4930]: I1012 05:41:46.774269 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:46 crc kubenswrapper[4930]: I1012 05:41:46.774287 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:46 crc kubenswrapper[4930]: I1012 05:41:46.774317 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:46 crc kubenswrapper[4930]: I1012 05:41:46.774333 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:46Z","lastTransitionTime":"2025-10-12T05:41:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:46 crc kubenswrapper[4930]: I1012 05:41:46.877853 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:46 crc kubenswrapper[4930]: I1012 05:41:46.877917 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:46 crc kubenswrapper[4930]: I1012 05:41:46.877935 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:46 crc kubenswrapper[4930]: I1012 05:41:46.877962 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:46 crc kubenswrapper[4930]: I1012 05:41:46.877987 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:46Z","lastTransitionTime":"2025-10-12T05:41:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:46 crc kubenswrapper[4930]: I1012 05:41:46.981141 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:46 crc kubenswrapper[4930]: I1012 05:41:46.981232 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:46 crc kubenswrapper[4930]: I1012 05:41:46.981281 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:46 crc kubenswrapper[4930]: I1012 05:41:46.981308 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:46 crc kubenswrapper[4930]: I1012 05:41:46.981326 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:46Z","lastTransitionTime":"2025-10-12T05:41:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:47 crc kubenswrapper[4930]: I1012 05:41:47.084939 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:47 crc kubenswrapper[4930]: I1012 05:41:47.085293 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:47 crc kubenswrapper[4930]: I1012 05:41:47.085479 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:47 crc kubenswrapper[4930]: I1012 05:41:47.085633 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:47 crc kubenswrapper[4930]: I1012 05:41:47.085803 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:47Z","lastTransitionTime":"2025-10-12T05:41:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:47 crc kubenswrapper[4930]: I1012 05:41:47.189286 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:47 crc kubenswrapper[4930]: I1012 05:41:47.189341 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:47 crc kubenswrapper[4930]: I1012 05:41:47.189359 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:47 crc kubenswrapper[4930]: I1012 05:41:47.189382 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:47 crc kubenswrapper[4930]: I1012 05:41:47.189400 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:47Z","lastTransitionTime":"2025-10-12T05:41:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:47 crc kubenswrapper[4930]: I1012 05:41:47.291877 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:47 crc kubenswrapper[4930]: I1012 05:41:47.291951 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:47 crc kubenswrapper[4930]: I1012 05:41:47.291970 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:47 crc kubenswrapper[4930]: I1012 05:41:47.292000 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:47 crc kubenswrapper[4930]: I1012 05:41:47.292020 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:47Z","lastTransitionTime":"2025-10-12T05:41:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:47 crc kubenswrapper[4930]: I1012 05:41:47.395326 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:47 crc kubenswrapper[4930]: I1012 05:41:47.395390 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:47 crc kubenswrapper[4930]: I1012 05:41:47.395408 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:47 crc kubenswrapper[4930]: I1012 05:41:47.395436 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:47 crc kubenswrapper[4930]: I1012 05:41:47.395456 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:47Z","lastTransitionTime":"2025-10-12T05:41:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:47 crc kubenswrapper[4930]: I1012 05:41:47.498559 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:47 crc kubenswrapper[4930]: I1012 05:41:47.499049 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:47 crc kubenswrapper[4930]: I1012 05:41:47.499214 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:47 crc kubenswrapper[4930]: I1012 05:41:47.499359 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:47 crc kubenswrapper[4930]: I1012 05:41:47.499497 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:47Z","lastTransitionTime":"2025-10-12T05:41:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:47 crc kubenswrapper[4930]: I1012 05:41:47.603144 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:47 crc kubenswrapper[4930]: I1012 05:41:47.603866 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:47 crc kubenswrapper[4930]: I1012 05:41:47.603885 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:47 crc kubenswrapper[4930]: I1012 05:41:47.603909 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:47 crc kubenswrapper[4930]: I1012 05:41:47.603923 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:47Z","lastTransitionTime":"2025-10-12T05:41:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:47 crc kubenswrapper[4930]: I1012 05:41:47.707254 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:47 crc kubenswrapper[4930]: I1012 05:41:47.707339 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:47 crc kubenswrapper[4930]: I1012 05:41:47.707363 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:47 crc kubenswrapper[4930]: I1012 05:41:47.707396 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:47 crc kubenswrapper[4930]: I1012 05:41:47.707414 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:47Z","lastTransitionTime":"2025-10-12T05:41:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:47 crc kubenswrapper[4930]: I1012 05:41:47.809668 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:47 crc kubenswrapper[4930]: I1012 05:41:47.809707 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:47 crc kubenswrapper[4930]: I1012 05:41:47.809718 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:47 crc kubenswrapper[4930]: I1012 05:41:47.809758 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:47 crc kubenswrapper[4930]: I1012 05:41:47.809771 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:47Z","lastTransitionTime":"2025-10-12T05:41:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:47 crc kubenswrapper[4930]: I1012 05:41:47.912075 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:47 crc kubenswrapper[4930]: I1012 05:41:47.912461 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:47 crc kubenswrapper[4930]: I1012 05:41:47.912601 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:47 crc kubenswrapper[4930]: I1012 05:41:47.912720 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:47 crc kubenswrapper[4930]: I1012 05:41:47.912857 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:47Z","lastTransitionTime":"2025-10-12T05:41:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:47 crc kubenswrapper[4930]: I1012 05:41:47.972597 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dda08509-105f-4935-a8a9-ff852e73c3ce-metrics-certs\") pod \"network-metrics-daemon-7cjzn\" (UID: \"dda08509-105f-4935-a8a9-ff852e73c3ce\") " pod="openshift-multus/network-metrics-daemon-7cjzn" Oct 12 05:41:47 crc kubenswrapper[4930]: E1012 05:41:47.972846 4930 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 12 05:41:47 crc kubenswrapper[4930]: E1012 05:41:47.972918 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dda08509-105f-4935-a8a9-ff852e73c3ce-metrics-certs podName:dda08509-105f-4935-a8a9-ff852e73c3ce nodeName:}" failed. No retries permitted until 2025-10-12 05:41:51.972896398 +0000 UTC m=+44.514998193 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dda08509-105f-4935-a8a9-ff852e73c3ce-metrics-certs") pod "network-metrics-daemon-7cjzn" (UID: "dda08509-105f-4935-a8a9-ff852e73c3ce") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 12 05:41:48 crc kubenswrapper[4930]: I1012 05:41:48.017017 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:48 crc kubenswrapper[4930]: I1012 05:41:48.017081 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:48 crc kubenswrapper[4930]: I1012 05:41:48.017102 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:48 crc kubenswrapper[4930]: I1012 05:41:48.017129 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:48 crc kubenswrapper[4930]: I1012 05:41:48.017148 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:48Z","lastTransitionTime":"2025-10-12T05:41:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:48 crc kubenswrapper[4930]: I1012 05:41:48.119634 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:48 crc kubenswrapper[4930]: I1012 05:41:48.119709 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:48 crc kubenswrapper[4930]: I1012 05:41:48.119782 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:48 crc kubenswrapper[4930]: I1012 05:41:48.119833 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:48 crc kubenswrapper[4930]: I1012 05:41:48.119859 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:48Z","lastTransitionTime":"2025-10-12T05:41:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:48 crc kubenswrapper[4930]: I1012 05:41:48.134493 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 05:41:48 crc kubenswrapper[4930]: E1012 05:41:48.134676 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 05:41:48 crc kubenswrapper[4930]: I1012 05:41:48.134777 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 05:41:48 crc kubenswrapper[4930]: E1012 05:41:48.134930 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 05:41:48 crc kubenswrapper[4930]: I1012 05:41:48.135050 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 05:41:48 crc kubenswrapper[4930]: E1012 05:41:48.135203 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 05:41:48 crc kubenswrapper[4930]: I1012 05:41:48.135240 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cjzn" Oct 12 05:41:48 crc kubenswrapper[4930]: E1012 05:41:48.135374 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cjzn" podUID="dda08509-105f-4935-a8a9-ff852e73c3ce" Oct 12 05:41:48 crc kubenswrapper[4930]: I1012 05:41:48.163698 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vwttt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d928520f-ca1d-4cca-b966-c1e6c9168db0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://083fd9c4a4a835276945402005503980476de4e834b8c77b4347cbfd74347baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://271a5e8f9d11a30e34aaa4965e6d9d036cfc47d469d87a2ff0ddcae5ba4e8fe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://271a5e8f9d11a30e34aaa4965e6d9d036cfc47d469d87a2ff0ddcae5ba4e8fe8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a7387bc9fd2f9b98dd174adbd738309e22b90e1c82756e0e4c114cd5bc1279a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a7387bc9fd2f9b98dd174adbd738309e22b90e1c82756e0e4c114cd5bc1279a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b73defe23757469590b3d3d92e7b79bf3b6dcbcaec44d750770cd2b94caaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6b73defe23757469590b3d3d92e7b79bf3b6dcbcaec44d750770cd2b94caaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c379d6c40705bccb05ab8835e3b2410fc0a98db9e372b4dbd62cc3d4c1c5470\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c379d6c40705bccb05ab8835e3b2410fc0a98db9e372b4dbd62cc3d4c1c5470\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5e3c8e16c925694d42868813bc57edd9c50c88f2153189a2b9f8451655aff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5e3c8e16c925694d42868813bc57edd9c50c88f2153189a2b9f8451655aff1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d01f2b61f7fe3605f4ad0626d2cf28756a8c7e3de343b9a20acafe7f35c19bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d01f2b61f7fe3605f4ad0626d2cf28756a8c7e3de343b9a20acafe7f35c19bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vwttt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:48Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:48 crc kubenswrapper[4930]: I1012 05:41:48.199916 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0add0fa2-092f-4dcc-8c72-82881564bf63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7495c144ce989bf10b08b199a7459ea85351b77941a3edcb0ef76a01b2e08164\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a4295347cdfdf0864fbc2b102904f73e19cc33021fc5a82eed61255c4641d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2cba08ba8fd74a521bdd005e51be52f80687a1ccf0f250d11b709c894be8235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1006beab015e55856b70b9bcb93bb81b541e6c593fae8fd366c361990d233914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17051b78f584f1199be3038a82d9e675d1b9fb4da939611577c0b05235a147a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9898a7d7e617b5eb8eab6b6d9cb5d2ea92e8fbb0f4b1e0e60711e8f4a5e3aef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://264bb5418af04a677f569add84f3eb35c3760f20db19a03595723d261202e317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://264bb5418af04a677f569add84f3eb35c3760f20db19a03595723d261202e317\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T05:41:41Z\\\",\\\"message\\\":\\\"ster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.34:8443: 10.217.5.34:8888:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7715118b-bb1b-400a-803e-7ab2cc3eeec0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1012 05:41:41.453945 6374 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:41Z is after 2025-08-24T17:21:41Z]\\\\nI1012 05:41:41.453849 6374 services_controller.go:451] Built service openshift-dns-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-dns-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-mdhw6_openshift-ovn-kubernetes(0add0fa2-092f-4dcc-8c72-82881564bf63)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcac9953a6ca0ac812cb0d1aed7ff1179eed01831f4af4a098f3e6938b5bc666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mdhw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:48Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:48 crc kubenswrapper[4930]: I1012 05:41:48.221360 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fjsw8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c21ff03a-29e7-467c-93e2-45f3a6cef5af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df55e443aa226c07478c488ee54785f35fbb1d6285775e82abcbf924aaf304fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dk8z2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bae41878203d4a2974f46d284f4fc05e705fb9fc9b1a3e8159db00a4c9d70762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dk8z2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fjsw8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:48Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:48 crc kubenswrapper[4930]: I1012 05:41:48.222565 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:48 crc kubenswrapper[4930]: I1012 05:41:48.222634 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:48 crc kubenswrapper[4930]: I1012 05:41:48.222658 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:48 crc kubenswrapper[4930]: I1012 05:41:48.222682 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:48 crc kubenswrapper[4930]: I1012 05:41:48.222700 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:48Z","lastTransitionTime":"2025-10-12T05:41:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:48 crc kubenswrapper[4930]: I1012 05:41:48.247525 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7cjzn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dda08509-105f-4935-a8a9-ff852e73c3ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2ntq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2ntq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7cjzn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:48Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:48 crc kubenswrapper[4930]: I1012 05:41:48.282238 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd0aa975-5017-4585-b9d9-66563c734f99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2ee3c6242b1dd92c05cad132f941c6463862d38dd3f32db106c4458b19ca281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce0b341a84353da33af118a985d78e1926365e672cd9d9a94a8c9ccdeeeef68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://772b3ac684f8d59be06d3921919db10572b68d9acfb6ae64e85b92c637963057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04341c7af2fbffb89ce55a0ed2594bdb4fd943ac9088994f81f9245c31c62a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c844939574ab6101aa35b883fd6b7dfabb8338c2de1bc238a60b92f7e0fc99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:48Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:48 crc kubenswrapper[4930]: I1012 05:41:48.302412 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:48Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:48 crc kubenswrapper[4930]: I1012 05:41:48.324283 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:48Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:48 crc kubenswrapper[4930]: I1012 05:41:48.326705 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:48 crc kubenswrapper[4930]: I1012 05:41:48.326797 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:48 crc kubenswrapper[4930]: I1012 05:41:48.326817 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:48 crc kubenswrapper[4930]: I1012 05:41:48.326844 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:48 crc kubenswrapper[4930]: I1012 05:41:48.326864 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:48Z","lastTransitionTime":"2025-10-12T05:41:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:48 crc kubenswrapper[4930]: I1012 05:41:48.345689 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://489b43d7659655253d7055c2725d6b59d594b9eea5d1e6dc78b015f8d44824ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:48Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:48 crc kubenswrapper[4930]: I1012 05:41:48.364947 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-br2vl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"545bf51c-0b04-4166-a984-ec9c1276470a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6148cdf3aed29894038db482881c0cf3d0fd2d99c241e87d864099a3eb4248f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tpks\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-br2vl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:48Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:48 crc kubenswrapper[4930]: I1012 05:41:48.388987 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jd2dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"835a1f98-4ae1-499b-b08c-a87dbcf8eaf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7816344abd767a7cdc2aa06418215233815285a683bb5a7fceeeb9a383b00bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvkzz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jd2dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:48Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:48 crc kubenswrapper[4930]: I1012 05:41:48.413629 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2acab1-2f7d-4497-ab0c-d2088825de18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://380b8da0bc8ba00ec7fe7a9dd1a4f49f0f0828fa469e21bd947a1b976ae5e584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14de1c8677ea9a164e1713e68861b55fa4b8fd0bff40211afa3ee9e883c65092\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93380cd1156afc3c34f019f5026f01b5242b9009c4c5de83edf5301e4cdb7b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b74b7cf89e0ba8d3417d38fdfc78b271cb23bc6e737358303956214c8f9a0f9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fb5b0196fd005013eba91e6ea563d14230abf0a8cbc1d2a4bf9cd4491f7c391\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T05:41:22Z\\\",\\\"message\\\":\\\"W1012 05:41:11.395066 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 05:41:11.395683 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760247671 cert, and key in /tmp/serving-cert-1865248722/serving-signer.crt, /tmp/serving-cert-1865248722/serving-signer.key\\\\nI1012 05:41:11.668419 1 observer_polling.go:159] Starting file observer\\\\nW1012 05:41:11.671952 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 05:41:11.672482 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 05:41:11.675332 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1865248722/tls.crt::/tmp/serving-cert-1865248722/tls.key\\\\\\\"\\\\nF1012 05:41:22.294380 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9e13b7dbf76d9dcdaa4a8edc40eb5ebac2c044e9f171f78f18b33a08784ed28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:48Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:48 crc kubenswrapper[4930]: I1012 05:41:48.431160 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:48 crc kubenswrapper[4930]: I1012 05:41:48.431275 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:48 crc kubenswrapper[4930]: I1012 05:41:48.431333 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:48 crc kubenswrapper[4930]: I1012 05:41:48.431367 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:48 crc kubenswrapper[4930]: I1012 05:41:48.431424 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:48Z","lastTransitionTime":"2025-10-12T05:41:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:48 crc kubenswrapper[4930]: I1012 05:41:48.436390 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d92840f87187d6a4e0a40972f9b1601b70c300104b6023b980e0939e8b83eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d6f5f971b8708420f74c2b9602179815dd69145979d42bbaeb899290c01c286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:48Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:48 crc kubenswrapper[4930]: I1012 05:41:48.458555 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e930f9f-8120-43b6-95ba-39319c069f64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://428593f6ee3749d62d3cb8553ec6707f0f13f70f7750b8ea90fb0b9f0aa8f337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05e5821bdc37103cb95d9f2a3dcacbff75c87335a6c6ed400846ea49fe370c00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e44390f068c28c4a9917648c747a322ddfa44915b128a61f145d6e2e23fc167a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://586395a6006e26fe969c2ef9af6ec6299523c0868c26989709389f5af24b7e39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:48Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:48 crc kubenswrapper[4930]: I1012 05:41:48.479937 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c87fc1896b611cf87d3f6e8dce960e0aea4e0a5e161d12e046bec5a4d46d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:48Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:48 crc kubenswrapper[4930]: I1012 05:41:48.500466 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:48Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:48 crc kubenswrapper[4930]: I1012 05:41:48.518358 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tq29s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1c3ae9e-26ae-418f-b261-eabc4302b332\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbe4d76b294d8c174b77355aae9ec55f4d38c04446fc048603a294d171332584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s29kh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tq29s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:48Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:48 crc kubenswrapper[4930]: I1012 05:41:48.536204 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:48 crc kubenswrapper[4930]: I1012 05:41:48.536699 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:48 crc kubenswrapper[4930]: I1012 05:41:48.536972 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:48 crc kubenswrapper[4930]: I1012 05:41:48.537184 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:48 crc kubenswrapper[4930]: I1012 05:41:48.537323 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:48Z","lastTransitionTime":"2025-10-12T05:41:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:48 crc kubenswrapper[4930]: I1012 05:41:48.536796 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02f8684c-a3e4-44e8-9741-9f54488d8d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74c17edeca2e2b56f4be36327914c7232a5eee3190d7cd9ce54fe81050942ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqrgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2193886cf3e24fbf30c3e6f91ab52d9b7e68968c8b42a4d0deafab5668555701\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqrgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mk4tf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:48Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:48 crc kubenswrapper[4930]: I1012 05:41:48.641025 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:48 crc kubenswrapper[4930]: I1012 05:41:48.641073 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:48 crc kubenswrapper[4930]: I1012 05:41:48.641092 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:48 crc kubenswrapper[4930]: I1012 05:41:48.641118 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:48 crc kubenswrapper[4930]: I1012 05:41:48.641137 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:48Z","lastTransitionTime":"2025-10-12T05:41:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:48 crc kubenswrapper[4930]: I1012 05:41:48.744650 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:48 crc kubenswrapper[4930]: I1012 05:41:48.744783 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:48 crc kubenswrapper[4930]: I1012 05:41:48.744804 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:48 crc kubenswrapper[4930]: I1012 05:41:48.744857 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:48 crc kubenswrapper[4930]: I1012 05:41:48.744877 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:48Z","lastTransitionTime":"2025-10-12T05:41:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:48 crc kubenswrapper[4930]: I1012 05:41:48.847971 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:48 crc kubenswrapper[4930]: I1012 05:41:48.848050 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:48 crc kubenswrapper[4930]: I1012 05:41:48.848074 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:48 crc kubenswrapper[4930]: I1012 05:41:48.848108 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:48 crc kubenswrapper[4930]: I1012 05:41:48.848134 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:48Z","lastTransitionTime":"2025-10-12T05:41:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:48 crc kubenswrapper[4930]: I1012 05:41:48.951661 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:48 crc kubenswrapper[4930]: I1012 05:41:48.951721 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:48 crc kubenswrapper[4930]: I1012 05:41:48.951780 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:48 crc kubenswrapper[4930]: I1012 05:41:48.951808 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:48 crc kubenswrapper[4930]: I1012 05:41:48.951827 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:48Z","lastTransitionTime":"2025-10-12T05:41:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:49 crc kubenswrapper[4930]: I1012 05:41:49.055230 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:49 crc kubenswrapper[4930]: I1012 05:41:49.055543 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:49 crc kubenswrapper[4930]: I1012 05:41:49.055682 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:49 crc kubenswrapper[4930]: I1012 05:41:49.056070 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:49 crc kubenswrapper[4930]: I1012 05:41:49.056223 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:49Z","lastTransitionTime":"2025-10-12T05:41:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:49 crc kubenswrapper[4930]: I1012 05:41:49.159690 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:49 crc kubenswrapper[4930]: I1012 05:41:49.159783 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:49 crc kubenswrapper[4930]: I1012 05:41:49.159804 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:49 crc kubenswrapper[4930]: I1012 05:41:49.159828 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:49 crc kubenswrapper[4930]: I1012 05:41:49.159847 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:49Z","lastTransitionTime":"2025-10-12T05:41:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:49 crc kubenswrapper[4930]: I1012 05:41:49.262962 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:49 crc kubenswrapper[4930]: I1012 05:41:49.263016 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:49 crc kubenswrapper[4930]: I1012 05:41:49.263035 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:49 crc kubenswrapper[4930]: I1012 05:41:49.263057 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:49 crc kubenswrapper[4930]: I1012 05:41:49.263074 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:49Z","lastTransitionTime":"2025-10-12T05:41:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:49 crc kubenswrapper[4930]: I1012 05:41:49.366567 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:49 crc kubenswrapper[4930]: I1012 05:41:49.367236 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:49 crc kubenswrapper[4930]: I1012 05:41:49.367263 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:49 crc kubenswrapper[4930]: I1012 05:41:49.367294 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:49 crc kubenswrapper[4930]: I1012 05:41:49.367313 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:49Z","lastTransitionTime":"2025-10-12T05:41:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:49 crc kubenswrapper[4930]: I1012 05:41:49.470583 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:49 crc kubenswrapper[4930]: I1012 05:41:49.470637 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:49 crc kubenswrapper[4930]: I1012 05:41:49.470655 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:49 crc kubenswrapper[4930]: I1012 05:41:49.470679 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:49 crc kubenswrapper[4930]: I1012 05:41:49.470702 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:49Z","lastTransitionTime":"2025-10-12T05:41:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:49 crc kubenswrapper[4930]: I1012 05:41:49.574164 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:49 crc kubenswrapper[4930]: I1012 05:41:49.574224 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:49 crc kubenswrapper[4930]: I1012 05:41:49.574242 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:49 crc kubenswrapper[4930]: I1012 05:41:49.574267 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:49 crc kubenswrapper[4930]: I1012 05:41:49.574285 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:49Z","lastTransitionTime":"2025-10-12T05:41:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:49 crc kubenswrapper[4930]: I1012 05:41:49.678432 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:49 crc kubenswrapper[4930]: I1012 05:41:49.678488 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:49 crc kubenswrapper[4930]: I1012 05:41:49.678503 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:49 crc kubenswrapper[4930]: I1012 05:41:49.678527 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:49 crc kubenswrapper[4930]: I1012 05:41:49.678549 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:49Z","lastTransitionTime":"2025-10-12T05:41:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:49 crc kubenswrapper[4930]: I1012 05:41:49.782375 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:49 crc kubenswrapper[4930]: I1012 05:41:49.782479 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:49 crc kubenswrapper[4930]: I1012 05:41:49.782501 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:49 crc kubenswrapper[4930]: I1012 05:41:49.782534 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:49 crc kubenswrapper[4930]: I1012 05:41:49.782558 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:49Z","lastTransitionTime":"2025-10-12T05:41:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:49 crc kubenswrapper[4930]: I1012 05:41:49.886061 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:49 crc kubenswrapper[4930]: I1012 05:41:49.886143 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:49 crc kubenswrapper[4930]: I1012 05:41:49.886162 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:49 crc kubenswrapper[4930]: I1012 05:41:49.886193 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:49 crc kubenswrapper[4930]: I1012 05:41:49.886215 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:49Z","lastTransitionTime":"2025-10-12T05:41:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:49 crc kubenswrapper[4930]: I1012 05:41:49.989644 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:49 crc kubenswrapper[4930]: I1012 05:41:49.989721 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:49 crc kubenswrapper[4930]: I1012 05:41:49.989781 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:49 crc kubenswrapper[4930]: I1012 05:41:49.989815 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:49 crc kubenswrapper[4930]: I1012 05:41:49.989839 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:49Z","lastTransitionTime":"2025-10-12T05:41:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:50 crc kubenswrapper[4930]: I1012 05:41:50.092791 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:50 crc kubenswrapper[4930]: I1012 05:41:50.092868 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:50 crc kubenswrapper[4930]: I1012 05:41:50.092896 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:50 crc kubenswrapper[4930]: I1012 05:41:50.092914 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:50 crc kubenswrapper[4930]: I1012 05:41:50.092929 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:50Z","lastTransitionTime":"2025-10-12T05:41:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:50 crc kubenswrapper[4930]: I1012 05:41:50.134996 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 05:41:50 crc kubenswrapper[4930]: I1012 05:41:50.135058 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 05:41:50 crc kubenswrapper[4930]: E1012 05:41:50.135160 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 05:41:50 crc kubenswrapper[4930]: E1012 05:41:50.135299 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 05:41:50 crc kubenswrapper[4930]: I1012 05:41:50.135428 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cjzn" Oct 12 05:41:50 crc kubenswrapper[4930]: E1012 05:41:50.135585 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cjzn" podUID="dda08509-105f-4935-a8a9-ff852e73c3ce" Oct 12 05:41:50 crc kubenswrapper[4930]: I1012 05:41:50.135837 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 05:41:50 crc kubenswrapper[4930]: E1012 05:41:50.135975 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 05:41:50 crc kubenswrapper[4930]: I1012 05:41:50.196851 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:50 crc kubenswrapper[4930]: I1012 05:41:50.196905 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:50 crc kubenswrapper[4930]: I1012 05:41:50.196926 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:50 crc kubenswrapper[4930]: I1012 05:41:50.196954 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:50 crc kubenswrapper[4930]: I1012 05:41:50.196976 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:50Z","lastTransitionTime":"2025-10-12T05:41:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:50 crc kubenswrapper[4930]: I1012 05:41:50.299899 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:50 crc kubenswrapper[4930]: I1012 05:41:50.299962 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:50 crc kubenswrapper[4930]: I1012 05:41:50.299980 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:50 crc kubenswrapper[4930]: I1012 05:41:50.300004 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:50 crc kubenswrapper[4930]: I1012 05:41:50.300023 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:50Z","lastTransitionTime":"2025-10-12T05:41:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:50 crc kubenswrapper[4930]: I1012 05:41:50.402876 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:50 crc kubenswrapper[4930]: I1012 05:41:50.402940 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:50 crc kubenswrapper[4930]: I1012 05:41:50.402960 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:50 crc kubenswrapper[4930]: I1012 05:41:50.402986 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:50 crc kubenswrapper[4930]: I1012 05:41:50.403008 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:50Z","lastTransitionTime":"2025-10-12T05:41:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:50 crc kubenswrapper[4930]: I1012 05:41:50.506230 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:50 crc kubenswrapper[4930]: I1012 05:41:50.506583 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:50 crc kubenswrapper[4930]: I1012 05:41:50.506775 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:50 crc kubenswrapper[4930]: I1012 05:41:50.506937 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:50 crc kubenswrapper[4930]: I1012 05:41:50.507079 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:50Z","lastTransitionTime":"2025-10-12T05:41:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:50 crc kubenswrapper[4930]: I1012 05:41:50.610045 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:50 crc kubenswrapper[4930]: I1012 05:41:50.610124 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:50 crc kubenswrapper[4930]: I1012 05:41:50.610141 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:50 crc kubenswrapper[4930]: I1012 05:41:50.610161 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:50 crc kubenswrapper[4930]: I1012 05:41:50.610176 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:50Z","lastTransitionTime":"2025-10-12T05:41:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:50 crc kubenswrapper[4930]: I1012 05:41:50.713109 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:50 crc kubenswrapper[4930]: I1012 05:41:50.713176 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:50 crc kubenswrapper[4930]: I1012 05:41:50.713202 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:50 crc kubenswrapper[4930]: I1012 05:41:50.713230 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:50 crc kubenswrapper[4930]: I1012 05:41:50.713252 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:50Z","lastTransitionTime":"2025-10-12T05:41:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:50 crc kubenswrapper[4930]: I1012 05:41:50.815725 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:50 crc kubenswrapper[4930]: I1012 05:41:50.816080 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:50 crc kubenswrapper[4930]: I1012 05:41:50.816298 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:50 crc kubenswrapper[4930]: I1012 05:41:50.816525 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:50 crc kubenswrapper[4930]: I1012 05:41:50.816792 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:50Z","lastTransitionTime":"2025-10-12T05:41:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:50 crc kubenswrapper[4930]: I1012 05:41:50.920540 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:50 crc kubenswrapper[4930]: I1012 05:41:50.920598 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:50 crc kubenswrapper[4930]: I1012 05:41:50.920616 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:50 crc kubenswrapper[4930]: I1012 05:41:50.920643 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:50 crc kubenswrapper[4930]: I1012 05:41:50.920662 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:50Z","lastTransitionTime":"2025-10-12T05:41:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:51 crc kubenswrapper[4930]: I1012 05:41:51.024731 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:51 crc kubenswrapper[4930]: I1012 05:41:51.024843 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:51 crc kubenswrapper[4930]: I1012 05:41:51.024863 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:51 crc kubenswrapper[4930]: I1012 05:41:51.024888 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:51 crc kubenswrapper[4930]: I1012 05:41:51.024904 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:51Z","lastTransitionTime":"2025-10-12T05:41:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:51 crc kubenswrapper[4930]: I1012 05:41:51.127902 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:51 crc kubenswrapper[4930]: I1012 05:41:51.127980 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:51 crc kubenswrapper[4930]: I1012 05:41:51.128016 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:51 crc kubenswrapper[4930]: I1012 05:41:51.128037 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:51 crc kubenswrapper[4930]: I1012 05:41:51.128055 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:51Z","lastTransitionTime":"2025-10-12T05:41:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:51 crc kubenswrapper[4930]: I1012 05:41:51.230935 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:51 crc kubenswrapper[4930]: I1012 05:41:51.230995 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:51 crc kubenswrapper[4930]: I1012 05:41:51.231013 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:51 crc kubenswrapper[4930]: I1012 05:41:51.231038 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:51 crc kubenswrapper[4930]: I1012 05:41:51.231055 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:51Z","lastTransitionTime":"2025-10-12T05:41:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:51 crc kubenswrapper[4930]: I1012 05:41:51.334177 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:51 crc kubenswrapper[4930]: I1012 05:41:51.334239 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:51 crc kubenswrapper[4930]: I1012 05:41:51.334261 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:51 crc kubenswrapper[4930]: I1012 05:41:51.334290 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:51 crc kubenswrapper[4930]: I1012 05:41:51.334311 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:51Z","lastTransitionTime":"2025-10-12T05:41:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:51 crc kubenswrapper[4930]: I1012 05:41:51.438393 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:51 crc kubenswrapper[4930]: I1012 05:41:51.438460 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:51 crc kubenswrapper[4930]: I1012 05:41:51.438478 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:51 crc kubenswrapper[4930]: I1012 05:41:51.438504 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:51 crc kubenswrapper[4930]: I1012 05:41:51.438524 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:51Z","lastTransitionTime":"2025-10-12T05:41:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:51 crc kubenswrapper[4930]: I1012 05:41:51.541875 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:51 crc kubenswrapper[4930]: I1012 05:41:51.542022 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:51 crc kubenswrapper[4930]: I1012 05:41:51.542106 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:51 crc kubenswrapper[4930]: I1012 05:41:51.542140 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:51 crc kubenswrapper[4930]: I1012 05:41:51.542161 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:51Z","lastTransitionTime":"2025-10-12T05:41:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:51 crc kubenswrapper[4930]: I1012 05:41:51.646216 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:51 crc kubenswrapper[4930]: I1012 05:41:51.646290 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:51 crc kubenswrapper[4930]: I1012 05:41:51.646309 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:51 crc kubenswrapper[4930]: I1012 05:41:51.646336 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:51 crc kubenswrapper[4930]: I1012 05:41:51.646357 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:51Z","lastTransitionTime":"2025-10-12T05:41:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:51 crc kubenswrapper[4930]: I1012 05:41:51.749776 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:51 crc kubenswrapper[4930]: I1012 05:41:51.749839 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:51 crc kubenswrapper[4930]: I1012 05:41:51.749859 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:51 crc kubenswrapper[4930]: I1012 05:41:51.749886 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:51 crc kubenswrapper[4930]: I1012 05:41:51.749907 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:51Z","lastTransitionTime":"2025-10-12T05:41:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:51 crc kubenswrapper[4930]: I1012 05:41:51.853924 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:51 crc kubenswrapper[4930]: I1012 05:41:51.854143 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:51 crc kubenswrapper[4930]: I1012 05:41:51.854241 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:51 crc kubenswrapper[4930]: I1012 05:41:51.854365 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:51 crc kubenswrapper[4930]: I1012 05:41:51.854483 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:51Z","lastTransitionTime":"2025-10-12T05:41:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:51 crc kubenswrapper[4930]: I1012 05:41:51.957422 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:51 crc kubenswrapper[4930]: I1012 05:41:51.957480 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:51 crc kubenswrapper[4930]: I1012 05:41:51.957498 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:51 crc kubenswrapper[4930]: I1012 05:41:51.957521 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:51 crc kubenswrapper[4930]: I1012 05:41:51.957537 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:51Z","lastTransitionTime":"2025-10-12T05:41:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:52 crc kubenswrapper[4930]: I1012 05:41:52.016171 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dda08509-105f-4935-a8a9-ff852e73c3ce-metrics-certs\") pod \"network-metrics-daemon-7cjzn\" (UID: \"dda08509-105f-4935-a8a9-ff852e73c3ce\") " pod="openshift-multus/network-metrics-daemon-7cjzn" Oct 12 05:41:52 crc kubenswrapper[4930]: E1012 05:41:52.016352 4930 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 12 05:41:52 crc kubenswrapper[4930]: E1012 05:41:52.016464 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dda08509-105f-4935-a8a9-ff852e73c3ce-metrics-certs podName:dda08509-105f-4935-a8a9-ff852e73c3ce nodeName:}" failed. No retries permitted until 2025-10-12 05:42:00.016431862 +0000 UTC m=+52.558533657 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dda08509-105f-4935-a8a9-ff852e73c3ce-metrics-certs") pod "network-metrics-daemon-7cjzn" (UID: "dda08509-105f-4935-a8a9-ff852e73c3ce") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 12 05:41:52 crc kubenswrapper[4930]: I1012 05:41:52.060462 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:52 crc kubenswrapper[4930]: I1012 05:41:52.060516 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:52 crc kubenswrapper[4930]: I1012 05:41:52.060535 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:52 crc kubenswrapper[4930]: I1012 05:41:52.060559 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:52 crc kubenswrapper[4930]: I1012 05:41:52.060578 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:52Z","lastTransitionTime":"2025-10-12T05:41:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:52 crc kubenswrapper[4930]: I1012 05:41:52.134420 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cjzn" Oct 12 05:41:52 crc kubenswrapper[4930]: E1012 05:41:52.134797 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cjzn" podUID="dda08509-105f-4935-a8a9-ff852e73c3ce" Oct 12 05:41:52 crc kubenswrapper[4930]: I1012 05:41:52.135050 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 05:41:52 crc kubenswrapper[4930]: I1012 05:41:52.135063 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 05:41:52 crc kubenswrapper[4930]: E1012 05:41:52.135201 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 05:41:52 crc kubenswrapper[4930]: I1012 05:41:52.135212 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 05:41:52 crc kubenswrapper[4930]: E1012 05:41:52.135369 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 05:41:52 crc kubenswrapper[4930]: E1012 05:41:52.135493 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 05:41:52 crc kubenswrapper[4930]: I1012 05:41:52.162993 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:52 crc kubenswrapper[4930]: I1012 05:41:52.163047 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:52 crc kubenswrapper[4930]: I1012 05:41:52.163066 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:52 crc kubenswrapper[4930]: I1012 05:41:52.163092 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:52 crc kubenswrapper[4930]: I1012 05:41:52.163111 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:52Z","lastTransitionTime":"2025-10-12T05:41:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:52 crc kubenswrapper[4930]: I1012 05:41:52.265537 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:52 crc kubenswrapper[4930]: I1012 05:41:52.265601 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:52 crc kubenswrapper[4930]: I1012 05:41:52.265619 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:52 crc kubenswrapper[4930]: I1012 05:41:52.265647 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:52 crc kubenswrapper[4930]: I1012 05:41:52.265667 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:52Z","lastTransitionTime":"2025-10-12T05:41:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:52 crc kubenswrapper[4930]: I1012 05:41:52.368336 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:52 crc kubenswrapper[4930]: I1012 05:41:52.368390 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:52 crc kubenswrapper[4930]: I1012 05:41:52.368406 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:52 crc kubenswrapper[4930]: I1012 05:41:52.368428 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:52 crc kubenswrapper[4930]: I1012 05:41:52.368446 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:52Z","lastTransitionTime":"2025-10-12T05:41:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:52 crc kubenswrapper[4930]: I1012 05:41:52.472108 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:52 crc kubenswrapper[4930]: I1012 05:41:52.472159 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:52 crc kubenswrapper[4930]: I1012 05:41:52.472174 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:52 crc kubenswrapper[4930]: I1012 05:41:52.472197 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:52 crc kubenswrapper[4930]: I1012 05:41:52.472213 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:52Z","lastTransitionTime":"2025-10-12T05:41:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:52 crc kubenswrapper[4930]: I1012 05:41:52.574804 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:52 crc kubenswrapper[4930]: I1012 05:41:52.574886 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:52 crc kubenswrapper[4930]: I1012 05:41:52.574905 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:52 crc kubenswrapper[4930]: I1012 05:41:52.574931 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:52 crc kubenswrapper[4930]: I1012 05:41:52.574949 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:52Z","lastTransitionTime":"2025-10-12T05:41:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:52 crc kubenswrapper[4930]: I1012 05:41:52.678332 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:52 crc kubenswrapper[4930]: I1012 05:41:52.678397 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:52 crc kubenswrapper[4930]: I1012 05:41:52.678409 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:52 crc kubenswrapper[4930]: I1012 05:41:52.678432 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:52 crc kubenswrapper[4930]: I1012 05:41:52.678447 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:52Z","lastTransitionTime":"2025-10-12T05:41:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:52 crc kubenswrapper[4930]: I1012 05:41:52.781848 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:52 crc kubenswrapper[4930]: I1012 05:41:52.781923 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:52 crc kubenswrapper[4930]: I1012 05:41:52.781940 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:52 crc kubenswrapper[4930]: I1012 05:41:52.781968 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:52 crc kubenswrapper[4930]: I1012 05:41:52.781989 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:52Z","lastTransitionTime":"2025-10-12T05:41:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:52 crc kubenswrapper[4930]: I1012 05:41:52.885300 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:52 crc kubenswrapper[4930]: I1012 05:41:52.885379 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:52 crc kubenswrapper[4930]: I1012 05:41:52.885403 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:52 crc kubenswrapper[4930]: I1012 05:41:52.885437 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:52 crc kubenswrapper[4930]: I1012 05:41:52.885463 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:52Z","lastTransitionTime":"2025-10-12T05:41:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:52 crc kubenswrapper[4930]: I1012 05:41:52.989117 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:52 crc kubenswrapper[4930]: I1012 05:41:52.989177 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:52 crc kubenswrapper[4930]: I1012 05:41:52.989195 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:52 crc kubenswrapper[4930]: I1012 05:41:52.989220 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:52 crc kubenswrapper[4930]: I1012 05:41:52.989241 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:52Z","lastTransitionTime":"2025-10-12T05:41:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:53 crc kubenswrapper[4930]: I1012 05:41:53.092424 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:53 crc kubenswrapper[4930]: I1012 05:41:53.092486 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:53 crc kubenswrapper[4930]: I1012 05:41:53.092503 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:53 crc kubenswrapper[4930]: I1012 05:41:53.092529 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:53 crc kubenswrapper[4930]: I1012 05:41:53.092547 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:53Z","lastTransitionTime":"2025-10-12T05:41:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:53 crc kubenswrapper[4930]: I1012 05:41:53.196150 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:53 crc kubenswrapper[4930]: I1012 05:41:53.196217 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:53 crc kubenswrapper[4930]: I1012 05:41:53.196236 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:53 crc kubenswrapper[4930]: I1012 05:41:53.196261 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:53 crc kubenswrapper[4930]: I1012 05:41:53.196279 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:53Z","lastTransitionTime":"2025-10-12T05:41:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:53 crc kubenswrapper[4930]: I1012 05:41:53.299630 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:53 crc kubenswrapper[4930]: I1012 05:41:53.299715 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:53 crc kubenswrapper[4930]: I1012 05:41:53.299745 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:53 crc kubenswrapper[4930]: I1012 05:41:53.299829 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:53 crc kubenswrapper[4930]: I1012 05:41:53.299852 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:53Z","lastTransitionTime":"2025-10-12T05:41:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:53 crc kubenswrapper[4930]: I1012 05:41:53.403075 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:53 crc kubenswrapper[4930]: I1012 05:41:53.403165 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:53 crc kubenswrapper[4930]: I1012 05:41:53.403184 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:53 crc kubenswrapper[4930]: I1012 05:41:53.403243 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:53 crc kubenswrapper[4930]: I1012 05:41:53.403273 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:53Z","lastTransitionTime":"2025-10-12T05:41:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:53 crc kubenswrapper[4930]: I1012 05:41:53.507065 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:53 crc kubenswrapper[4930]: I1012 05:41:53.507149 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:53 crc kubenswrapper[4930]: I1012 05:41:53.507175 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:53 crc kubenswrapper[4930]: I1012 05:41:53.507210 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:53 crc kubenswrapper[4930]: I1012 05:41:53.507232 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:53Z","lastTransitionTime":"2025-10-12T05:41:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:53 crc kubenswrapper[4930]: I1012 05:41:53.611335 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:53 crc kubenswrapper[4930]: I1012 05:41:53.611456 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:53 crc kubenswrapper[4930]: I1012 05:41:53.611475 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:53 crc kubenswrapper[4930]: I1012 05:41:53.611505 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:53 crc kubenswrapper[4930]: I1012 05:41:53.611527 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:53Z","lastTransitionTime":"2025-10-12T05:41:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:53 crc kubenswrapper[4930]: I1012 05:41:53.715046 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:53 crc kubenswrapper[4930]: I1012 05:41:53.715115 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:53 crc kubenswrapper[4930]: I1012 05:41:53.715134 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:53 crc kubenswrapper[4930]: I1012 05:41:53.715157 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:53 crc kubenswrapper[4930]: I1012 05:41:53.715802 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:53Z","lastTransitionTime":"2025-10-12T05:41:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:53 crc kubenswrapper[4930]: I1012 05:41:53.818547 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:53 crc kubenswrapper[4930]: I1012 05:41:53.818620 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:53 crc kubenswrapper[4930]: I1012 05:41:53.818645 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:53 crc kubenswrapper[4930]: I1012 05:41:53.818677 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:53 crc kubenswrapper[4930]: I1012 05:41:53.818814 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:53Z","lastTransitionTime":"2025-10-12T05:41:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:53 crc kubenswrapper[4930]: I1012 05:41:53.922037 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:53 crc kubenswrapper[4930]: I1012 05:41:53.922102 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:53 crc kubenswrapper[4930]: I1012 05:41:53.922127 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:53 crc kubenswrapper[4930]: I1012 05:41:53.922155 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:53 crc kubenswrapper[4930]: I1012 05:41:53.922176 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:53Z","lastTransitionTime":"2025-10-12T05:41:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:54 crc kubenswrapper[4930]: I1012 05:41:54.025660 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:54 crc kubenswrapper[4930]: I1012 05:41:54.025721 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:54 crc kubenswrapper[4930]: I1012 05:41:54.025776 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:54 crc kubenswrapper[4930]: I1012 05:41:54.025811 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:54 crc kubenswrapper[4930]: I1012 05:41:54.025835 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:54Z","lastTransitionTime":"2025-10-12T05:41:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:54 crc kubenswrapper[4930]: I1012 05:41:54.129696 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:54 crc kubenswrapper[4930]: I1012 05:41:54.129790 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:54 crc kubenswrapper[4930]: I1012 05:41:54.129809 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:54 crc kubenswrapper[4930]: I1012 05:41:54.129834 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:54 crc kubenswrapper[4930]: I1012 05:41:54.129866 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:54Z","lastTransitionTime":"2025-10-12T05:41:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:54 crc kubenswrapper[4930]: I1012 05:41:54.134999 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 05:41:54 crc kubenswrapper[4930]: I1012 05:41:54.135069 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cjzn" Oct 12 05:41:54 crc kubenswrapper[4930]: E1012 05:41:54.135153 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 05:41:54 crc kubenswrapper[4930]: I1012 05:41:54.135188 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 05:41:54 crc kubenswrapper[4930]: E1012 05:41:54.135272 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cjzn" podUID="dda08509-105f-4935-a8a9-ff852e73c3ce" Oct 12 05:41:54 crc kubenswrapper[4930]: I1012 05:41:54.135382 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 05:41:54 crc kubenswrapper[4930]: E1012 05:41:54.135558 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 05:41:54 crc kubenswrapper[4930]: E1012 05:41:54.135819 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 05:41:54 crc kubenswrapper[4930]: I1012 05:41:54.232165 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:54 crc kubenswrapper[4930]: I1012 05:41:54.232210 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:54 crc kubenswrapper[4930]: I1012 05:41:54.232226 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:54 crc kubenswrapper[4930]: I1012 05:41:54.232249 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:54 crc kubenswrapper[4930]: I1012 05:41:54.232266 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:54Z","lastTransitionTime":"2025-10-12T05:41:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:54 crc kubenswrapper[4930]: I1012 05:41:54.335409 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:54 crc kubenswrapper[4930]: I1012 05:41:54.335474 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:54 crc kubenswrapper[4930]: I1012 05:41:54.335497 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:54 crc kubenswrapper[4930]: I1012 05:41:54.335534 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:54 crc kubenswrapper[4930]: I1012 05:41:54.335555 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:54Z","lastTransitionTime":"2025-10-12T05:41:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:54 crc kubenswrapper[4930]: I1012 05:41:54.438081 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:54 crc kubenswrapper[4930]: I1012 05:41:54.438143 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:54 crc kubenswrapper[4930]: I1012 05:41:54.438164 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:54 crc kubenswrapper[4930]: I1012 05:41:54.438191 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:54 crc kubenswrapper[4930]: I1012 05:41:54.438211 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:54Z","lastTransitionTime":"2025-10-12T05:41:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:54 crc kubenswrapper[4930]: I1012 05:41:54.542510 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:54 crc kubenswrapper[4930]: I1012 05:41:54.542577 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:54 crc kubenswrapper[4930]: I1012 05:41:54.542599 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:54 crc kubenswrapper[4930]: I1012 05:41:54.542631 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:54 crc kubenswrapper[4930]: I1012 05:41:54.542654 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:54Z","lastTransitionTime":"2025-10-12T05:41:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:54 crc kubenswrapper[4930]: I1012 05:41:54.646857 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:54 crc kubenswrapper[4930]: I1012 05:41:54.646920 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:54 crc kubenswrapper[4930]: I1012 05:41:54.646932 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:54 crc kubenswrapper[4930]: I1012 05:41:54.646963 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:54 crc kubenswrapper[4930]: I1012 05:41:54.646978 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:54Z","lastTransitionTime":"2025-10-12T05:41:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:54 crc kubenswrapper[4930]: I1012 05:41:54.749558 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:54 crc kubenswrapper[4930]: I1012 05:41:54.749659 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:54 crc kubenswrapper[4930]: I1012 05:41:54.749682 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:54 crc kubenswrapper[4930]: I1012 05:41:54.750281 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:54 crc kubenswrapper[4930]: I1012 05:41:54.750610 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:54Z","lastTransitionTime":"2025-10-12T05:41:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:54 crc kubenswrapper[4930]: I1012 05:41:54.854652 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:54 crc kubenswrapper[4930]: I1012 05:41:54.854716 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:54 crc kubenswrapper[4930]: I1012 05:41:54.854730 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:54 crc kubenswrapper[4930]: I1012 05:41:54.854787 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:54 crc kubenswrapper[4930]: I1012 05:41:54.854803 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:54Z","lastTransitionTime":"2025-10-12T05:41:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:54 crc kubenswrapper[4930]: I1012 05:41:54.932025 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:54 crc kubenswrapper[4930]: I1012 05:41:54.932117 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:54 crc kubenswrapper[4930]: I1012 05:41:54.932137 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:54 crc kubenswrapper[4930]: I1012 05:41:54.932169 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:54 crc kubenswrapper[4930]: I1012 05:41:54.932189 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:54Z","lastTransitionTime":"2025-10-12T05:41:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:54 crc kubenswrapper[4930]: E1012 05:41:54.954782 4930 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:41:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:41:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:41:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:41:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"208e324a-7c16-4d54-b585-7c9f58265cb2\\\",\\\"systemUUID\\\":\\\"42c4af26-8d34-49a0-8413-58384a3ecd2b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:54Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:54 crc kubenswrapper[4930]: I1012 05:41:54.960391 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:54 crc kubenswrapper[4930]: I1012 05:41:54.960448 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:54 crc kubenswrapper[4930]: I1012 05:41:54.960467 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:54 crc kubenswrapper[4930]: I1012 05:41:54.960494 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:54 crc kubenswrapper[4930]: I1012 05:41:54.960513 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:54Z","lastTransitionTime":"2025-10-12T05:41:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:54 crc kubenswrapper[4930]: E1012 05:41:54.981277 4930 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:41:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:41:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:41:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:41:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"208e324a-7c16-4d54-b585-7c9f58265cb2\\\",\\\"systemUUID\\\":\\\"42c4af26-8d34-49a0-8413-58384a3ecd2b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:54Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:54 crc kubenswrapper[4930]: I1012 05:41:54.988311 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:54 crc kubenswrapper[4930]: I1012 05:41:54.988371 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:54 crc kubenswrapper[4930]: I1012 05:41:54.988392 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:54 crc kubenswrapper[4930]: I1012 05:41:54.988415 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:54 crc kubenswrapper[4930]: I1012 05:41:54.988430 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:54Z","lastTransitionTime":"2025-10-12T05:41:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:55 crc kubenswrapper[4930]: E1012 05:41:55.016395 4930 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:41:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:41:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:41:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:41:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"208e324a-7c16-4d54-b585-7c9f58265cb2\\\",\\\"systemUUID\\\":\\\"42c4af26-8d34-49a0-8413-58384a3ecd2b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:55Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:55 crc kubenswrapper[4930]: I1012 05:41:55.023370 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:55 crc kubenswrapper[4930]: I1012 05:41:55.023443 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:55 crc kubenswrapper[4930]: I1012 05:41:55.023461 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:55 crc kubenswrapper[4930]: I1012 05:41:55.023490 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:55 crc kubenswrapper[4930]: I1012 05:41:55.023507 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:55Z","lastTransitionTime":"2025-10-12T05:41:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:55 crc kubenswrapper[4930]: E1012 05:41:55.038967 4930 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:41:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:41:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:41:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:41:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"208e324a-7c16-4d54-b585-7c9f58265cb2\\\",\\\"systemUUID\\\":\\\"42c4af26-8d34-49a0-8413-58384a3ecd2b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:55Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:55 crc kubenswrapper[4930]: I1012 05:41:55.044288 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:55 crc kubenswrapper[4930]: I1012 05:41:55.044351 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:55 crc kubenswrapper[4930]: I1012 05:41:55.044365 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:55 crc kubenswrapper[4930]: I1012 05:41:55.044388 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:55 crc kubenswrapper[4930]: I1012 05:41:55.044403 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:55Z","lastTransitionTime":"2025-10-12T05:41:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:55 crc kubenswrapper[4930]: E1012 05:41:55.057358 4930 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:41:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:41:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:41:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:41:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"208e324a-7c16-4d54-b585-7c9f58265cb2\\\",\\\"systemUUID\\\":\\\"42c4af26-8d34-49a0-8413-58384a3ecd2b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:55Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:55 crc kubenswrapper[4930]: E1012 05:41:55.057598 4930 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 12 05:41:55 crc kubenswrapper[4930]: I1012 05:41:55.059885 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:55 crc kubenswrapper[4930]: I1012 05:41:55.059938 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:55 crc kubenswrapper[4930]: I1012 05:41:55.059950 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:55 crc kubenswrapper[4930]: I1012 05:41:55.059978 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:55 crc kubenswrapper[4930]: I1012 05:41:55.059993 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:55Z","lastTransitionTime":"2025-10-12T05:41:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:55 crc kubenswrapper[4930]: I1012 05:41:55.163507 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:55 crc kubenswrapper[4930]: I1012 05:41:55.163577 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:55 crc kubenswrapper[4930]: I1012 05:41:55.163591 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:55 crc kubenswrapper[4930]: I1012 05:41:55.163615 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:55 crc kubenswrapper[4930]: I1012 05:41:55.163631 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:55Z","lastTransitionTime":"2025-10-12T05:41:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:55 crc kubenswrapper[4930]: I1012 05:41:55.267215 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:55 crc kubenswrapper[4930]: I1012 05:41:55.267268 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:55 crc kubenswrapper[4930]: I1012 05:41:55.267282 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:55 crc kubenswrapper[4930]: I1012 05:41:55.267302 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:55 crc kubenswrapper[4930]: I1012 05:41:55.267315 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:55Z","lastTransitionTime":"2025-10-12T05:41:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:55 crc kubenswrapper[4930]: I1012 05:41:55.371503 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:55 crc kubenswrapper[4930]: I1012 05:41:55.371571 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:55 crc kubenswrapper[4930]: I1012 05:41:55.371590 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:55 crc kubenswrapper[4930]: I1012 05:41:55.371620 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:55 crc kubenswrapper[4930]: I1012 05:41:55.371638 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:55Z","lastTransitionTime":"2025-10-12T05:41:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:55 crc kubenswrapper[4930]: I1012 05:41:55.475378 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:55 crc kubenswrapper[4930]: I1012 05:41:55.475438 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:55 crc kubenswrapper[4930]: I1012 05:41:55.475456 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:55 crc kubenswrapper[4930]: I1012 05:41:55.475484 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:55 crc kubenswrapper[4930]: I1012 05:41:55.475503 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:55Z","lastTransitionTime":"2025-10-12T05:41:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:55 crc kubenswrapper[4930]: I1012 05:41:55.578500 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:55 crc kubenswrapper[4930]: I1012 05:41:55.578572 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:55 crc kubenswrapper[4930]: I1012 05:41:55.578582 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:55 crc kubenswrapper[4930]: I1012 05:41:55.578603 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:55 crc kubenswrapper[4930]: I1012 05:41:55.578617 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:55Z","lastTransitionTime":"2025-10-12T05:41:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:55 crc kubenswrapper[4930]: I1012 05:41:55.681059 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:55 crc kubenswrapper[4930]: I1012 05:41:55.681128 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:55 crc kubenswrapper[4930]: I1012 05:41:55.681146 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:55 crc kubenswrapper[4930]: I1012 05:41:55.681176 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:55 crc kubenswrapper[4930]: I1012 05:41:55.681196 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:55Z","lastTransitionTime":"2025-10-12T05:41:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:55 crc kubenswrapper[4930]: I1012 05:41:55.784419 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:55 crc kubenswrapper[4930]: I1012 05:41:55.784508 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:55 crc kubenswrapper[4930]: I1012 05:41:55.784536 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:55 crc kubenswrapper[4930]: I1012 05:41:55.784572 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:55 crc kubenswrapper[4930]: I1012 05:41:55.784596 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:55Z","lastTransitionTime":"2025-10-12T05:41:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:55 crc kubenswrapper[4930]: I1012 05:41:55.888291 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:55 crc kubenswrapper[4930]: I1012 05:41:55.888402 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:55 crc kubenswrapper[4930]: I1012 05:41:55.888431 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:55 crc kubenswrapper[4930]: I1012 05:41:55.888479 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:55 crc kubenswrapper[4930]: I1012 05:41:55.888504 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:55Z","lastTransitionTime":"2025-10-12T05:41:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:55 crc kubenswrapper[4930]: I1012 05:41:55.991930 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:55 crc kubenswrapper[4930]: I1012 05:41:55.991983 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:55 crc kubenswrapper[4930]: I1012 05:41:55.991993 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:55 crc kubenswrapper[4930]: I1012 05:41:55.992013 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:55 crc kubenswrapper[4930]: I1012 05:41:55.992027 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:55Z","lastTransitionTime":"2025-10-12T05:41:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:56 crc kubenswrapper[4930]: I1012 05:41:56.095883 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:56 crc kubenswrapper[4930]: I1012 05:41:56.095961 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:56 crc kubenswrapper[4930]: I1012 05:41:56.095981 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:56 crc kubenswrapper[4930]: I1012 05:41:56.096015 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:56 crc kubenswrapper[4930]: I1012 05:41:56.096038 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:56Z","lastTransitionTime":"2025-10-12T05:41:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:56 crc kubenswrapper[4930]: I1012 05:41:56.134639 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cjzn" Oct 12 05:41:56 crc kubenswrapper[4930]: I1012 05:41:56.134693 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 05:41:56 crc kubenswrapper[4930]: I1012 05:41:56.134667 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 05:41:56 crc kubenswrapper[4930]: I1012 05:41:56.134839 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 05:41:56 crc kubenswrapper[4930]: E1012 05:41:56.134986 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cjzn" podUID="dda08509-105f-4935-a8a9-ff852e73c3ce" Oct 12 05:41:56 crc kubenswrapper[4930]: E1012 05:41:56.135176 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 05:41:56 crc kubenswrapper[4930]: E1012 05:41:56.135439 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 05:41:56 crc kubenswrapper[4930]: E1012 05:41:56.135541 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 05:41:56 crc kubenswrapper[4930]: I1012 05:41:56.198453 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:56 crc kubenswrapper[4930]: I1012 05:41:56.198530 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:56 crc kubenswrapper[4930]: I1012 05:41:56.198557 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:56 crc kubenswrapper[4930]: I1012 05:41:56.198587 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:56 crc kubenswrapper[4930]: I1012 05:41:56.198613 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:56Z","lastTransitionTime":"2025-10-12T05:41:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:56 crc kubenswrapper[4930]: I1012 05:41:56.301570 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:56 crc kubenswrapper[4930]: I1012 05:41:56.301640 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:56 crc kubenswrapper[4930]: I1012 05:41:56.301661 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:56 crc kubenswrapper[4930]: I1012 05:41:56.301689 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:56 crc kubenswrapper[4930]: I1012 05:41:56.301709 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:56Z","lastTransitionTime":"2025-10-12T05:41:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:56 crc kubenswrapper[4930]: I1012 05:41:56.405418 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:56 crc kubenswrapper[4930]: I1012 05:41:56.405508 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:56 crc kubenswrapper[4930]: I1012 05:41:56.405537 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:56 crc kubenswrapper[4930]: I1012 05:41:56.405572 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:56 crc kubenswrapper[4930]: I1012 05:41:56.405596 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:56Z","lastTransitionTime":"2025-10-12T05:41:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:56 crc kubenswrapper[4930]: I1012 05:41:56.508567 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:56 crc kubenswrapper[4930]: I1012 05:41:56.508657 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:56 crc kubenswrapper[4930]: I1012 05:41:56.508684 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:56 crc kubenswrapper[4930]: I1012 05:41:56.508720 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:56 crc kubenswrapper[4930]: I1012 05:41:56.508779 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:56Z","lastTransitionTime":"2025-10-12T05:41:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:56 crc kubenswrapper[4930]: I1012 05:41:56.612167 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:56 crc kubenswrapper[4930]: I1012 05:41:56.612243 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:56 crc kubenswrapper[4930]: I1012 05:41:56.612270 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:56 crc kubenswrapper[4930]: I1012 05:41:56.612301 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:56 crc kubenswrapper[4930]: I1012 05:41:56.612319 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:56Z","lastTransitionTime":"2025-10-12T05:41:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:56 crc kubenswrapper[4930]: I1012 05:41:56.715542 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:56 crc kubenswrapper[4930]: I1012 05:41:56.715594 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:56 crc kubenswrapper[4930]: I1012 05:41:56.715612 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:56 crc kubenswrapper[4930]: I1012 05:41:56.715803 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:56 crc kubenswrapper[4930]: I1012 05:41:56.715824 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:56Z","lastTransitionTime":"2025-10-12T05:41:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:56 crc kubenswrapper[4930]: I1012 05:41:56.819096 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:56 crc kubenswrapper[4930]: I1012 05:41:56.819154 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:56 crc kubenswrapper[4930]: I1012 05:41:56.819170 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:56 crc kubenswrapper[4930]: I1012 05:41:56.819194 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:56 crc kubenswrapper[4930]: I1012 05:41:56.819210 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:56Z","lastTransitionTime":"2025-10-12T05:41:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:56 crc kubenswrapper[4930]: I1012 05:41:56.922504 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:56 crc kubenswrapper[4930]: I1012 05:41:56.922609 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:56 crc kubenswrapper[4930]: I1012 05:41:56.922629 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:56 crc kubenswrapper[4930]: I1012 05:41:56.922654 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:56 crc kubenswrapper[4930]: I1012 05:41:56.922674 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:56Z","lastTransitionTime":"2025-10-12T05:41:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:57 crc kubenswrapper[4930]: I1012 05:41:57.026295 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:57 crc kubenswrapper[4930]: I1012 05:41:57.026357 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:57 crc kubenswrapper[4930]: I1012 05:41:57.026368 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:57 crc kubenswrapper[4930]: I1012 05:41:57.026391 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:57 crc kubenswrapper[4930]: I1012 05:41:57.026817 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:57Z","lastTransitionTime":"2025-10-12T05:41:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:57 crc kubenswrapper[4930]: I1012 05:41:57.130027 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:57 crc kubenswrapper[4930]: I1012 05:41:57.130070 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:57 crc kubenswrapper[4930]: I1012 05:41:57.130080 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:57 crc kubenswrapper[4930]: I1012 05:41:57.130096 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:57 crc kubenswrapper[4930]: I1012 05:41:57.130108 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:57Z","lastTransitionTime":"2025-10-12T05:41:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:57 crc kubenswrapper[4930]: I1012 05:41:57.136397 4930 scope.go:117] "RemoveContainer" containerID="264bb5418af04a677f569add84f3eb35c3760f20db19a03595723d261202e317" Oct 12 05:41:57 crc kubenswrapper[4930]: I1012 05:41:57.232720 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:57 crc kubenswrapper[4930]: I1012 05:41:57.232836 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:57 crc kubenswrapper[4930]: I1012 05:41:57.232857 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:57 crc kubenswrapper[4930]: I1012 05:41:57.232884 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:57 crc kubenswrapper[4930]: I1012 05:41:57.232906 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:57Z","lastTransitionTime":"2025-10-12T05:41:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:57 crc kubenswrapper[4930]: I1012 05:41:57.335611 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:57 crc kubenswrapper[4930]: I1012 05:41:57.335688 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:57 crc kubenswrapper[4930]: I1012 05:41:57.335702 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:57 crc kubenswrapper[4930]: I1012 05:41:57.335723 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:57 crc kubenswrapper[4930]: I1012 05:41:57.335754 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:57Z","lastTransitionTime":"2025-10-12T05:41:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:57 crc kubenswrapper[4930]: I1012 05:41:57.443101 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:57 crc kubenswrapper[4930]: I1012 05:41:57.443169 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:57 crc kubenswrapper[4930]: I1012 05:41:57.443182 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:57 crc kubenswrapper[4930]: I1012 05:41:57.443203 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:57 crc kubenswrapper[4930]: I1012 05:41:57.443214 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:57Z","lastTransitionTime":"2025-10-12T05:41:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:57 crc kubenswrapper[4930]: I1012 05:41:57.547393 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:57 crc kubenswrapper[4930]: I1012 05:41:57.547450 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:57 crc kubenswrapper[4930]: I1012 05:41:57.547460 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:57 crc kubenswrapper[4930]: I1012 05:41:57.547480 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:57 crc kubenswrapper[4930]: I1012 05:41:57.547492 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:57Z","lastTransitionTime":"2025-10-12T05:41:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:57 crc kubenswrapper[4930]: I1012 05:41:57.553691 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mdhw6_0add0fa2-092f-4dcc-8c72-82881564bf63/ovnkube-controller/1.log" Oct 12 05:41:57 crc kubenswrapper[4930]: I1012 05:41:57.557843 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" event={"ID":"0add0fa2-092f-4dcc-8c72-82881564bf63","Type":"ContainerStarted","Data":"3921189ea146b5643c2898b528d1e76b32effa405e5e61dc58db80200fb1223c"} Oct 12 05:41:57 crc kubenswrapper[4930]: I1012 05:41:57.559273 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" Oct 12 05:41:57 crc kubenswrapper[4930]: I1012 05:41:57.579206 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2acab1-2f7d-4497-ab0c-d2088825de18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://380b8da0bc8ba00ec7fe7a9dd1a4f49f0f0828fa469e21bd947a1b976ae5e584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14de1c8677ea9a164e1713e68861b55fa4b8fd0bff40211afa3ee9e883c65092\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93380cd1156afc3c34f019f5026f01b5242b9009c4c5de83edf5301e4cdb7b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b74b7cf89e0ba8d3417d38fdfc78b271cb23bc6e737358303956214c8f9a0f9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fb5b0196fd005013eba91e6ea563d14230abf0a8cbc1d2a4bf9cd4491f7c391\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T05:41:22Z\\\",\\\"message\\\":\\\"W1012 05:41:11.395066 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 05:41:11.395683 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760247671 cert, and key in /tmp/serving-cert-1865248722/serving-signer.crt, /tmp/serving-cert-1865248722/serving-signer.key\\\\nI1012 05:41:11.668419 1 observer_polling.go:159] Starting file observer\\\\nW1012 05:41:11.671952 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 05:41:11.672482 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 05:41:11.675332 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1865248722/tls.crt::/tmp/serving-cert-1865248722/tls.key\\\\\\\"\\\\nF1012 05:41:22.294380 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9e13b7dbf76d9dcdaa4a8edc40eb5ebac2c044e9f171f78f18b33a08784ed28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:57Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:57 crc kubenswrapper[4930]: I1012 05:41:57.600559 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:57Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:57 crc kubenswrapper[4930]: I1012 05:41:57.622726 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:57Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:57 crc kubenswrapper[4930]: I1012 05:41:57.648126 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://489b43d7659655253d7055c2725d6b59d594b9eea5d1e6dc78b015f8d44824ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:57Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:57 crc kubenswrapper[4930]: I1012 05:41:57.650868 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:57 crc kubenswrapper[4930]: I1012 05:41:57.650923 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:57 crc kubenswrapper[4930]: I1012 05:41:57.650935 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:57 crc kubenswrapper[4930]: I1012 05:41:57.650952 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:57 crc kubenswrapper[4930]: I1012 05:41:57.650965 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:57Z","lastTransitionTime":"2025-10-12T05:41:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:57 crc kubenswrapper[4930]: I1012 05:41:57.664792 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-br2vl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"545bf51c-0b04-4166-a984-ec9c1276470a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6148cdf3aed29894038db482881c0cf3d0fd2d99c241e87d864099a3eb4248f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tpks\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-br2vl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:57Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:57 crc kubenswrapper[4930]: I1012 05:41:57.678717 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jd2dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"835a1f98-4ae1-499b-b08c-a87dbcf8eaf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7816344abd767a7cdc2aa06418215233815285a683bb5a7fceeeb9a383b00bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvkzz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jd2dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:57Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:57 crc kubenswrapper[4930]: I1012 05:41:57.708104 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e930f9f-8120-43b6-95ba-39319c069f64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://428593f6ee3749d62d3cb8553ec6707f0f13f70f7750b8ea90fb0b9f0aa8f337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05e5821bdc37103cb95d9f2a3dcacbff75c87335a6c6ed400846ea49fe370c00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e44390f068c28c4a9917648c747a322ddfa44915b128a61f145d6e2e23fc167a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://586395a6006e26fe969c2ef9af6ec6299523c0868c26989709389f5af24b7e39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:57Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:57 crc kubenswrapper[4930]: I1012 05:41:57.733636 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d92840f87187d6a4e0a40972f9b1601b70c300104b6023b980e0939e8b83eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d6f5f971b8708420f74c2b9602179815dd69145979d42bbaeb899290c01c286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:57Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:57 crc kubenswrapper[4930]: I1012 05:41:57.749712 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c87fc1896b611cf87d3f6e8dce960e0aea4e0a5e161d12e046bec5a4d46d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:57Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:57 crc kubenswrapper[4930]: I1012 05:41:57.753997 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:57 crc kubenswrapper[4930]: I1012 05:41:57.754057 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:57 crc kubenswrapper[4930]: I1012 05:41:57.754071 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:57 crc kubenswrapper[4930]: I1012 05:41:57.754089 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:57 crc kubenswrapper[4930]: I1012 05:41:57.754103 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:57Z","lastTransitionTime":"2025-10-12T05:41:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:57 crc kubenswrapper[4930]: I1012 05:41:57.765846 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:57Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:57 crc kubenswrapper[4930]: I1012 05:41:57.779547 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tq29s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1c3ae9e-26ae-418f-b261-eabc4302b332\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbe4d76b294d8c174b77355aae9ec55f4d38c04446fc048603a294d171332584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s29kh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tq29s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:57Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:57 crc kubenswrapper[4930]: I1012 05:41:57.789525 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02f8684c-a3e4-44e8-9741-9f54488d8d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74c17edeca2e2b56f4be36327914c7232a5eee3190d7cd9ce54fe81050942ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqrgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2193886cf3e24fbf30c3e6f91ab52d9b7e68968c8b42a4d0deafab5668555701\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqrgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mk4tf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:57Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:57 crc kubenswrapper[4930]: I1012 05:41:57.806576 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd0aa975-5017-4585-b9d9-66563c734f99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2ee3c6242b1dd92c05cad132f941c6463862d38dd3f32db106c4458b19ca281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce0b341a84353da33af118a985d78e1926365e672cd9d9a94a8c9ccdeeeef68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://772b3ac684f8d59be06d3921919db10572b68d9acfb6ae64e85b92c637963057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04341c7af2fbffb89ce55a0ed2594bdb4fd943ac9088994f81f9245c31c62a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c844939574ab6101aa35b883fd6b7dfabb8338c2de1bc238a60b92f7e0fc99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:57Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:57 crc kubenswrapper[4930]: I1012 05:41:57.820799 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vwttt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d928520f-ca1d-4cca-b966-c1e6c9168db0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://083fd9c4a4a835276945402005503980476de4e834b8c77b4347cbfd74347baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://271a5e8f9d11a30e34aaa4965e6d9d036cfc47d469d87a2ff0ddcae5ba4e8fe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://271a5e8f9d11a30e34aaa4965e6d9d036cfc47d469d87a2ff0ddcae5ba4e8fe8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a7387bc9fd2f9b98dd174adbd738309e22b90e1c82756e0e4c114cd5bc1279a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a7387bc9fd2f9b98dd174adbd738309e22b90e1c82756e0e4c114cd5bc1279a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b73defe23757469590b3d3d92e7b79bf3b6dcbcaec44d750770cd2b94caaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6b73defe23757469590b3d3d92e7b79bf3b6dcbcaec44d750770cd2b94caaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c379d6c40705bccb05ab8835e3b2410fc0a98db9e372b4dbd62cc3d4c1c5470\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c379d6c40705bccb05ab8835e3b2410fc0a98db9e372b4dbd62cc3d4c1c5470\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5e3c8e16c925694d42868813bc57edd9c50c88f2153189a2b9f8451655aff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5e3c8e16c925694d42868813bc57edd9c50c88f2153189a2b9f8451655aff1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d01f2b61f7fe3605f4ad0626d2cf28756a8c7e3de343b9a20acafe7f35c19bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d01f2b61f7fe3605f4ad0626d2cf28756a8c7e3de343b9a20acafe7f35c19bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vwttt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:57Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:57 crc kubenswrapper[4930]: I1012 05:41:57.840496 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0add0fa2-092f-4dcc-8c72-82881564bf63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7495c144ce989bf10b08b199a7459ea85351b77941a3edcb0ef76a01b2e08164\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a4295347cdfdf0864fbc2b102904f73e19cc33021fc5a82eed61255c4641d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2cba08ba8fd74a521bdd005e51be52f80687a1ccf0f250d11b709c894be8235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1006beab015e55856b70b9bcb93bb81b541e6c593fae8fd366c361990d233914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17051b78f584f1199be3038a82d9e675d1b9fb4da939611577c0b05235a147a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9898a7d7e617b5eb8eab6b6d9cb5d2ea92e8fbb0f4b1e0e60711e8f4a5e3aef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3921189ea146b5643c2898b528d1e76b32effa405e5e61dc58db80200fb1223c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://264bb5418af04a677f569add84f3eb35c3760f20db19a03595723d261202e317\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T05:41:41Z\\\",\\\"message\\\":\\\"ster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.34:8443: 10.217.5.34:8888:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7715118b-bb1b-400a-803e-7ab2cc3eeec0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1012 05:41:41.453945 6374 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:41Z is after 2025-08-24T17:21:41Z]\\\\nI1012 05:41:41.453849 6374 services_controller.go:451] Built service openshift-dns-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-dns-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcac9953a6ca0ac812cb0d1aed7ff1179eed01831f4af4a098f3e6938b5bc666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mdhw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:57Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:57 crc kubenswrapper[4930]: I1012 05:41:57.852844 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fjsw8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c21ff03a-29e7-467c-93e2-45f3a6cef5af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df55e443aa226c07478c488ee54785f35fbb1d6285775e82abcbf924aaf304fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dk8z2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bae41878203d4a2974f46d284f4fc05e705fb9fc9b1a3e8159db00a4c9d70762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dk8z2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fjsw8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:57Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:57 crc kubenswrapper[4930]: I1012 05:41:57.857027 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:57 crc kubenswrapper[4930]: I1012 05:41:57.857066 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:57 crc kubenswrapper[4930]: I1012 05:41:57.857083 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:57 crc kubenswrapper[4930]: I1012 05:41:57.857103 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:57 crc kubenswrapper[4930]: I1012 05:41:57.857116 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:57Z","lastTransitionTime":"2025-10-12T05:41:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:57 crc kubenswrapper[4930]: I1012 05:41:57.866608 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7cjzn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dda08509-105f-4935-a8a9-ff852e73c3ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2ntq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2ntq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7cjzn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:57Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:57 crc kubenswrapper[4930]: I1012 05:41:57.960691 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:57 crc kubenswrapper[4930]: I1012 05:41:57.960799 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:57 crc kubenswrapper[4930]: I1012 05:41:57.960818 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:57 crc kubenswrapper[4930]: I1012 05:41:57.960845 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:57 crc kubenswrapper[4930]: I1012 05:41:57.960863 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:57Z","lastTransitionTime":"2025-10-12T05:41:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.064265 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.064340 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.064358 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.064387 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.064405 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:58Z","lastTransitionTime":"2025-10-12T05:41:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.135349 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.135432 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.135470 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 05:41:58 crc kubenswrapper[4930]: E1012 05:41:58.135543 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.135633 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cjzn" Oct 12 05:41:58 crc kubenswrapper[4930]: E1012 05:41:58.135883 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 05:41:58 crc kubenswrapper[4930]: E1012 05:41:58.136016 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cjzn" podUID="dda08509-105f-4935-a8a9-ff852e73c3ce" Oct 12 05:41:58 crc kubenswrapper[4930]: E1012 05:41:58.136256 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.169454 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.169812 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.169974 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.170118 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.170275 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:58Z","lastTransitionTime":"2025-10-12T05:41:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.175606 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02f8684c-a3e4-44e8-9741-9f54488d8d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74c17edeca2e2b56f4be36327914c7232a5eee3190d7cd9ce54fe81050942ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqrgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2193886cf3e24fbf30c3e6f91ab52d9b7e68968c8b42a4d0deafab5668555701\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqrgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mk4tf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:58Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.203457 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c87fc1896b611cf87d3f6e8dce960e0aea4e0a5e161d12e046bec5a4d46d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:58Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.221097 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:58Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.244633 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tq29s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1c3ae9e-26ae-418f-b261-eabc4302b332\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbe4d76b294d8c174b77355aae9ec55f4d38c04446fc048603a294d171332584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s29kh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tq29s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:58Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.273013 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.273317 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.273685 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.274038 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.274184 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:58Z","lastTransitionTime":"2025-10-12T05:41:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.279454 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd0aa975-5017-4585-b9d9-66563c734f99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2ee3c6242b1dd92c05cad132f941c6463862d38dd3f32db106c4458b19ca281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce0b341a84353da33af118a985d78e1926365e672cd9d9a94a8c9ccdeeeef68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://772b3ac684f8d59be06d3921919db10572b68d9acfb6ae64e85b92c637963057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04341c7af2fbffb89ce55a0ed2594bdb4fd943ac9088994f81f9245c31c62a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c844939574ab6101aa35b883fd6b7dfabb8338c2de1bc238a60b92f7e0fc99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:58Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.305212 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vwttt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d928520f-ca1d-4cca-b966-c1e6c9168db0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://083fd9c4a4a835276945402005503980476de4e834b8c77b4347cbfd74347baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://271a5e8f9d11a30e34aaa4965e6d9d036cfc47d469d87a2ff0ddcae5ba4e8fe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://271a5e8f9d11a30e34aaa4965e6d9d036cfc47d469d87a2ff0ddcae5ba4e8fe8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a7387bc9fd2f9b98dd174adbd738309e22b90e1c82756e0e4c114cd5bc1279a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a7387bc9fd2f9b98dd174adbd738309e22b90e1c82756e0e4c114cd5bc1279a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b73defe23757469590b3d3d92e7b79bf3b6dcbcaec44d750770cd2b94caaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6b73defe23757469590b3d3d92e7b79bf3b6dcbcaec44d750770cd2b94caaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c379d6c40705bccb05ab8835e3b2410fc0a98db9e372b4dbd62cc3d4c1c5470\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c379d6c40705bccb05ab8835e3b2410fc0a98db9e372b4dbd62cc3d4c1c5470\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5e3c8e16c925694d42868813bc57edd9c50c88f2153189a2b9f8451655aff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5e3c8e16c925694d42868813bc57edd9c50c88f2153189a2b9f8451655aff1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d01f2b61f7fe3605f4ad0626d2cf28756a8c7e3de343b9a20acafe7f35c19bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d01f2b61f7fe3605f4ad0626d2cf28756a8c7e3de343b9a20acafe7f35c19bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vwttt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:58Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.336009 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0add0fa2-092f-4dcc-8c72-82881564bf63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7495c144ce989bf10b08b199a7459ea85351b77941a3edcb0ef76a01b2e08164\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a4295347cdfdf0864fbc2b102904f73e19cc33021fc5a82eed61255c4641d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2cba08ba8fd74a521bdd005e51be52f80687a1ccf0f250d11b709c894be8235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1006beab015e55856b70b9bcb93bb81b541e6c593fae8fd366c361990d233914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17051b78f584f1199be3038a82d9e675d1b9fb4da939611577c0b05235a147a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9898a7d7e617b5eb8eab6b6d9cb5d2ea92e8fbb0f4b1e0e60711e8f4a5e3aef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3921189ea146b5643c2898b528d1e76b32effa405e5e61dc58db80200fb1223c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://264bb5418af04a677f569add84f3eb35c3760f20db19a03595723d261202e317\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T05:41:41Z\\\",\\\"message\\\":\\\"ster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.34:8443: 10.217.5.34:8888:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7715118b-bb1b-400a-803e-7ab2cc3eeec0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1012 05:41:41.453945 6374 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:41Z is after 2025-08-24T17:21:41Z]\\\\nI1012 05:41:41.453849 6374 services_controller.go:451] Built service openshift-dns-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-dns-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcac9953a6ca0ac812cb0d1aed7ff1179eed01831f4af4a098f3e6938b5bc666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mdhw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:58Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.353711 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fjsw8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c21ff03a-29e7-467c-93e2-45f3a6cef5af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df55e443aa226c07478c488ee54785f35fbb1d6285775e82abcbf924aaf304fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dk8z2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bae41878203d4a2974f46d284f4fc05e705fb9fc9b1a3e8159db00a4c9d70762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dk8z2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fjsw8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:58Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.368177 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7cjzn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dda08509-105f-4935-a8a9-ff852e73c3ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2ntq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2ntq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7cjzn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:58Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.376899 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.376959 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.376980 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.377007 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.377028 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:58Z","lastTransitionTime":"2025-10-12T05:41:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.387384 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jd2dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"835a1f98-4ae1-499b-b08c-a87dbcf8eaf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7816344abd767a7cdc2aa06418215233815285a683bb5a7fceeeb9a383b00bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvkzz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jd2dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:58Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.408918 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2acab1-2f7d-4497-ab0c-d2088825de18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://380b8da0bc8ba00ec7fe7a9dd1a4f49f0f0828fa469e21bd947a1b976ae5e584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14de1c8677ea9a164e1713e68861b55fa4b8fd0bff40211afa3ee9e883c65092\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93380cd1156afc3c34f019f5026f01b5242b9009c4c5de83edf5301e4cdb7b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b74b7cf89e0ba8d3417d38fdfc78b271cb23bc6e737358303956214c8f9a0f9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fb5b0196fd005013eba91e6ea563d14230abf0a8cbc1d2a4bf9cd4491f7c391\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T05:41:22Z\\\",\\\"message\\\":\\\"W1012 05:41:11.395066 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 05:41:11.395683 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760247671 cert, and key in /tmp/serving-cert-1865248722/serving-signer.crt, /tmp/serving-cert-1865248722/serving-signer.key\\\\nI1012 05:41:11.668419 1 observer_polling.go:159] Starting file observer\\\\nW1012 05:41:11.671952 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 05:41:11.672482 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 05:41:11.675332 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1865248722/tls.crt::/tmp/serving-cert-1865248722/tls.key\\\\\\\"\\\\nF1012 05:41:22.294380 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9e13b7dbf76d9dcdaa4a8edc40eb5ebac2c044e9f171f78f18b33a08784ed28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:58Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.426999 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:58Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.449776 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:58Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.469246 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://489b43d7659655253d7055c2725d6b59d594b9eea5d1e6dc78b015f8d44824ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:58Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.481453 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.481513 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.481531 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.481558 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.481580 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:58Z","lastTransitionTime":"2025-10-12T05:41:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.487382 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-br2vl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"545bf51c-0b04-4166-a984-ec9c1276470a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6148cdf3aed29894038db482881c0cf3d0fd2d99c241e87d864099a3eb4248f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tpks\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-br2vl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:58Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.510189 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e930f9f-8120-43b6-95ba-39319c069f64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://428593f6ee3749d62d3cb8553ec6707f0f13f70f7750b8ea90fb0b9f0aa8f337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05e5821bdc37103cb95d9f2a3dcacbff75c87335a6c6ed400846ea49fe370c00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e44390f068c28c4a9917648c747a322ddfa44915b128a61f145d6e2e23fc167a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://586395a6006e26fe969c2ef9af6ec6299523c0868c26989709389f5af24b7e39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:58Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.529515 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d92840f87187d6a4e0a40972f9b1601b70c300104b6023b980e0939e8b83eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d6f5f971b8708420f74c2b9602179815dd69145979d42bbaeb899290c01c286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:58Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.564294 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mdhw6_0add0fa2-092f-4dcc-8c72-82881564bf63/ovnkube-controller/2.log" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.565531 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mdhw6_0add0fa2-092f-4dcc-8c72-82881564bf63/ovnkube-controller/1.log" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.569049 4930 generic.go:334] "Generic (PLEG): container finished" podID="0add0fa2-092f-4dcc-8c72-82881564bf63" containerID="3921189ea146b5643c2898b528d1e76b32effa405e5e61dc58db80200fb1223c" exitCode=1 Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.569104 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" event={"ID":"0add0fa2-092f-4dcc-8c72-82881564bf63","Type":"ContainerDied","Data":"3921189ea146b5643c2898b528d1e76b32effa405e5e61dc58db80200fb1223c"} Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.569157 4930 scope.go:117] "RemoveContainer" containerID="264bb5418af04a677f569add84f3eb35c3760f20db19a03595723d261202e317" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.570434 4930 scope.go:117] "RemoveContainer" containerID="3921189ea146b5643c2898b528d1e76b32effa405e5e61dc58db80200fb1223c" Oct 12 05:41:58 crc kubenswrapper[4930]: E1012 05:41:58.570802 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-mdhw6_openshift-ovn-kubernetes(0add0fa2-092f-4dcc-8c72-82881564bf63)\"" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" podUID="0add0fa2-092f-4dcc-8c72-82881564bf63" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.584003 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.584052 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.584070 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.584094 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.584113 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:58Z","lastTransitionTime":"2025-10-12T05:41:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.594804 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vwttt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d928520f-ca1d-4cca-b966-c1e6c9168db0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://083fd9c4a4a835276945402005503980476de4e834b8c77b4347cbfd74347baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://271a5e8f9d11a30e34aaa4965e6d9d036cfc47d469d87a2ff0ddcae5ba4e8fe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://271a5e8f9d11a30e34aaa4965e6d9d036cfc47d469d87a2ff0ddcae5ba4e8fe8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a7387bc9fd2f9b98dd174adbd738309e22b90e1c82756e0e4c114cd5bc1279a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a7387bc9fd2f9b98dd174adbd738309e22b90e1c82756e0e4c114cd5bc1279a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b73defe23757469590b3d3d92e7b79bf3b6dcbcaec44d750770cd2b94caaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6b73defe23757469590b3d3d92e7b79bf3b6dcbcaec44d750770cd2b94caaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c379d6c40705bccb05ab8835e3b2410fc0a98db9e372b4dbd62cc3d4c1c5470\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c379d6c40705bccb05ab8835e3b2410fc0a98db9e372b4dbd62cc3d4c1c5470\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5e3c8e16c925694d42868813bc57edd9c50c88f2153189a2b9f8451655aff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5e3c8e16c925694d42868813bc57edd9c50c88f2153189a2b9f8451655aff1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d01f2b61f7fe3605f4ad0626d2cf28756a8c7e3de343b9a20acafe7f35c19bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d01f2b61f7fe3605f4ad0626d2cf28756a8c7e3de343b9a20acafe7f35c19bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vwttt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:58Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.618962 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0add0fa2-092f-4dcc-8c72-82881564bf63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7495c144ce989bf10b08b199a7459ea85351b77941a3edcb0ef76a01b2e08164\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a4295347cdfdf0864fbc2b102904f73e19cc33021fc5a82eed61255c4641d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2cba08ba8fd74a521bdd005e51be52f80687a1ccf0f250d11b709c894be8235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1006beab015e55856b70b9bcb93bb81b541e6c593fae8fd366c361990d233914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17051b78f584f1199be3038a82d9e675d1b9fb4da939611577c0b05235a147a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9898a7d7e617b5eb8eab6b6d9cb5d2ea92e8fbb0f4b1e0e60711e8f4a5e3aef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3921189ea146b5643c2898b528d1e76b32effa405e5e61dc58db80200fb1223c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://264bb5418af04a677f569add84f3eb35c3760f20db19a03595723d261202e317\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T05:41:41Z\\\",\\\"message\\\":\\\"ster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.34:8443: 10.217.5.34:8888:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7715118b-bb1b-400a-803e-7ab2cc3eeec0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1012 05:41:41.453945 6374 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:41Z is after 2025-08-24T17:21:41Z]\\\\nI1012 05:41:41.453849 6374 services_controller.go:451] Built service openshift-dns-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-dns-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3921189ea146b5643c2898b528d1e76b32effa405e5e61dc58db80200fb1223c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T05:41:58Z\\\",\\\"message\\\":\\\"controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI1012 05:41:58.201492 6569 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager-operator/metrics\\\\\\\"}\\\\nI1012 05:41:58.201552 6569 services_controller.go:360] Finished syncing service metrics on namespace openshift-controller-manager-operator for network=default : 16.497974ms\\\\nI1012 05:41:58.201573 6569 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI1012 05:41:58.201644 6569 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1012 05:41:58.201741 6569 factory.go:1336] Added *v1.Node event handler 7\\\\nI1012 05:41:58.201827 6569 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1012 05:41:58.202337 6569 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1012 05:41:58.202433 6569 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1012 05:41:58.202487 6569 ovnkube.go:599] Stopped ovnkube\\\\nI1012 05:41:58.202539 6569 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1012 05:41:58.202635 6569 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcac9953a6ca0ac812cb0d1aed7ff1179eed01831f4af4a098f3e6938b5bc666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mdhw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:58Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.635536 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fjsw8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c21ff03a-29e7-467c-93e2-45f3a6cef5af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df55e443aa226c07478c488ee54785f35fbb1d6285775e82abcbf924aaf304fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dk8z2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bae41878203d4a2974f46d284f4fc05e705fb9fc9b1a3e8159db00a4c9d70762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dk8z2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fjsw8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:58Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.649284 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7cjzn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dda08509-105f-4935-a8a9-ff852e73c3ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2ntq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2ntq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7cjzn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:58Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.678769 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd0aa975-5017-4585-b9d9-66563c734f99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2ee3c6242b1dd92c05cad132f941c6463862d38dd3f32db106c4458b19ca281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce0b341a84353da33af118a985d78e1926365e672cd9d9a94a8c9ccdeeeef68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://772b3ac684f8d59be06d3921919db10572b68d9acfb6ae64e85b92c637963057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04341c7af2fbffb89ce55a0ed2594bdb4fd943ac9088994f81f9245c31c62a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c844939574ab6101aa35b883fd6b7dfabb8338c2de1bc238a60b92f7e0fc99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:58Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.687494 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.687686 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.687697 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.687753 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.687765 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:58Z","lastTransitionTime":"2025-10-12T05:41:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.693562 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:58Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.708540 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:58Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.725271 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://489b43d7659655253d7055c2725d6b59d594b9eea5d1e6dc78b015f8d44824ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:58Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.744233 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-br2vl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"545bf51c-0b04-4166-a984-ec9c1276470a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6148cdf3aed29894038db482881c0cf3d0fd2d99c241e87d864099a3eb4248f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tpks\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-br2vl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:58Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.758656 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jd2dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"835a1f98-4ae1-499b-b08c-a87dbcf8eaf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7816344abd767a7cdc2aa06418215233815285a683bb5a7fceeeb9a383b00bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvkzz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jd2dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:58Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.778060 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2acab1-2f7d-4497-ab0c-d2088825de18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://380b8da0bc8ba00ec7fe7a9dd1a4f49f0f0828fa469e21bd947a1b976ae5e584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14de1c8677ea9a164e1713e68861b55fa4b8fd0bff40211afa3ee9e883c65092\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93380cd1156afc3c34f019f5026f01b5242b9009c4c5de83edf5301e4cdb7b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b74b7cf89e0ba8d3417d38fdfc78b271cb23bc6e737358303956214c8f9a0f9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fb5b0196fd005013eba91e6ea563d14230abf0a8cbc1d2a4bf9cd4491f7c391\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T05:41:22Z\\\",\\\"message\\\":\\\"W1012 05:41:11.395066 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 05:41:11.395683 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760247671 cert, and key in /tmp/serving-cert-1865248722/serving-signer.crt, /tmp/serving-cert-1865248722/serving-signer.key\\\\nI1012 05:41:11.668419 1 observer_polling.go:159] Starting file observer\\\\nW1012 05:41:11.671952 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 05:41:11.672482 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 05:41:11.675332 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1865248722/tls.crt::/tmp/serving-cert-1865248722/tls.key\\\\\\\"\\\\nF1012 05:41:22.294380 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9e13b7dbf76d9dcdaa4a8edc40eb5ebac2c044e9f171f78f18b33a08784ed28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:58Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.791103 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.791163 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.791187 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.791218 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.791240 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:58Z","lastTransitionTime":"2025-10-12T05:41:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.794429 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d92840f87187d6a4e0a40972f9b1601b70c300104b6023b980e0939e8b83eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d6f5f971b8708420f74c2b9602179815dd69145979d42bbaeb899290c01c286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:58Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.813054 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e930f9f-8120-43b6-95ba-39319c069f64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://428593f6ee3749d62d3cb8553ec6707f0f13f70f7750b8ea90fb0b9f0aa8f337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05e5821bdc37103cb95d9f2a3dcacbff75c87335a6c6ed400846ea49fe370c00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e44390f068c28c4a9917648c747a322ddfa44915b128a61f145d6e2e23fc167a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://586395a6006e26fe969c2ef9af6ec6299523c0868c26989709389f5af24b7e39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:58Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.833288 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c87fc1896b611cf87d3f6e8dce960e0aea4e0a5e161d12e046bec5a4d46d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:58Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.855512 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:58Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.873641 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tq29s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1c3ae9e-26ae-418f-b261-eabc4302b332\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbe4d76b294d8c174b77355aae9ec55f4d38c04446fc048603a294d171332584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s29kh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tq29s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:58Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.890568 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02f8684c-a3e4-44e8-9741-9f54488d8d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74c17edeca2e2b56f4be36327914c7232a5eee3190d7cd9ce54fe81050942ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqrgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2193886cf3e24fbf30c3e6f91ab52d9b7e68968c8b42a4d0deafab5668555701\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqrgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mk4tf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:58Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.894736 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.894803 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.894814 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.894832 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.894845 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:58Z","lastTransitionTime":"2025-10-12T05:41:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.997944 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.998039 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.998069 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.998107 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:58 crc kubenswrapper[4930]: I1012 05:41:58.998133 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:58Z","lastTransitionTime":"2025-10-12T05:41:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:59 crc kubenswrapper[4930]: I1012 05:41:59.101455 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:59 crc kubenswrapper[4930]: I1012 05:41:59.101533 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:59 crc kubenswrapper[4930]: I1012 05:41:59.101556 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:59 crc kubenswrapper[4930]: I1012 05:41:59.101589 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:59 crc kubenswrapper[4930]: I1012 05:41:59.101611 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:59Z","lastTransitionTime":"2025-10-12T05:41:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:59 crc kubenswrapper[4930]: I1012 05:41:59.204651 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:59 crc kubenswrapper[4930]: I1012 05:41:59.204712 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:59 crc kubenswrapper[4930]: I1012 05:41:59.204730 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:59 crc kubenswrapper[4930]: I1012 05:41:59.204791 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:59 crc kubenswrapper[4930]: I1012 05:41:59.204842 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:59Z","lastTransitionTime":"2025-10-12T05:41:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:59 crc kubenswrapper[4930]: I1012 05:41:59.308504 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:59 crc kubenswrapper[4930]: I1012 05:41:59.308572 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:59 crc kubenswrapper[4930]: I1012 05:41:59.308591 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:59 crc kubenswrapper[4930]: I1012 05:41:59.308616 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:59 crc kubenswrapper[4930]: I1012 05:41:59.308634 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:59Z","lastTransitionTime":"2025-10-12T05:41:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:59 crc kubenswrapper[4930]: I1012 05:41:59.412120 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:59 crc kubenswrapper[4930]: I1012 05:41:59.412184 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:59 crc kubenswrapper[4930]: I1012 05:41:59.412207 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:59 crc kubenswrapper[4930]: I1012 05:41:59.412234 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:59 crc kubenswrapper[4930]: I1012 05:41:59.412255 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:59Z","lastTransitionTime":"2025-10-12T05:41:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:59 crc kubenswrapper[4930]: I1012 05:41:59.516058 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:59 crc kubenswrapper[4930]: I1012 05:41:59.516126 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:59 crc kubenswrapper[4930]: I1012 05:41:59.516146 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:59 crc kubenswrapper[4930]: I1012 05:41:59.516173 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:59 crc kubenswrapper[4930]: I1012 05:41:59.516198 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:59Z","lastTransitionTime":"2025-10-12T05:41:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:59 crc kubenswrapper[4930]: I1012 05:41:59.575879 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mdhw6_0add0fa2-092f-4dcc-8c72-82881564bf63/ovnkube-controller/2.log" Oct 12 05:41:59 crc kubenswrapper[4930]: I1012 05:41:59.582249 4930 scope.go:117] "RemoveContainer" containerID="3921189ea146b5643c2898b528d1e76b32effa405e5e61dc58db80200fb1223c" Oct 12 05:41:59 crc kubenswrapper[4930]: E1012 05:41:59.582510 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-mdhw6_openshift-ovn-kubernetes(0add0fa2-092f-4dcc-8c72-82881564bf63)\"" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" podUID="0add0fa2-092f-4dcc-8c72-82881564bf63" Oct 12 05:41:59 crc kubenswrapper[4930]: I1012 05:41:59.601421 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fjsw8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c21ff03a-29e7-467c-93e2-45f3a6cef5af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df55e443aa226c07478c488ee54785f35fbb1d6285775e82abcbf924aaf304fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dk8z2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bae41878203d4a2974f46d284f4fc05e705fb9fc9b1a3e8159db00a4c9d70762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dk8z2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fjsw8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:59Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:59 crc kubenswrapper[4930]: I1012 05:41:59.618977 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7cjzn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dda08509-105f-4935-a8a9-ff852e73c3ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2ntq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2ntq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7cjzn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:59Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:59 crc kubenswrapper[4930]: I1012 05:41:59.619436 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:59 crc kubenswrapper[4930]: I1012 05:41:59.619509 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:59 crc kubenswrapper[4930]: I1012 05:41:59.619534 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:59 crc kubenswrapper[4930]: I1012 05:41:59.619559 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:59 crc kubenswrapper[4930]: I1012 05:41:59.619578 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:59Z","lastTransitionTime":"2025-10-12T05:41:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:59 crc kubenswrapper[4930]: I1012 05:41:59.655529 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd0aa975-5017-4585-b9d9-66563c734f99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2ee3c6242b1dd92c05cad132f941c6463862d38dd3f32db106c4458b19ca281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce0b341a84353da33af118a985d78e1926365e672cd9d9a94a8c9ccdeeeef68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://772b3ac684f8d59be06d3921919db10572b68d9acfb6ae64e85b92c637963057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04341c7af2fbffb89ce55a0ed2594bdb4fd943ac9088994f81f9245c31c62a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c844939574ab6101aa35b883fd6b7dfabb8338c2de1bc238a60b92f7e0fc99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:59Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:59 crc kubenswrapper[4930]: I1012 05:41:59.679690 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vwttt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d928520f-ca1d-4cca-b966-c1e6c9168db0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://083fd9c4a4a835276945402005503980476de4e834b8c77b4347cbfd74347baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://271a5e8f9d11a30e34aaa4965e6d9d036cfc47d469d87a2ff0ddcae5ba4e8fe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://271a5e8f9d11a30e34aaa4965e6d9d036cfc47d469d87a2ff0ddcae5ba4e8fe8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a7387bc9fd2f9b98dd174adbd738309e22b90e1c82756e0e4c114cd5bc1279a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a7387bc9fd2f9b98dd174adbd738309e22b90e1c82756e0e4c114cd5bc1279a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b73defe23757469590b3d3d92e7b79bf3b6dcbcaec44d750770cd2b94caaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6b73defe23757469590b3d3d92e7b79bf3b6dcbcaec44d750770cd2b94caaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c379d6c40705bccb05ab8835e3b2410fc0a98db9e372b4dbd62cc3d4c1c5470\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c379d6c40705bccb05ab8835e3b2410fc0a98db9e372b4dbd62cc3d4c1c5470\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5e3c8e16c925694d42868813bc57edd9c50c88f2153189a2b9f8451655aff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5e3c8e16c925694d42868813bc57edd9c50c88f2153189a2b9f8451655aff1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d01f2b61f7fe3605f4ad0626d2cf28756a8c7e3de343b9a20acafe7f35c19bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d01f2b61f7fe3605f4ad0626d2cf28756a8c7e3de343b9a20acafe7f35c19bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vwttt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:59Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:59 crc kubenswrapper[4930]: I1012 05:41:59.713218 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0add0fa2-092f-4dcc-8c72-82881564bf63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7495c144ce989bf10b08b199a7459ea85351b77941a3edcb0ef76a01b2e08164\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a4295347cdfdf0864fbc2b102904f73e19cc33021fc5a82eed61255c4641d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2cba08ba8fd74a521bdd005e51be52f80687a1ccf0f250d11b709c894be8235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1006beab015e55856b70b9bcb93bb81b541e6c593fae8fd366c361990d233914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17051b78f584f1199be3038a82d9e675d1b9fb4da939611577c0b05235a147a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9898a7d7e617b5eb8eab6b6d9cb5d2ea92e8fbb0f4b1e0e60711e8f4a5e3aef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3921189ea146b5643c2898b528d1e76b32effa405e5e61dc58db80200fb1223c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3921189ea146b5643c2898b528d1e76b32effa405e5e61dc58db80200fb1223c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T05:41:58Z\\\",\\\"message\\\":\\\"controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI1012 05:41:58.201492 6569 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager-operator/metrics\\\\\\\"}\\\\nI1012 05:41:58.201552 6569 services_controller.go:360] Finished syncing service metrics on namespace openshift-controller-manager-operator for network=default : 16.497974ms\\\\nI1012 05:41:58.201573 6569 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI1012 05:41:58.201644 6569 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1012 05:41:58.201741 6569 factory.go:1336] Added *v1.Node event handler 7\\\\nI1012 05:41:58.201827 6569 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1012 05:41:58.202337 6569 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1012 05:41:58.202433 6569 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1012 05:41:58.202487 6569 ovnkube.go:599] Stopped ovnkube\\\\nI1012 05:41:58.202539 6569 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1012 05:41:58.202635 6569 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-mdhw6_openshift-ovn-kubernetes(0add0fa2-092f-4dcc-8c72-82881564bf63)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcac9953a6ca0ac812cb0d1aed7ff1179eed01831f4af4a098f3e6938b5bc666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mdhw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:59Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:59 crc kubenswrapper[4930]: I1012 05:41:59.722995 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:59 crc kubenswrapper[4930]: I1012 05:41:59.723084 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:59 crc kubenswrapper[4930]: I1012 05:41:59.723107 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:59 crc kubenswrapper[4930]: I1012 05:41:59.723141 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:59 crc kubenswrapper[4930]: I1012 05:41:59.723165 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:59Z","lastTransitionTime":"2025-10-12T05:41:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:59 crc kubenswrapper[4930]: I1012 05:41:59.730598 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://489b43d7659655253d7055c2725d6b59d594b9eea5d1e6dc78b015f8d44824ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:59Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:59 crc kubenswrapper[4930]: I1012 05:41:59.746607 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-br2vl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"545bf51c-0b04-4166-a984-ec9c1276470a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6148cdf3aed29894038db482881c0cf3d0fd2d99c241e87d864099a3eb4248f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tpks\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-br2vl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:59Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:59 crc kubenswrapper[4930]: I1012 05:41:59.761423 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jd2dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"835a1f98-4ae1-499b-b08c-a87dbcf8eaf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7816344abd767a7cdc2aa06418215233815285a683bb5a7fceeeb9a383b00bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvkzz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jd2dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:59Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:59 crc kubenswrapper[4930]: I1012 05:41:59.782971 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2acab1-2f7d-4497-ab0c-d2088825de18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://380b8da0bc8ba00ec7fe7a9dd1a4f49f0f0828fa469e21bd947a1b976ae5e584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14de1c8677ea9a164e1713e68861b55fa4b8fd0bff40211afa3ee9e883c65092\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93380cd1156afc3c34f019f5026f01b5242b9009c4c5de83edf5301e4cdb7b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b74b7cf89e0ba8d3417d38fdfc78b271cb23bc6e737358303956214c8f9a0f9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fb5b0196fd005013eba91e6ea563d14230abf0a8cbc1d2a4bf9cd4491f7c391\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T05:41:22Z\\\",\\\"message\\\":\\\"W1012 05:41:11.395066 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 05:41:11.395683 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760247671 cert, and key in /tmp/serving-cert-1865248722/serving-signer.crt, /tmp/serving-cert-1865248722/serving-signer.key\\\\nI1012 05:41:11.668419 1 observer_polling.go:159] Starting file observer\\\\nW1012 05:41:11.671952 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 05:41:11.672482 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 05:41:11.675332 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1865248722/tls.crt::/tmp/serving-cert-1865248722/tls.key\\\\\\\"\\\\nF1012 05:41:22.294380 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9e13b7dbf76d9dcdaa4a8edc40eb5ebac2c044e9f171f78f18b33a08784ed28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:59Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:59 crc kubenswrapper[4930]: I1012 05:41:59.802281 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:59Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:59 crc kubenswrapper[4930]: I1012 05:41:59.820198 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:59Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:59 crc kubenswrapper[4930]: I1012 05:41:59.827240 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:59 crc kubenswrapper[4930]: I1012 05:41:59.827288 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:59 crc kubenswrapper[4930]: I1012 05:41:59.827307 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:59 crc kubenswrapper[4930]: I1012 05:41:59.827330 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:59 crc kubenswrapper[4930]: I1012 05:41:59.827350 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:59Z","lastTransitionTime":"2025-10-12T05:41:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:59 crc kubenswrapper[4930]: I1012 05:41:59.841119 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e930f9f-8120-43b6-95ba-39319c069f64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://428593f6ee3749d62d3cb8553ec6707f0f13f70f7750b8ea90fb0b9f0aa8f337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05e5821bdc37103cb95d9f2a3dcacbff75c87335a6c6ed400846ea49fe370c00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e44390f068c28c4a9917648c747a322ddfa44915b128a61f145d6e2e23fc167a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://586395a6006e26fe969c2ef9af6ec6299523c0868c26989709389f5af24b7e39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:59Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:59 crc kubenswrapper[4930]: I1012 05:41:59.859804 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d92840f87187d6a4e0a40972f9b1601b70c300104b6023b980e0939e8b83eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d6f5f971b8708420f74c2b9602179815dd69145979d42bbaeb899290c01c286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:59Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:59 crc kubenswrapper[4930]: I1012 05:41:59.878244 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:59Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:59 crc kubenswrapper[4930]: I1012 05:41:59.898562 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tq29s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1c3ae9e-26ae-418f-b261-eabc4302b332\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbe4d76b294d8c174b77355aae9ec55f4d38c04446fc048603a294d171332584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s29kh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tq29s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:59Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:59 crc kubenswrapper[4930]: I1012 05:41:59.916244 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02f8684c-a3e4-44e8-9741-9f54488d8d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74c17edeca2e2b56f4be36327914c7232a5eee3190d7cd9ce54fe81050942ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqrgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2193886cf3e24fbf30c3e6f91ab52d9b7e68968c8b42a4d0deafab5668555701\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqrgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mk4tf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:59Z is after 2025-08-24T17:21:41Z" Oct 12 05:41:59 crc kubenswrapper[4930]: I1012 05:41:59.930368 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:41:59 crc kubenswrapper[4930]: I1012 05:41:59.930563 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:41:59 crc kubenswrapper[4930]: I1012 05:41:59.930677 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:41:59 crc kubenswrapper[4930]: I1012 05:41:59.930704 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:41:59 crc kubenswrapper[4930]: I1012 05:41:59.930861 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:41:59Z","lastTransitionTime":"2025-10-12T05:41:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:41:59 crc kubenswrapper[4930]: I1012 05:41:59.940309 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c87fc1896b611cf87d3f6e8dce960e0aea4e0a5e161d12e046bec5a4d46d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:41:59Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:00 crc kubenswrapper[4930]: I1012 05:42:00.017176 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 05:42:00 crc kubenswrapper[4930]: I1012 05:42:00.017355 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 05:42:00 crc kubenswrapper[4930]: E1012 05:42:00.017390 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 05:42:32.017357188 +0000 UTC m=+84.559458993 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:42:00 crc kubenswrapper[4930]: I1012 05:42:00.017471 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 05:42:00 crc kubenswrapper[4930]: I1012 05:42:00.017531 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 05:42:00 crc kubenswrapper[4930]: I1012 05:42:00.017571 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 05:42:00 crc kubenswrapper[4930]: E1012 05:42:00.017580 4930 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 12 05:42:00 crc kubenswrapper[4930]: I1012 05:42:00.017608 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dda08509-105f-4935-a8a9-ff852e73c3ce-metrics-certs\") pod \"network-metrics-daemon-7cjzn\" (UID: \"dda08509-105f-4935-a8a9-ff852e73c3ce\") " pod="openshift-multus/network-metrics-daemon-7cjzn" Oct 12 05:42:00 crc kubenswrapper[4930]: E1012 05:42:00.017629 4930 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 12 05:42:00 crc kubenswrapper[4930]: E1012 05:42:00.017651 4930 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 05:42:00 crc kubenswrapper[4930]: E1012 05:42:00.017661 4930 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 12 05:42:00 crc kubenswrapper[4930]: E1012 05:42:00.017718 4930 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 12 05:42:00 crc kubenswrapper[4930]: E1012 05:42:00.017730 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-12 05:42:32.017703307 +0000 UTC m=+84.559805102 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 05:42:00 crc kubenswrapper[4930]: E1012 05:42:00.017779 4930 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 12 05:42:00 crc kubenswrapper[4930]: E1012 05:42:00.017801 4930 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 05:42:00 crc kubenswrapper[4930]: E1012 05:42:00.017837 4930 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 12 05:42:00 crc kubenswrapper[4930]: E1012 05:42:00.017802 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-12 05:42:32.017788229 +0000 UTC m=+84.559890024 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 12 05:42:00 crc kubenswrapper[4930]: E1012 05:42:00.017847 4930 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 12 05:42:00 crc kubenswrapper[4930]: E1012 05:42:00.017885 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-12 05:42:32.017863181 +0000 UTC m=+84.559964986 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 05:42:00 crc kubenswrapper[4930]: E1012 05:42:00.017913 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dda08509-105f-4935-a8a9-ff852e73c3ce-metrics-certs podName:dda08509-105f-4935-a8a9-ff852e73c3ce nodeName:}" failed. No retries permitted until 2025-10-12 05:42:16.017896901 +0000 UTC m=+68.559998706 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dda08509-105f-4935-a8a9-ff852e73c3ce-metrics-certs") pod "network-metrics-daemon-7cjzn" (UID: "dda08509-105f-4935-a8a9-ff852e73c3ce") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 12 05:42:00 crc kubenswrapper[4930]: E1012 05:42:00.017933 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-12 05:42:32.017923342 +0000 UTC m=+84.560025147 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 12 05:42:00 crc kubenswrapper[4930]: I1012 05:42:00.034371 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:00 crc kubenswrapper[4930]: I1012 05:42:00.034454 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:00 crc kubenswrapper[4930]: I1012 05:42:00.034479 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:00 crc kubenswrapper[4930]: I1012 05:42:00.034510 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:00 crc kubenswrapper[4930]: I1012 05:42:00.034534 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:00Z","lastTransitionTime":"2025-10-12T05:42:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:00 crc kubenswrapper[4930]: I1012 05:42:00.135108 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 05:42:00 crc kubenswrapper[4930]: I1012 05:42:00.135236 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 05:42:00 crc kubenswrapper[4930]: E1012 05:42:00.135434 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 05:42:00 crc kubenswrapper[4930]: I1012 05:42:00.135497 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 05:42:00 crc kubenswrapper[4930]: I1012 05:42:00.135527 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cjzn" Oct 12 05:42:00 crc kubenswrapper[4930]: E1012 05:42:00.135657 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 05:42:00 crc kubenswrapper[4930]: E1012 05:42:00.136632 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 05:42:00 crc kubenswrapper[4930]: E1012 05:42:00.136930 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cjzn" podUID="dda08509-105f-4935-a8a9-ff852e73c3ce" Oct 12 05:42:00 crc kubenswrapper[4930]: I1012 05:42:00.138313 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:00 crc kubenswrapper[4930]: I1012 05:42:00.138371 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:00 crc kubenswrapper[4930]: I1012 05:42:00.138398 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:00 crc kubenswrapper[4930]: I1012 05:42:00.138425 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:00 crc kubenswrapper[4930]: I1012 05:42:00.138451 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:00Z","lastTransitionTime":"2025-10-12T05:42:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:00 crc kubenswrapper[4930]: I1012 05:42:00.241797 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:00 crc kubenswrapper[4930]: I1012 05:42:00.241873 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:00 crc kubenswrapper[4930]: I1012 05:42:00.241898 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:00 crc kubenswrapper[4930]: I1012 05:42:00.241929 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:00 crc kubenswrapper[4930]: I1012 05:42:00.242028 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:00Z","lastTransitionTime":"2025-10-12T05:42:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:00 crc kubenswrapper[4930]: I1012 05:42:00.345977 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:00 crc kubenswrapper[4930]: I1012 05:42:00.346035 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:00 crc kubenswrapper[4930]: I1012 05:42:00.346056 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:00 crc kubenswrapper[4930]: I1012 05:42:00.346085 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:00 crc kubenswrapper[4930]: I1012 05:42:00.346107 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:00Z","lastTransitionTime":"2025-10-12T05:42:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:00 crc kubenswrapper[4930]: I1012 05:42:00.448995 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:00 crc kubenswrapper[4930]: I1012 05:42:00.449053 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:00 crc kubenswrapper[4930]: I1012 05:42:00.449073 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:00 crc kubenswrapper[4930]: I1012 05:42:00.449098 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:00 crc kubenswrapper[4930]: I1012 05:42:00.449115 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:00Z","lastTransitionTime":"2025-10-12T05:42:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:00 crc kubenswrapper[4930]: I1012 05:42:00.552135 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:00 crc kubenswrapper[4930]: I1012 05:42:00.552203 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:00 crc kubenswrapper[4930]: I1012 05:42:00.552224 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:00 crc kubenswrapper[4930]: I1012 05:42:00.552251 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:00 crc kubenswrapper[4930]: I1012 05:42:00.552273 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:00Z","lastTransitionTime":"2025-10-12T05:42:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:00 crc kubenswrapper[4930]: I1012 05:42:00.655706 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:00 crc kubenswrapper[4930]: I1012 05:42:00.655804 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:00 crc kubenswrapper[4930]: I1012 05:42:00.655829 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:00 crc kubenswrapper[4930]: I1012 05:42:00.655860 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:00 crc kubenswrapper[4930]: I1012 05:42:00.655882 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:00Z","lastTransitionTime":"2025-10-12T05:42:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:00 crc kubenswrapper[4930]: I1012 05:42:00.758882 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:00 crc kubenswrapper[4930]: I1012 05:42:00.758941 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:00 crc kubenswrapper[4930]: I1012 05:42:00.758958 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:00 crc kubenswrapper[4930]: I1012 05:42:00.758980 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:00 crc kubenswrapper[4930]: I1012 05:42:00.758997 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:00Z","lastTransitionTime":"2025-10-12T05:42:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:00 crc kubenswrapper[4930]: I1012 05:42:00.861695 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:00 crc kubenswrapper[4930]: I1012 05:42:00.862051 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:00 crc kubenswrapper[4930]: I1012 05:42:00.862072 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:00 crc kubenswrapper[4930]: I1012 05:42:00.862097 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:00 crc kubenswrapper[4930]: I1012 05:42:00.862117 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:00Z","lastTransitionTime":"2025-10-12T05:42:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:00 crc kubenswrapper[4930]: I1012 05:42:00.965163 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:00 crc kubenswrapper[4930]: I1012 05:42:00.965255 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:00 crc kubenswrapper[4930]: I1012 05:42:00.965282 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:00 crc kubenswrapper[4930]: I1012 05:42:00.965316 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:00 crc kubenswrapper[4930]: I1012 05:42:00.965340 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:00Z","lastTransitionTime":"2025-10-12T05:42:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:01 crc kubenswrapper[4930]: I1012 05:42:01.069095 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:01 crc kubenswrapper[4930]: I1012 05:42:01.069206 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:01 crc kubenswrapper[4930]: I1012 05:42:01.069225 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:01 crc kubenswrapper[4930]: I1012 05:42:01.069252 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:01 crc kubenswrapper[4930]: I1012 05:42:01.069272 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:01Z","lastTransitionTime":"2025-10-12T05:42:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:01 crc kubenswrapper[4930]: I1012 05:42:01.172693 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:01 crc kubenswrapper[4930]: I1012 05:42:01.172799 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:01 crc kubenswrapper[4930]: I1012 05:42:01.172822 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:01 crc kubenswrapper[4930]: I1012 05:42:01.172849 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:01 crc kubenswrapper[4930]: I1012 05:42:01.172868 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:01Z","lastTransitionTime":"2025-10-12T05:42:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:01 crc kubenswrapper[4930]: I1012 05:42:01.276370 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:01 crc kubenswrapper[4930]: I1012 05:42:01.276426 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:01 crc kubenswrapper[4930]: I1012 05:42:01.276445 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:01 crc kubenswrapper[4930]: I1012 05:42:01.276471 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:01 crc kubenswrapper[4930]: I1012 05:42:01.276489 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:01Z","lastTransitionTime":"2025-10-12T05:42:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:01 crc kubenswrapper[4930]: I1012 05:42:01.380087 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:01 crc kubenswrapper[4930]: I1012 05:42:01.380151 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:01 crc kubenswrapper[4930]: I1012 05:42:01.380170 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:01 crc kubenswrapper[4930]: I1012 05:42:01.380195 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:01 crc kubenswrapper[4930]: I1012 05:42:01.380214 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:01Z","lastTransitionTime":"2025-10-12T05:42:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:01 crc kubenswrapper[4930]: I1012 05:42:01.482960 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:01 crc kubenswrapper[4930]: I1012 05:42:01.483037 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:01 crc kubenswrapper[4930]: I1012 05:42:01.483057 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:01 crc kubenswrapper[4930]: I1012 05:42:01.483082 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:01 crc kubenswrapper[4930]: I1012 05:42:01.483100 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:01Z","lastTransitionTime":"2025-10-12T05:42:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:01 crc kubenswrapper[4930]: I1012 05:42:01.586087 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:01 crc kubenswrapper[4930]: I1012 05:42:01.586154 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:01 crc kubenswrapper[4930]: I1012 05:42:01.586173 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:01 crc kubenswrapper[4930]: I1012 05:42:01.586198 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:01 crc kubenswrapper[4930]: I1012 05:42:01.586216 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:01Z","lastTransitionTime":"2025-10-12T05:42:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:01 crc kubenswrapper[4930]: I1012 05:42:01.689724 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:01 crc kubenswrapper[4930]: I1012 05:42:01.689878 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:01 crc kubenswrapper[4930]: I1012 05:42:01.689905 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:01 crc kubenswrapper[4930]: I1012 05:42:01.689936 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:01 crc kubenswrapper[4930]: I1012 05:42:01.689958 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:01Z","lastTransitionTime":"2025-10-12T05:42:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:01 crc kubenswrapper[4930]: I1012 05:42:01.792584 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:01 crc kubenswrapper[4930]: I1012 05:42:01.792704 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:01 crc kubenswrapper[4930]: I1012 05:42:01.792728 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:01 crc kubenswrapper[4930]: I1012 05:42:01.792800 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:01 crc kubenswrapper[4930]: I1012 05:42:01.792817 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:01Z","lastTransitionTime":"2025-10-12T05:42:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:01 crc kubenswrapper[4930]: I1012 05:42:01.895635 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:01 crc kubenswrapper[4930]: I1012 05:42:01.895711 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:01 crc kubenswrapper[4930]: I1012 05:42:01.895765 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:01 crc kubenswrapper[4930]: I1012 05:42:01.895805 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:01 crc kubenswrapper[4930]: I1012 05:42:01.895828 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:01Z","lastTransitionTime":"2025-10-12T05:42:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:01 crc kubenswrapper[4930]: I1012 05:42:01.998649 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:01 crc kubenswrapper[4930]: I1012 05:42:01.998701 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:01 crc kubenswrapper[4930]: I1012 05:42:01.998718 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:01 crc kubenswrapper[4930]: I1012 05:42:01.998769 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:01 crc kubenswrapper[4930]: I1012 05:42:01.998792 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:01Z","lastTransitionTime":"2025-10-12T05:42:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:02 crc kubenswrapper[4930]: I1012 05:42:02.102029 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:02 crc kubenswrapper[4930]: I1012 05:42:02.102071 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:02 crc kubenswrapper[4930]: I1012 05:42:02.102085 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:02 crc kubenswrapper[4930]: I1012 05:42:02.102105 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:02 crc kubenswrapper[4930]: I1012 05:42:02.102122 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:02Z","lastTransitionTime":"2025-10-12T05:42:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:02 crc kubenswrapper[4930]: I1012 05:42:02.134801 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 05:42:02 crc kubenswrapper[4930]: I1012 05:42:02.134842 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 05:42:02 crc kubenswrapper[4930]: E1012 05:42:02.134986 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 05:42:02 crc kubenswrapper[4930]: I1012 05:42:02.135066 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 05:42:02 crc kubenswrapper[4930]: I1012 05:42:02.135104 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cjzn" Oct 12 05:42:02 crc kubenswrapper[4930]: E1012 05:42:02.135236 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 05:42:02 crc kubenswrapper[4930]: E1012 05:42:02.135479 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cjzn" podUID="dda08509-105f-4935-a8a9-ff852e73c3ce" Oct 12 05:42:02 crc kubenswrapper[4930]: E1012 05:42:02.135661 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 05:42:02 crc kubenswrapper[4930]: I1012 05:42:02.205146 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:02 crc kubenswrapper[4930]: I1012 05:42:02.205226 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:02 crc kubenswrapper[4930]: I1012 05:42:02.205248 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:02 crc kubenswrapper[4930]: I1012 05:42:02.205279 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:02 crc kubenswrapper[4930]: I1012 05:42:02.205301 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:02Z","lastTransitionTime":"2025-10-12T05:42:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:02 crc kubenswrapper[4930]: I1012 05:42:02.308031 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:02 crc kubenswrapper[4930]: I1012 05:42:02.308091 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:02 crc kubenswrapper[4930]: I1012 05:42:02.308107 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:02 crc kubenswrapper[4930]: I1012 05:42:02.308132 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:02 crc kubenswrapper[4930]: I1012 05:42:02.308149 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:02Z","lastTransitionTime":"2025-10-12T05:42:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:02 crc kubenswrapper[4930]: I1012 05:42:02.415429 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:02 crc kubenswrapper[4930]: I1012 05:42:02.415494 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:02 crc kubenswrapper[4930]: I1012 05:42:02.415512 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:02 crc kubenswrapper[4930]: I1012 05:42:02.415539 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:02 crc kubenswrapper[4930]: I1012 05:42:02.415559 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:02Z","lastTransitionTime":"2025-10-12T05:42:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:02 crc kubenswrapper[4930]: I1012 05:42:02.520538 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:02 crc kubenswrapper[4930]: I1012 05:42:02.520601 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:02 crc kubenswrapper[4930]: I1012 05:42:02.520616 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:02 crc kubenswrapper[4930]: I1012 05:42:02.520637 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:02 crc kubenswrapper[4930]: I1012 05:42:02.520652 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:02Z","lastTransitionTime":"2025-10-12T05:42:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:02 crc kubenswrapper[4930]: I1012 05:42:02.623248 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:02 crc kubenswrapper[4930]: I1012 05:42:02.623299 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:02 crc kubenswrapper[4930]: I1012 05:42:02.623311 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:02 crc kubenswrapper[4930]: I1012 05:42:02.623330 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:02 crc kubenswrapper[4930]: I1012 05:42:02.623343 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:02Z","lastTransitionTime":"2025-10-12T05:42:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:02 crc kubenswrapper[4930]: I1012 05:42:02.726334 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:02 crc kubenswrapper[4930]: I1012 05:42:02.726386 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:02 crc kubenswrapper[4930]: I1012 05:42:02.726397 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:02 crc kubenswrapper[4930]: I1012 05:42:02.726419 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:02 crc kubenswrapper[4930]: I1012 05:42:02.726432 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:02Z","lastTransitionTime":"2025-10-12T05:42:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:02 crc kubenswrapper[4930]: I1012 05:42:02.829362 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:02 crc kubenswrapper[4930]: I1012 05:42:02.829414 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:02 crc kubenswrapper[4930]: I1012 05:42:02.829425 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:02 crc kubenswrapper[4930]: I1012 05:42:02.829444 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:02 crc kubenswrapper[4930]: I1012 05:42:02.829458 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:02Z","lastTransitionTime":"2025-10-12T05:42:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:02 crc kubenswrapper[4930]: I1012 05:42:02.838578 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 12 05:42:02 crc kubenswrapper[4930]: I1012 05:42:02.852727 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 12 05:42:02 crc kubenswrapper[4930]: I1012 05:42:02.858794 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:02Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:02 crc kubenswrapper[4930]: I1012 05:42:02.874394 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tq29s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1c3ae9e-26ae-418f-b261-eabc4302b332\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbe4d76b294d8c174b77355aae9ec55f4d38c04446fc048603a294d171332584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s29kh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tq29s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:02Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:02 crc kubenswrapper[4930]: I1012 05:42:02.886991 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02f8684c-a3e4-44e8-9741-9f54488d8d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74c17edeca2e2b56f4be36327914c7232a5eee3190d7cd9ce54fe81050942ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqrgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2193886cf3e24fbf30c3e6f91ab52d9b7e68968c8b42a4d0deafab5668555701\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqrgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mk4tf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:02Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:02 crc kubenswrapper[4930]: I1012 05:42:02.902876 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c87fc1896b611cf87d3f6e8dce960e0aea4e0a5e161d12e046bec5a4d46d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:02Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:02 crc kubenswrapper[4930]: I1012 05:42:02.917637 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fjsw8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c21ff03a-29e7-467c-93e2-45f3a6cef5af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df55e443aa226c07478c488ee54785f35fbb1d6285775e82abcbf924aaf304fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dk8z2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bae41878203d4a2974f46d284f4fc05e705fb9fc9b1a3e8159db00a4c9d70762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dk8z2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fjsw8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:02Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:02 crc kubenswrapper[4930]: I1012 05:42:02.928775 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7cjzn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dda08509-105f-4935-a8a9-ff852e73c3ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2ntq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2ntq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7cjzn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:02Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:02 crc kubenswrapper[4930]: I1012 05:42:02.934086 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:02 crc kubenswrapper[4930]: I1012 05:42:02.934310 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:02 crc kubenswrapper[4930]: I1012 05:42:02.934490 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:02 crc kubenswrapper[4930]: I1012 05:42:02.934648 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:02 crc kubenswrapper[4930]: I1012 05:42:02.934831 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:02Z","lastTransitionTime":"2025-10-12T05:42:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:02 crc kubenswrapper[4930]: I1012 05:42:02.950893 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd0aa975-5017-4585-b9d9-66563c734f99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2ee3c6242b1dd92c05cad132f941c6463862d38dd3f32db106c4458b19ca281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce0b341a84353da33af118a985d78e1926365e672cd9d9a94a8c9ccdeeeef68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://772b3ac684f8d59be06d3921919db10572b68d9acfb6ae64e85b92c637963057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04341c7af2fbffb89ce55a0ed2594bdb4fd943ac9088994f81f9245c31c62a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c844939574ab6101aa35b883fd6b7dfabb8338c2de1bc238a60b92f7e0fc99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:02Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:02 crc kubenswrapper[4930]: I1012 05:42:02.966649 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vwttt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d928520f-ca1d-4cca-b966-c1e6c9168db0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://083fd9c4a4a835276945402005503980476de4e834b8c77b4347cbfd74347baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://271a5e8f9d11a30e34aaa4965e6d9d036cfc47d469d87a2ff0ddcae5ba4e8fe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://271a5e8f9d11a30e34aaa4965e6d9d036cfc47d469d87a2ff0ddcae5ba4e8fe8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a7387bc9fd2f9b98dd174adbd738309e22b90e1c82756e0e4c114cd5bc1279a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a7387bc9fd2f9b98dd174adbd738309e22b90e1c82756e0e4c114cd5bc1279a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b73defe23757469590b3d3d92e7b79bf3b6dcbcaec44d750770cd2b94caaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6b73defe23757469590b3d3d92e7b79bf3b6dcbcaec44d750770cd2b94caaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c379d6c40705bccb05ab8835e3b2410fc0a98db9e372b4dbd62cc3d4c1c5470\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c379d6c40705bccb05ab8835e3b2410fc0a98db9e372b4dbd62cc3d4c1c5470\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5e3c8e16c925694d42868813bc57edd9c50c88f2153189a2b9f8451655aff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5e3c8e16c925694d42868813bc57edd9c50c88f2153189a2b9f8451655aff1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d01f2b61f7fe3605f4ad0626d2cf28756a8c7e3de343b9a20acafe7f35c19bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d01f2b61f7fe3605f4ad0626d2cf28756a8c7e3de343b9a20acafe7f35c19bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vwttt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:02Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:02 crc kubenswrapper[4930]: I1012 05:42:02.994983 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0add0fa2-092f-4dcc-8c72-82881564bf63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7495c144ce989bf10b08b199a7459ea85351b77941a3edcb0ef76a01b2e08164\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a4295347cdfdf0864fbc2b102904f73e19cc33021fc5a82eed61255c4641d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2cba08ba8fd74a521bdd005e51be52f80687a1ccf0f250d11b709c894be8235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1006beab015e55856b70b9bcb93bb81b541e6c593fae8fd366c361990d233914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17051b78f584f1199be3038a82d9e675d1b9fb4da939611577c0b05235a147a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9898a7d7e617b5eb8eab6b6d9cb5d2ea92e8fbb0f4b1e0e60711e8f4a5e3aef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3921189ea146b5643c2898b528d1e76b32effa405e5e61dc58db80200fb1223c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3921189ea146b5643c2898b528d1e76b32effa405e5e61dc58db80200fb1223c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T05:41:58Z\\\",\\\"message\\\":\\\"controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI1012 05:41:58.201492 6569 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager-operator/metrics\\\\\\\"}\\\\nI1012 05:41:58.201552 6569 services_controller.go:360] Finished syncing service metrics on namespace openshift-controller-manager-operator for network=default : 16.497974ms\\\\nI1012 05:41:58.201573 6569 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI1012 05:41:58.201644 6569 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1012 05:41:58.201741 6569 factory.go:1336] Added *v1.Node event handler 7\\\\nI1012 05:41:58.201827 6569 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1012 05:41:58.202337 6569 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1012 05:41:58.202433 6569 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1012 05:41:58.202487 6569 ovnkube.go:599] Stopped ovnkube\\\\nI1012 05:41:58.202539 6569 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1012 05:41:58.202635 6569 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-mdhw6_openshift-ovn-kubernetes(0add0fa2-092f-4dcc-8c72-82881564bf63)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcac9953a6ca0ac812cb0d1aed7ff1179eed01831f4af4a098f3e6938b5bc666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mdhw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:02Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:03 crc kubenswrapper[4930]: I1012 05:42:03.012842 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://489b43d7659655253d7055c2725d6b59d594b9eea5d1e6dc78b015f8d44824ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:03Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:03 crc kubenswrapper[4930]: I1012 05:42:03.028962 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-br2vl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"545bf51c-0b04-4166-a984-ec9c1276470a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6148cdf3aed29894038db482881c0cf3d0fd2d99c241e87d864099a3eb4248f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tpks\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-br2vl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:03Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:03 crc kubenswrapper[4930]: I1012 05:42:03.038307 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:03 crc kubenswrapper[4930]: I1012 05:42:03.038388 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:03 crc kubenswrapper[4930]: I1012 05:42:03.038412 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:03 crc kubenswrapper[4930]: I1012 05:42:03.038437 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:03 crc kubenswrapper[4930]: I1012 05:42:03.038457 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:03Z","lastTransitionTime":"2025-10-12T05:42:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:03 crc kubenswrapper[4930]: I1012 05:42:03.044151 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jd2dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"835a1f98-4ae1-499b-b08c-a87dbcf8eaf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7816344abd767a7cdc2aa06418215233815285a683bb5a7fceeeb9a383b00bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvkzz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jd2dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:03Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:03 crc kubenswrapper[4930]: I1012 05:42:03.061600 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2acab1-2f7d-4497-ab0c-d2088825de18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://380b8da0bc8ba00ec7fe7a9dd1a4f49f0f0828fa469e21bd947a1b976ae5e584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14de1c8677ea9a164e1713e68861b55fa4b8fd0bff40211afa3ee9e883c65092\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93380cd1156afc3c34f019f5026f01b5242b9009c4c5de83edf5301e4cdb7b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b74b7cf89e0ba8d3417d38fdfc78b271cb23bc6e737358303956214c8f9a0f9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fb5b0196fd005013eba91e6ea563d14230abf0a8cbc1d2a4bf9cd4491f7c391\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T05:41:22Z\\\",\\\"message\\\":\\\"W1012 05:41:11.395066 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 05:41:11.395683 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760247671 cert, and key in /tmp/serving-cert-1865248722/serving-signer.crt, /tmp/serving-cert-1865248722/serving-signer.key\\\\nI1012 05:41:11.668419 1 observer_polling.go:159] Starting file observer\\\\nW1012 05:41:11.671952 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 05:41:11.672482 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 05:41:11.675332 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1865248722/tls.crt::/tmp/serving-cert-1865248722/tls.key\\\\\\\"\\\\nF1012 05:41:22.294380 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9e13b7dbf76d9dcdaa4a8edc40eb5ebac2c044e9f171f78f18b33a08784ed28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:03Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:03 crc kubenswrapper[4930]: I1012 05:42:03.079654 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:03Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:03 crc kubenswrapper[4930]: I1012 05:42:03.093632 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:03Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:03 crc kubenswrapper[4930]: I1012 05:42:03.111502 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e930f9f-8120-43b6-95ba-39319c069f64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://428593f6ee3749d62d3cb8553ec6707f0f13f70f7750b8ea90fb0b9f0aa8f337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05e5821bdc37103cb95d9f2a3dcacbff75c87335a6c6ed400846ea49fe370c00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e44390f068c28c4a9917648c747a322ddfa44915b128a61f145d6e2e23fc167a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://586395a6006e26fe969c2ef9af6ec6299523c0868c26989709389f5af24b7e39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:03Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:03 crc kubenswrapper[4930]: I1012 05:42:03.128184 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d92840f87187d6a4e0a40972f9b1601b70c300104b6023b980e0939e8b83eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d6f5f971b8708420f74c2b9602179815dd69145979d42bbaeb899290c01c286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:03Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:03 crc kubenswrapper[4930]: I1012 05:42:03.144064 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:03 crc kubenswrapper[4930]: I1012 05:42:03.144158 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:03 crc kubenswrapper[4930]: I1012 05:42:03.144178 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:03 crc kubenswrapper[4930]: I1012 05:42:03.144209 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:03 crc kubenswrapper[4930]: I1012 05:42:03.144232 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:03Z","lastTransitionTime":"2025-10-12T05:42:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:03 crc kubenswrapper[4930]: I1012 05:42:03.246979 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:03 crc kubenswrapper[4930]: I1012 05:42:03.247052 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:03 crc kubenswrapper[4930]: I1012 05:42:03.247069 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:03 crc kubenswrapper[4930]: I1012 05:42:03.247093 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:03 crc kubenswrapper[4930]: I1012 05:42:03.247111 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:03Z","lastTransitionTime":"2025-10-12T05:42:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:03 crc kubenswrapper[4930]: I1012 05:42:03.350674 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:03 crc kubenswrapper[4930]: I1012 05:42:03.350787 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:03 crc kubenswrapper[4930]: I1012 05:42:03.350814 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:03 crc kubenswrapper[4930]: I1012 05:42:03.350848 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:03 crc kubenswrapper[4930]: I1012 05:42:03.350870 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:03Z","lastTransitionTime":"2025-10-12T05:42:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:03 crc kubenswrapper[4930]: I1012 05:42:03.453668 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:03 crc kubenswrapper[4930]: I1012 05:42:03.454126 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:03 crc kubenswrapper[4930]: I1012 05:42:03.454285 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:03 crc kubenswrapper[4930]: I1012 05:42:03.454438 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:03 crc kubenswrapper[4930]: I1012 05:42:03.454583 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:03Z","lastTransitionTime":"2025-10-12T05:42:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:03 crc kubenswrapper[4930]: I1012 05:42:03.557814 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:03 crc kubenswrapper[4930]: I1012 05:42:03.557890 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:03 crc kubenswrapper[4930]: I1012 05:42:03.557914 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:03 crc kubenswrapper[4930]: I1012 05:42:03.557976 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:03 crc kubenswrapper[4930]: I1012 05:42:03.557995 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:03Z","lastTransitionTime":"2025-10-12T05:42:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:03 crc kubenswrapper[4930]: I1012 05:42:03.660148 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:03 crc kubenswrapper[4930]: I1012 05:42:03.660401 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:03 crc kubenswrapper[4930]: I1012 05:42:03.660517 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:03 crc kubenswrapper[4930]: I1012 05:42:03.660612 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:03 crc kubenswrapper[4930]: I1012 05:42:03.660703 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:03Z","lastTransitionTime":"2025-10-12T05:42:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:03 crc kubenswrapper[4930]: I1012 05:42:03.764561 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:03 crc kubenswrapper[4930]: I1012 05:42:03.764594 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:03 crc kubenswrapper[4930]: I1012 05:42:03.764603 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:03 crc kubenswrapper[4930]: I1012 05:42:03.764615 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:03 crc kubenswrapper[4930]: I1012 05:42:03.764623 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:03Z","lastTransitionTime":"2025-10-12T05:42:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:03 crc kubenswrapper[4930]: I1012 05:42:03.868477 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:03 crc kubenswrapper[4930]: I1012 05:42:03.868546 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:03 crc kubenswrapper[4930]: I1012 05:42:03.868567 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:03 crc kubenswrapper[4930]: I1012 05:42:03.868592 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:03 crc kubenswrapper[4930]: I1012 05:42:03.868609 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:03Z","lastTransitionTime":"2025-10-12T05:42:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:03 crc kubenswrapper[4930]: I1012 05:42:03.971302 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:03 crc kubenswrapper[4930]: I1012 05:42:03.971357 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:03 crc kubenswrapper[4930]: I1012 05:42:03.971369 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:03 crc kubenswrapper[4930]: I1012 05:42:03.971393 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:03 crc kubenswrapper[4930]: I1012 05:42:03.971407 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:03Z","lastTransitionTime":"2025-10-12T05:42:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:04 crc kubenswrapper[4930]: I1012 05:42:04.074810 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:04 crc kubenswrapper[4930]: I1012 05:42:04.074862 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:04 crc kubenswrapper[4930]: I1012 05:42:04.074871 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:04 crc kubenswrapper[4930]: I1012 05:42:04.074888 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:04 crc kubenswrapper[4930]: I1012 05:42:04.074901 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:04Z","lastTransitionTime":"2025-10-12T05:42:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:04 crc kubenswrapper[4930]: I1012 05:42:04.135201 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cjzn" Oct 12 05:42:04 crc kubenswrapper[4930]: I1012 05:42:04.135367 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 05:42:04 crc kubenswrapper[4930]: E1012 05:42:04.135404 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cjzn" podUID="dda08509-105f-4935-a8a9-ff852e73c3ce" Oct 12 05:42:04 crc kubenswrapper[4930]: I1012 05:42:04.135224 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 05:42:04 crc kubenswrapper[4930]: I1012 05:42:04.135444 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 05:42:04 crc kubenswrapper[4930]: E1012 05:42:04.135606 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 05:42:04 crc kubenswrapper[4930]: E1012 05:42:04.135751 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 05:42:04 crc kubenswrapper[4930]: E1012 05:42:04.135881 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 05:42:04 crc kubenswrapper[4930]: I1012 05:42:04.177786 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:04 crc kubenswrapper[4930]: I1012 05:42:04.177855 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:04 crc kubenswrapper[4930]: I1012 05:42:04.177873 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:04 crc kubenswrapper[4930]: I1012 05:42:04.177900 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:04 crc kubenswrapper[4930]: I1012 05:42:04.177919 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:04Z","lastTransitionTime":"2025-10-12T05:42:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:04 crc kubenswrapper[4930]: I1012 05:42:04.281439 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:04 crc kubenswrapper[4930]: I1012 05:42:04.281541 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:04 crc kubenswrapper[4930]: I1012 05:42:04.281572 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:04 crc kubenswrapper[4930]: I1012 05:42:04.281612 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:04 crc kubenswrapper[4930]: I1012 05:42:04.281640 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:04Z","lastTransitionTime":"2025-10-12T05:42:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:04 crc kubenswrapper[4930]: I1012 05:42:04.385756 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:04 crc kubenswrapper[4930]: I1012 05:42:04.385825 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:04 crc kubenswrapper[4930]: I1012 05:42:04.385842 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:04 crc kubenswrapper[4930]: I1012 05:42:04.385866 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:04 crc kubenswrapper[4930]: I1012 05:42:04.385882 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:04Z","lastTransitionTime":"2025-10-12T05:42:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:04 crc kubenswrapper[4930]: I1012 05:42:04.489708 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:04 crc kubenswrapper[4930]: I1012 05:42:04.489823 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:04 crc kubenswrapper[4930]: I1012 05:42:04.489843 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:04 crc kubenswrapper[4930]: I1012 05:42:04.489872 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:04 crc kubenswrapper[4930]: I1012 05:42:04.489890 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:04Z","lastTransitionTime":"2025-10-12T05:42:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:04 crc kubenswrapper[4930]: I1012 05:42:04.593274 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:04 crc kubenswrapper[4930]: I1012 05:42:04.593644 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:04 crc kubenswrapper[4930]: I1012 05:42:04.594292 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:04 crc kubenswrapper[4930]: I1012 05:42:04.594454 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:04 crc kubenswrapper[4930]: I1012 05:42:04.594637 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:04Z","lastTransitionTime":"2025-10-12T05:42:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:04 crc kubenswrapper[4930]: I1012 05:42:04.697831 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:04 crc kubenswrapper[4930]: I1012 05:42:04.697895 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:04 crc kubenswrapper[4930]: I1012 05:42:04.697918 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:04 crc kubenswrapper[4930]: I1012 05:42:04.697944 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:04 crc kubenswrapper[4930]: I1012 05:42:04.697964 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:04Z","lastTransitionTime":"2025-10-12T05:42:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:04 crc kubenswrapper[4930]: I1012 05:42:04.801124 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:04 crc kubenswrapper[4930]: I1012 05:42:04.801334 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:04 crc kubenswrapper[4930]: I1012 05:42:04.801459 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:04 crc kubenswrapper[4930]: I1012 05:42:04.801578 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:04 crc kubenswrapper[4930]: I1012 05:42:04.801714 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:04Z","lastTransitionTime":"2025-10-12T05:42:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:04 crc kubenswrapper[4930]: I1012 05:42:04.905239 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:04 crc kubenswrapper[4930]: I1012 05:42:04.905310 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:04 crc kubenswrapper[4930]: I1012 05:42:04.905323 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:04 crc kubenswrapper[4930]: I1012 05:42:04.905348 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:04 crc kubenswrapper[4930]: I1012 05:42:04.905363 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:04Z","lastTransitionTime":"2025-10-12T05:42:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:05 crc kubenswrapper[4930]: I1012 05:42:05.007937 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:05 crc kubenswrapper[4930]: I1012 05:42:05.007990 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:05 crc kubenswrapper[4930]: I1012 05:42:05.008001 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:05 crc kubenswrapper[4930]: I1012 05:42:05.008023 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:05 crc kubenswrapper[4930]: I1012 05:42:05.008036 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:05Z","lastTransitionTime":"2025-10-12T05:42:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:05 crc kubenswrapper[4930]: I1012 05:42:05.111505 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:05 crc kubenswrapper[4930]: I1012 05:42:05.111572 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:05 crc kubenswrapper[4930]: I1012 05:42:05.111591 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:05 crc kubenswrapper[4930]: I1012 05:42:05.111621 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:05 crc kubenswrapper[4930]: I1012 05:42:05.111639 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:05Z","lastTransitionTime":"2025-10-12T05:42:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:05 crc kubenswrapper[4930]: I1012 05:42:05.215431 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:05 crc kubenswrapper[4930]: I1012 05:42:05.216195 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:05 crc kubenswrapper[4930]: I1012 05:42:05.216401 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:05 crc kubenswrapper[4930]: I1012 05:42:05.216619 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:05 crc kubenswrapper[4930]: I1012 05:42:05.216873 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:05Z","lastTransitionTime":"2025-10-12T05:42:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:05 crc kubenswrapper[4930]: I1012 05:42:05.320480 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:05 crc kubenswrapper[4930]: I1012 05:42:05.320578 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:05 crc kubenswrapper[4930]: I1012 05:42:05.320603 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:05 crc kubenswrapper[4930]: I1012 05:42:05.320632 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:05 crc kubenswrapper[4930]: I1012 05:42:05.320651 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:05Z","lastTransitionTime":"2025-10-12T05:42:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:05 crc kubenswrapper[4930]: I1012 05:42:05.324217 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:05 crc kubenswrapper[4930]: I1012 05:42:05.324522 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:05 crc kubenswrapper[4930]: I1012 05:42:05.324706 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:05 crc kubenswrapper[4930]: I1012 05:42:05.324958 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:05 crc kubenswrapper[4930]: I1012 05:42:05.325147 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:05Z","lastTransitionTime":"2025-10-12T05:42:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:05 crc kubenswrapper[4930]: E1012 05:42:05.349017 4930 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:42:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:42:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:42:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:42:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"208e324a-7c16-4d54-b585-7c9f58265cb2\\\",\\\"systemUUID\\\":\\\"42c4af26-8d34-49a0-8413-58384a3ecd2b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:05Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:05 crc kubenswrapper[4930]: I1012 05:42:05.356003 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:05 crc kubenswrapper[4930]: I1012 05:42:05.356091 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:05 crc kubenswrapper[4930]: I1012 05:42:05.356115 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:05 crc kubenswrapper[4930]: I1012 05:42:05.356147 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:05 crc kubenswrapper[4930]: I1012 05:42:05.356172 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:05Z","lastTransitionTime":"2025-10-12T05:42:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:05 crc kubenswrapper[4930]: E1012 05:42:05.375836 4930 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:42:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:42:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:42:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:42:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"208e324a-7c16-4d54-b585-7c9f58265cb2\\\",\\\"systemUUID\\\":\\\"42c4af26-8d34-49a0-8413-58384a3ecd2b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:05Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:05 crc kubenswrapper[4930]: I1012 05:42:05.381381 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:05 crc kubenswrapper[4930]: I1012 05:42:05.381447 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:05 crc kubenswrapper[4930]: I1012 05:42:05.381466 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:05 crc kubenswrapper[4930]: I1012 05:42:05.381498 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:05 crc kubenswrapper[4930]: I1012 05:42:05.381518 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:05Z","lastTransitionTime":"2025-10-12T05:42:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:05 crc kubenswrapper[4930]: E1012 05:42:05.403371 4930 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:42:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:42:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:42:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:42:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"208e324a-7c16-4d54-b585-7c9f58265cb2\\\",\\\"systemUUID\\\":\\\"42c4af26-8d34-49a0-8413-58384a3ecd2b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:05Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:05 crc kubenswrapper[4930]: I1012 05:42:05.409625 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:05 crc kubenswrapper[4930]: I1012 05:42:05.409702 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:05 crc kubenswrapper[4930]: I1012 05:42:05.409721 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:05 crc kubenswrapper[4930]: I1012 05:42:05.409775 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:05 crc kubenswrapper[4930]: I1012 05:42:05.409797 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:05Z","lastTransitionTime":"2025-10-12T05:42:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:05 crc kubenswrapper[4930]: E1012 05:42:05.426995 4930 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:42:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:42:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:42:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:42:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"208e324a-7c16-4d54-b585-7c9f58265cb2\\\",\\\"systemUUID\\\":\\\"42c4af26-8d34-49a0-8413-58384a3ecd2b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:05Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:05 crc kubenswrapper[4930]: I1012 05:42:05.433465 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:05 crc kubenswrapper[4930]: I1012 05:42:05.433528 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:05 crc kubenswrapper[4930]: I1012 05:42:05.433547 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:05 crc kubenswrapper[4930]: I1012 05:42:05.433577 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:05 crc kubenswrapper[4930]: I1012 05:42:05.433597 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:05Z","lastTransitionTime":"2025-10-12T05:42:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:05 crc kubenswrapper[4930]: E1012 05:42:05.455941 4930 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:42:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:42:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:42:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:42:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"208e324a-7c16-4d54-b585-7c9f58265cb2\\\",\\\"systemUUID\\\":\\\"42c4af26-8d34-49a0-8413-58384a3ecd2b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:05Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:05 crc kubenswrapper[4930]: E1012 05:42:05.456170 4930 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 12 05:42:05 crc kubenswrapper[4930]: I1012 05:42:05.458506 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:05 crc kubenswrapper[4930]: I1012 05:42:05.458553 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:05 crc kubenswrapper[4930]: I1012 05:42:05.458571 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:05 crc kubenswrapper[4930]: I1012 05:42:05.458599 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:05 crc kubenswrapper[4930]: I1012 05:42:05.458616 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:05Z","lastTransitionTime":"2025-10-12T05:42:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:05 crc kubenswrapper[4930]: I1012 05:42:05.562835 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:05 crc kubenswrapper[4930]: I1012 05:42:05.562905 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:05 crc kubenswrapper[4930]: I1012 05:42:05.562921 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:05 crc kubenswrapper[4930]: I1012 05:42:05.562951 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:05 crc kubenswrapper[4930]: I1012 05:42:05.562971 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:05Z","lastTransitionTime":"2025-10-12T05:42:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:05 crc kubenswrapper[4930]: I1012 05:42:05.666981 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:05 crc kubenswrapper[4930]: I1012 05:42:05.667056 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:05 crc kubenswrapper[4930]: I1012 05:42:05.667075 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:05 crc kubenswrapper[4930]: I1012 05:42:05.667119 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:05 crc kubenswrapper[4930]: I1012 05:42:05.667146 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:05Z","lastTransitionTime":"2025-10-12T05:42:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:05 crc kubenswrapper[4930]: I1012 05:42:05.770593 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:05 crc kubenswrapper[4930]: I1012 05:42:05.770654 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:05 crc kubenswrapper[4930]: I1012 05:42:05.770671 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:05 crc kubenswrapper[4930]: I1012 05:42:05.770695 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:05 crc kubenswrapper[4930]: I1012 05:42:05.770713 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:05Z","lastTransitionTime":"2025-10-12T05:42:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:05 crc kubenswrapper[4930]: I1012 05:42:05.877337 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:05 crc kubenswrapper[4930]: I1012 05:42:05.877418 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:05 crc kubenswrapper[4930]: I1012 05:42:05.877441 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:05 crc kubenswrapper[4930]: I1012 05:42:05.877473 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:05 crc kubenswrapper[4930]: I1012 05:42:05.877501 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:05Z","lastTransitionTime":"2025-10-12T05:42:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:05 crc kubenswrapper[4930]: I1012 05:42:05.980646 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:05 crc kubenswrapper[4930]: I1012 05:42:05.980732 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:05 crc kubenswrapper[4930]: I1012 05:42:05.980797 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:05 crc kubenswrapper[4930]: I1012 05:42:05.980828 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:05 crc kubenswrapper[4930]: I1012 05:42:05.980852 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:05Z","lastTransitionTime":"2025-10-12T05:42:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:06 crc kubenswrapper[4930]: I1012 05:42:06.084306 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:06 crc kubenswrapper[4930]: I1012 05:42:06.084374 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:06 crc kubenswrapper[4930]: I1012 05:42:06.084392 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:06 crc kubenswrapper[4930]: I1012 05:42:06.084424 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:06 crc kubenswrapper[4930]: I1012 05:42:06.084445 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:06Z","lastTransitionTime":"2025-10-12T05:42:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:06 crc kubenswrapper[4930]: I1012 05:42:06.134704 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 05:42:06 crc kubenswrapper[4930]: I1012 05:42:06.134774 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 05:42:06 crc kubenswrapper[4930]: I1012 05:42:06.134816 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 05:42:06 crc kubenswrapper[4930]: E1012 05:42:06.135189 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 05:42:06 crc kubenswrapper[4930]: I1012 05:42:06.135358 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cjzn" Oct 12 05:42:06 crc kubenswrapper[4930]: E1012 05:42:06.135396 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 05:42:06 crc kubenswrapper[4930]: E1012 05:42:06.135497 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 05:42:06 crc kubenswrapper[4930]: E1012 05:42:06.135604 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cjzn" podUID="dda08509-105f-4935-a8a9-ff852e73c3ce" Oct 12 05:42:06 crc kubenswrapper[4930]: I1012 05:42:06.187901 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:06 crc kubenswrapper[4930]: I1012 05:42:06.187971 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:06 crc kubenswrapper[4930]: I1012 05:42:06.188036 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:06 crc kubenswrapper[4930]: I1012 05:42:06.188110 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:06 crc kubenswrapper[4930]: I1012 05:42:06.188181 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:06Z","lastTransitionTime":"2025-10-12T05:42:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:06 crc kubenswrapper[4930]: I1012 05:42:06.291692 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:06 crc kubenswrapper[4930]: I1012 05:42:06.291777 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:06 crc kubenswrapper[4930]: I1012 05:42:06.291795 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:06 crc kubenswrapper[4930]: I1012 05:42:06.291818 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:06 crc kubenswrapper[4930]: I1012 05:42:06.291835 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:06Z","lastTransitionTime":"2025-10-12T05:42:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:06 crc kubenswrapper[4930]: I1012 05:42:06.395407 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:06 crc kubenswrapper[4930]: I1012 05:42:06.395490 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:06 crc kubenswrapper[4930]: I1012 05:42:06.395507 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:06 crc kubenswrapper[4930]: I1012 05:42:06.395532 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:06 crc kubenswrapper[4930]: I1012 05:42:06.395552 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:06Z","lastTransitionTime":"2025-10-12T05:42:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:06 crc kubenswrapper[4930]: I1012 05:42:06.498369 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:06 crc kubenswrapper[4930]: I1012 05:42:06.498424 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:06 crc kubenswrapper[4930]: I1012 05:42:06.498441 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:06 crc kubenswrapper[4930]: I1012 05:42:06.498472 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:06 crc kubenswrapper[4930]: I1012 05:42:06.498489 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:06Z","lastTransitionTime":"2025-10-12T05:42:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:06 crc kubenswrapper[4930]: I1012 05:42:06.601803 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:06 crc kubenswrapper[4930]: I1012 05:42:06.601874 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:06 crc kubenswrapper[4930]: I1012 05:42:06.601894 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:06 crc kubenswrapper[4930]: I1012 05:42:06.601930 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:06 crc kubenswrapper[4930]: I1012 05:42:06.601970 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:06Z","lastTransitionTime":"2025-10-12T05:42:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:06 crc kubenswrapper[4930]: I1012 05:42:06.705159 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:06 crc kubenswrapper[4930]: I1012 05:42:06.705230 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:06 crc kubenswrapper[4930]: I1012 05:42:06.705253 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:06 crc kubenswrapper[4930]: I1012 05:42:06.705282 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:06 crc kubenswrapper[4930]: I1012 05:42:06.705302 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:06Z","lastTransitionTime":"2025-10-12T05:42:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:06 crc kubenswrapper[4930]: I1012 05:42:06.808870 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:06 crc kubenswrapper[4930]: I1012 05:42:06.808917 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:06 crc kubenswrapper[4930]: I1012 05:42:06.808934 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:06 crc kubenswrapper[4930]: I1012 05:42:06.808956 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:06 crc kubenswrapper[4930]: I1012 05:42:06.808974 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:06Z","lastTransitionTime":"2025-10-12T05:42:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:06 crc kubenswrapper[4930]: I1012 05:42:06.912108 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:06 crc kubenswrapper[4930]: I1012 05:42:06.912180 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:06 crc kubenswrapper[4930]: I1012 05:42:06.912204 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:06 crc kubenswrapper[4930]: I1012 05:42:06.912233 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:06 crc kubenswrapper[4930]: I1012 05:42:06.912256 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:06Z","lastTransitionTime":"2025-10-12T05:42:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:07 crc kubenswrapper[4930]: I1012 05:42:07.016574 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:07 crc kubenswrapper[4930]: I1012 05:42:07.016645 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:07 crc kubenswrapper[4930]: I1012 05:42:07.016663 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:07 crc kubenswrapper[4930]: I1012 05:42:07.016692 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:07 crc kubenswrapper[4930]: I1012 05:42:07.016711 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:07Z","lastTransitionTime":"2025-10-12T05:42:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:07 crc kubenswrapper[4930]: I1012 05:42:07.119420 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:07 crc kubenswrapper[4930]: I1012 05:42:07.119462 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:07 crc kubenswrapper[4930]: I1012 05:42:07.119471 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:07 crc kubenswrapper[4930]: I1012 05:42:07.119486 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:07 crc kubenswrapper[4930]: I1012 05:42:07.119496 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:07Z","lastTransitionTime":"2025-10-12T05:42:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:07 crc kubenswrapper[4930]: I1012 05:42:07.222600 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:07 crc kubenswrapper[4930]: I1012 05:42:07.222659 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:07 crc kubenswrapper[4930]: I1012 05:42:07.222677 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:07 crc kubenswrapper[4930]: I1012 05:42:07.222701 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:07 crc kubenswrapper[4930]: I1012 05:42:07.222717 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:07Z","lastTransitionTime":"2025-10-12T05:42:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:07 crc kubenswrapper[4930]: I1012 05:42:07.325941 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:07 crc kubenswrapper[4930]: I1012 05:42:07.326027 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:07 crc kubenswrapper[4930]: I1012 05:42:07.326050 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:07 crc kubenswrapper[4930]: I1012 05:42:07.326083 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:07 crc kubenswrapper[4930]: I1012 05:42:07.326108 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:07Z","lastTransitionTime":"2025-10-12T05:42:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:07 crc kubenswrapper[4930]: I1012 05:42:07.430236 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:07 crc kubenswrapper[4930]: I1012 05:42:07.430308 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:07 crc kubenswrapper[4930]: I1012 05:42:07.430325 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:07 crc kubenswrapper[4930]: I1012 05:42:07.430354 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:07 crc kubenswrapper[4930]: I1012 05:42:07.430373 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:07Z","lastTransitionTime":"2025-10-12T05:42:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:07 crc kubenswrapper[4930]: I1012 05:42:07.534229 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:07 crc kubenswrapper[4930]: I1012 05:42:07.534302 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:07 crc kubenswrapper[4930]: I1012 05:42:07.534321 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:07 crc kubenswrapper[4930]: I1012 05:42:07.534356 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:07 crc kubenswrapper[4930]: I1012 05:42:07.534379 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:07Z","lastTransitionTime":"2025-10-12T05:42:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:07 crc kubenswrapper[4930]: I1012 05:42:07.638254 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:07 crc kubenswrapper[4930]: I1012 05:42:07.638327 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:07 crc kubenswrapper[4930]: I1012 05:42:07.638349 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:07 crc kubenswrapper[4930]: I1012 05:42:07.638383 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:07 crc kubenswrapper[4930]: I1012 05:42:07.638419 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:07Z","lastTransitionTime":"2025-10-12T05:42:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:07 crc kubenswrapper[4930]: I1012 05:42:07.742254 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:07 crc kubenswrapper[4930]: I1012 05:42:07.742332 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:07 crc kubenswrapper[4930]: I1012 05:42:07.742359 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:07 crc kubenswrapper[4930]: I1012 05:42:07.742390 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:07 crc kubenswrapper[4930]: I1012 05:42:07.742411 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:07Z","lastTransitionTime":"2025-10-12T05:42:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:07 crc kubenswrapper[4930]: I1012 05:42:07.845870 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:07 crc kubenswrapper[4930]: I1012 05:42:07.845972 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:07 crc kubenswrapper[4930]: I1012 05:42:07.845993 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:07 crc kubenswrapper[4930]: I1012 05:42:07.846021 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:07 crc kubenswrapper[4930]: I1012 05:42:07.846042 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:07Z","lastTransitionTime":"2025-10-12T05:42:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:07 crc kubenswrapper[4930]: I1012 05:42:07.949934 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:07 crc kubenswrapper[4930]: I1012 05:42:07.950031 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:07 crc kubenswrapper[4930]: I1012 05:42:07.950049 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:07 crc kubenswrapper[4930]: I1012 05:42:07.950072 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:07 crc kubenswrapper[4930]: I1012 05:42:07.950090 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:07Z","lastTransitionTime":"2025-10-12T05:42:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:08 crc kubenswrapper[4930]: I1012 05:42:08.053186 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:08 crc kubenswrapper[4930]: I1012 05:42:08.053293 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:08 crc kubenswrapper[4930]: I1012 05:42:08.053317 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:08 crc kubenswrapper[4930]: I1012 05:42:08.053347 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:08 crc kubenswrapper[4930]: I1012 05:42:08.053383 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:08Z","lastTransitionTime":"2025-10-12T05:42:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:08 crc kubenswrapper[4930]: I1012 05:42:08.134495 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 05:42:08 crc kubenswrapper[4930]: I1012 05:42:08.134631 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cjzn" Oct 12 05:42:08 crc kubenswrapper[4930]: I1012 05:42:08.134495 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 05:42:08 crc kubenswrapper[4930]: I1012 05:42:08.134645 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 05:42:08 crc kubenswrapper[4930]: E1012 05:42:08.134874 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cjzn" podUID="dda08509-105f-4935-a8a9-ff852e73c3ce" Oct 12 05:42:08 crc kubenswrapper[4930]: E1012 05:42:08.135046 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 05:42:08 crc kubenswrapper[4930]: E1012 05:42:08.135200 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 05:42:08 crc kubenswrapper[4930]: E1012 05:42:08.136385 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 05:42:08 crc kubenswrapper[4930]: I1012 05:42:08.152918 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f56867a3-bcac-4478-b32f-0869b7e330a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39f8602e4dd7d746c23c7b303c631d40b2bc2c447acd1491b77be0895c1715a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82f1df2fa5060e9f2922e06c412de9fe47f1dd0236b2bba1d81783dfd52a68d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebbbfef668586cda6fda820fbbba5432c59b621a94025e2dd6d7d0733a3dee09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecfafe319660c39bb6291f6fdb7c0dd529065b7db2e0dd52cbf8e42a56c58dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecfafe319660c39bb6291f6fdb7c0dd529065b7db2e0dd52cbf8e42a56c58dd4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:08Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:08 crc kubenswrapper[4930]: I1012 05:42:08.155968 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:08 crc kubenswrapper[4930]: I1012 05:42:08.156050 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:08 crc kubenswrapper[4930]: I1012 05:42:08.156078 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:08 crc kubenswrapper[4930]: I1012 05:42:08.156111 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:08 crc kubenswrapper[4930]: I1012 05:42:08.156135 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:08Z","lastTransitionTime":"2025-10-12T05:42:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:08 crc kubenswrapper[4930]: I1012 05:42:08.178255 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c87fc1896b611cf87d3f6e8dce960e0aea4e0a5e161d12e046bec5a4d46d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:08Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:08 crc kubenswrapper[4930]: I1012 05:42:08.194351 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:08Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:08 crc kubenswrapper[4930]: I1012 05:42:08.209540 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tq29s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1c3ae9e-26ae-418f-b261-eabc4302b332\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbe4d76b294d8c174b77355aae9ec55f4d38c04446fc048603a294d171332584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s29kh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tq29s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:08Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:08 crc kubenswrapper[4930]: I1012 05:42:08.224161 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02f8684c-a3e4-44e8-9741-9f54488d8d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74c17edeca2e2b56f4be36327914c7232a5eee3190d7cd9ce54fe81050942ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqrgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2193886cf3e24fbf30c3e6f91ab52d9b7e68968c8b42a4d0deafab5668555701\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqrgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mk4tf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:08Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:08 crc kubenswrapper[4930]: I1012 05:42:08.255574 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd0aa975-5017-4585-b9d9-66563c734f99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2ee3c6242b1dd92c05cad132f941c6463862d38dd3f32db106c4458b19ca281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce0b341a84353da33af118a985d78e1926365e672cd9d9a94a8c9ccdeeeef68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://772b3ac684f8d59be06d3921919db10572b68d9acfb6ae64e85b92c637963057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04341c7af2fbffb89ce55a0ed2594bdb4fd943ac9088994f81f9245c31c62a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c844939574ab6101aa35b883fd6b7dfabb8338c2de1bc238a60b92f7e0fc99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:08Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:08 crc kubenswrapper[4930]: I1012 05:42:08.259467 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:08 crc kubenswrapper[4930]: I1012 05:42:08.259673 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:08 crc kubenswrapper[4930]: I1012 05:42:08.259860 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:08 crc kubenswrapper[4930]: I1012 05:42:08.260049 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:08 crc kubenswrapper[4930]: I1012 05:42:08.260186 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:08Z","lastTransitionTime":"2025-10-12T05:42:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:08 crc kubenswrapper[4930]: I1012 05:42:08.271886 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vwttt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d928520f-ca1d-4cca-b966-c1e6c9168db0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://083fd9c4a4a835276945402005503980476de4e834b8c77b4347cbfd74347baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://271a5e8f9d11a30e34aaa4965e6d9d036cfc47d469d87a2ff0ddcae5ba4e8fe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://271a5e8f9d11a30e34aaa4965e6d9d036cfc47d469d87a2ff0ddcae5ba4e8fe8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a7387bc9fd2f9b98dd174adbd738309e22b90e1c82756e0e4c114cd5bc1279a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a7387bc9fd2f9b98dd174adbd738309e22b90e1c82756e0e4c114cd5bc1279a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b73defe23757469590b3d3d92e7b79bf3b6dcbcaec44d750770cd2b94caaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6b73defe23757469590b3d3d92e7b79bf3b6dcbcaec44d750770cd2b94caaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c379d6c40705bccb05ab8835e3b2410fc0a98db9e372b4dbd62cc3d4c1c5470\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c379d6c40705bccb05ab8835e3b2410fc0a98db9e372b4dbd62cc3d4c1c5470\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5e3c8e16c925694d42868813bc57edd9c50c88f2153189a2b9f8451655aff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5e3c8e16c925694d42868813bc57edd9c50c88f2153189a2b9f8451655aff1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d01f2b61f7fe3605f4ad0626d2cf28756a8c7e3de343b9a20acafe7f35c19bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d01f2b61f7fe3605f4ad0626d2cf28756a8c7e3de343b9a20acafe7f35c19bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vwttt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:08Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:08 crc kubenswrapper[4930]: I1012 05:42:08.305391 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0add0fa2-092f-4dcc-8c72-82881564bf63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7495c144ce989bf10b08b199a7459ea85351b77941a3edcb0ef76a01b2e08164\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a4295347cdfdf0864fbc2b102904f73e19cc33021fc5a82eed61255c4641d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2cba08ba8fd74a521bdd005e51be52f80687a1ccf0f250d11b709c894be8235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1006beab015e55856b70b9bcb93bb81b541e6c593fae8fd366c361990d233914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17051b78f584f1199be3038a82d9e675d1b9fb4da939611577c0b05235a147a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9898a7d7e617b5eb8eab6b6d9cb5d2ea92e8fbb0f4b1e0e60711e8f4a5e3aef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3921189ea146b5643c2898b528d1e76b32effa405e5e61dc58db80200fb1223c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3921189ea146b5643c2898b528d1e76b32effa405e5e61dc58db80200fb1223c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T05:41:58Z\\\",\\\"message\\\":\\\"controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI1012 05:41:58.201492 6569 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager-operator/metrics\\\\\\\"}\\\\nI1012 05:41:58.201552 6569 services_controller.go:360] Finished syncing service metrics on namespace openshift-controller-manager-operator for network=default : 16.497974ms\\\\nI1012 05:41:58.201573 6569 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI1012 05:41:58.201644 6569 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1012 05:41:58.201741 6569 factory.go:1336] Added *v1.Node event handler 7\\\\nI1012 05:41:58.201827 6569 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1012 05:41:58.202337 6569 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1012 05:41:58.202433 6569 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1012 05:41:58.202487 6569 ovnkube.go:599] Stopped ovnkube\\\\nI1012 05:41:58.202539 6569 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1012 05:41:58.202635 6569 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-mdhw6_openshift-ovn-kubernetes(0add0fa2-092f-4dcc-8c72-82881564bf63)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcac9953a6ca0ac812cb0d1aed7ff1179eed01831f4af4a098f3e6938b5bc666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mdhw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:08Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:08 crc kubenswrapper[4930]: I1012 05:42:08.321435 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fjsw8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c21ff03a-29e7-467c-93e2-45f3a6cef5af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df55e443aa226c07478c488ee54785f35fbb1d6285775e82abcbf924aaf304fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dk8z2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bae41878203d4a2974f46d284f4fc05e705fb9fc9b1a3e8159db00a4c9d70762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dk8z2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fjsw8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:08Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:08 crc kubenswrapper[4930]: I1012 05:42:08.335645 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7cjzn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dda08509-105f-4935-a8a9-ff852e73c3ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2ntq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2ntq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7cjzn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:08Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:08 crc kubenswrapper[4930]: I1012 05:42:08.353224 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2acab1-2f7d-4497-ab0c-d2088825de18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://380b8da0bc8ba00ec7fe7a9dd1a4f49f0f0828fa469e21bd947a1b976ae5e584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14de1c8677ea9a164e1713e68861b55fa4b8fd0bff40211afa3ee9e883c65092\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93380cd1156afc3c34f019f5026f01b5242b9009c4c5de83edf5301e4cdb7b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b74b7cf89e0ba8d3417d38fdfc78b271cb23bc6e737358303956214c8f9a0f9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fb5b0196fd005013eba91e6ea563d14230abf0a8cbc1d2a4bf9cd4491f7c391\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T05:41:22Z\\\",\\\"message\\\":\\\"W1012 05:41:11.395066 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 05:41:11.395683 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760247671 cert, and key in /tmp/serving-cert-1865248722/serving-signer.crt, /tmp/serving-cert-1865248722/serving-signer.key\\\\nI1012 05:41:11.668419 1 observer_polling.go:159] Starting file observer\\\\nW1012 05:41:11.671952 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 05:41:11.672482 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 05:41:11.675332 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1865248722/tls.crt::/tmp/serving-cert-1865248722/tls.key\\\\\\\"\\\\nF1012 05:41:22.294380 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9e13b7dbf76d9dcdaa4a8edc40eb5ebac2c044e9f171f78f18b33a08784ed28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:08Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:08 crc kubenswrapper[4930]: I1012 05:42:08.363902 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:08 crc kubenswrapper[4930]: I1012 05:42:08.363996 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:08 crc kubenswrapper[4930]: I1012 05:42:08.364017 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:08 crc kubenswrapper[4930]: I1012 05:42:08.364090 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:08 crc kubenswrapper[4930]: I1012 05:42:08.364111 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:08Z","lastTransitionTime":"2025-10-12T05:42:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:08 crc kubenswrapper[4930]: I1012 05:42:08.373921 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:08Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:08 crc kubenswrapper[4930]: I1012 05:42:08.393418 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:08Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:08 crc kubenswrapper[4930]: I1012 05:42:08.410796 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://489b43d7659655253d7055c2725d6b59d594b9eea5d1e6dc78b015f8d44824ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:08Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:08 crc kubenswrapper[4930]: I1012 05:42:08.424218 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-br2vl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"545bf51c-0b04-4166-a984-ec9c1276470a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6148cdf3aed29894038db482881c0cf3d0fd2d99c241e87d864099a3eb4248f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tpks\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-br2vl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:08Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:08 crc kubenswrapper[4930]: I1012 05:42:08.439778 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jd2dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"835a1f98-4ae1-499b-b08c-a87dbcf8eaf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7816344abd767a7cdc2aa06418215233815285a683bb5a7fceeeb9a383b00bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvkzz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jd2dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:08Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:08 crc kubenswrapper[4930]: I1012 05:42:08.458805 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e930f9f-8120-43b6-95ba-39319c069f64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://428593f6ee3749d62d3cb8553ec6707f0f13f70f7750b8ea90fb0b9f0aa8f337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05e5821bdc37103cb95d9f2a3dcacbff75c87335a6c6ed400846ea49fe370c00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e44390f068c28c4a9917648c747a322ddfa44915b128a61f145d6e2e23fc167a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://586395a6006e26fe969c2ef9af6ec6299523c0868c26989709389f5af24b7e39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:08Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:08 crc kubenswrapper[4930]: I1012 05:42:08.472790 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:08 crc kubenswrapper[4930]: I1012 05:42:08.472840 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:08 crc kubenswrapper[4930]: I1012 05:42:08.472849 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:08 crc kubenswrapper[4930]: I1012 05:42:08.472871 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:08 crc kubenswrapper[4930]: I1012 05:42:08.472884 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:08Z","lastTransitionTime":"2025-10-12T05:42:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:08 crc kubenswrapper[4930]: I1012 05:42:08.479200 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d92840f87187d6a4e0a40972f9b1601b70c300104b6023b980e0939e8b83eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d6f5f971b8708420f74c2b9602179815dd69145979d42bbaeb899290c01c286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:08Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:08 crc kubenswrapper[4930]: I1012 05:42:08.575732 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:08 crc kubenswrapper[4930]: I1012 05:42:08.575879 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:08 crc kubenswrapper[4930]: I1012 05:42:08.575900 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:08 crc kubenswrapper[4930]: I1012 05:42:08.575927 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:08 crc kubenswrapper[4930]: I1012 05:42:08.575947 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:08Z","lastTransitionTime":"2025-10-12T05:42:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:08 crc kubenswrapper[4930]: I1012 05:42:08.678454 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:08 crc kubenswrapper[4930]: I1012 05:42:08.678717 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:08 crc kubenswrapper[4930]: I1012 05:42:08.678916 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:08 crc kubenswrapper[4930]: I1012 05:42:08.679074 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:08 crc kubenswrapper[4930]: I1012 05:42:08.679213 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:08Z","lastTransitionTime":"2025-10-12T05:42:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:08 crc kubenswrapper[4930]: I1012 05:42:08.782259 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:08 crc kubenswrapper[4930]: I1012 05:42:08.782562 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:08 crc kubenswrapper[4930]: I1012 05:42:08.782575 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:08 crc kubenswrapper[4930]: I1012 05:42:08.782592 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:08 crc kubenswrapper[4930]: I1012 05:42:08.782605 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:08Z","lastTransitionTime":"2025-10-12T05:42:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:08 crc kubenswrapper[4930]: I1012 05:42:08.885719 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:08 crc kubenswrapper[4930]: I1012 05:42:08.885817 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:08 crc kubenswrapper[4930]: I1012 05:42:08.885835 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:08 crc kubenswrapper[4930]: I1012 05:42:08.885859 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:08 crc kubenswrapper[4930]: I1012 05:42:08.885877 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:08Z","lastTransitionTime":"2025-10-12T05:42:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:08 crc kubenswrapper[4930]: I1012 05:42:08.988898 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:08 crc kubenswrapper[4930]: I1012 05:42:08.988964 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:08 crc kubenswrapper[4930]: I1012 05:42:08.988988 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:08 crc kubenswrapper[4930]: I1012 05:42:08.989017 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:08 crc kubenswrapper[4930]: I1012 05:42:08.989037 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:08Z","lastTransitionTime":"2025-10-12T05:42:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:09 crc kubenswrapper[4930]: I1012 05:42:09.091942 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:09 crc kubenswrapper[4930]: I1012 05:42:09.092008 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:09 crc kubenswrapper[4930]: I1012 05:42:09.092027 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:09 crc kubenswrapper[4930]: I1012 05:42:09.092054 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:09 crc kubenswrapper[4930]: I1012 05:42:09.092073 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:09Z","lastTransitionTime":"2025-10-12T05:42:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:09 crc kubenswrapper[4930]: I1012 05:42:09.195415 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:09 crc kubenswrapper[4930]: I1012 05:42:09.195684 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:09 crc kubenswrapper[4930]: I1012 05:42:09.195884 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:09 crc kubenswrapper[4930]: I1012 05:42:09.196029 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:09 crc kubenswrapper[4930]: I1012 05:42:09.196173 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:09Z","lastTransitionTime":"2025-10-12T05:42:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:09 crc kubenswrapper[4930]: I1012 05:42:09.299494 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:09 crc kubenswrapper[4930]: I1012 05:42:09.299812 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:09 crc kubenswrapper[4930]: I1012 05:42:09.299832 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:09 crc kubenswrapper[4930]: I1012 05:42:09.299856 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:09 crc kubenswrapper[4930]: I1012 05:42:09.299873 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:09Z","lastTransitionTime":"2025-10-12T05:42:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:09 crc kubenswrapper[4930]: I1012 05:42:09.403238 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:09 crc kubenswrapper[4930]: I1012 05:42:09.403297 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:09 crc kubenswrapper[4930]: I1012 05:42:09.403316 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:09 crc kubenswrapper[4930]: I1012 05:42:09.403340 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:09 crc kubenswrapper[4930]: I1012 05:42:09.403358 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:09Z","lastTransitionTime":"2025-10-12T05:42:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:09 crc kubenswrapper[4930]: I1012 05:42:09.506475 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:09 crc kubenswrapper[4930]: I1012 05:42:09.506561 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:09 crc kubenswrapper[4930]: I1012 05:42:09.506585 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:09 crc kubenswrapper[4930]: I1012 05:42:09.506616 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:09 crc kubenswrapper[4930]: I1012 05:42:09.506637 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:09Z","lastTransitionTime":"2025-10-12T05:42:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:09 crc kubenswrapper[4930]: I1012 05:42:09.609925 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:09 crc kubenswrapper[4930]: I1012 05:42:09.609989 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:09 crc kubenswrapper[4930]: I1012 05:42:09.610006 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:09 crc kubenswrapper[4930]: I1012 05:42:09.610033 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:09 crc kubenswrapper[4930]: I1012 05:42:09.610052 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:09Z","lastTransitionTime":"2025-10-12T05:42:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:09 crc kubenswrapper[4930]: I1012 05:42:09.712994 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:09 crc kubenswrapper[4930]: I1012 05:42:09.713053 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:09 crc kubenswrapper[4930]: I1012 05:42:09.713071 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:09 crc kubenswrapper[4930]: I1012 05:42:09.713095 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:09 crc kubenswrapper[4930]: I1012 05:42:09.713112 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:09Z","lastTransitionTime":"2025-10-12T05:42:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:09 crc kubenswrapper[4930]: I1012 05:42:09.816494 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:09 crc kubenswrapper[4930]: I1012 05:42:09.816548 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:09 crc kubenswrapper[4930]: I1012 05:42:09.816564 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:09 crc kubenswrapper[4930]: I1012 05:42:09.816586 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:09 crc kubenswrapper[4930]: I1012 05:42:09.816602 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:09Z","lastTransitionTime":"2025-10-12T05:42:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:09 crc kubenswrapper[4930]: I1012 05:42:09.920105 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:09 crc kubenswrapper[4930]: I1012 05:42:09.920481 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:09 crc kubenswrapper[4930]: I1012 05:42:09.920639 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:09 crc kubenswrapper[4930]: I1012 05:42:09.920831 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:09 crc kubenswrapper[4930]: I1012 05:42:09.920987 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:09Z","lastTransitionTime":"2025-10-12T05:42:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:10 crc kubenswrapper[4930]: I1012 05:42:10.024690 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:10 crc kubenswrapper[4930]: I1012 05:42:10.025142 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:10 crc kubenswrapper[4930]: I1012 05:42:10.025419 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:10 crc kubenswrapper[4930]: I1012 05:42:10.025657 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:10 crc kubenswrapper[4930]: I1012 05:42:10.025932 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:10Z","lastTransitionTime":"2025-10-12T05:42:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:10 crc kubenswrapper[4930]: I1012 05:42:10.129263 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:10 crc kubenswrapper[4930]: I1012 05:42:10.129627 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:10 crc kubenswrapper[4930]: I1012 05:42:10.129801 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:10 crc kubenswrapper[4930]: I1012 05:42:10.129942 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:10 crc kubenswrapper[4930]: I1012 05:42:10.130091 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:10Z","lastTransitionTime":"2025-10-12T05:42:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:10 crc kubenswrapper[4930]: I1012 05:42:10.134803 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 05:42:10 crc kubenswrapper[4930]: I1012 05:42:10.134851 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cjzn" Oct 12 05:42:10 crc kubenswrapper[4930]: I1012 05:42:10.134876 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 05:42:10 crc kubenswrapper[4930]: E1012 05:42:10.134988 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 05:42:10 crc kubenswrapper[4930]: I1012 05:42:10.135142 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 05:42:10 crc kubenswrapper[4930]: E1012 05:42:10.135346 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cjzn" podUID="dda08509-105f-4935-a8a9-ff852e73c3ce" Oct 12 05:42:10 crc kubenswrapper[4930]: E1012 05:42:10.135411 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 05:42:10 crc kubenswrapper[4930]: E1012 05:42:10.136001 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 05:42:10 crc kubenswrapper[4930]: I1012 05:42:10.136417 4930 scope.go:117] "RemoveContainer" containerID="3921189ea146b5643c2898b528d1e76b32effa405e5e61dc58db80200fb1223c" Oct 12 05:42:10 crc kubenswrapper[4930]: E1012 05:42:10.137389 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-mdhw6_openshift-ovn-kubernetes(0add0fa2-092f-4dcc-8c72-82881564bf63)\"" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" podUID="0add0fa2-092f-4dcc-8c72-82881564bf63" Oct 12 05:42:10 crc kubenswrapper[4930]: I1012 05:42:10.233370 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:10 crc kubenswrapper[4930]: I1012 05:42:10.233422 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:10 crc kubenswrapper[4930]: I1012 05:42:10.233440 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:10 crc kubenswrapper[4930]: I1012 05:42:10.233464 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:10 crc kubenswrapper[4930]: I1012 05:42:10.233480 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:10Z","lastTransitionTime":"2025-10-12T05:42:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:10 crc kubenswrapper[4930]: I1012 05:42:10.336464 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:10 crc kubenswrapper[4930]: I1012 05:42:10.336506 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:10 crc kubenswrapper[4930]: I1012 05:42:10.336523 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:10 crc kubenswrapper[4930]: I1012 05:42:10.336545 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:10 crc kubenswrapper[4930]: I1012 05:42:10.336561 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:10Z","lastTransitionTime":"2025-10-12T05:42:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:10 crc kubenswrapper[4930]: I1012 05:42:10.439201 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:10 crc kubenswrapper[4930]: I1012 05:42:10.439242 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:10 crc kubenswrapper[4930]: I1012 05:42:10.439254 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:10 crc kubenswrapper[4930]: I1012 05:42:10.439270 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:10 crc kubenswrapper[4930]: I1012 05:42:10.439283 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:10Z","lastTransitionTime":"2025-10-12T05:42:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:10 crc kubenswrapper[4930]: I1012 05:42:10.542846 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:10 crc kubenswrapper[4930]: I1012 05:42:10.542886 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:10 crc kubenswrapper[4930]: I1012 05:42:10.542899 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:10 crc kubenswrapper[4930]: I1012 05:42:10.542914 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:10 crc kubenswrapper[4930]: I1012 05:42:10.542924 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:10Z","lastTransitionTime":"2025-10-12T05:42:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:10 crc kubenswrapper[4930]: I1012 05:42:10.645911 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:10 crc kubenswrapper[4930]: I1012 05:42:10.645960 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:10 crc kubenswrapper[4930]: I1012 05:42:10.645976 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:10 crc kubenswrapper[4930]: I1012 05:42:10.646001 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:10 crc kubenswrapper[4930]: I1012 05:42:10.651772 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:10Z","lastTransitionTime":"2025-10-12T05:42:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:10 crc kubenswrapper[4930]: I1012 05:42:10.756866 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:10 crc kubenswrapper[4930]: I1012 05:42:10.756938 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:10 crc kubenswrapper[4930]: I1012 05:42:10.756957 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:10 crc kubenswrapper[4930]: I1012 05:42:10.756985 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:10 crc kubenswrapper[4930]: I1012 05:42:10.757007 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:10Z","lastTransitionTime":"2025-10-12T05:42:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:10 crc kubenswrapper[4930]: I1012 05:42:10.860508 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:10 crc kubenswrapper[4930]: I1012 05:42:10.860557 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:10 crc kubenswrapper[4930]: I1012 05:42:10.860574 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:10 crc kubenswrapper[4930]: I1012 05:42:10.860598 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:10 crc kubenswrapper[4930]: I1012 05:42:10.860617 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:10Z","lastTransitionTime":"2025-10-12T05:42:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:10 crc kubenswrapper[4930]: I1012 05:42:10.963934 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:10 crc kubenswrapper[4930]: I1012 05:42:10.963985 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:10 crc kubenswrapper[4930]: I1012 05:42:10.964003 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:10 crc kubenswrapper[4930]: I1012 05:42:10.964027 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:10 crc kubenswrapper[4930]: I1012 05:42:10.964103 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:10Z","lastTransitionTime":"2025-10-12T05:42:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:11 crc kubenswrapper[4930]: I1012 05:42:11.066812 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:11 crc kubenswrapper[4930]: I1012 05:42:11.066883 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:11 crc kubenswrapper[4930]: I1012 05:42:11.066900 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:11 crc kubenswrapper[4930]: I1012 05:42:11.066923 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:11 crc kubenswrapper[4930]: I1012 05:42:11.066939 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:11Z","lastTransitionTime":"2025-10-12T05:42:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:11 crc kubenswrapper[4930]: I1012 05:42:11.170041 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:11 crc kubenswrapper[4930]: I1012 05:42:11.170099 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:11 crc kubenswrapper[4930]: I1012 05:42:11.170116 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:11 crc kubenswrapper[4930]: I1012 05:42:11.170138 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:11 crc kubenswrapper[4930]: I1012 05:42:11.170153 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:11Z","lastTransitionTime":"2025-10-12T05:42:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:11 crc kubenswrapper[4930]: I1012 05:42:11.273635 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:11 crc kubenswrapper[4930]: I1012 05:42:11.273685 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:11 crc kubenswrapper[4930]: I1012 05:42:11.273702 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:11 crc kubenswrapper[4930]: I1012 05:42:11.273725 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:11 crc kubenswrapper[4930]: I1012 05:42:11.273767 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:11Z","lastTransitionTime":"2025-10-12T05:42:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:11 crc kubenswrapper[4930]: I1012 05:42:11.376962 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:11 crc kubenswrapper[4930]: I1012 05:42:11.377036 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:11 crc kubenswrapper[4930]: I1012 05:42:11.377055 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:11 crc kubenswrapper[4930]: I1012 05:42:11.377466 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:11 crc kubenswrapper[4930]: I1012 05:42:11.377516 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:11Z","lastTransitionTime":"2025-10-12T05:42:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:11 crc kubenswrapper[4930]: I1012 05:42:11.480281 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:11 crc kubenswrapper[4930]: I1012 05:42:11.480386 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:11 crc kubenswrapper[4930]: I1012 05:42:11.480405 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:11 crc kubenswrapper[4930]: I1012 05:42:11.480430 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:11 crc kubenswrapper[4930]: I1012 05:42:11.480448 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:11Z","lastTransitionTime":"2025-10-12T05:42:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:11 crc kubenswrapper[4930]: I1012 05:42:11.583351 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:11 crc kubenswrapper[4930]: I1012 05:42:11.583395 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:11 crc kubenswrapper[4930]: I1012 05:42:11.583406 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:11 crc kubenswrapper[4930]: I1012 05:42:11.583427 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:11 crc kubenswrapper[4930]: I1012 05:42:11.583442 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:11Z","lastTransitionTime":"2025-10-12T05:42:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:11 crc kubenswrapper[4930]: I1012 05:42:11.686648 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:11 crc kubenswrapper[4930]: I1012 05:42:11.686714 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:11 crc kubenswrapper[4930]: I1012 05:42:11.686774 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:11 crc kubenswrapper[4930]: I1012 05:42:11.686804 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:11 crc kubenswrapper[4930]: I1012 05:42:11.686826 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:11Z","lastTransitionTime":"2025-10-12T05:42:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:11 crc kubenswrapper[4930]: I1012 05:42:11.790576 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:11 crc kubenswrapper[4930]: I1012 05:42:11.790632 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:11 crc kubenswrapper[4930]: I1012 05:42:11.790647 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:11 crc kubenswrapper[4930]: I1012 05:42:11.790666 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:11 crc kubenswrapper[4930]: I1012 05:42:11.790678 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:11Z","lastTransitionTime":"2025-10-12T05:42:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:11 crc kubenswrapper[4930]: I1012 05:42:11.893944 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:11 crc kubenswrapper[4930]: I1012 05:42:11.894045 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:11 crc kubenswrapper[4930]: I1012 05:42:11.894065 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:11 crc kubenswrapper[4930]: I1012 05:42:11.894095 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:11 crc kubenswrapper[4930]: I1012 05:42:11.894117 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:11Z","lastTransitionTime":"2025-10-12T05:42:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:11 crc kubenswrapper[4930]: I1012 05:42:11.997716 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:11 crc kubenswrapper[4930]: I1012 05:42:11.997807 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:11 crc kubenswrapper[4930]: I1012 05:42:11.997824 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:11 crc kubenswrapper[4930]: I1012 05:42:11.997848 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:11 crc kubenswrapper[4930]: I1012 05:42:11.997865 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:11Z","lastTransitionTime":"2025-10-12T05:42:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:12 crc kubenswrapper[4930]: I1012 05:42:12.100812 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:12 crc kubenswrapper[4930]: I1012 05:42:12.100842 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:12 crc kubenswrapper[4930]: I1012 05:42:12.100851 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:12 crc kubenswrapper[4930]: I1012 05:42:12.100865 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:12 crc kubenswrapper[4930]: I1012 05:42:12.100874 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:12Z","lastTransitionTime":"2025-10-12T05:42:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:12 crc kubenswrapper[4930]: I1012 05:42:12.134900 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 05:42:12 crc kubenswrapper[4930]: E1012 05:42:12.134989 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 05:42:12 crc kubenswrapper[4930]: I1012 05:42:12.135063 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cjzn" Oct 12 05:42:12 crc kubenswrapper[4930]: I1012 05:42:12.135077 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 05:42:12 crc kubenswrapper[4930]: I1012 05:42:12.135298 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 05:42:12 crc kubenswrapper[4930]: E1012 05:42:12.135363 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cjzn" podUID="dda08509-105f-4935-a8a9-ff852e73c3ce" Oct 12 05:42:12 crc kubenswrapper[4930]: E1012 05:42:12.135562 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 05:42:12 crc kubenswrapper[4930]: E1012 05:42:12.135827 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 05:42:12 crc kubenswrapper[4930]: I1012 05:42:12.203340 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:12 crc kubenswrapper[4930]: I1012 05:42:12.203480 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:12 crc kubenswrapper[4930]: I1012 05:42:12.203504 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:12 crc kubenswrapper[4930]: I1012 05:42:12.203537 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:12 crc kubenswrapper[4930]: I1012 05:42:12.203558 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:12Z","lastTransitionTime":"2025-10-12T05:42:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:12 crc kubenswrapper[4930]: I1012 05:42:12.306886 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:12 crc kubenswrapper[4930]: I1012 05:42:12.306956 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:12 crc kubenswrapper[4930]: I1012 05:42:12.306968 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:12 crc kubenswrapper[4930]: I1012 05:42:12.306996 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:12 crc kubenswrapper[4930]: I1012 05:42:12.307031 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:12Z","lastTransitionTime":"2025-10-12T05:42:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:12 crc kubenswrapper[4930]: I1012 05:42:12.410550 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:12 crc kubenswrapper[4930]: I1012 05:42:12.410581 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:12 crc kubenswrapper[4930]: I1012 05:42:12.410593 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:12 crc kubenswrapper[4930]: I1012 05:42:12.410607 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:12 crc kubenswrapper[4930]: I1012 05:42:12.410623 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:12Z","lastTransitionTime":"2025-10-12T05:42:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:12 crc kubenswrapper[4930]: I1012 05:42:12.514541 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:12 crc kubenswrapper[4930]: I1012 05:42:12.514602 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:12 crc kubenswrapper[4930]: I1012 05:42:12.514621 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:12 crc kubenswrapper[4930]: I1012 05:42:12.514647 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:12 crc kubenswrapper[4930]: I1012 05:42:12.514666 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:12Z","lastTransitionTime":"2025-10-12T05:42:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:12 crc kubenswrapper[4930]: I1012 05:42:12.618685 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:12 crc kubenswrapper[4930]: I1012 05:42:12.618719 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:12 crc kubenswrapper[4930]: I1012 05:42:12.618727 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:12 crc kubenswrapper[4930]: I1012 05:42:12.618758 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:12 crc kubenswrapper[4930]: I1012 05:42:12.618768 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:12Z","lastTransitionTime":"2025-10-12T05:42:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:12 crc kubenswrapper[4930]: I1012 05:42:12.721334 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:12 crc kubenswrapper[4930]: I1012 05:42:12.721401 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:12 crc kubenswrapper[4930]: I1012 05:42:12.721424 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:12 crc kubenswrapper[4930]: I1012 05:42:12.721453 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:12 crc kubenswrapper[4930]: I1012 05:42:12.721474 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:12Z","lastTransitionTime":"2025-10-12T05:42:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:12 crc kubenswrapper[4930]: I1012 05:42:12.824129 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:12 crc kubenswrapper[4930]: I1012 05:42:12.824191 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:12 crc kubenswrapper[4930]: I1012 05:42:12.824213 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:12 crc kubenswrapper[4930]: I1012 05:42:12.824240 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:12 crc kubenswrapper[4930]: I1012 05:42:12.824324 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:12Z","lastTransitionTime":"2025-10-12T05:42:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:12 crc kubenswrapper[4930]: I1012 05:42:12.927661 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:12 crc kubenswrapper[4930]: I1012 05:42:12.927714 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:12 crc kubenswrapper[4930]: I1012 05:42:12.927762 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:12 crc kubenswrapper[4930]: I1012 05:42:12.927788 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:12 crc kubenswrapper[4930]: I1012 05:42:12.927809 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:12Z","lastTransitionTime":"2025-10-12T05:42:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:13 crc kubenswrapper[4930]: I1012 05:42:13.030426 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:13 crc kubenswrapper[4930]: I1012 05:42:13.030476 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:13 crc kubenswrapper[4930]: I1012 05:42:13.030490 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:13 crc kubenswrapper[4930]: I1012 05:42:13.030514 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:13 crc kubenswrapper[4930]: I1012 05:42:13.030528 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:13Z","lastTransitionTime":"2025-10-12T05:42:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:13 crc kubenswrapper[4930]: I1012 05:42:13.133333 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:13 crc kubenswrapper[4930]: I1012 05:42:13.133387 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:13 crc kubenswrapper[4930]: I1012 05:42:13.133400 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:13 crc kubenswrapper[4930]: I1012 05:42:13.133423 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:13 crc kubenswrapper[4930]: I1012 05:42:13.133440 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:13Z","lastTransitionTime":"2025-10-12T05:42:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:13 crc kubenswrapper[4930]: I1012 05:42:13.236793 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:13 crc kubenswrapper[4930]: I1012 05:42:13.236865 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:13 crc kubenswrapper[4930]: I1012 05:42:13.236880 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:13 crc kubenswrapper[4930]: I1012 05:42:13.236911 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:13 crc kubenswrapper[4930]: I1012 05:42:13.236930 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:13Z","lastTransitionTime":"2025-10-12T05:42:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:13 crc kubenswrapper[4930]: I1012 05:42:13.339509 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:13 crc kubenswrapper[4930]: I1012 05:42:13.339567 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:13 crc kubenswrapper[4930]: I1012 05:42:13.339584 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:13 crc kubenswrapper[4930]: I1012 05:42:13.339611 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:13 crc kubenswrapper[4930]: I1012 05:42:13.339681 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:13Z","lastTransitionTime":"2025-10-12T05:42:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:13 crc kubenswrapper[4930]: I1012 05:42:13.441690 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:13 crc kubenswrapper[4930]: I1012 05:42:13.441772 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:13 crc kubenswrapper[4930]: I1012 05:42:13.441791 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:13 crc kubenswrapper[4930]: I1012 05:42:13.441819 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:13 crc kubenswrapper[4930]: I1012 05:42:13.441845 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:13Z","lastTransitionTime":"2025-10-12T05:42:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:13 crc kubenswrapper[4930]: I1012 05:42:13.544592 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:13 crc kubenswrapper[4930]: I1012 05:42:13.544658 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:13 crc kubenswrapper[4930]: I1012 05:42:13.544683 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:13 crc kubenswrapper[4930]: I1012 05:42:13.544712 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:13 crc kubenswrapper[4930]: I1012 05:42:13.544947 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:13Z","lastTransitionTime":"2025-10-12T05:42:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:13 crc kubenswrapper[4930]: I1012 05:42:13.646674 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:13 crc kubenswrapper[4930]: I1012 05:42:13.646759 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:13 crc kubenswrapper[4930]: I1012 05:42:13.646777 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:13 crc kubenswrapper[4930]: I1012 05:42:13.646803 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:13 crc kubenswrapper[4930]: I1012 05:42:13.646820 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:13Z","lastTransitionTime":"2025-10-12T05:42:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:13 crc kubenswrapper[4930]: I1012 05:42:13.749187 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:13 crc kubenswrapper[4930]: I1012 05:42:13.749249 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:13 crc kubenswrapper[4930]: I1012 05:42:13.749266 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:13 crc kubenswrapper[4930]: I1012 05:42:13.749365 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:13 crc kubenswrapper[4930]: I1012 05:42:13.749436 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:13Z","lastTransitionTime":"2025-10-12T05:42:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:13 crc kubenswrapper[4930]: I1012 05:42:13.851507 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:13 crc kubenswrapper[4930]: I1012 05:42:13.851606 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:13 crc kubenswrapper[4930]: I1012 05:42:13.851630 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:13 crc kubenswrapper[4930]: I1012 05:42:13.851664 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:13 crc kubenswrapper[4930]: I1012 05:42:13.851688 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:13Z","lastTransitionTime":"2025-10-12T05:42:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:13 crc kubenswrapper[4930]: I1012 05:42:13.953901 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:13 crc kubenswrapper[4930]: I1012 05:42:13.953929 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:13 crc kubenswrapper[4930]: I1012 05:42:13.953937 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:13 crc kubenswrapper[4930]: I1012 05:42:13.953951 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:13 crc kubenswrapper[4930]: I1012 05:42:13.953959 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:13Z","lastTransitionTime":"2025-10-12T05:42:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:14 crc kubenswrapper[4930]: I1012 05:42:14.055883 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:14 crc kubenswrapper[4930]: I1012 05:42:14.055910 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:14 crc kubenswrapper[4930]: I1012 05:42:14.055919 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:14 crc kubenswrapper[4930]: I1012 05:42:14.055932 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:14 crc kubenswrapper[4930]: I1012 05:42:14.055941 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:14Z","lastTransitionTime":"2025-10-12T05:42:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:14 crc kubenswrapper[4930]: I1012 05:42:14.134644 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cjzn" Oct 12 05:42:14 crc kubenswrapper[4930]: E1012 05:42:14.134755 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cjzn" podUID="dda08509-105f-4935-a8a9-ff852e73c3ce" Oct 12 05:42:14 crc kubenswrapper[4930]: I1012 05:42:14.134786 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 05:42:14 crc kubenswrapper[4930]: I1012 05:42:14.134874 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 05:42:14 crc kubenswrapper[4930]: I1012 05:42:14.134902 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 05:42:14 crc kubenswrapper[4930]: E1012 05:42:14.135021 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 05:42:14 crc kubenswrapper[4930]: E1012 05:42:14.135058 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 05:42:14 crc kubenswrapper[4930]: E1012 05:42:14.135105 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 05:42:14 crc kubenswrapper[4930]: I1012 05:42:14.157873 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:14 crc kubenswrapper[4930]: I1012 05:42:14.157954 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:14 crc kubenswrapper[4930]: I1012 05:42:14.157986 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:14 crc kubenswrapper[4930]: I1012 05:42:14.158021 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:14 crc kubenswrapper[4930]: I1012 05:42:14.158047 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:14Z","lastTransitionTime":"2025-10-12T05:42:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:14 crc kubenswrapper[4930]: I1012 05:42:14.261453 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:14 crc kubenswrapper[4930]: I1012 05:42:14.261491 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:14 crc kubenswrapper[4930]: I1012 05:42:14.261502 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:14 crc kubenswrapper[4930]: I1012 05:42:14.261518 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:14 crc kubenswrapper[4930]: I1012 05:42:14.261530 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:14Z","lastTransitionTime":"2025-10-12T05:42:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:14 crc kubenswrapper[4930]: I1012 05:42:14.364874 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:14 crc kubenswrapper[4930]: I1012 05:42:14.364914 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:14 crc kubenswrapper[4930]: I1012 05:42:14.364926 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:14 crc kubenswrapper[4930]: I1012 05:42:14.364944 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:14 crc kubenswrapper[4930]: I1012 05:42:14.364957 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:14Z","lastTransitionTime":"2025-10-12T05:42:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:14 crc kubenswrapper[4930]: I1012 05:42:14.467955 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:14 crc kubenswrapper[4930]: I1012 05:42:14.468004 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:14 crc kubenswrapper[4930]: I1012 05:42:14.468054 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:14 crc kubenswrapper[4930]: I1012 05:42:14.468078 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:14 crc kubenswrapper[4930]: I1012 05:42:14.468099 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:14Z","lastTransitionTime":"2025-10-12T05:42:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:14 crc kubenswrapper[4930]: I1012 05:42:14.570380 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:14 crc kubenswrapper[4930]: I1012 05:42:14.570419 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:14 crc kubenswrapper[4930]: I1012 05:42:14.570431 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:14 crc kubenswrapper[4930]: I1012 05:42:14.570446 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:14 crc kubenswrapper[4930]: I1012 05:42:14.570457 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:14Z","lastTransitionTime":"2025-10-12T05:42:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:14 crc kubenswrapper[4930]: I1012 05:42:14.673192 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:14 crc kubenswrapper[4930]: I1012 05:42:14.673278 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:14 crc kubenswrapper[4930]: I1012 05:42:14.673326 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:14 crc kubenswrapper[4930]: I1012 05:42:14.673351 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:14 crc kubenswrapper[4930]: I1012 05:42:14.673369 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:14Z","lastTransitionTime":"2025-10-12T05:42:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:14 crc kubenswrapper[4930]: I1012 05:42:14.776119 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:14 crc kubenswrapper[4930]: I1012 05:42:14.776178 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:14 crc kubenswrapper[4930]: I1012 05:42:14.776189 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:14 crc kubenswrapper[4930]: I1012 05:42:14.776204 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:14 crc kubenswrapper[4930]: I1012 05:42:14.776238 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:14Z","lastTransitionTime":"2025-10-12T05:42:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:14 crc kubenswrapper[4930]: I1012 05:42:14.879359 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:14 crc kubenswrapper[4930]: I1012 05:42:14.879428 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:14 crc kubenswrapper[4930]: I1012 05:42:14.879450 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:14 crc kubenswrapper[4930]: I1012 05:42:14.879478 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:14 crc kubenswrapper[4930]: I1012 05:42:14.879497 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:14Z","lastTransitionTime":"2025-10-12T05:42:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:14 crc kubenswrapper[4930]: I1012 05:42:14.982218 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:14 crc kubenswrapper[4930]: I1012 05:42:14.982283 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:14 crc kubenswrapper[4930]: I1012 05:42:14.982303 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:14 crc kubenswrapper[4930]: I1012 05:42:14.982330 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:14 crc kubenswrapper[4930]: I1012 05:42:14.982348 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:14Z","lastTransitionTime":"2025-10-12T05:42:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:15 crc kubenswrapper[4930]: I1012 05:42:15.084641 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:15 crc kubenswrapper[4930]: I1012 05:42:15.084722 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:15 crc kubenswrapper[4930]: I1012 05:42:15.084775 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:15 crc kubenswrapper[4930]: I1012 05:42:15.084807 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:15 crc kubenswrapper[4930]: I1012 05:42:15.084830 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:15Z","lastTransitionTime":"2025-10-12T05:42:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:15 crc kubenswrapper[4930]: I1012 05:42:15.188293 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:15 crc kubenswrapper[4930]: I1012 05:42:15.188367 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:15 crc kubenswrapper[4930]: I1012 05:42:15.188386 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:15 crc kubenswrapper[4930]: I1012 05:42:15.188422 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:15 crc kubenswrapper[4930]: I1012 05:42:15.188440 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:15Z","lastTransitionTime":"2025-10-12T05:42:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:15 crc kubenswrapper[4930]: I1012 05:42:15.292848 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:15 crc kubenswrapper[4930]: I1012 05:42:15.292920 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:15 crc kubenswrapper[4930]: I1012 05:42:15.292939 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:15 crc kubenswrapper[4930]: I1012 05:42:15.292965 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:15 crc kubenswrapper[4930]: I1012 05:42:15.292985 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:15Z","lastTransitionTime":"2025-10-12T05:42:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:15 crc kubenswrapper[4930]: I1012 05:42:15.396395 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:15 crc kubenswrapper[4930]: I1012 05:42:15.396457 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:15 crc kubenswrapper[4930]: I1012 05:42:15.396474 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:15 crc kubenswrapper[4930]: I1012 05:42:15.396498 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:15 crc kubenswrapper[4930]: I1012 05:42:15.396516 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:15Z","lastTransitionTime":"2025-10-12T05:42:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:15 crc kubenswrapper[4930]: I1012 05:42:15.500002 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:15 crc kubenswrapper[4930]: I1012 05:42:15.500073 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:15 crc kubenswrapper[4930]: I1012 05:42:15.500090 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:15 crc kubenswrapper[4930]: I1012 05:42:15.500114 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:15 crc kubenswrapper[4930]: I1012 05:42:15.500133 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:15Z","lastTransitionTime":"2025-10-12T05:42:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:15 crc kubenswrapper[4930]: I1012 05:42:15.603065 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:15 crc kubenswrapper[4930]: I1012 05:42:15.603118 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:15 crc kubenswrapper[4930]: I1012 05:42:15.603135 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:15 crc kubenswrapper[4930]: I1012 05:42:15.603158 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:15 crc kubenswrapper[4930]: I1012 05:42:15.603175 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:15Z","lastTransitionTime":"2025-10-12T05:42:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:15 crc kubenswrapper[4930]: I1012 05:42:15.705799 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:15 crc kubenswrapper[4930]: I1012 05:42:15.705862 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:15 crc kubenswrapper[4930]: I1012 05:42:15.705885 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:15 crc kubenswrapper[4930]: I1012 05:42:15.705911 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:15 crc kubenswrapper[4930]: I1012 05:42:15.705928 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:15Z","lastTransitionTime":"2025-10-12T05:42:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:15 crc kubenswrapper[4930]: I1012 05:42:15.717372 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:15 crc kubenswrapper[4930]: I1012 05:42:15.717432 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:15 crc kubenswrapper[4930]: I1012 05:42:15.717449 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:15 crc kubenswrapper[4930]: I1012 05:42:15.717473 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:15 crc kubenswrapper[4930]: I1012 05:42:15.717494 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:15Z","lastTransitionTime":"2025-10-12T05:42:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:15 crc kubenswrapper[4930]: E1012 05:42:15.735451 4930 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:42:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:42:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:42:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:42:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"208e324a-7c16-4d54-b585-7c9f58265cb2\\\",\\\"systemUUID\\\":\\\"42c4af26-8d34-49a0-8413-58384a3ecd2b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:15Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:15 crc kubenswrapper[4930]: I1012 05:42:15.746346 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:15 crc kubenswrapper[4930]: I1012 05:42:15.746408 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:15 crc kubenswrapper[4930]: I1012 05:42:15.746426 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:15 crc kubenswrapper[4930]: I1012 05:42:15.746450 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:15 crc kubenswrapper[4930]: I1012 05:42:15.746470 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:15Z","lastTransitionTime":"2025-10-12T05:42:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:15 crc kubenswrapper[4930]: E1012 05:42:15.762404 4930 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:42:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:42:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:42:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:42:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"208e324a-7c16-4d54-b585-7c9f58265cb2\\\",\\\"systemUUID\\\":\\\"42c4af26-8d34-49a0-8413-58384a3ecd2b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:15Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:15 crc kubenswrapper[4930]: I1012 05:42:15.766517 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:15 crc kubenswrapper[4930]: I1012 05:42:15.766603 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:15 crc kubenswrapper[4930]: I1012 05:42:15.766621 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:15 crc kubenswrapper[4930]: I1012 05:42:15.766674 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:15 crc kubenswrapper[4930]: I1012 05:42:15.766690 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:15Z","lastTransitionTime":"2025-10-12T05:42:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:15 crc kubenswrapper[4930]: E1012 05:42:15.789633 4930 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:42:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:42:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:42:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:42:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"208e324a-7c16-4d54-b585-7c9f58265cb2\\\",\\\"systemUUID\\\":\\\"42c4af26-8d34-49a0-8413-58384a3ecd2b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:15Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:15 crc kubenswrapper[4930]: I1012 05:42:15.794731 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:15 crc kubenswrapper[4930]: I1012 05:42:15.794806 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:15 crc kubenswrapper[4930]: I1012 05:42:15.794825 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:15 crc kubenswrapper[4930]: I1012 05:42:15.794883 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:15 crc kubenswrapper[4930]: I1012 05:42:15.794903 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:15Z","lastTransitionTime":"2025-10-12T05:42:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:15 crc kubenswrapper[4930]: E1012 05:42:15.811939 4930 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:42:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:42:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:42:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:42:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"208e324a-7c16-4d54-b585-7c9f58265cb2\\\",\\\"systemUUID\\\":\\\"42c4af26-8d34-49a0-8413-58384a3ecd2b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:15Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:15 crc kubenswrapper[4930]: I1012 05:42:15.816390 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:15 crc kubenswrapper[4930]: I1012 05:42:15.816455 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:15 crc kubenswrapper[4930]: I1012 05:42:15.816475 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:15 crc kubenswrapper[4930]: I1012 05:42:15.816501 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:15 crc kubenswrapper[4930]: I1012 05:42:15.816519 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:15Z","lastTransitionTime":"2025-10-12T05:42:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:15 crc kubenswrapper[4930]: E1012 05:42:15.835885 4930 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:42:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:42:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:42:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:42:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"208e324a-7c16-4d54-b585-7c9f58265cb2\\\",\\\"systemUUID\\\":\\\"42c4af26-8d34-49a0-8413-58384a3ecd2b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:15Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:15 crc kubenswrapper[4930]: E1012 05:42:15.836148 4930 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 12 05:42:15 crc kubenswrapper[4930]: I1012 05:42:15.838443 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:15 crc kubenswrapper[4930]: I1012 05:42:15.838509 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:15 crc kubenswrapper[4930]: I1012 05:42:15.838528 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:15 crc kubenswrapper[4930]: I1012 05:42:15.838553 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:15 crc kubenswrapper[4930]: I1012 05:42:15.838571 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:15Z","lastTransitionTime":"2025-10-12T05:42:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:15 crc kubenswrapper[4930]: I1012 05:42:15.941550 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:15 crc kubenswrapper[4930]: I1012 05:42:15.941620 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:15 crc kubenswrapper[4930]: I1012 05:42:15.941638 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:15 crc kubenswrapper[4930]: I1012 05:42:15.941663 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:15 crc kubenswrapper[4930]: I1012 05:42:15.941682 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:15Z","lastTransitionTime":"2025-10-12T05:42:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:16 crc kubenswrapper[4930]: I1012 05:42:16.045376 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:16 crc kubenswrapper[4930]: I1012 05:42:16.045449 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:16 crc kubenswrapper[4930]: I1012 05:42:16.045472 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:16 crc kubenswrapper[4930]: I1012 05:42:16.045502 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:16 crc kubenswrapper[4930]: I1012 05:42:16.045527 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:16Z","lastTransitionTime":"2025-10-12T05:42:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:16 crc kubenswrapper[4930]: I1012 05:42:16.107588 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dda08509-105f-4935-a8a9-ff852e73c3ce-metrics-certs\") pod \"network-metrics-daemon-7cjzn\" (UID: \"dda08509-105f-4935-a8a9-ff852e73c3ce\") " pod="openshift-multus/network-metrics-daemon-7cjzn" Oct 12 05:42:16 crc kubenswrapper[4930]: E1012 05:42:16.107818 4930 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 12 05:42:16 crc kubenswrapper[4930]: E1012 05:42:16.107897 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dda08509-105f-4935-a8a9-ff852e73c3ce-metrics-certs podName:dda08509-105f-4935-a8a9-ff852e73c3ce nodeName:}" failed. No retries permitted until 2025-10-12 05:42:48.107875527 +0000 UTC m=+100.649977322 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dda08509-105f-4935-a8a9-ff852e73c3ce-metrics-certs") pod "network-metrics-daemon-7cjzn" (UID: "dda08509-105f-4935-a8a9-ff852e73c3ce") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 12 05:42:16 crc kubenswrapper[4930]: I1012 05:42:16.135137 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cjzn" Oct 12 05:42:16 crc kubenswrapper[4930]: I1012 05:42:16.135150 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 05:42:16 crc kubenswrapper[4930]: I1012 05:42:16.135235 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 05:42:16 crc kubenswrapper[4930]: I1012 05:42:16.135261 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 05:42:16 crc kubenswrapper[4930]: E1012 05:42:16.135492 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cjzn" podUID="dda08509-105f-4935-a8a9-ff852e73c3ce" Oct 12 05:42:16 crc kubenswrapper[4930]: E1012 05:42:16.135762 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 05:42:16 crc kubenswrapper[4930]: E1012 05:42:16.135917 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 05:42:16 crc kubenswrapper[4930]: E1012 05:42:16.136079 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 05:42:16 crc kubenswrapper[4930]: I1012 05:42:16.148205 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:16 crc kubenswrapper[4930]: I1012 05:42:16.148255 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:16 crc kubenswrapper[4930]: I1012 05:42:16.148272 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:16 crc kubenswrapper[4930]: I1012 05:42:16.148295 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:16 crc kubenswrapper[4930]: I1012 05:42:16.148311 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:16Z","lastTransitionTime":"2025-10-12T05:42:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:16 crc kubenswrapper[4930]: I1012 05:42:16.251159 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:16 crc kubenswrapper[4930]: I1012 05:42:16.251224 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:16 crc kubenswrapper[4930]: I1012 05:42:16.251241 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:16 crc kubenswrapper[4930]: I1012 05:42:16.251268 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:16 crc kubenswrapper[4930]: I1012 05:42:16.251287 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:16Z","lastTransitionTime":"2025-10-12T05:42:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:16 crc kubenswrapper[4930]: I1012 05:42:16.354395 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:16 crc kubenswrapper[4930]: I1012 05:42:16.354466 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:16 crc kubenswrapper[4930]: I1012 05:42:16.354491 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:16 crc kubenswrapper[4930]: I1012 05:42:16.354517 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:16 crc kubenswrapper[4930]: I1012 05:42:16.354534 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:16Z","lastTransitionTime":"2025-10-12T05:42:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:16 crc kubenswrapper[4930]: I1012 05:42:16.456725 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:16 crc kubenswrapper[4930]: I1012 05:42:16.456813 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:16 crc kubenswrapper[4930]: I1012 05:42:16.456830 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:16 crc kubenswrapper[4930]: I1012 05:42:16.456854 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:16 crc kubenswrapper[4930]: I1012 05:42:16.456875 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:16Z","lastTransitionTime":"2025-10-12T05:42:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:16 crc kubenswrapper[4930]: I1012 05:42:16.560270 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:16 crc kubenswrapper[4930]: I1012 05:42:16.560331 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:16 crc kubenswrapper[4930]: I1012 05:42:16.560346 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:16 crc kubenswrapper[4930]: I1012 05:42:16.560370 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:16 crc kubenswrapper[4930]: I1012 05:42:16.560387 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:16Z","lastTransitionTime":"2025-10-12T05:42:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:16 crc kubenswrapper[4930]: I1012 05:42:16.662160 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:16 crc kubenswrapper[4930]: I1012 05:42:16.662204 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:16 crc kubenswrapper[4930]: I1012 05:42:16.662215 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:16 crc kubenswrapper[4930]: I1012 05:42:16.662232 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:16 crc kubenswrapper[4930]: I1012 05:42:16.662244 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:16Z","lastTransitionTime":"2025-10-12T05:42:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:16 crc kubenswrapper[4930]: I1012 05:42:16.765668 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:16 crc kubenswrapper[4930]: I1012 05:42:16.765772 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:16 crc kubenswrapper[4930]: I1012 05:42:16.765796 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:16 crc kubenswrapper[4930]: I1012 05:42:16.765823 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:16 crc kubenswrapper[4930]: I1012 05:42:16.765844 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:16Z","lastTransitionTime":"2025-10-12T05:42:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:16 crc kubenswrapper[4930]: I1012 05:42:16.868192 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:16 crc kubenswrapper[4930]: I1012 05:42:16.868231 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:16 crc kubenswrapper[4930]: I1012 05:42:16.868241 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:16 crc kubenswrapper[4930]: I1012 05:42:16.868255 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:16 crc kubenswrapper[4930]: I1012 05:42:16.868264 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:16Z","lastTransitionTime":"2025-10-12T05:42:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:16 crc kubenswrapper[4930]: I1012 05:42:16.970393 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:16 crc kubenswrapper[4930]: I1012 05:42:16.970432 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:16 crc kubenswrapper[4930]: I1012 05:42:16.970443 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:16 crc kubenswrapper[4930]: I1012 05:42:16.970457 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:16 crc kubenswrapper[4930]: I1012 05:42:16.970468 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:16Z","lastTransitionTime":"2025-10-12T05:42:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:17 crc kubenswrapper[4930]: I1012 05:42:17.072519 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:17 crc kubenswrapper[4930]: I1012 05:42:17.072582 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:17 crc kubenswrapper[4930]: I1012 05:42:17.072601 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:17 crc kubenswrapper[4930]: I1012 05:42:17.072628 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:17 crc kubenswrapper[4930]: I1012 05:42:17.072647 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:17Z","lastTransitionTime":"2025-10-12T05:42:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:17 crc kubenswrapper[4930]: I1012 05:42:17.175248 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:17 crc kubenswrapper[4930]: I1012 05:42:17.175285 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:17 crc kubenswrapper[4930]: I1012 05:42:17.175296 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:17 crc kubenswrapper[4930]: I1012 05:42:17.175309 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:17 crc kubenswrapper[4930]: I1012 05:42:17.175321 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:17Z","lastTransitionTime":"2025-10-12T05:42:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:17 crc kubenswrapper[4930]: I1012 05:42:17.278285 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:17 crc kubenswrapper[4930]: I1012 05:42:17.278333 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:17 crc kubenswrapper[4930]: I1012 05:42:17.278342 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:17 crc kubenswrapper[4930]: I1012 05:42:17.278352 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:17 crc kubenswrapper[4930]: I1012 05:42:17.278362 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:17Z","lastTransitionTime":"2025-10-12T05:42:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:17 crc kubenswrapper[4930]: I1012 05:42:17.381061 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:17 crc kubenswrapper[4930]: I1012 05:42:17.381122 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:17 crc kubenswrapper[4930]: I1012 05:42:17.381142 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:17 crc kubenswrapper[4930]: I1012 05:42:17.381167 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:17 crc kubenswrapper[4930]: I1012 05:42:17.381184 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:17Z","lastTransitionTime":"2025-10-12T05:42:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:17 crc kubenswrapper[4930]: I1012 05:42:17.483991 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:17 crc kubenswrapper[4930]: I1012 05:42:17.484047 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:17 crc kubenswrapper[4930]: I1012 05:42:17.484070 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:17 crc kubenswrapper[4930]: I1012 05:42:17.484094 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:17 crc kubenswrapper[4930]: I1012 05:42:17.484111 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:17Z","lastTransitionTime":"2025-10-12T05:42:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:17 crc kubenswrapper[4930]: I1012 05:42:17.586941 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:17 crc kubenswrapper[4930]: I1012 05:42:17.587002 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:17 crc kubenswrapper[4930]: I1012 05:42:17.587018 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:17 crc kubenswrapper[4930]: I1012 05:42:17.587041 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:17 crc kubenswrapper[4930]: I1012 05:42:17.587061 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:17Z","lastTransitionTime":"2025-10-12T05:42:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:17 crc kubenswrapper[4930]: I1012 05:42:17.656518 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tq29s_c1c3ae9e-26ae-418f-b261-eabc4302b332/kube-multus/0.log" Oct 12 05:42:17 crc kubenswrapper[4930]: I1012 05:42:17.656582 4930 generic.go:334] "Generic (PLEG): container finished" podID="c1c3ae9e-26ae-418f-b261-eabc4302b332" containerID="cbe4d76b294d8c174b77355aae9ec55f4d38c04446fc048603a294d171332584" exitCode=1 Oct 12 05:42:17 crc kubenswrapper[4930]: I1012 05:42:17.656643 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tq29s" event={"ID":"c1c3ae9e-26ae-418f-b261-eabc4302b332","Type":"ContainerDied","Data":"cbe4d76b294d8c174b77355aae9ec55f4d38c04446fc048603a294d171332584"} Oct 12 05:42:17 crc kubenswrapper[4930]: I1012 05:42:17.657070 4930 scope.go:117] "RemoveContainer" containerID="cbe4d76b294d8c174b77355aae9ec55f4d38c04446fc048603a294d171332584" Oct 12 05:42:17 crc kubenswrapper[4930]: I1012 05:42:17.680334 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e930f9f-8120-43b6-95ba-39319c069f64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://428593f6ee3749d62d3cb8553ec6707f0f13f70f7750b8ea90fb0b9f0aa8f337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05e5821bdc37103cb95d9f2a3dcacbff75c87335a6c6ed400846ea49fe370c00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e44390f068c28c4a9917648c747a322ddfa44915b128a61f145d6e2e23fc167a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://586395a6006e26fe969c2ef9af6ec6299523c0868c26989709389f5af24b7e39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:17Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:17 crc kubenswrapper[4930]: I1012 05:42:17.689726 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:17 crc kubenswrapper[4930]: I1012 05:42:17.689790 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:17 crc kubenswrapper[4930]: I1012 05:42:17.689803 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:17 crc kubenswrapper[4930]: I1012 05:42:17.689820 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:17 crc kubenswrapper[4930]: I1012 05:42:17.689831 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:17Z","lastTransitionTime":"2025-10-12T05:42:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:17 crc kubenswrapper[4930]: I1012 05:42:17.695095 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d92840f87187d6a4e0a40972f9b1601b70c300104b6023b980e0939e8b83eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d6f5f971b8708420f74c2b9602179815dd69145979d42bbaeb899290c01c286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:17Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:17 crc kubenswrapper[4930]: I1012 05:42:17.711643 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tq29s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1c3ae9e-26ae-418f-b261-eabc4302b332\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbe4d76b294d8c174b77355aae9ec55f4d38c04446fc048603a294d171332584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbe4d76b294d8c174b77355aae9ec55f4d38c04446fc048603a294d171332584\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T05:42:17Z\\\",\\\"message\\\":\\\"2025-10-12T05:41:31+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9dcab761-8478-4ed2-bae4-530da435f8cb\\\\n2025-10-12T05:41:31+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9dcab761-8478-4ed2-bae4-530da435f8cb to /host/opt/cni/bin/\\\\n2025-10-12T05:41:32Z [verbose] multus-daemon started\\\\n2025-10-12T05:41:32Z [verbose] Readiness Indicator file check\\\\n2025-10-12T05:42:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s29kh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tq29s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:17Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:17 crc kubenswrapper[4930]: I1012 05:42:17.722466 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02f8684c-a3e4-44e8-9741-9f54488d8d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74c17edeca2e2b56f4be36327914c7232a5eee3190d7cd9ce54fe81050942ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqrgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2193886cf3e24fbf30c3e6f91ab52d9b7e68968c8b42a4d0deafab5668555701\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqrgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mk4tf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:17Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:17 crc kubenswrapper[4930]: I1012 05:42:17.735211 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f56867a3-bcac-4478-b32f-0869b7e330a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39f8602e4dd7d746c23c7b303c631d40b2bc2c447acd1491b77be0895c1715a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82f1df2fa5060e9f2922e06c412de9fe47f1dd0236b2bba1d81783dfd52a68d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebbbfef668586cda6fda820fbbba5432c59b621a94025e2dd6d7d0733a3dee09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecfafe319660c39bb6291f6fdb7c0dd529065b7db2e0dd52cbf8e42a56c58dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecfafe319660c39bb6291f6fdb7c0dd529065b7db2e0dd52cbf8e42a56c58dd4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:17Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:17 crc kubenswrapper[4930]: I1012 05:42:17.745984 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c87fc1896b611cf87d3f6e8dce960e0aea4e0a5e161d12e046bec5a4d46d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:17Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:17 crc kubenswrapper[4930]: I1012 05:42:17.758655 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:17Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:17 crc kubenswrapper[4930]: I1012 05:42:17.767407 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7cjzn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dda08509-105f-4935-a8a9-ff852e73c3ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2ntq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2ntq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7cjzn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:17Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:17 crc kubenswrapper[4930]: I1012 05:42:17.792090 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:17 crc kubenswrapper[4930]: I1012 05:42:17.792134 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:17 crc kubenswrapper[4930]: I1012 05:42:17.792146 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:17 crc kubenswrapper[4930]: I1012 05:42:17.792162 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:17 crc kubenswrapper[4930]: I1012 05:42:17.792174 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:17Z","lastTransitionTime":"2025-10-12T05:42:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:17 crc kubenswrapper[4930]: I1012 05:42:17.792225 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd0aa975-5017-4585-b9d9-66563c734f99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2ee3c6242b1dd92c05cad132f941c6463862d38dd3f32db106c4458b19ca281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce0b341a84353da33af118a985d78e1926365e672cd9d9a94a8c9ccdeeeef68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://772b3ac684f8d59be06d3921919db10572b68d9acfb6ae64e85b92c637963057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04341c7af2fbffb89ce55a0ed2594bdb4fd943ac9088994f81f9245c31c62a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c844939574ab6101aa35b883fd6b7dfabb8338c2de1bc238a60b92f7e0fc99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:17Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:17 crc kubenswrapper[4930]: I1012 05:42:17.813329 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vwttt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d928520f-ca1d-4cca-b966-c1e6c9168db0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://083fd9c4a4a835276945402005503980476de4e834b8c77b4347cbfd74347baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://271a5e8f9d11a30e34aaa4965e6d9d036cfc47d469d87a2ff0ddcae5ba4e8fe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://271a5e8f9d11a30e34aaa4965e6d9d036cfc47d469d87a2ff0ddcae5ba4e8fe8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a7387bc9fd2f9b98dd174adbd738309e22b90e1c82756e0e4c114cd5bc1279a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a7387bc9fd2f9b98dd174adbd738309e22b90e1c82756e0e4c114cd5bc1279a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b73defe23757469590b3d3d92e7b79bf3b6dcbcaec44d750770cd2b94caaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6b73defe23757469590b3d3d92e7b79bf3b6dcbcaec44d750770cd2b94caaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c379d6c40705bccb05ab8835e3b2410fc0a98db9e372b4dbd62cc3d4c1c5470\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c379d6c40705bccb05ab8835e3b2410fc0a98db9e372b4dbd62cc3d4c1c5470\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5e3c8e16c925694d42868813bc57edd9c50c88f2153189a2b9f8451655aff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5e3c8e16c925694d42868813bc57edd9c50c88f2153189a2b9f8451655aff1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d01f2b61f7fe3605f4ad0626d2cf28756a8c7e3de343b9a20acafe7f35c19bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d01f2b61f7fe3605f4ad0626d2cf28756a8c7e3de343b9a20acafe7f35c19bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vwttt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:17Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:17 crc kubenswrapper[4930]: I1012 05:42:17.841823 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0add0fa2-092f-4dcc-8c72-82881564bf63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7495c144ce989bf10b08b199a7459ea85351b77941a3edcb0ef76a01b2e08164\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a4295347cdfdf0864fbc2b102904f73e19cc33021fc5a82eed61255c4641d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2cba08ba8fd74a521bdd005e51be52f80687a1ccf0f250d11b709c894be8235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1006beab015e55856b70b9bcb93bb81b541e6c593fae8fd366c361990d233914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17051b78f584f1199be3038a82d9e675d1b9fb4da939611577c0b05235a147a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9898a7d7e617b5eb8eab6b6d9cb5d2ea92e8fbb0f4b1e0e60711e8f4a5e3aef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3921189ea146b5643c2898b528d1e76b32effa405e5e61dc58db80200fb1223c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3921189ea146b5643c2898b528d1e76b32effa405e5e61dc58db80200fb1223c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T05:41:58Z\\\",\\\"message\\\":\\\"controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI1012 05:41:58.201492 6569 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager-operator/metrics\\\\\\\"}\\\\nI1012 05:41:58.201552 6569 services_controller.go:360] Finished syncing service metrics on namespace openshift-controller-manager-operator for network=default : 16.497974ms\\\\nI1012 05:41:58.201573 6569 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI1012 05:41:58.201644 6569 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1012 05:41:58.201741 6569 factory.go:1336] Added *v1.Node event handler 7\\\\nI1012 05:41:58.201827 6569 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1012 05:41:58.202337 6569 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1012 05:41:58.202433 6569 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1012 05:41:58.202487 6569 ovnkube.go:599] Stopped ovnkube\\\\nI1012 05:41:58.202539 6569 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1012 05:41:58.202635 6569 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-mdhw6_openshift-ovn-kubernetes(0add0fa2-092f-4dcc-8c72-82881564bf63)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcac9953a6ca0ac812cb0d1aed7ff1179eed01831f4af4a098f3e6938b5bc666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mdhw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:17Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:17 crc kubenswrapper[4930]: I1012 05:42:17.859110 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fjsw8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c21ff03a-29e7-467c-93e2-45f3a6cef5af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df55e443aa226c07478c488ee54785f35fbb1d6285775e82abcbf924aaf304fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dk8z2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bae41878203d4a2974f46d284f4fc05e705fb9fc9b1a3e8159db00a4c9d70762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dk8z2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fjsw8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:17Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:17 crc kubenswrapper[4930]: I1012 05:42:17.873763 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-br2vl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"545bf51c-0b04-4166-a984-ec9c1276470a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6148cdf3aed29894038db482881c0cf3d0fd2d99c241e87d864099a3eb4248f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tpks\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-br2vl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:17Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:17 crc kubenswrapper[4930]: I1012 05:42:17.887503 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jd2dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"835a1f98-4ae1-499b-b08c-a87dbcf8eaf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7816344abd767a7cdc2aa06418215233815285a683bb5a7fceeeb9a383b00bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvkzz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jd2dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:17Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:17 crc kubenswrapper[4930]: I1012 05:42:17.895284 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:17 crc kubenswrapper[4930]: I1012 05:42:17.895318 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:17 crc kubenswrapper[4930]: I1012 05:42:17.895330 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:17 crc kubenswrapper[4930]: I1012 05:42:17.895349 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:17 crc kubenswrapper[4930]: I1012 05:42:17.895364 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:17Z","lastTransitionTime":"2025-10-12T05:42:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:17 crc kubenswrapper[4930]: I1012 05:42:17.908957 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2acab1-2f7d-4497-ab0c-d2088825de18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://380b8da0bc8ba00ec7fe7a9dd1a4f49f0f0828fa469e21bd947a1b976ae5e584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14de1c8677ea9a164e1713e68861b55fa4b8fd0bff40211afa3ee9e883c65092\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93380cd1156afc3c34f019f5026f01b5242b9009c4c5de83edf5301e4cdb7b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b74b7cf89e0ba8d3417d38fdfc78b271cb23bc6e737358303956214c8f9a0f9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fb5b0196fd005013eba91e6ea563d14230abf0a8cbc1d2a4bf9cd4491f7c391\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T05:41:22Z\\\",\\\"message\\\":\\\"W1012 05:41:11.395066 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 05:41:11.395683 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760247671 cert, and key in /tmp/serving-cert-1865248722/serving-signer.crt, /tmp/serving-cert-1865248722/serving-signer.key\\\\nI1012 05:41:11.668419 1 observer_polling.go:159] Starting file observer\\\\nW1012 05:41:11.671952 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 05:41:11.672482 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 05:41:11.675332 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1865248722/tls.crt::/tmp/serving-cert-1865248722/tls.key\\\\\\\"\\\\nF1012 05:41:22.294380 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9e13b7dbf76d9dcdaa4a8edc40eb5ebac2c044e9f171f78f18b33a08784ed28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:17Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:17 crc kubenswrapper[4930]: I1012 05:42:17.926848 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:17Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:17 crc kubenswrapper[4930]: I1012 05:42:17.948530 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:17Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:17 crc kubenswrapper[4930]: I1012 05:42:17.963549 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://489b43d7659655253d7055c2725d6b59d594b9eea5d1e6dc78b015f8d44824ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:17Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:17 crc kubenswrapper[4930]: I1012 05:42:17.998773 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:17 crc kubenswrapper[4930]: I1012 05:42:17.998824 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:17 crc kubenswrapper[4930]: I1012 05:42:17.998834 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:17 crc kubenswrapper[4930]: I1012 05:42:17.998854 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:17 crc kubenswrapper[4930]: I1012 05:42:17.998862 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:17Z","lastTransitionTime":"2025-10-12T05:42:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.101135 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.101515 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.101671 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.101848 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.101997 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:18Z","lastTransitionTime":"2025-10-12T05:42:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.134506 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.134767 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.134848 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cjzn" Oct 12 05:42:18 crc kubenswrapper[4930]: E1012 05:42:18.134926 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.134592 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 05:42:18 crc kubenswrapper[4930]: E1012 05:42:18.135102 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 05:42:18 crc kubenswrapper[4930]: E1012 05:42:18.135176 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cjzn" podUID="dda08509-105f-4935-a8a9-ff852e73c3ce" Oct 12 05:42:18 crc kubenswrapper[4930]: E1012 05:42:18.136241 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.152529 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c87fc1896b611cf87d3f6e8dce960e0aea4e0a5e161d12e046bec5a4d46d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:18Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.166899 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:18Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.180154 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tq29s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1c3ae9e-26ae-418f-b261-eabc4302b332\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbe4d76b294d8c174b77355aae9ec55f4d38c04446fc048603a294d171332584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbe4d76b294d8c174b77355aae9ec55f4d38c04446fc048603a294d171332584\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T05:42:17Z\\\",\\\"message\\\":\\\"2025-10-12T05:41:31+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9dcab761-8478-4ed2-bae4-530da435f8cb\\\\n2025-10-12T05:41:31+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9dcab761-8478-4ed2-bae4-530da435f8cb to /host/opt/cni/bin/\\\\n2025-10-12T05:41:32Z [verbose] multus-daemon started\\\\n2025-10-12T05:41:32Z [verbose] Readiness Indicator file check\\\\n2025-10-12T05:42:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s29kh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tq29s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:18Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.190933 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02f8684c-a3e4-44e8-9741-9f54488d8d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74c17edeca2e2b56f4be36327914c7232a5eee3190d7cd9ce54fe81050942ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqrgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2193886cf3e24fbf30c3e6f91ab52d9b7e68968c8b42a4d0deafab5668555701\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqrgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mk4tf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:18Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.204312 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.204361 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.204378 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.204400 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.204418 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:18Z","lastTransitionTime":"2025-10-12T05:42:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.205084 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f56867a3-bcac-4478-b32f-0869b7e330a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39f8602e4dd7d746c23c7b303c631d40b2bc2c447acd1491b77be0895c1715a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82f1df2fa5060e9f2922e06c412de9fe47f1dd0236b2bba1d81783dfd52a68d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebbbfef668586cda6fda820fbbba5432c59b621a94025e2dd6d7d0733a3dee09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecfafe319660c39bb6291f6fdb7c0dd529065b7db2e0dd52cbf8e42a56c58dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecfafe319660c39bb6291f6fdb7c0dd529065b7db2e0dd52cbf8e42a56c58dd4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:18Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.254167 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vwttt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d928520f-ca1d-4cca-b966-c1e6c9168db0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://083fd9c4a4a835276945402005503980476de4e834b8c77b4347cbfd74347baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://271a5e8f9d11a30e34aaa4965e6d9d036cfc47d469d87a2ff0ddcae5ba4e8fe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://271a5e8f9d11a30e34aaa4965e6d9d036cfc47d469d87a2ff0ddcae5ba4e8fe8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a7387bc9fd2f9b98dd174adbd738309e22b90e1c82756e0e4c114cd5bc1279a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a7387bc9fd2f9b98dd174adbd738309e22b90e1c82756e0e4c114cd5bc1279a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b73defe23757469590b3d3d92e7b79bf3b6dcbcaec44d750770cd2b94caaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6b73defe23757469590b3d3d92e7b79bf3b6dcbcaec44d750770cd2b94caaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c379d6c40705bccb05ab8835e3b2410fc0a98db9e372b4dbd62cc3d4c1c5470\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c379d6c40705bccb05ab8835e3b2410fc0a98db9e372b4dbd62cc3d4c1c5470\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5e3c8e16c925694d42868813bc57edd9c50c88f2153189a2b9f8451655aff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5e3c8e16c925694d42868813bc57edd9c50c88f2153189a2b9f8451655aff1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d01f2b61f7fe3605f4ad0626d2cf28756a8c7e3de343b9a20acafe7f35c19bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d01f2b61f7fe3605f4ad0626d2cf28756a8c7e3de343b9a20acafe7f35c19bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vwttt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:18Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.283358 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0add0fa2-092f-4dcc-8c72-82881564bf63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7495c144ce989bf10b08b199a7459ea85351b77941a3edcb0ef76a01b2e08164\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a4295347cdfdf0864fbc2b102904f73e19cc33021fc5a82eed61255c4641d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2cba08ba8fd74a521bdd005e51be52f80687a1ccf0f250d11b709c894be8235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1006beab015e55856b70b9bcb93bb81b541e6c593fae8fd366c361990d233914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17051b78f584f1199be3038a82d9e675d1b9fb4da939611577c0b05235a147a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9898a7d7e617b5eb8eab6b6d9cb5d2ea92e8fbb0f4b1e0e60711e8f4a5e3aef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3921189ea146b5643c2898b528d1e76b32effa405e5e61dc58db80200fb1223c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3921189ea146b5643c2898b528d1e76b32effa405e5e61dc58db80200fb1223c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T05:41:58Z\\\",\\\"message\\\":\\\"controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI1012 05:41:58.201492 6569 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager-operator/metrics\\\\\\\"}\\\\nI1012 05:41:58.201552 6569 services_controller.go:360] Finished syncing service metrics on namespace openshift-controller-manager-operator for network=default : 16.497974ms\\\\nI1012 05:41:58.201573 6569 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI1012 05:41:58.201644 6569 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1012 05:41:58.201741 6569 factory.go:1336] Added *v1.Node event handler 7\\\\nI1012 05:41:58.201827 6569 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1012 05:41:58.202337 6569 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1012 05:41:58.202433 6569 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1012 05:41:58.202487 6569 ovnkube.go:599] Stopped ovnkube\\\\nI1012 05:41:58.202539 6569 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1012 05:41:58.202635 6569 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-mdhw6_openshift-ovn-kubernetes(0add0fa2-092f-4dcc-8c72-82881564bf63)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcac9953a6ca0ac812cb0d1aed7ff1179eed01831f4af4a098f3e6938b5bc666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mdhw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:18Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.300632 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fjsw8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c21ff03a-29e7-467c-93e2-45f3a6cef5af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df55e443aa226c07478c488ee54785f35fbb1d6285775e82abcbf924aaf304fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dk8z2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bae41878203d4a2974f46d284f4fc05e705fb9fc9b1a3e8159db00a4c9d70762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dk8z2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fjsw8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:18Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.307472 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.307535 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.307550 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.307579 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.307598 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:18Z","lastTransitionTime":"2025-10-12T05:42:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.315923 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7cjzn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dda08509-105f-4935-a8a9-ff852e73c3ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2ntq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2ntq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7cjzn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:18Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.348948 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd0aa975-5017-4585-b9d9-66563c734f99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2ee3c6242b1dd92c05cad132f941c6463862d38dd3f32db106c4458b19ca281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce0b341a84353da33af118a985d78e1926365e672cd9d9a94a8c9ccdeeeef68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://772b3ac684f8d59be06d3921919db10572b68d9acfb6ae64e85b92c637963057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04341c7af2fbffb89ce55a0ed2594bdb4fd943ac9088994f81f9245c31c62a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c844939574ab6101aa35b883fd6b7dfabb8338c2de1bc238a60b92f7e0fc99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:18Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.364603 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:18Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.382626 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:18Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.398343 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://489b43d7659655253d7055c2725d6b59d594b9eea5d1e6dc78b015f8d44824ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:18Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.411001 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.411046 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.411063 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.411088 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.411106 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:18Z","lastTransitionTime":"2025-10-12T05:42:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.414817 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-br2vl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"545bf51c-0b04-4166-a984-ec9c1276470a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6148cdf3aed29894038db482881c0cf3d0fd2d99c241e87d864099a3eb4248f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tpks\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-br2vl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:18Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.428208 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jd2dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"835a1f98-4ae1-499b-b08c-a87dbcf8eaf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7816344abd767a7cdc2aa06418215233815285a683bb5a7fceeeb9a383b00bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvkzz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jd2dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:18Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.455087 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2acab1-2f7d-4497-ab0c-d2088825de18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://380b8da0bc8ba00ec7fe7a9dd1a4f49f0f0828fa469e21bd947a1b976ae5e584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14de1c8677ea9a164e1713e68861b55fa4b8fd0bff40211afa3ee9e883c65092\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93380cd1156afc3c34f019f5026f01b5242b9009c4c5de83edf5301e4cdb7b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b74b7cf89e0ba8d3417d38fdfc78b271cb23bc6e737358303956214c8f9a0f9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fb5b0196fd005013eba91e6ea563d14230abf0a8cbc1d2a4bf9cd4491f7c391\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T05:41:22Z\\\",\\\"message\\\":\\\"W1012 05:41:11.395066 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 05:41:11.395683 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760247671 cert, and key in /tmp/serving-cert-1865248722/serving-signer.crt, /tmp/serving-cert-1865248722/serving-signer.key\\\\nI1012 05:41:11.668419 1 observer_polling.go:159] Starting file observer\\\\nW1012 05:41:11.671952 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 05:41:11.672482 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 05:41:11.675332 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1865248722/tls.crt::/tmp/serving-cert-1865248722/tls.key\\\\\\\"\\\\nF1012 05:41:22.294380 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9e13b7dbf76d9dcdaa4a8edc40eb5ebac2c044e9f171f78f18b33a08784ed28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:18Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.471947 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d92840f87187d6a4e0a40972f9b1601b70c300104b6023b980e0939e8b83eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d6f5f971b8708420f74c2b9602179815dd69145979d42bbaeb899290c01c286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:18Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.487633 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e930f9f-8120-43b6-95ba-39319c069f64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://428593f6ee3749d62d3cb8553ec6707f0f13f70f7750b8ea90fb0b9f0aa8f337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05e5821bdc37103cb95d9f2a3dcacbff75c87335a6c6ed400846ea49fe370c00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e44390f068c28c4a9917648c747a322ddfa44915b128a61f145d6e2e23fc167a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://586395a6006e26fe969c2ef9af6ec6299523c0868c26989709389f5af24b7e39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:18Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.514381 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.514449 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.514469 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.514499 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.514517 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:18Z","lastTransitionTime":"2025-10-12T05:42:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.617786 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.617837 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.617850 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.617873 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.617888 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:18Z","lastTransitionTime":"2025-10-12T05:42:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.663181 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tq29s_c1c3ae9e-26ae-418f-b261-eabc4302b332/kube-multus/0.log" Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.663254 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tq29s" event={"ID":"c1c3ae9e-26ae-418f-b261-eabc4302b332","Type":"ContainerStarted","Data":"6697bf685228a837d46450ea695e9579b3bda51686b6b5bcd8338e66757476cd"} Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.680240 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:18Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.701223 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:18Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.718470 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://489b43d7659655253d7055c2725d6b59d594b9eea5d1e6dc78b015f8d44824ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:18Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.722259 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.722492 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.722648 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.722845 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.723050 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:18Z","lastTransitionTime":"2025-10-12T05:42:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.735191 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-br2vl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"545bf51c-0b04-4166-a984-ec9c1276470a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6148cdf3aed29894038db482881c0cf3d0fd2d99c241e87d864099a3eb4248f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tpks\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-br2vl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:18Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.752549 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jd2dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"835a1f98-4ae1-499b-b08c-a87dbcf8eaf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7816344abd767a7cdc2aa06418215233815285a683bb5a7fceeeb9a383b00bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvkzz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jd2dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:18Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.770243 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2acab1-2f7d-4497-ab0c-d2088825de18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://380b8da0bc8ba00ec7fe7a9dd1a4f49f0f0828fa469e21bd947a1b976ae5e584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14de1c8677ea9a164e1713e68861b55fa4b8fd0bff40211afa3ee9e883c65092\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93380cd1156afc3c34f019f5026f01b5242b9009c4c5de83edf5301e4cdb7b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b74b7cf89e0ba8d3417d38fdfc78b271cb23bc6e737358303956214c8f9a0f9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fb5b0196fd005013eba91e6ea563d14230abf0a8cbc1d2a4bf9cd4491f7c391\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T05:41:22Z\\\",\\\"message\\\":\\\"W1012 05:41:11.395066 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 05:41:11.395683 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760247671 cert, and key in /tmp/serving-cert-1865248722/serving-signer.crt, /tmp/serving-cert-1865248722/serving-signer.key\\\\nI1012 05:41:11.668419 1 observer_polling.go:159] Starting file observer\\\\nW1012 05:41:11.671952 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 05:41:11.672482 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 05:41:11.675332 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1865248722/tls.crt::/tmp/serving-cert-1865248722/tls.key\\\\\\\"\\\\nF1012 05:41:22.294380 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9e13b7dbf76d9dcdaa4a8edc40eb5ebac2c044e9f171f78f18b33a08784ed28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:18Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.808367 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d92840f87187d6a4e0a40972f9b1601b70c300104b6023b980e0939e8b83eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d6f5f971b8708420f74c2b9602179815dd69145979d42bbaeb899290c01c286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:18Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.826583 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.826627 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.826644 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.826671 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.826689 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:18Z","lastTransitionTime":"2025-10-12T05:42:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.830580 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e930f9f-8120-43b6-95ba-39319c069f64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://428593f6ee3749d62d3cb8553ec6707f0f13f70f7750b8ea90fb0b9f0aa8f337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05e5821bdc37103cb95d9f2a3dcacbff75c87335a6c6ed400846ea49fe370c00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e44390f068c28c4a9917648c747a322ddfa44915b128a61f145d6e2e23fc167a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://586395a6006e26fe969c2ef9af6ec6299523c0868c26989709389f5af24b7e39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:18Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.864594 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c87fc1896b611cf87d3f6e8dce960e0aea4e0a5e161d12e046bec5a4d46d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:18Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.879517 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:18Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.891518 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tq29s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1c3ae9e-26ae-418f-b261-eabc4302b332\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6697bf685228a837d46450ea695e9579b3bda51686b6b5bcd8338e66757476cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbe4d76b294d8c174b77355aae9ec55f4d38c04446fc048603a294d171332584\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T05:42:17Z\\\",\\\"message\\\":\\\"2025-10-12T05:41:31+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9dcab761-8478-4ed2-bae4-530da435f8cb\\\\n2025-10-12T05:41:31+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9dcab761-8478-4ed2-bae4-530da435f8cb to /host/opt/cni/bin/\\\\n2025-10-12T05:41:32Z [verbose] multus-daemon started\\\\n2025-10-12T05:41:32Z [verbose] Readiness Indicator file check\\\\n2025-10-12T05:42:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s29kh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tq29s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:18Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.902837 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02f8684c-a3e4-44e8-9741-9f54488d8d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74c17edeca2e2b56f4be36327914c7232a5eee3190d7cd9ce54fe81050942ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqrgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2193886cf3e24fbf30c3e6f91ab52d9b7e68968c8b42a4d0deafab5668555701\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqrgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mk4tf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:18Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.913075 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f56867a3-bcac-4478-b32f-0869b7e330a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39f8602e4dd7d746c23c7b303c631d40b2bc2c447acd1491b77be0895c1715a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82f1df2fa5060e9f2922e06c412de9fe47f1dd0236b2bba1d81783dfd52a68d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebbbfef668586cda6fda820fbbba5432c59b621a94025e2dd6d7d0733a3dee09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecfafe319660c39bb6291f6fdb7c0dd529065b7db2e0dd52cbf8e42a56c58dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecfafe319660c39bb6291f6fdb7c0dd529065b7db2e0dd52cbf8e42a56c58dd4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:18Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.925677 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vwttt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d928520f-ca1d-4cca-b966-c1e6c9168db0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://083fd9c4a4a835276945402005503980476de4e834b8c77b4347cbfd74347baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://271a5e8f9d11a30e34aaa4965e6d9d036cfc47d469d87a2ff0ddcae5ba4e8fe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://271a5e8f9d11a30e34aaa4965e6d9d036cfc47d469d87a2ff0ddcae5ba4e8fe8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a7387bc9fd2f9b98dd174adbd738309e22b90e1c82756e0e4c114cd5bc1279a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a7387bc9fd2f9b98dd174adbd738309e22b90e1c82756e0e4c114cd5bc1279a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b73defe23757469590b3d3d92e7b79bf3b6dcbcaec44d750770cd2b94caaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6b73defe23757469590b3d3d92e7b79bf3b6dcbcaec44d750770cd2b94caaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c379d6c40705bccb05ab8835e3b2410fc0a98db9e372b4dbd62cc3d4c1c5470\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c379d6c40705bccb05ab8835e3b2410fc0a98db9e372b4dbd62cc3d4c1c5470\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5e3c8e16c925694d42868813bc57edd9c50c88f2153189a2b9f8451655aff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5e3c8e16c925694d42868813bc57edd9c50c88f2153189a2b9f8451655aff1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d01f2b61f7fe3605f4ad0626d2cf28756a8c7e3de343b9a20acafe7f35c19bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d01f2b61f7fe3605f4ad0626d2cf28756a8c7e3de343b9a20acafe7f35c19bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vwttt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:18Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.929829 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.929860 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.929869 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.929887 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.929896 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:18Z","lastTransitionTime":"2025-10-12T05:42:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.946676 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0add0fa2-092f-4dcc-8c72-82881564bf63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7495c144ce989bf10b08b199a7459ea85351b77941a3edcb0ef76a01b2e08164\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a4295347cdfdf0864fbc2b102904f73e19cc33021fc5a82eed61255c4641d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2cba08ba8fd74a521bdd005e51be52f80687a1ccf0f250d11b709c894be8235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1006beab015e55856b70b9bcb93bb81b541e6c593fae8fd366c361990d233914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17051b78f584f1199be3038a82d9e675d1b9fb4da939611577c0b05235a147a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9898a7d7e617b5eb8eab6b6d9cb5d2ea92e8fbb0f4b1e0e60711e8f4a5e3aef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3921189ea146b5643c2898b528d1e76b32effa405e5e61dc58db80200fb1223c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3921189ea146b5643c2898b528d1e76b32effa405e5e61dc58db80200fb1223c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T05:41:58Z\\\",\\\"message\\\":\\\"controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI1012 05:41:58.201492 6569 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager-operator/metrics\\\\\\\"}\\\\nI1012 05:41:58.201552 6569 services_controller.go:360] Finished syncing service metrics on namespace openshift-controller-manager-operator for network=default : 16.497974ms\\\\nI1012 05:41:58.201573 6569 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI1012 05:41:58.201644 6569 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1012 05:41:58.201741 6569 factory.go:1336] Added *v1.Node event handler 7\\\\nI1012 05:41:58.201827 6569 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1012 05:41:58.202337 6569 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1012 05:41:58.202433 6569 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1012 05:41:58.202487 6569 ovnkube.go:599] Stopped ovnkube\\\\nI1012 05:41:58.202539 6569 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1012 05:41:58.202635 6569 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-mdhw6_openshift-ovn-kubernetes(0add0fa2-092f-4dcc-8c72-82881564bf63)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcac9953a6ca0ac812cb0d1aed7ff1179eed01831f4af4a098f3e6938b5bc666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mdhw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:18Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.965515 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fjsw8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c21ff03a-29e7-467c-93e2-45f3a6cef5af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df55e443aa226c07478c488ee54785f35fbb1d6285775e82abcbf924aaf304fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dk8z2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bae41878203d4a2974f46d284f4fc05e705fb9fc9b1a3e8159db00a4c9d70762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dk8z2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fjsw8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:18Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:18 crc kubenswrapper[4930]: I1012 05:42:18.980416 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7cjzn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dda08509-105f-4935-a8a9-ff852e73c3ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2ntq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2ntq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7cjzn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:18Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:19 crc kubenswrapper[4930]: I1012 05:42:19.003351 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd0aa975-5017-4585-b9d9-66563c734f99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2ee3c6242b1dd92c05cad132f941c6463862d38dd3f32db106c4458b19ca281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce0b341a84353da33af118a985d78e1926365e672cd9d9a94a8c9ccdeeeef68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://772b3ac684f8d59be06d3921919db10572b68d9acfb6ae64e85b92c637963057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04341c7af2fbffb89ce55a0ed2594bdb4fd943ac9088994f81f9245c31c62a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c844939574ab6101aa35b883fd6b7dfabb8338c2de1bc238a60b92f7e0fc99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:19Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:19 crc kubenswrapper[4930]: I1012 05:42:19.032601 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:19 crc kubenswrapper[4930]: I1012 05:42:19.032652 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:19 crc kubenswrapper[4930]: I1012 05:42:19.032664 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:19 crc kubenswrapper[4930]: I1012 05:42:19.032680 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:19 crc kubenswrapper[4930]: I1012 05:42:19.032689 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:19Z","lastTransitionTime":"2025-10-12T05:42:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:19 crc kubenswrapper[4930]: I1012 05:42:19.136014 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:19 crc kubenswrapper[4930]: I1012 05:42:19.136513 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:19 crc kubenswrapper[4930]: I1012 05:42:19.136640 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:19 crc kubenswrapper[4930]: I1012 05:42:19.136781 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:19 crc kubenswrapper[4930]: I1012 05:42:19.136872 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:19Z","lastTransitionTime":"2025-10-12T05:42:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:19 crc kubenswrapper[4930]: I1012 05:42:19.239531 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:19 crc kubenswrapper[4930]: I1012 05:42:19.239599 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:19 crc kubenswrapper[4930]: I1012 05:42:19.239617 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:19 crc kubenswrapper[4930]: I1012 05:42:19.239643 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:19 crc kubenswrapper[4930]: I1012 05:42:19.239661 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:19Z","lastTransitionTime":"2025-10-12T05:42:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:19 crc kubenswrapper[4930]: I1012 05:42:19.343518 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:19 crc kubenswrapper[4930]: I1012 05:42:19.343586 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:19 crc kubenswrapper[4930]: I1012 05:42:19.343603 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:19 crc kubenswrapper[4930]: I1012 05:42:19.343630 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:19 crc kubenswrapper[4930]: I1012 05:42:19.343649 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:19Z","lastTransitionTime":"2025-10-12T05:42:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:19 crc kubenswrapper[4930]: I1012 05:42:19.447425 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:19 crc kubenswrapper[4930]: I1012 05:42:19.447469 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:19 crc kubenswrapper[4930]: I1012 05:42:19.447486 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:19 crc kubenswrapper[4930]: I1012 05:42:19.447511 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:19 crc kubenswrapper[4930]: I1012 05:42:19.447526 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:19Z","lastTransitionTime":"2025-10-12T05:42:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:19 crc kubenswrapper[4930]: I1012 05:42:19.551969 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:19 crc kubenswrapper[4930]: I1012 05:42:19.552039 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:19 crc kubenswrapper[4930]: I1012 05:42:19.552051 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:19 crc kubenswrapper[4930]: I1012 05:42:19.552073 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:19 crc kubenswrapper[4930]: I1012 05:42:19.552084 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:19Z","lastTransitionTime":"2025-10-12T05:42:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:19 crc kubenswrapper[4930]: I1012 05:42:19.655821 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:19 crc kubenswrapper[4930]: I1012 05:42:19.655877 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:19 crc kubenswrapper[4930]: I1012 05:42:19.655895 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:19 crc kubenswrapper[4930]: I1012 05:42:19.655924 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:19 crc kubenswrapper[4930]: I1012 05:42:19.655946 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:19Z","lastTransitionTime":"2025-10-12T05:42:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:19 crc kubenswrapper[4930]: I1012 05:42:19.759230 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:19 crc kubenswrapper[4930]: I1012 05:42:19.759275 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:19 crc kubenswrapper[4930]: I1012 05:42:19.759292 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:19 crc kubenswrapper[4930]: I1012 05:42:19.759312 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:19 crc kubenswrapper[4930]: I1012 05:42:19.759329 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:19Z","lastTransitionTime":"2025-10-12T05:42:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:19 crc kubenswrapper[4930]: I1012 05:42:19.862616 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:19 crc kubenswrapper[4930]: I1012 05:42:19.862667 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:19 crc kubenswrapper[4930]: I1012 05:42:19.862686 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:19 crc kubenswrapper[4930]: I1012 05:42:19.862707 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:19 crc kubenswrapper[4930]: I1012 05:42:19.862724 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:19Z","lastTransitionTime":"2025-10-12T05:42:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:19 crc kubenswrapper[4930]: I1012 05:42:19.965769 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:19 crc kubenswrapper[4930]: I1012 05:42:19.965852 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:19 crc kubenswrapper[4930]: I1012 05:42:19.965880 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:19 crc kubenswrapper[4930]: I1012 05:42:19.965914 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:19 crc kubenswrapper[4930]: I1012 05:42:19.965936 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:19Z","lastTransitionTime":"2025-10-12T05:42:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:20 crc kubenswrapper[4930]: I1012 05:42:20.069497 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:20 crc kubenswrapper[4930]: I1012 05:42:20.069543 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:20 crc kubenswrapper[4930]: I1012 05:42:20.069555 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:20 crc kubenswrapper[4930]: I1012 05:42:20.069572 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:20 crc kubenswrapper[4930]: I1012 05:42:20.069584 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:20Z","lastTransitionTime":"2025-10-12T05:42:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:20 crc kubenswrapper[4930]: I1012 05:42:20.134859 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 05:42:20 crc kubenswrapper[4930]: I1012 05:42:20.134936 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 05:42:20 crc kubenswrapper[4930]: E1012 05:42:20.135007 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 05:42:20 crc kubenswrapper[4930]: I1012 05:42:20.135084 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 05:42:20 crc kubenswrapper[4930]: I1012 05:42:20.135170 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cjzn" Oct 12 05:42:20 crc kubenswrapper[4930]: E1012 05:42:20.135230 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 05:42:20 crc kubenswrapper[4930]: E1012 05:42:20.135381 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 05:42:20 crc kubenswrapper[4930]: E1012 05:42:20.135605 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cjzn" podUID="dda08509-105f-4935-a8a9-ff852e73c3ce" Oct 12 05:42:20 crc kubenswrapper[4930]: I1012 05:42:20.171532 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:20 crc kubenswrapper[4930]: I1012 05:42:20.171571 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:20 crc kubenswrapper[4930]: I1012 05:42:20.171584 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:20 crc kubenswrapper[4930]: I1012 05:42:20.171598 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:20 crc kubenswrapper[4930]: I1012 05:42:20.171608 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:20Z","lastTransitionTime":"2025-10-12T05:42:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:20 crc kubenswrapper[4930]: I1012 05:42:20.273888 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:20 crc kubenswrapper[4930]: I1012 05:42:20.273978 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:20 crc kubenswrapper[4930]: I1012 05:42:20.273988 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:20 crc kubenswrapper[4930]: I1012 05:42:20.274008 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:20 crc kubenswrapper[4930]: I1012 05:42:20.274020 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:20Z","lastTransitionTime":"2025-10-12T05:42:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:20 crc kubenswrapper[4930]: I1012 05:42:20.376475 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:20 crc kubenswrapper[4930]: I1012 05:42:20.376529 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:20 crc kubenswrapper[4930]: I1012 05:42:20.376583 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:20 crc kubenswrapper[4930]: I1012 05:42:20.376612 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:20 crc kubenswrapper[4930]: I1012 05:42:20.376630 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:20Z","lastTransitionTime":"2025-10-12T05:42:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:20 crc kubenswrapper[4930]: I1012 05:42:20.478581 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:20 crc kubenswrapper[4930]: I1012 05:42:20.478651 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:20 crc kubenswrapper[4930]: I1012 05:42:20.478665 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:20 crc kubenswrapper[4930]: I1012 05:42:20.478687 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:20 crc kubenswrapper[4930]: I1012 05:42:20.478700 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:20Z","lastTransitionTime":"2025-10-12T05:42:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:20 crc kubenswrapper[4930]: I1012 05:42:20.581825 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:20 crc kubenswrapper[4930]: I1012 05:42:20.581878 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:20 crc kubenswrapper[4930]: I1012 05:42:20.581888 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:20 crc kubenswrapper[4930]: I1012 05:42:20.581909 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:20 crc kubenswrapper[4930]: I1012 05:42:20.581922 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:20Z","lastTransitionTime":"2025-10-12T05:42:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:20 crc kubenswrapper[4930]: I1012 05:42:20.684214 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:20 crc kubenswrapper[4930]: I1012 05:42:20.684290 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:20 crc kubenswrapper[4930]: I1012 05:42:20.684308 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:20 crc kubenswrapper[4930]: I1012 05:42:20.684334 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:20 crc kubenswrapper[4930]: I1012 05:42:20.684352 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:20Z","lastTransitionTime":"2025-10-12T05:42:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:20 crc kubenswrapper[4930]: I1012 05:42:20.788316 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:20 crc kubenswrapper[4930]: I1012 05:42:20.788402 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:20 crc kubenswrapper[4930]: I1012 05:42:20.788425 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:20 crc kubenswrapper[4930]: I1012 05:42:20.788504 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:20 crc kubenswrapper[4930]: I1012 05:42:20.788528 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:20Z","lastTransitionTime":"2025-10-12T05:42:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:20 crc kubenswrapper[4930]: I1012 05:42:20.891294 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:20 crc kubenswrapper[4930]: I1012 05:42:20.891357 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:20 crc kubenswrapper[4930]: I1012 05:42:20.891378 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:20 crc kubenswrapper[4930]: I1012 05:42:20.891406 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:20 crc kubenswrapper[4930]: I1012 05:42:20.891425 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:20Z","lastTransitionTime":"2025-10-12T05:42:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:20 crc kubenswrapper[4930]: I1012 05:42:20.994569 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:20 crc kubenswrapper[4930]: I1012 05:42:20.994620 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:20 crc kubenswrapper[4930]: I1012 05:42:20.994637 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:20 crc kubenswrapper[4930]: I1012 05:42:20.994660 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:20 crc kubenswrapper[4930]: I1012 05:42:20.994677 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:20Z","lastTransitionTime":"2025-10-12T05:42:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:21 crc kubenswrapper[4930]: I1012 05:42:21.098265 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:21 crc kubenswrapper[4930]: I1012 05:42:21.098314 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:21 crc kubenswrapper[4930]: I1012 05:42:21.098331 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:21 crc kubenswrapper[4930]: I1012 05:42:21.098357 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:21 crc kubenswrapper[4930]: I1012 05:42:21.098375 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:21Z","lastTransitionTime":"2025-10-12T05:42:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:21 crc kubenswrapper[4930]: I1012 05:42:21.201660 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:21 crc kubenswrapper[4930]: I1012 05:42:21.201700 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:21 crc kubenswrapper[4930]: I1012 05:42:21.201718 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:21 crc kubenswrapper[4930]: I1012 05:42:21.201772 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:21 crc kubenswrapper[4930]: I1012 05:42:21.201796 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:21Z","lastTransitionTime":"2025-10-12T05:42:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:21 crc kubenswrapper[4930]: I1012 05:42:21.304563 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:21 crc kubenswrapper[4930]: I1012 05:42:21.304604 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:21 crc kubenswrapper[4930]: I1012 05:42:21.304616 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:21 crc kubenswrapper[4930]: I1012 05:42:21.304635 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:21 crc kubenswrapper[4930]: I1012 05:42:21.304647 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:21Z","lastTransitionTime":"2025-10-12T05:42:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:21 crc kubenswrapper[4930]: I1012 05:42:21.408158 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:21 crc kubenswrapper[4930]: I1012 05:42:21.408187 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:21 crc kubenswrapper[4930]: I1012 05:42:21.408197 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:21 crc kubenswrapper[4930]: I1012 05:42:21.408213 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:21 crc kubenswrapper[4930]: I1012 05:42:21.408224 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:21Z","lastTransitionTime":"2025-10-12T05:42:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:21 crc kubenswrapper[4930]: I1012 05:42:21.511877 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:21 crc kubenswrapper[4930]: I1012 05:42:21.511927 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:21 crc kubenswrapper[4930]: I1012 05:42:21.511944 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:21 crc kubenswrapper[4930]: I1012 05:42:21.511968 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:21 crc kubenswrapper[4930]: I1012 05:42:21.511985 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:21Z","lastTransitionTime":"2025-10-12T05:42:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:21 crc kubenswrapper[4930]: I1012 05:42:21.615565 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:21 crc kubenswrapper[4930]: I1012 05:42:21.615624 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:21 crc kubenswrapper[4930]: I1012 05:42:21.615647 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:21 crc kubenswrapper[4930]: I1012 05:42:21.615673 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:21 crc kubenswrapper[4930]: I1012 05:42:21.615700 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:21Z","lastTransitionTime":"2025-10-12T05:42:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:21 crc kubenswrapper[4930]: I1012 05:42:21.718868 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:21 crc kubenswrapper[4930]: I1012 05:42:21.718922 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:21 crc kubenswrapper[4930]: I1012 05:42:21.718940 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:21 crc kubenswrapper[4930]: I1012 05:42:21.718962 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:21 crc kubenswrapper[4930]: I1012 05:42:21.718980 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:21Z","lastTransitionTime":"2025-10-12T05:42:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:21 crc kubenswrapper[4930]: I1012 05:42:21.822018 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:21 crc kubenswrapper[4930]: I1012 05:42:21.822081 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:21 crc kubenswrapper[4930]: I1012 05:42:21.822104 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:21 crc kubenswrapper[4930]: I1012 05:42:21.822130 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:21 crc kubenswrapper[4930]: I1012 05:42:21.822148 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:21Z","lastTransitionTime":"2025-10-12T05:42:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:21 crc kubenswrapper[4930]: I1012 05:42:21.925999 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:21 crc kubenswrapper[4930]: I1012 05:42:21.926080 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:21 crc kubenswrapper[4930]: I1012 05:42:21.926100 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:21 crc kubenswrapper[4930]: I1012 05:42:21.926129 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:21 crc kubenswrapper[4930]: I1012 05:42:21.926150 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:21Z","lastTransitionTime":"2025-10-12T05:42:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:22 crc kubenswrapper[4930]: I1012 05:42:22.030089 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:22 crc kubenswrapper[4930]: I1012 05:42:22.030175 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:22 crc kubenswrapper[4930]: I1012 05:42:22.030202 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:22 crc kubenswrapper[4930]: I1012 05:42:22.030233 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:22 crc kubenswrapper[4930]: I1012 05:42:22.030254 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:22Z","lastTransitionTime":"2025-10-12T05:42:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:22 crc kubenswrapper[4930]: I1012 05:42:22.134544 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:22 crc kubenswrapper[4930]: I1012 05:42:22.134584 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 05:42:22 crc kubenswrapper[4930]: I1012 05:42:22.134612 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 05:42:22 crc kubenswrapper[4930]: I1012 05:42:22.134593 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:22 crc kubenswrapper[4930]: I1012 05:42:22.134789 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 05:42:22 crc kubenswrapper[4930]: I1012 05:42:22.134818 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:22 crc kubenswrapper[4930]: I1012 05:42:22.134953 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:22 crc kubenswrapper[4930]: I1012 05:42:22.134972 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:22Z","lastTransitionTime":"2025-10-12T05:42:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:22 crc kubenswrapper[4930]: I1012 05:42:22.136017 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cjzn" Oct 12 05:42:22 crc kubenswrapper[4930]: E1012 05:42:22.141915 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 05:42:22 crc kubenswrapper[4930]: E1012 05:42:22.142452 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 05:42:22 crc kubenswrapper[4930]: E1012 05:42:22.143063 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cjzn" podUID="dda08509-105f-4935-a8a9-ff852e73c3ce" Oct 12 05:42:22 crc kubenswrapper[4930]: E1012 05:42:22.143495 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 05:42:22 crc kubenswrapper[4930]: I1012 05:42:22.240242 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:22 crc kubenswrapper[4930]: I1012 05:42:22.240316 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:22 crc kubenswrapper[4930]: I1012 05:42:22.240339 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:22 crc kubenswrapper[4930]: I1012 05:42:22.240365 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:22 crc kubenswrapper[4930]: I1012 05:42:22.240384 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:22Z","lastTransitionTime":"2025-10-12T05:42:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:22 crc kubenswrapper[4930]: I1012 05:42:22.344109 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:22 crc kubenswrapper[4930]: I1012 05:42:22.344181 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:22 crc kubenswrapper[4930]: I1012 05:42:22.344199 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:22 crc kubenswrapper[4930]: I1012 05:42:22.344229 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:22 crc kubenswrapper[4930]: I1012 05:42:22.344251 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:22Z","lastTransitionTime":"2025-10-12T05:42:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:22 crc kubenswrapper[4930]: I1012 05:42:22.447686 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:22 crc kubenswrapper[4930]: I1012 05:42:22.447786 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:22 crc kubenswrapper[4930]: I1012 05:42:22.447806 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:22 crc kubenswrapper[4930]: I1012 05:42:22.447832 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:22 crc kubenswrapper[4930]: I1012 05:42:22.447850 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:22Z","lastTransitionTime":"2025-10-12T05:42:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:22 crc kubenswrapper[4930]: I1012 05:42:22.552711 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:22 crc kubenswrapper[4930]: I1012 05:42:22.552858 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:22 crc kubenswrapper[4930]: I1012 05:42:22.552877 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:22 crc kubenswrapper[4930]: I1012 05:42:22.552901 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:22 crc kubenswrapper[4930]: I1012 05:42:22.552917 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:22Z","lastTransitionTime":"2025-10-12T05:42:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:22 crc kubenswrapper[4930]: I1012 05:42:22.655994 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:22 crc kubenswrapper[4930]: I1012 05:42:22.656061 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:22 crc kubenswrapper[4930]: I1012 05:42:22.656082 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:22 crc kubenswrapper[4930]: I1012 05:42:22.656110 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:22 crc kubenswrapper[4930]: I1012 05:42:22.656131 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:22Z","lastTransitionTime":"2025-10-12T05:42:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:22 crc kubenswrapper[4930]: I1012 05:42:22.759061 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:22 crc kubenswrapper[4930]: I1012 05:42:22.759141 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:22 crc kubenswrapper[4930]: I1012 05:42:22.759164 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:22 crc kubenswrapper[4930]: I1012 05:42:22.759194 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:22 crc kubenswrapper[4930]: I1012 05:42:22.759218 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:22Z","lastTransitionTime":"2025-10-12T05:42:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:22 crc kubenswrapper[4930]: I1012 05:42:22.862841 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:22 crc kubenswrapper[4930]: I1012 05:42:22.863009 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:22 crc kubenswrapper[4930]: I1012 05:42:22.863030 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:22 crc kubenswrapper[4930]: I1012 05:42:22.863058 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:22 crc kubenswrapper[4930]: I1012 05:42:22.863081 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:22Z","lastTransitionTime":"2025-10-12T05:42:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:22 crc kubenswrapper[4930]: I1012 05:42:22.965951 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:22 crc kubenswrapper[4930]: I1012 05:42:22.966016 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:22 crc kubenswrapper[4930]: I1012 05:42:22.966033 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:22 crc kubenswrapper[4930]: I1012 05:42:22.966061 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:22 crc kubenswrapper[4930]: I1012 05:42:22.966081 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:22Z","lastTransitionTime":"2025-10-12T05:42:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:23 crc kubenswrapper[4930]: I1012 05:42:23.069690 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:23 crc kubenswrapper[4930]: I1012 05:42:23.069797 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:23 crc kubenswrapper[4930]: I1012 05:42:23.069819 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:23 crc kubenswrapper[4930]: I1012 05:42:23.069856 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:23 crc kubenswrapper[4930]: I1012 05:42:23.069880 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:23Z","lastTransitionTime":"2025-10-12T05:42:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:23 crc kubenswrapper[4930]: I1012 05:42:23.172990 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:23 crc kubenswrapper[4930]: I1012 05:42:23.173052 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:23 crc kubenswrapper[4930]: I1012 05:42:23.173073 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:23 crc kubenswrapper[4930]: I1012 05:42:23.173098 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:23 crc kubenswrapper[4930]: I1012 05:42:23.173117 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:23Z","lastTransitionTime":"2025-10-12T05:42:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:23 crc kubenswrapper[4930]: I1012 05:42:23.275812 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:23 crc kubenswrapper[4930]: I1012 05:42:23.275867 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:23 crc kubenswrapper[4930]: I1012 05:42:23.275884 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:23 crc kubenswrapper[4930]: I1012 05:42:23.275910 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:23 crc kubenswrapper[4930]: I1012 05:42:23.275928 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:23Z","lastTransitionTime":"2025-10-12T05:42:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:23 crc kubenswrapper[4930]: I1012 05:42:23.379908 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:23 crc kubenswrapper[4930]: I1012 05:42:23.379981 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:23 crc kubenswrapper[4930]: I1012 05:42:23.380007 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:23 crc kubenswrapper[4930]: I1012 05:42:23.380038 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:23 crc kubenswrapper[4930]: I1012 05:42:23.380060 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:23Z","lastTransitionTime":"2025-10-12T05:42:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:23 crc kubenswrapper[4930]: I1012 05:42:23.483861 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:23 crc kubenswrapper[4930]: I1012 05:42:23.483927 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:23 crc kubenswrapper[4930]: I1012 05:42:23.483945 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:23 crc kubenswrapper[4930]: I1012 05:42:23.483973 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:23 crc kubenswrapper[4930]: I1012 05:42:23.483992 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:23Z","lastTransitionTime":"2025-10-12T05:42:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:23 crc kubenswrapper[4930]: I1012 05:42:23.587982 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:23 crc kubenswrapper[4930]: I1012 05:42:23.588258 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:23 crc kubenswrapper[4930]: I1012 05:42:23.588288 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:23 crc kubenswrapper[4930]: I1012 05:42:23.588321 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:23 crc kubenswrapper[4930]: I1012 05:42:23.588348 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:23Z","lastTransitionTime":"2025-10-12T05:42:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:23 crc kubenswrapper[4930]: I1012 05:42:23.691529 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:23 crc kubenswrapper[4930]: I1012 05:42:23.691627 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:23 crc kubenswrapper[4930]: I1012 05:42:23.691643 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:23 crc kubenswrapper[4930]: I1012 05:42:23.691668 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:23 crc kubenswrapper[4930]: I1012 05:42:23.691687 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:23Z","lastTransitionTime":"2025-10-12T05:42:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:23 crc kubenswrapper[4930]: I1012 05:42:23.795971 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:23 crc kubenswrapper[4930]: I1012 05:42:23.796053 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:23 crc kubenswrapper[4930]: I1012 05:42:23.796070 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:23 crc kubenswrapper[4930]: I1012 05:42:23.796096 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:23 crc kubenswrapper[4930]: I1012 05:42:23.796113 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:23Z","lastTransitionTime":"2025-10-12T05:42:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:23 crc kubenswrapper[4930]: I1012 05:42:23.899851 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:23 crc kubenswrapper[4930]: I1012 05:42:23.899938 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:23 crc kubenswrapper[4930]: I1012 05:42:23.899971 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:23 crc kubenswrapper[4930]: I1012 05:42:23.900001 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:23 crc kubenswrapper[4930]: I1012 05:42:23.900020 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:23Z","lastTransitionTime":"2025-10-12T05:42:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:24 crc kubenswrapper[4930]: I1012 05:42:24.004013 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:24 crc kubenswrapper[4930]: I1012 05:42:24.004096 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:24 crc kubenswrapper[4930]: I1012 05:42:24.004123 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:24 crc kubenswrapper[4930]: I1012 05:42:24.004160 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:24 crc kubenswrapper[4930]: I1012 05:42:24.004187 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:24Z","lastTransitionTime":"2025-10-12T05:42:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:24 crc kubenswrapper[4930]: I1012 05:42:24.108123 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:24 crc kubenswrapper[4930]: I1012 05:42:24.108238 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:24 crc kubenswrapper[4930]: I1012 05:42:24.108259 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:24 crc kubenswrapper[4930]: I1012 05:42:24.108324 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:24 crc kubenswrapper[4930]: I1012 05:42:24.108344 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:24Z","lastTransitionTime":"2025-10-12T05:42:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:24 crc kubenswrapper[4930]: I1012 05:42:24.134634 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 05:42:24 crc kubenswrapper[4930]: I1012 05:42:24.134726 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 05:42:24 crc kubenswrapper[4930]: E1012 05:42:24.134841 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 05:42:24 crc kubenswrapper[4930]: I1012 05:42:24.134892 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 05:42:24 crc kubenswrapper[4930]: I1012 05:42:24.134987 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cjzn" Oct 12 05:42:24 crc kubenswrapper[4930]: E1012 05:42:24.135049 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 05:42:24 crc kubenswrapper[4930]: E1012 05:42:24.135386 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 05:42:24 crc kubenswrapper[4930]: E1012 05:42:24.135616 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cjzn" podUID="dda08509-105f-4935-a8a9-ff852e73c3ce" Oct 12 05:42:24 crc kubenswrapper[4930]: I1012 05:42:24.136780 4930 scope.go:117] "RemoveContainer" containerID="3921189ea146b5643c2898b528d1e76b32effa405e5e61dc58db80200fb1223c" Oct 12 05:42:24 crc kubenswrapper[4930]: I1012 05:42:24.211671 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:24 crc kubenswrapper[4930]: I1012 05:42:24.211728 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:24 crc kubenswrapper[4930]: I1012 05:42:24.211758 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:24 crc kubenswrapper[4930]: I1012 05:42:24.211787 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:24 crc kubenswrapper[4930]: I1012 05:42:24.211806 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:24Z","lastTransitionTime":"2025-10-12T05:42:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:24 crc kubenswrapper[4930]: I1012 05:42:24.315183 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:24 crc kubenswrapper[4930]: I1012 05:42:24.315269 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:24 crc kubenswrapper[4930]: I1012 05:42:24.315297 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:24 crc kubenswrapper[4930]: I1012 05:42:24.315333 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:24 crc kubenswrapper[4930]: I1012 05:42:24.315357 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:24Z","lastTransitionTime":"2025-10-12T05:42:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:24 crc kubenswrapper[4930]: I1012 05:42:24.419968 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:24 crc kubenswrapper[4930]: I1012 05:42:24.420038 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:24 crc kubenswrapper[4930]: I1012 05:42:24.420055 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:24 crc kubenswrapper[4930]: I1012 05:42:24.420084 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:24 crc kubenswrapper[4930]: I1012 05:42:24.420107 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:24Z","lastTransitionTime":"2025-10-12T05:42:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:24 crc kubenswrapper[4930]: I1012 05:42:24.523383 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:24 crc kubenswrapper[4930]: I1012 05:42:24.523456 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:24 crc kubenswrapper[4930]: I1012 05:42:24.523476 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:24 crc kubenswrapper[4930]: I1012 05:42:24.523504 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:24 crc kubenswrapper[4930]: I1012 05:42:24.523527 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:24Z","lastTransitionTime":"2025-10-12T05:42:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:24 crc kubenswrapper[4930]: I1012 05:42:24.625539 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:24 crc kubenswrapper[4930]: I1012 05:42:24.625584 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:24 crc kubenswrapper[4930]: I1012 05:42:24.625595 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:24 crc kubenswrapper[4930]: I1012 05:42:24.625614 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:24 crc kubenswrapper[4930]: I1012 05:42:24.625625 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:24Z","lastTransitionTime":"2025-10-12T05:42:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:24 crc kubenswrapper[4930]: I1012 05:42:24.687591 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mdhw6_0add0fa2-092f-4dcc-8c72-82881564bf63/ovnkube-controller/2.log" Oct 12 05:42:24 crc kubenswrapper[4930]: I1012 05:42:24.690621 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" event={"ID":"0add0fa2-092f-4dcc-8c72-82881564bf63","Type":"ContainerStarted","Data":"3d13c51f393f32e647751366764fe45c6eb3dd586ba90fc528fce23cae8b16b7"} Oct 12 05:42:24 crc kubenswrapper[4930]: I1012 05:42:24.691107 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" Oct 12 05:42:24 crc kubenswrapper[4930]: I1012 05:42:24.703346 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d92840f87187d6a4e0a40972f9b1601b70c300104b6023b980e0939e8b83eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d6f5f971b8708420f74c2b9602179815dd69145979d42bbaeb899290c01c286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:24Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:24 crc kubenswrapper[4930]: I1012 05:42:24.714918 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e930f9f-8120-43b6-95ba-39319c069f64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://428593f6ee3749d62d3cb8553ec6707f0f13f70f7750b8ea90fb0b9f0aa8f337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05e5821bdc37103cb95d9f2a3dcacbff75c87335a6c6ed400846ea49fe370c00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e44390f068c28c4a9917648c747a322ddfa44915b128a61f145d6e2e23fc167a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://586395a6006e26fe969c2ef9af6ec6299523c0868c26989709389f5af24b7e39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:24Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:24 crc kubenswrapper[4930]: I1012 05:42:24.728825 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:24 crc kubenswrapper[4930]: I1012 05:42:24.728876 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:24 crc kubenswrapper[4930]: I1012 05:42:24.728894 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:24 crc kubenswrapper[4930]: I1012 05:42:24.728925 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:24 crc kubenswrapper[4930]: I1012 05:42:24.728949 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:24Z","lastTransitionTime":"2025-10-12T05:42:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:24 crc kubenswrapper[4930]: I1012 05:42:24.734669 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c87fc1896b611cf87d3f6e8dce960e0aea4e0a5e161d12e046bec5a4d46d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:24Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:24 crc kubenswrapper[4930]: I1012 05:42:24.758326 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:24Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:24 crc kubenswrapper[4930]: I1012 05:42:24.777300 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tq29s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1c3ae9e-26ae-418f-b261-eabc4302b332\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6697bf685228a837d46450ea695e9579b3bda51686b6b5bcd8338e66757476cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbe4d76b294d8c174b77355aae9ec55f4d38c04446fc048603a294d171332584\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T05:42:17Z\\\",\\\"message\\\":\\\"2025-10-12T05:41:31+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9dcab761-8478-4ed2-bae4-530da435f8cb\\\\n2025-10-12T05:41:31+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9dcab761-8478-4ed2-bae4-530da435f8cb to /host/opt/cni/bin/\\\\n2025-10-12T05:41:32Z [verbose] multus-daemon started\\\\n2025-10-12T05:41:32Z [verbose] Readiness Indicator file check\\\\n2025-10-12T05:42:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s29kh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tq29s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:24Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:24 crc kubenswrapper[4930]: I1012 05:42:24.791404 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02f8684c-a3e4-44e8-9741-9f54488d8d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74c17edeca2e2b56f4be36327914c7232a5eee3190d7cd9ce54fe81050942ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqrgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2193886cf3e24fbf30c3e6f91ab52d9b7e68968c8b42a4d0deafab5668555701\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqrgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mk4tf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:24Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:24 crc kubenswrapper[4930]: I1012 05:42:24.805381 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f56867a3-bcac-4478-b32f-0869b7e330a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39f8602e4dd7d746c23c7b303c631d40b2bc2c447acd1491b77be0895c1715a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82f1df2fa5060e9f2922e06c412de9fe47f1dd0236b2bba1d81783dfd52a68d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebbbfef668586cda6fda820fbbba5432c59b621a94025e2dd6d7d0733a3dee09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecfafe319660c39bb6291f6fdb7c0dd529065b7db2e0dd52cbf8e42a56c58dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecfafe319660c39bb6291f6fdb7c0dd529065b7db2e0dd52cbf8e42a56c58dd4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:24Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:24 crc kubenswrapper[4930]: I1012 05:42:24.830917 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vwttt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d928520f-ca1d-4cca-b966-c1e6c9168db0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://083fd9c4a4a835276945402005503980476de4e834b8c77b4347cbfd74347baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://271a5e8f9d11a30e34aaa4965e6d9d036cfc47d469d87a2ff0ddcae5ba4e8fe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://271a5e8f9d11a30e34aaa4965e6d9d036cfc47d469d87a2ff0ddcae5ba4e8fe8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a7387bc9fd2f9b98dd174adbd738309e22b90e1c82756e0e4c114cd5bc1279a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a7387bc9fd2f9b98dd174adbd738309e22b90e1c82756e0e4c114cd5bc1279a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b73defe23757469590b3d3d92e7b79bf3b6dcbcaec44d750770cd2b94caaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6b73defe23757469590b3d3d92e7b79bf3b6dcbcaec44d750770cd2b94caaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c379d6c40705bccb05ab8835e3b2410fc0a98db9e372b4dbd62cc3d4c1c5470\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c379d6c40705bccb05ab8835e3b2410fc0a98db9e372b4dbd62cc3d4c1c5470\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5e3c8e16c925694d42868813bc57edd9c50c88f2153189a2b9f8451655aff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5e3c8e16c925694d42868813bc57edd9c50c88f2153189a2b9f8451655aff1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d01f2b61f7fe3605f4ad0626d2cf28756a8c7e3de343b9a20acafe7f35c19bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d01f2b61f7fe3605f4ad0626d2cf28756a8c7e3de343b9a20acafe7f35c19bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vwttt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:24Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:24 crc kubenswrapper[4930]: I1012 05:42:24.831509 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:24 crc kubenswrapper[4930]: I1012 05:42:24.831559 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:24 crc kubenswrapper[4930]: I1012 05:42:24.831576 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:24 crc kubenswrapper[4930]: I1012 05:42:24.831597 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:24 crc kubenswrapper[4930]: I1012 05:42:24.831612 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:24Z","lastTransitionTime":"2025-10-12T05:42:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:24 crc kubenswrapper[4930]: I1012 05:42:24.858812 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0add0fa2-092f-4dcc-8c72-82881564bf63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7495c144ce989bf10b08b199a7459ea85351b77941a3edcb0ef76a01b2e08164\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a4295347cdfdf0864fbc2b102904f73e19cc33021fc5a82eed61255c4641d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2cba08ba8fd74a521bdd005e51be52f80687a1ccf0f250d11b709c894be8235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1006beab015e55856b70b9bcb93bb81b541e6c593fae8fd366c361990d233914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17051b78f584f1199be3038a82d9e675d1b9fb4da939611577c0b05235a147a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9898a7d7e617b5eb8eab6b6d9cb5d2ea92e8fbb0f4b1e0e60711e8f4a5e3aef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d13c51f393f32e647751366764fe45c6eb3dd586ba90fc528fce23cae8b16b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3921189ea146b5643c2898b528d1e76b32effa405e5e61dc58db80200fb1223c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T05:41:58Z\\\",\\\"message\\\":\\\"controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI1012 05:41:58.201492 6569 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager-operator/metrics\\\\\\\"}\\\\nI1012 05:41:58.201552 6569 services_controller.go:360] Finished syncing service metrics on namespace openshift-controller-manager-operator for network=default : 16.497974ms\\\\nI1012 05:41:58.201573 6569 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI1012 05:41:58.201644 6569 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1012 05:41:58.201741 6569 factory.go:1336] Added *v1.Node event handler 7\\\\nI1012 05:41:58.201827 6569 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1012 05:41:58.202337 6569 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1012 05:41:58.202433 6569 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1012 05:41:58.202487 6569 ovnkube.go:599] Stopped ovnkube\\\\nI1012 05:41:58.202539 6569 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1012 05:41:58.202635 6569 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcac9953a6ca0ac812cb0d1aed7ff1179eed01831f4af4a098f3e6938b5bc666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mdhw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:24Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:24 crc kubenswrapper[4930]: I1012 05:42:24.875469 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fjsw8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c21ff03a-29e7-467c-93e2-45f3a6cef5af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df55e443aa226c07478c488ee54785f35fbb1d6285775e82abcbf924aaf304fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dk8z2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bae41878203d4a2974f46d284f4fc05e705fb9fc9b1a3e8159db00a4c9d70762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dk8z2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fjsw8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:24Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:24 crc kubenswrapper[4930]: I1012 05:42:24.887784 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7cjzn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dda08509-105f-4935-a8a9-ff852e73c3ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2ntq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2ntq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7cjzn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:24Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:24 crc kubenswrapper[4930]: I1012 05:42:24.916190 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd0aa975-5017-4585-b9d9-66563c734f99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2ee3c6242b1dd92c05cad132f941c6463862d38dd3f32db106c4458b19ca281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce0b341a84353da33af118a985d78e1926365e672cd9d9a94a8c9ccdeeeef68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://772b3ac684f8d59be06d3921919db10572b68d9acfb6ae64e85b92c637963057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04341c7af2fbffb89ce55a0ed2594bdb4fd943ac9088994f81f9245c31c62a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c844939574ab6101aa35b883fd6b7dfabb8338c2de1bc238a60b92f7e0fc99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:24Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:24 crc kubenswrapper[4930]: I1012 05:42:24.932094 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:24Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:24 crc kubenswrapper[4930]: I1012 05:42:24.934048 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:24 crc kubenswrapper[4930]: I1012 05:42:24.934075 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:24 crc kubenswrapper[4930]: I1012 05:42:24.934085 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:24 crc kubenswrapper[4930]: I1012 05:42:24.934099 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:24 crc kubenswrapper[4930]: I1012 05:42:24.934109 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:24Z","lastTransitionTime":"2025-10-12T05:42:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:24 crc kubenswrapper[4930]: I1012 05:42:24.947293 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:24Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:24 crc kubenswrapper[4930]: I1012 05:42:24.961882 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://489b43d7659655253d7055c2725d6b59d594b9eea5d1e6dc78b015f8d44824ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:24Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:24 crc kubenswrapper[4930]: I1012 05:42:24.975305 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-br2vl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"545bf51c-0b04-4166-a984-ec9c1276470a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6148cdf3aed29894038db482881c0cf3d0fd2d99c241e87d864099a3eb4248f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tpks\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-br2vl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:24Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:24 crc kubenswrapper[4930]: I1012 05:42:24.989381 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jd2dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"835a1f98-4ae1-499b-b08c-a87dbcf8eaf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7816344abd767a7cdc2aa06418215233815285a683bb5a7fceeeb9a383b00bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvkzz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jd2dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:24Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.006639 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2acab1-2f7d-4497-ab0c-d2088825de18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://380b8da0bc8ba00ec7fe7a9dd1a4f49f0f0828fa469e21bd947a1b976ae5e584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14de1c8677ea9a164e1713e68861b55fa4b8fd0bff40211afa3ee9e883c65092\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93380cd1156afc3c34f019f5026f01b5242b9009c4c5de83edf5301e4cdb7b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b74b7cf89e0ba8d3417d38fdfc78b271cb23bc6e737358303956214c8f9a0f9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fb5b0196fd005013eba91e6ea563d14230abf0a8cbc1d2a4bf9cd4491f7c391\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T05:41:22Z\\\",\\\"message\\\":\\\"W1012 05:41:11.395066 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 05:41:11.395683 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760247671 cert, and key in /tmp/serving-cert-1865248722/serving-signer.crt, /tmp/serving-cert-1865248722/serving-signer.key\\\\nI1012 05:41:11.668419 1 observer_polling.go:159] Starting file observer\\\\nW1012 05:41:11.671952 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 05:41:11.672482 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 05:41:11.675332 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1865248722/tls.crt::/tmp/serving-cert-1865248722/tls.key\\\\\\\"\\\\nF1012 05:41:22.294380 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9e13b7dbf76d9dcdaa4a8edc40eb5ebac2c044e9f171f78f18b33a08784ed28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:25Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.037004 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.037068 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.037086 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.037111 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.037130 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:25Z","lastTransitionTime":"2025-10-12T05:42:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.140370 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.140423 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.140440 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.140462 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.140480 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:25Z","lastTransitionTime":"2025-10-12T05:42:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.243872 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.243951 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.243970 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.243995 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.244014 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:25Z","lastTransitionTime":"2025-10-12T05:42:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.347361 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.347425 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.347442 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.347466 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.347485 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:25Z","lastTransitionTime":"2025-10-12T05:42:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.450897 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.450965 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.450982 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.451006 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.451027 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:25Z","lastTransitionTime":"2025-10-12T05:42:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.554413 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.554494 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.554517 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.554549 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.554578 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:25Z","lastTransitionTime":"2025-10-12T05:42:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.658083 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.658383 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.658514 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.658666 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.658864 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:25Z","lastTransitionTime":"2025-10-12T05:42:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.698229 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mdhw6_0add0fa2-092f-4dcc-8c72-82881564bf63/ovnkube-controller/3.log" Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.700115 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mdhw6_0add0fa2-092f-4dcc-8c72-82881564bf63/ovnkube-controller/2.log" Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.705713 4930 generic.go:334] "Generic (PLEG): container finished" podID="0add0fa2-092f-4dcc-8c72-82881564bf63" containerID="3d13c51f393f32e647751366764fe45c6eb3dd586ba90fc528fce23cae8b16b7" exitCode=1 Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.705803 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" event={"ID":"0add0fa2-092f-4dcc-8c72-82881564bf63","Type":"ContainerDied","Data":"3d13c51f393f32e647751366764fe45c6eb3dd586ba90fc528fce23cae8b16b7"} Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.705891 4930 scope.go:117] "RemoveContainer" containerID="3921189ea146b5643c2898b528d1e76b32effa405e5e61dc58db80200fb1223c" Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.714570 4930 scope.go:117] "RemoveContainer" containerID="3d13c51f393f32e647751366764fe45c6eb3dd586ba90fc528fce23cae8b16b7" Oct 12 05:42:25 crc kubenswrapper[4930]: E1012 05:42:25.714986 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-mdhw6_openshift-ovn-kubernetes(0add0fa2-092f-4dcc-8c72-82881564bf63)\"" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" podUID="0add0fa2-092f-4dcc-8c72-82881564bf63" Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.748765 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd0aa975-5017-4585-b9d9-66563c734f99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2ee3c6242b1dd92c05cad132f941c6463862d38dd3f32db106c4458b19ca281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce0b341a84353da33af118a985d78e1926365e672cd9d9a94a8c9ccdeeeef68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://772b3ac684f8d59be06d3921919db10572b68d9acfb6ae64e85b92c637963057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04341c7af2fbffb89ce55a0ed2594bdb4fd943ac9088994f81f9245c31c62a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c844939574ab6101aa35b883fd6b7dfabb8338c2de1bc238a60b92f7e0fc99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:25Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.763405 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.763476 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.763493 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.763519 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.763538 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:25Z","lastTransitionTime":"2025-10-12T05:42:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.775208 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vwttt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d928520f-ca1d-4cca-b966-c1e6c9168db0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://083fd9c4a4a835276945402005503980476de4e834b8c77b4347cbfd74347baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://271a5e8f9d11a30e34aaa4965e6d9d036cfc47d469d87a2ff0ddcae5ba4e8fe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://271a5e8f9d11a30e34aaa4965e6d9d036cfc47d469d87a2ff0ddcae5ba4e8fe8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a7387bc9fd2f9b98dd174adbd738309e22b90e1c82756e0e4c114cd5bc1279a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a7387bc9fd2f9b98dd174adbd738309e22b90e1c82756e0e4c114cd5bc1279a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b73defe23757469590b3d3d92e7b79bf3b6dcbcaec44d750770cd2b94caaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6b73defe23757469590b3d3d92e7b79bf3b6dcbcaec44d750770cd2b94caaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c379d6c40705bccb05ab8835e3b2410fc0a98db9e372b4dbd62cc3d4c1c5470\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c379d6c40705bccb05ab8835e3b2410fc0a98db9e372b4dbd62cc3d4c1c5470\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5e3c8e16c925694d42868813bc57edd9c50c88f2153189a2b9f8451655aff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5e3c8e16c925694d42868813bc57edd9c50c88f2153189a2b9f8451655aff1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d01f2b61f7fe3605f4ad0626d2cf28756a8c7e3de343b9a20acafe7f35c19bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d01f2b61f7fe3605f4ad0626d2cf28756a8c7e3de343b9a20acafe7f35c19bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vwttt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:25Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.807482 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0add0fa2-092f-4dcc-8c72-82881564bf63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7495c144ce989bf10b08b199a7459ea85351b77941a3edcb0ef76a01b2e08164\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a4295347cdfdf0864fbc2b102904f73e19cc33021fc5a82eed61255c4641d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2cba08ba8fd74a521bdd005e51be52f80687a1ccf0f250d11b709c894be8235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1006beab015e55856b70b9bcb93bb81b541e6c593fae8fd366c361990d233914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17051b78f584f1199be3038a82d9e675d1b9fb4da939611577c0b05235a147a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9898a7d7e617b5eb8eab6b6d9cb5d2ea92e8fbb0f4b1e0e60711e8f4a5e3aef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d13c51f393f32e647751366764fe45c6eb3dd586ba90fc528fce23cae8b16b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3921189ea146b5643c2898b528d1e76b32effa405e5e61dc58db80200fb1223c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T05:41:58Z\\\",\\\"message\\\":\\\"controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI1012 05:41:58.201492 6569 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager-operator/metrics\\\\\\\"}\\\\nI1012 05:41:58.201552 6569 services_controller.go:360] Finished syncing service metrics on namespace openshift-controller-manager-operator for network=default : 16.497974ms\\\\nI1012 05:41:58.201573 6569 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI1012 05:41:58.201644 6569 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1012 05:41:58.201741 6569 factory.go:1336] Added *v1.Node event handler 7\\\\nI1012 05:41:58.201827 6569 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1012 05:41:58.202337 6569 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1012 05:41:58.202433 6569 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1012 05:41:58.202487 6569 ovnkube.go:599] Stopped ovnkube\\\\nI1012 05:41:58.202539 6569 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1012 05:41:58.202635 6569 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d13c51f393f32e647751366764fe45c6eb3dd586ba90fc528fce23cae8b16b7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T05:42:25Z\\\",\\\"message\\\":\\\"-source-55646444c4-trplf uuid:960d98b2-dc64-4e93-a4b6-9b19847af71e logicalSwitch:crc ips:[0xc008e1e3c0] mac:[10 88 10 217 0 59] expires:{wall:0 ext:0 loc:\\\\u003cnil\\\\u003e}} with IP: [10.217.0.59/23] and MAC: 0a:58:0a:d9:00:3b\\\\nI1012 05:42:25.153951 6920 ovnkube_controller.go:900] Cache entry expected pod with UID \\\\\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\\\\\" but failed to find it\\\\nI1012 05:42:25.155495 6920 ovnkube_controller.go:804] Add Logical Switch Port event expected pod with UID \\\\\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\\\\\" in cache\\\\nI1012 05:42:25.155565 6920 ovnkube_controller.go:900] Cache entry expected pod with UID \\\\\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\\\\\" but failed to find it\\\\nI1012 05:42:25.155628 6920 ovnkube_controller.go:804] Add Logical Switch Port event expected pod with UID \\\\\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\\\\\" in cache\\\\nI1012 05:42:25.155704 6920 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-operator-lifecycle-manager/olm-operator-metrics]} name:Service_openshift-operator-lifecycle-manager/olm-operator-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.168:8443:]}] Rows:[] C\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T05:42:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcac9953a6ca0ac812cb0d1aed7ff1179eed01831f4af4a098f3e6938b5bc666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mdhw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:25Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.825463 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fjsw8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c21ff03a-29e7-467c-93e2-45f3a6cef5af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df55e443aa226c07478c488ee54785f35fbb1d6285775e82abcbf924aaf304fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dk8z2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bae41878203d4a2974f46d284f4fc05e705fb9fc9b1a3e8159db00a4c9d70762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dk8z2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fjsw8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:25Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.842732 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7cjzn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dda08509-105f-4935-a8a9-ff852e73c3ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2ntq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2ntq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7cjzn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:25Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.863029 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.863148 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.863171 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.863230 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.863248 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:25Z","lastTransitionTime":"2025-10-12T05:42:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.863345 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jd2dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"835a1f98-4ae1-499b-b08c-a87dbcf8eaf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7816344abd767a7cdc2aa06418215233815285a683bb5a7fceeeb9a383b00bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvkzz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jd2dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:25Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:25 crc kubenswrapper[4930]: E1012 05:42:25.885468 4930 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:42:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:42:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:42:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:42:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"208e324a-7c16-4d54-b585-7c9f58265cb2\\\",\\\"systemUUID\\\":\\\"42c4af26-8d34-49a0-8413-58384a3ecd2b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:25Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.889369 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2acab1-2f7d-4497-ab0c-d2088825de18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://380b8da0bc8ba00ec7fe7a9dd1a4f49f0f0828fa469e21bd947a1b976ae5e584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14de1c8677ea9a164e1713e68861b55fa4b8fd0bff40211afa3ee9e883c65092\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93380cd1156afc3c34f019f5026f01b5242b9009c4c5de83edf5301e4cdb7b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b74b7cf89e0ba8d3417d38fdfc78b271cb23bc6e737358303956214c8f9a0f9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fb5b0196fd005013eba91e6ea563d14230abf0a8cbc1d2a4bf9cd4491f7c391\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T05:41:22Z\\\",\\\"message\\\":\\\"W1012 05:41:11.395066 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 05:41:11.395683 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760247671 cert, and key in /tmp/serving-cert-1865248722/serving-signer.crt, /tmp/serving-cert-1865248722/serving-signer.key\\\\nI1012 05:41:11.668419 1 observer_polling.go:159] Starting file observer\\\\nW1012 05:41:11.671952 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 05:41:11.672482 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 05:41:11.675332 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1865248722/tls.crt::/tmp/serving-cert-1865248722/tls.key\\\\\\\"\\\\nF1012 05:41:22.294380 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9e13b7dbf76d9dcdaa4a8edc40eb5ebac2c044e9f171f78f18b33a08784ed28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:25Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.892197 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.892292 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.892311 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.892338 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.892356 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:25Z","lastTransitionTime":"2025-10-12T05:42:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.909195 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:25Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:25 crc kubenswrapper[4930]: E1012 05:42:25.913899 4930 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:42:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:42:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:42:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:42:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"208e324a-7c16-4d54-b585-7c9f58265cb2\\\",\\\"systemUUID\\\":\\\"42c4af26-8d34-49a0-8413-58384a3ecd2b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:25Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.919012 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.919091 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.919112 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.919142 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.919161 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:25Z","lastTransitionTime":"2025-10-12T05:42:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.932723 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:25Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:25 crc kubenswrapper[4930]: E1012 05:42:25.939173 4930 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:42:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:42:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:42:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:42:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"208e324a-7c16-4d54-b585-7c9f58265cb2\\\",\\\"systemUUID\\\":\\\"42c4af26-8d34-49a0-8413-58384a3ecd2b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:25Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.943499 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.943718 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.943894 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.944032 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.944153 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:25Z","lastTransitionTime":"2025-10-12T05:42:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.952644 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://489b43d7659655253d7055c2725d6b59d594b9eea5d1e6dc78b015f8d44824ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:25Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:25 crc kubenswrapper[4930]: E1012 05:42:25.963610 4930 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:42:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:42:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:42:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:42:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"208e324a-7c16-4d54-b585-7c9f58265cb2\\\",\\\"systemUUID\\\":\\\"42c4af26-8d34-49a0-8413-58384a3ecd2b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:25Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.968974 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.969059 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.969086 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.969118 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.969141 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:25Z","lastTransitionTime":"2025-10-12T05:42:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.970149 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-br2vl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"545bf51c-0b04-4166-a984-ec9c1276470a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6148cdf3aed29894038db482881c0cf3d0fd2d99c241e87d864099a3eb4248f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tpks\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-br2vl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:25Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:25 crc kubenswrapper[4930]: E1012 05:42:25.990472 4930 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:42:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:42:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:42:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:42:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"208e324a-7c16-4d54-b585-7c9f58265cb2\\\",\\\"systemUUID\\\":\\\"42c4af26-8d34-49a0-8413-58384a3ecd2b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:25Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:25 crc kubenswrapper[4930]: E1012 05:42:25.991092 4930 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.993987 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.994050 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.994069 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.994097 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.994117 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:25Z","lastTransitionTime":"2025-10-12T05:42:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:25 crc kubenswrapper[4930]: I1012 05:42:25.994727 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e930f9f-8120-43b6-95ba-39319c069f64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://428593f6ee3749d62d3cb8553ec6707f0f13f70f7750b8ea90fb0b9f0aa8f337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05e5821bdc37103cb95d9f2a3dcacbff75c87335a6c6ed400846ea49fe370c00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e44390f068c28c4a9917648c747a322ddfa44915b128a61f145d6e2e23fc167a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://586395a6006e26fe969c2ef9af6ec6299523c0868c26989709389f5af24b7e39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:25Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:26 crc kubenswrapper[4930]: I1012 05:42:26.014827 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d92840f87187d6a4e0a40972f9b1601b70c300104b6023b980e0939e8b83eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d6f5f971b8708420f74c2b9602179815dd69145979d42bbaeb899290c01c286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:26Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:26 crc kubenswrapper[4930]: I1012 05:42:26.036606 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02f8684c-a3e4-44e8-9741-9f54488d8d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74c17edeca2e2b56f4be36327914c7232a5eee3190d7cd9ce54fe81050942ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqrgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2193886cf3e24fbf30c3e6f91ab52d9b7e68968c8b42a4d0deafab5668555701\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqrgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mk4tf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:26Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:26 crc kubenswrapper[4930]: I1012 05:42:26.057479 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f56867a3-bcac-4478-b32f-0869b7e330a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39f8602e4dd7d746c23c7b303c631d40b2bc2c447acd1491b77be0895c1715a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82f1df2fa5060e9f2922e06c412de9fe47f1dd0236b2bba1d81783dfd52a68d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebbbfef668586cda6fda820fbbba5432c59b621a94025e2dd6d7d0733a3dee09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecfafe319660c39bb6291f6fdb7c0dd529065b7db2e0dd52cbf8e42a56c58dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecfafe319660c39bb6291f6fdb7c0dd529065b7db2e0dd52cbf8e42a56c58dd4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:26Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:26 crc kubenswrapper[4930]: I1012 05:42:26.080613 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c87fc1896b611cf87d3f6e8dce960e0aea4e0a5e161d12e046bec5a4d46d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:26Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:26 crc kubenswrapper[4930]: I1012 05:42:26.097631 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:26 crc kubenswrapper[4930]: I1012 05:42:26.097859 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:26 crc kubenswrapper[4930]: I1012 05:42:26.098000 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:26 crc kubenswrapper[4930]: I1012 05:42:26.098165 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:26 crc kubenswrapper[4930]: I1012 05:42:26.098321 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:26Z","lastTransitionTime":"2025-10-12T05:42:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:26 crc kubenswrapper[4930]: I1012 05:42:26.104244 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:26Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:26 crc kubenswrapper[4930]: I1012 05:42:26.128806 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tq29s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1c3ae9e-26ae-418f-b261-eabc4302b332\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6697bf685228a837d46450ea695e9579b3bda51686b6b5bcd8338e66757476cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbe4d76b294d8c174b77355aae9ec55f4d38c04446fc048603a294d171332584\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T05:42:17Z\\\",\\\"message\\\":\\\"2025-10-12T05:41:31+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9dcab761-8478-4ed2-bae4-530da435f8cb\\\\n2025-10-12T05:41:31+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9dcab761-8478-4ed2-bae4-530da435f8cb to /host/opt/cni/bin/\\\\n2025-10-12T05:41:32Z [verbose] multus-daemon started\\\\n2025-10-12T05:41:32Z [verbose] Readiness Indicator file check\\\\n2025-10-12T05:42:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s29kh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tq29s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:26Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:26 crc kubenswrapper[4930]: I1012 05:42:26.135275 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 05:42:26 crc kubenswrapper[4930]: I1012 05:42:26.135320 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cjzn" Oct 12 05:42:26 crc kubenswrapper[4930]: I1012 05:42:26.135282 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 05:42:26 crc kubenswrapper[4930]: I1012 05:42:26.135422 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 05:42:26 crc kubenswrapper[4930]: E1012 05:42:26.135578 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 05:42:26 crc kubenswrapper[4930]: E1012 05:42:26.135794 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 05:42:26 crc kubenswrapper[4930]: E1012 05:42:26.135904 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cjzn" podUID="dda08509-105f-4935-a8a9-ff852e73c3ce" Oct 12 05:42:26 crc kubenswrapper[4930]: E1012 05:42:26.136089 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 05:42:26 crc kubenswrapper[4930]: I1012 05:42:26.208599 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:26 crc kubenswrapper[4930]: I1012 05:42:26.208665 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:26 crc kubenswrapper[4930]: I1012 05:42:26.208684 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:26 crc kubenswrapper[4930]: I1012 05:42:26.208709 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:26 crc kubenswrapper[4930]: I1012 05:42:26.208728 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:26Z","lastTransitionTime":"2025-10-12T05:42:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:26 crc kubenswrapper[4930]: I1012 05:42:26.312584 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:26 crc kubenswrapper[4930]: I1012 05:42:26.312650 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:26 crc kubenswrapper[4930]: I1012 05:42:26.312671 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:26 crc kubenswrapper[4930]: I1012 05:42:26.312699 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:26 crc kubenswrapper[4930]: I1012 05:42:26.312720 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:26Z","lastTransitionTime":"2025-10-12T05:42:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:26 crc kubenswrapper[4930]: I1012 05:42:26.416179 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:26 crc kubenswrapper[4930]: I1012 05:42:26.416239 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:26 crc kubenswrapper[4930]: I1012 05:42:26.416255 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:26 crc kubenswrapper[4930]: I1012 05:42:26.416282 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:26 crc kubenswrapper[4930]: I1012 05:42:26.416303 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:26Z","lastTransitionTime":"2025-10-12T05:42:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:26 crc kubenswrapper[4930]: I1012 05:42:26.519627 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:26 crc kubenswrapper[4930]: I1012 05:42:26.519686 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:26 crc kubenswrapper[4930]: I1012 05:42:26.519705 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:26 crc kubenswrapper[4930]: I1012 05:42:26.519764 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:26 crc kubenswrapper[4930]: I1012 05:42:26.519786 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:26Z","lastTransitionTime":"2025-10-12T05:42:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:26 crc kubenswrapper[4930]: I1012 05:42:26.622572 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:26 crc kubenswrapper[4930]: I1012 05:42:26.622657 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:26 crc kubenswrapper[4930]: I1012 05:42:26.622687 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:26 crc kubenswrapper[4930]: I1012 05:42:26.622725 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:26 crc kubenswrapper[4930]: I1012 05:42:26.622793 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:26Z","lastTransitionTime":"2025-10-12T05:42:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:26 crc kubenswrapper[4930]: I1012 05:42:26.714533 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mdhw6_0add0fa2-092f-4dcc-8c72-82881564bf63/ovnkube-controller/3.log" Oct 12 05:42:26 crc kubenswrapper[4930]: I1012 05:42:26.720347 4930 scope.go:117] "RemoveContainer" containerID="3d13c51f393f32e647751366764fe45c6eb3dd586ba90fc528fce23cae8b16b7" Oct 12 05:42:26 crc kubenswrapper[4930]: E1012 05:42:26.720591 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-mdhw6_openshift-ovn-kubernetes(0add0fa2-092f-4dcc-8c72-82881564bf63)\"" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" podUID="0add0fa2-092f-4dcc-8c72-82881564bf63" Oct 12 05:42:26 crc kubenswrapper[4930]: I1012 05:42:26.725803 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:26 crc kubenswrapper[4930]: I1012 05:42:26.725878 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:26 crc kubenswrapper[4930]: I1012 05:42:26.725899 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:26 crc kubenswrapper[4930]: I1012 05:42:26.725928 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:26 crc kubenswrapper[4930]: I1012 05:42:26.725947 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:26Z","lastTransitionTime":"2025-10-12T05:42:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:26 crc kubenswrapper[4930]: I1012 05:42:26.743883 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e930f9f-8120-43b6-95ba-39319c069f64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://428593f6ee3749d62d3cb8553ec6707f0f13f70f7750b8ea90fb0b9f0aa8f337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05e5821bdc37103cb95d9f2a3dcacbff75c87335a6c6ed400846ea49fe370c00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e44390f068c28c4a9917648c747a322ddfa44915b128a61f145d6e2e23fc167a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://586395a6006e26fe969c2ef9af6ec6299523c0868c26989709389f5af24b7e39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:26Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:26 crc kubenswrapper[4930]: I1012 05:42:26.765014 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d92840f87187d6a4e0a40972f9b1601b70c300104b6023b980e0939e8b83eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d6f5f971b8708420f74c2b9602179815dd69145979d42bbaeb899290c01c286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:26Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:26 crc kubenswrapper[4930]: I1012 05:42:26.784050 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f56867a3-bcac-4478-b32f-0869b7e330a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39f8602e4dd7d746c23c7b303c631d40b2bc2c447acd1491b77be0895c1715a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82f1df2fa5060e9f2922e06c412de9fe47f1dd0236b2bba1d81783dfd52a68d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebbbfef668586cda6fda820fbbba5432c59b621a94025e2dd6d7d0733a3dee09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecfafe319660c39bb6291f6fdb7c0dd529065b7db2e0dd52cbf8e42a56c58dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecfafe319660c39bb6291f6fdb7c0dd529065b7db2e0dd52cbf8e42a56c58dd4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:26Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:26 crc kubenswrapper[4930]: I1012 05:42:26.805285 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c87fc1896b611cf87d3f6e8dce960e0aea4e0a5e161d12e046bec5a4d46d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:26Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:26 crc kubenswrapper[4930]: I1012 05:42:26.826417 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:26Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:26 crc kubenswrapper[4930]: I1012 05:42:26.829361 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:26 crc kubenswrapper[4930]: I1012 05:42:26.829413 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:26 crc kubenswrapper[4930]: I1012 05:42:26.829493 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:26 crc kubenswrapper[4930]: I1012 05:42:26.829571 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:26 crc kubenswrapper[4930]: I1012 05:42:26.829595 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:26Z","lastTransitionTime":"2025-10-12T05:42:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:26 crc kubenswrapper[4930]: I1012 05:42:26.851573 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tq29s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1c3ae9e-26ae-418f-b261-eabc4302b332\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6697bf685228a837d46450ea695e9579b3bda51686b6b5bcd8338e66757476cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbe4d76b294d8c174b77355aae9ec55f4d38c04446fc048603a294d171332584\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T05:42:17Z\\\",\\\"message\\\":\\\"2025-10-12T05:41:31+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9dcab761-8478-4ed2-bae4-530da435f8cb\\\\n2025-10-12T05:41:31+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9dcab761-8478-4ed2-bae4-530da435f8cb to /host/opt/cni/bin/\\\\n2025-10-12T05:41:32Z [verbose] multus-daemon started\\\\n2025-10-12T05:41:32Z [verbose] Readiness Indicator file check\\\\n2025-10-12T05:42:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s29kh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tq29s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:26Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:26 crc kubenswrapper[4930]: I1012 05:42:26.871466 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02f8684c-a3e4-44e8-9741-9f54488d8d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74c17edeca2e2b56f4be36327914c7232a5eee3190d7cd9ce54fe81050942ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqrgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2193886cf3e24fbf30c3e6f91ab52d9b7e68968c8b42a4d0deafab5668555701\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqrgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mk4tf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:26Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:26 crc kubenswrapper[4930]: I1012 05:42:26.906659 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd0aa975-5017-4585-b9d9-66563c734f99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2ee3c6242b1dd92c05cad132f941c6463862d38dd3f32db106c4458b19ca281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce0b341a84353da33af118a985d78e1926365e672cd9d9a94a8c9ccdeeeef68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://772b3ac684f8d59be06d3921919db10572b68d9acfb6ae64e85b92c637963057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04341c7af2fbffb89ce55a0ed2594bdb4fd943ac9088994f81f9245c31c62a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c844939574ab6101aa35b883fd6b7dfabb8338c2de1bc238a60b92f7e0fc99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:26Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:26 crc kubenswrapper[4930]: I1012 05:42:26.932578 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vwttt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d928520f-ca1d-4cca-b966-c1e6c9168db0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://083fd9c4a4a835276945402005503980476de4e834b8c77b4347cbfd74347baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://271a5e8f9d11a30e34aaa4965e6d9d036cfc47d469d87a2ff0ddcae5ba4e8fe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://271a5e8f9d11a30e34aaa4965e6d9d036cfc47d469d87a2ff0ddcae5ba4e8fe8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a7387bc9fd2f9b98dd174adbd738309e22b90e1c82756e0e4c114cd5bc1279a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a7387bc9fd2f9b98dd174adbd738309e22b90e1c82756e0e4c114cd5bc1279a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b73defe23757469590b3d3d92e7b79bf3b6dcbcaec44d750770cd2b94caaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6b73defe23757469590b3d3d92e7b79bf3b6dcbcaec44d750770cd2b94caaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c379d6c40705bccb05ab8835e3b2410fc0a98db9e372b4dbd62cc3d4c1c5470\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c379d6c40705bccb05ab8835e3b2410fc0a98db9e372b4dbd62cc3d4c1c5470\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5e3c8e16c925694d42868813bc57edd9c50c88f2153189a2b9f8451655aff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5e3c8e16c925694d42868813bc57edd9c50c88f2153189a2b9f8451655aff1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d01f2b61f7fe3605f4ad0626d2cf28756a8c7e3de343b9a20acafe7f35c19bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d01f2b61f7fe3605f4ad0626d2cf28756a8c7e3de343b9a20acafe7f35c19bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vwttt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:26Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:26 crc kubenswrapper[4930]: I1012 05:42:26.933775 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:26 crc kubenswrapper[4930]: I1012 05:42:26.933848 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:26 crc kubenswrapper[4930]: I1012 05:42:26.933868 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:26 crc kubenswrapper[4930]: I1012 05:42:26.933901 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:26 crc kubenswrapper[4930]: I1012 05:42:26.933925 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:26Z","lastTransitionTime":"2025-10-12T05:42:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:26 crc kubenswrapper[4930]: I1012 05:42:26.966872 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0add0fa2-092f-4dcc-8c72-82881564bf63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7495c144ce989bf10b08b199a7459ea85351b77941a3edcb0ef76a01b2e08164\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a4295347cdfdf0864fbc2b102904f73e19cc33021fc5a82eed61255c4641d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2cba08ba8fd74a521bdd005e51be52f80687a1ccf0f250d11b709c894be8235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1006beab015e55856b70b9bcb93bb81b541e6c593fae8fd366c361990d233914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17051b78f584f1199be3038a82d9e675d1b9fb4da939611577c0b05235a147a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9898a7d7e617b5eb8eab6b6d9cb5d2ea92e8fbb0f4b1e0e60711e8f4a5e3aef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d13c51f393f32e647751366764fe45c6eb3dd586ba90fc528fce23cae8b16b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d13c51f393f32e647751366764fe45c6eb3dd586ba90fc528fce23cae8b16b7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T05:42:25Z\\\",\\\"message\\\":\\\"-source-55646444c4-trplf uuid:960d98b2-dc64-4e93-a4b6-9b19847af71e logicalSwitch:crc ips:[0xc008e1e3c0] mac:[10 88 10 217 0 59] expires:{wall:0 ext:0 loc:\\\\u003cnil\\\\u003e}} with IP: [10.217.0.59/23] and MAC: 0a:58:0a:d9:00:3b\\\\nI1012 05:42:25.153951 6920 ovnkube_controller.go:900] Cache entry expected pod with UID \\\\\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\\\\\" but failed to find it\\\\nI1012 05:42:25.155495 6920 ovnkube_controller.go:804] Add Logical Switch Port event expected pod with UID \\\\\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\\\\\" in cache\\\\nI1012 05:42:25.155565 6920 ovnkube_controller.go:900] Cache entry expected pod with UID \\\\\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\\\\\" but failed to find it\\\\nI1012 05:42:25.155628 6920 ovnkube_controller.go:804] Add Logical Switch Port event expected pod with UID \\\\\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\\\\\" in cache\\\\nI1012 05:42:25.155704 6920 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-operator-lifecycle-manager/olm-operator-metrics]} name:Service_openshift-operator-lifecycle-manager/olm-operator-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.168:8443:]}] Rows:[] C\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T05:42:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-mdhw6_openshift-ovn-kubernetes(0add0fa2-092f-4dcc-8c72-82881564bf63)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcac9953a6ca0ac812cb0d1aed7ff1179eed01831f4af4a098f3e6938b5bc666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mdhw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:26Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:26 crc kubenswrapper[4930]: I1012 05:42:26.986273 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fjsw8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c21ff03a-29e7-467c-93e2-45f3a6cef5af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df55e443aa226c07478c488ee54785f35fbb1d6285775e82abcbf924aaf304fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dk8z2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bae41878203d4a2974f46d284f4fc05e705fb9fc9b1a3e8159db00a4c9d70762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dk8z2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fjsw8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:26Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:27 crc kubenswrapper[4930]: I1012 05:42:27.005174 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7cjzn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dda08509-105f-4935-a8a9-ff852e73c3ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2ntq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2ntq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7cjzn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:27Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:27 crc kubenswrapper[4930]: I1012 05:42:27.028762 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2acab1-2f7d-4497-ab0c-d2088825de18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://380b8da0bc8ba00ec7fe7a9dd1a4f49f0f0828fa469e21bd947a1b976ae5e584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14de1c8677ea9a164e1713e68861b55fa4b8fd0bff40211afa3ee9e883c65092\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93380cd1156afc3c34f019f5026f01b5242b9009c4c5de83edf5301e4cdb7b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b74b7cf89e0ba8d3417d38fdfc78b271cb23bc6e737358303956214c8f9a0f9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fb5b0196fd005013eba91e6ea563d14230abf0a8cbc1d2a4bf9cd4491f7c391\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T05:41:22Z\\\",\\\"message\\\":\\\"W1012 05:41:11.395066 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 05:41:11.395683 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760247671 cert, and key in /tmp/serving-cert-1865248722/serving-signer.crt, /tmp/serving-cert-1865248722/serving-signer.key\\\\nI1012 05:41:11.668419 1 observer_polling.go:159] Starting file observer\\\\nW1012 05:41:11.671952 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 05:41:11.672482 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 05:41:11.675332 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1865248722/tls.crt::/tmp/serving-cert-1865248722/tls.key\\\\\\\"\\\\nF1012 05:41:22.294380 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9e13b7dbf76d9dcdaa4a8edc40eb5ebac2c044e9f171f78f18b33a08784ed28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:27Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:27 crc kubenswrapper[4930]: I1012 05:42:27.037246 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:27 crc kubenswrapper[4930]: I1012 05:42:27.037291 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:27 crc kubenswrapper[4930]: I1012 05:42:27.037308 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:27 crc kubenswrapper[4930]: I1012 05:42:27.037334 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:27 crc kubenswrapper[4930]: I1012 05:42:27.037351 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:27Z","lastTransitionTime":"2025-10-12T05:42:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:27 crc kubenswrapper[4930]: I1012 05:42:27.053388 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:27Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:27 crc kubenswrapper[4930]: I1012 05:42:27.075783 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:27Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:27 crc kubenswrapper[4930]: I1012 05:42:27.096697 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://489b43d7659655253d7055c2725d6b59d594b9eea5d1e6dc78b015f8d44824ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:27Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:27 crc kubenswrapper[4930]: I1012 05:42:27.115555 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-br2vl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"545bf51c-0b04-4166-a984-ec9c1276470a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6148cdf3aed29894038db482881c0cf3d0fd2d99c241e87d864099a3eb4248f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tpks\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-br2vl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:27Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:27 crc kubenswrapper[4930]: I1012 05:42:27.134097 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jd2dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"835a1f98-4ae1-499b-b08c-a87dbcf8eaf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7816344abd767a7cdc2aa06418215233815285a683bb5a7fceeeb9a383b00bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvkzz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jd2dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:27Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:27 crc kubenswrapper[4930]: I1012 05:42:27.140590 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:27 crc kubenswrapper[4930]: I1012 05:42:27.140651 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:27 crc kubenswrapper[4930]: I1012 05:42:27.140668 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:27 crc kubenswrapper[4930]: I1012 05:42:27.140700 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:27 crc kubenswrapper[4930]: I1012 05:42:27.140715 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:27Z","lastTransitionTime":"2025-10-12T05:42:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:27 crc kubenswrapper[4930]: I1012 05:42:27.244953 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:27 crc kubenswrapper[4930]: I1012 05:42:27.245294 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:27 crc kubenswrapper[4930]: I1012 05:42:27.245431 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:27 crc kubenswrapper[4930]: I1012 05:42:27.245589 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:27 crc kubenswrapper[4930]: I1012 05:42:27.245774 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:27Z","lastTransitionTime":"2025-10-12T05:42:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:27 crc kubenswrapper[4930]: I1012 05:42:27.349569 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:27 crc kubenswrapper[4930]: I1012 05:42:27.349628 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:27 crc kubenswrapper[4930]: I1012 05:42:27.349647 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:27 crc kubenswrapper[4930]: I1012 05:42:27.349672 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:27 crc kubenswrapper[4930]: I1012 05:42:27.349691 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:27Z","lastTransitionTime":"2025-10-12T05:42:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:27 crc kubenswrapper[4930]: I1012 05:42:27.452491 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:27 crc kubenswrapper[4930]: I1012 05:42:27.452553 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:27 crc kubenswrapper[4930]: I1012 05:42:27.452574 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:27 crc kubenswrapper[4930]: I1012 05:42:27.452602 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:27 crc kubenswrapper[4930]: I1012 05:42:27.452621 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:27Z","lastTransitionTime":"2025-10-12T05:42:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:27 crc kubenswrapper[4930]: I1012 05:42:27.556183 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:27 crc kubenswrapper[4930]: I1012 05:42:27.556237 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:27 crc kubenswrapper[4930]: I1012 05:42:27.556257 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:27 crc kubenswrapper[4930]: I1012 05:42:27.556280 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:27 crc kubenswrapper[4930]: I1012 05:42:27.556299 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:27Z","lastTransitionTime":"2025-10-12T05:42:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:27 crc kubenswrapper[4930]: I1012 05:42:27.658920 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:27 crc kubenswrapper[4930]: I1012 05:42:27.659296 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:27 crc kubenswrapper[4930]: I1012 05:42:27.659560 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:27 crc kubenswrapper[4930]: I1012 05:42:27.659805 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:27 crc kubenswrapper[4930]: I1012 05:42:27.659996 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:27Z","lastTransitionTime":"2025-10-12T05:42:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:27 crc kubenswrapper[4930]: I1012 05:42:27.763380 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:27 crc kubenswrapper[4930]: I1012 05:42:27.763458 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:27 crc kubenswrapper[4930]: I1012 05:42:27.763475 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:27 crc kubenswrapper[4930]: I1012 05:42:27.763500 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:27 crc kubenswrapper[4930]: I1012 05:42:27.763517 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:27Z","lastTransitionTime":"2025-10-12T05:42:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:27 crc kubenswrapper[4930]: I1012 05:42:27.866251 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:27 crc kubenswrapper[4930]: I1012 05:42:27.866299 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:27 crc kubenswrapper[4930]: I1012 05:42:27.866316 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:27 crc kubenswrapper[4930]: I1012 05:42:27.866340 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:27 crc kubenswrapper[4930]: I1012 05:42:27.866356 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:27Z","lastTransitionTime":"2025-10-12T05:42:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:27 crc kubenswrapper[4930]: I1012 05:42:27.969438 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:27 crc kubenswrapper[4930]: I1012 05:42:27.969491 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:27 crc kubenswrapper[4930]: I1012 05:42:27.969508 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:27 crc kubenswrapper[4930]: I1012 05:42:27.969531 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:27 crc kubenswrapper[4930]: I1012 05:42:27.969550 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:27Z","lastTransitionTime":"2025-10-12T05:42:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:28 crc kubenswrapper[4930]: I1012 05:42:28.073244 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:28 crc kubenswrapper[4930]: I1012 05:42:28.073304 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:28 crc kubenswrapper[4930]: I1012 05:42:28.073322 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:28 crc kubenswrapper[4930]: I1012 05:42:28.073350 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:28 crc kubenswrapper[4930]: I1012 05:42:28.073371 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:28Z","lastTransitionTime":"2025-10-12T05:42:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:28 crc kubenswrapper[4930]: I1012 05:42:28.134911 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 05:42:28 crc kubenswrapper[4930]: I1012 05:42:28.135011 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 05:42:28 crc kubenswrapper[4930]: E1012 05:42:28.135073 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 05:42:28 crc kubenswrapper[4930]: I1012 05:42:28.135171 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 05:42:28 crc kubenswrapper[4930]: E1012 05:42:28.135234 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 05:42:28 crc kubenswrapper[4930]: I1012 05:42:28.135465 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cjzn" Oct 12 05:42:28 crc kubenswrapper[4930]: E1012 05:42:28.135575 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 05:42:28 crc kubenswrapper[4930]: E1012 05:42:28.135769 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cjzn" podUID="dda08509-105f-4935-a8a9-ff852e73c3ce" Oct 12 05:42:28 crc kubenswrapper[4930]: I1012 05:42:28.164101 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d92840f87187d6a4e0a40972f9b1601b70c300104b6023b980e0939e8b83eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d6f5f971b8708420f74c2b9602179815dd69145979d42bbaeb899290c01c286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:28Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:28 crc kubenswrapper[4930]: I1012 05:42:28.177119 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:28 crc kubenswrapper[4930]: I1012 05:42:28.177180 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:28 crc kubenswrapper[4930]: I1012 05:42:28.177199 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:28 crc kubenswrapper[4930]: I1012 05:42:28.177226 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:28 crc kubenswrapper[4930]: I1012 05:42:28.177245 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:28Z","lastTransitionTime":"2025-10-12T05:42:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:28 crc kubenswrapper[4930]: I1012 05:42:28.188986 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e930f9f-8120-43b6-95ba-39319c069f64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://428593f6ee3749d62d3cb8553ec6707f0f13f70f7750b8ea90fb0b9f0aa8f337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05e5821bdc37103cb95d9f2a3dcacbff75c87335a6c6ed400846ea49fe370c00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e44390f068c28c4a9917648c747a322ddfa44915b128a61f145d6e2e23fc167a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://586395a6006e26fe969c2ef9af6ec6299523c0868c26989709389f5af24b7e39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:28Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:28 crc kubenswrapper[4930]: I1012 05:42:28.211428 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c87fc1896b611cf87d3f6e8dce960e0aea4e0a5e161d12e046bec5a4d46d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:28Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:28 crc kubenswrapper[4930]: I1012 05:42:28.234046 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:28Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:28 crc kubenswrapper[4930]: I1012 05:42:28.267980 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tq29s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1c3ae9e-26ae-418f-b261-eabc4302b332\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6697bf685228a837d46450ea695e9579b3bda51686b6b5bcd8338e66757476cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbe4d76b294d8c174b77355aae9ec55f4d38c04446fc048603a294d171332584\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T05:42:17Z\\\",\\\"message\\\":\\\"2025-10-12T05:41:31+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9dcab761-8478-4ed2-bae4-530da435f8cb\\\\n2025-10-12T05:41:31+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9dcab761-8478-4ed2-bae4-530da435f8cb to /host/opt/cni/bin/\\\\n2025-10-12T05:41:32Z [verbose] multus-daemon started\\\\n2025-10-12T05:41:32Z [verbose] Readiness Indicator file check\\\\n2025-10-12T05:42:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s29kh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tq29s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:28Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:28 crc kubenswrapper[4930]: I1012 05:42:28.281783 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:28 crc kubenswrapper[4930]: I1012 05:42:28.281855 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:28 crc kubenswrapper[4930]: I1012 05:42:28.281882 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:28 crc kubenswrapper[4930]: I1012 05:42:28.281914 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:28 crc kubenswrapper[4930]: I1012 05:42:28.281935 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:28Z","lastTransitionTime":"2025-10-12T05:42:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:28 crc kubenswrapper[4930]: I1012 05:42:28.291889 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02f8684c-a3e4-44e8-9741-9f54488d8d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74c17edeca2e2b56f4be36327914c7232a5eee3190d7cd9ce54fe81050942ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqrgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2193886cf3e24fbf30c3e6f91ab52d9b7e68968c8b42a4d0deafab5668555701\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqrgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mk4tf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:28Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:28 crc kubenswrapper[4930]: I1012 05:42:28.316461 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f56867a3-bcac-4478-b32f-0869b7e330a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39f8602e4dd7d746c23c7b303c631d40b2bc2c447acd1491b77be0895c1715a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82f1df2fa5060e9f2922e06c412de9fe47f1dd0236b2bba1d81783dfd52a68d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebbbfef668586cda6fda820fbbba5432c59b621a94025e2dd6d7d0733a3dee09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecfafe319660c39bb6291f6fdb7c0dd529065b7db2e0dd52cbf8e42a56c58dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecfafe319660c39bb6291f6fdb7c0dd529065b7db2e0dd52cbf8e42a56c58dd4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:28Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:28 crc kubenswrapper[4930]: I1012 05:42:28.343304 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vwttt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d928520f-ca1d-4cca-b966-c1e6c9168db0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://083fd9c4a4a835276945402005503980476de4e834b8c77b4347cbfd74347baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://271a5e8f9d11a30e34aaa4965e6d9d036cfc47d469d87a2ff0ddcae5ba4e8fe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://271a5e8f9d11a30e34aaa4965e6d9d036cfc47d469d87a2ff0ddcae5ba4e8fe8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a7387bc9fd2f9b98dd174adbd738309e22b90e1c82756e0e4c114cd5bc1279a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a7387bc9fd2f9b98dd174adbd738309e22b90e1c82756e0e4c114cd5bc1279a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b73defe23757469590b3d3d92e7b79bf3b6dcbcaec44d750770cd2b94caaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6b73defe23757469590b3d3d92e7b79bf3b6dcbcaec44d750770cd2b94caaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c379d6c40705bccb05ab8835e3b2410fc0a98db9e372b4dbd62cc3d4c1c5470\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c379d6c40705bccb05ab8835e3b2410fc0a98db9e372b4dbd62cc3d4c1c5470\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5e3c8e16c925694d42868813bc57edd9c50c88f2153189a2b9f8451655aff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5e3c8e16c925694d42868813bc57edd9c50c88f2153189a2b9f8451655aff1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d01f2b61f7fe3605f4ad0626d2cf28756a8c7e3de343b9a20acafe7f35c19bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d01f2b61f7fe3605f4ad0626d2cf28756a8c7e3de343b9a20acafe7f35c19bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vwttt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:28Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:28 crc kubenswrapper[4930]: I1012 05:42:28.376430 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0add0fa2-092f-4dcc-8c72-82881564bf63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7495c144ce989bf10b08b199a7459ea85351b77941a3edcb0ef76a01b2e08164\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a4295347cdfdf0864fbc2b102904f73e19cc33021fc5a82eed61255c4641d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2cba08ba8fd74a521bdd005e51be52f80687a1ccf0f250d11b709c894be8235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1006beab015e55856b70b9bcb93bb81b541e6c593fae8fd366c361990d233914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17051b78f584f1199be3038a82d9e675d1b9fb4da939611577c0b05235a147a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9898a7d7e617b5eb8eab6b6d9cb5d2ea92e8fbb0f4b1e0e60711e8f4a5e3aef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d13c51f393f32e647751366764fe45c6eb3dd586ba90fc528fce23cae8b16b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d13c51f393f32e647751366764fe45c6eb3dd586ba90fc528fce23cae8b16b7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T05:42:25Z\\\",\\\"message\\\":\\\"-source-55646444c4-trplf uuid:960d98b2-dc64-4e93-a4b6-9b19847af71e logicalSwitch:crc ips:[0xc008e1e3c0] mac:[10 88 10 217 0 59] expires:{wall:0 ext:0 loc:\\\\u003cnil\\\\u003e}} with IP: [10.217.0.59/23] and MAC: 0a:58:0a:d9:00:3b\\\\nI1012 05:42:25.153951 6920 ovnkube_controller.go:900] Cache entry expected pod with UID \\\\\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\\\\\" but failed to find it\\\\nI1012 05:42:25.155495 6920 ovnkube_controller.go:804] Add Logical Switch Port event expected pod with UID \\\\\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\\\\\" in cache\\\\nI1012 05:42:25.155565 6920 ovnkube_controller.go:900] Cache entry expected pod with UID \\\\\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\\\\\" but failed to find it\\\\nI1012 05:42:25.155628 6920 ovnkube_controller.go:804] Add Logical Switch Port event expected pod with UID \\\\\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\\\\\" in cache\\\\nI1012 05:42:25.155704 6920 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-operator-lifecycle-manager/olm-operator-metrics]} name:Service_openshift-operator-lifecycle-manager/olm-operator-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.168:8443:]}] Rows:[] C\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T05:42:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-mdhw6_openshift-ovn-kubernetes(0add0fa2-092f-4dcc-8c72-82881564bf63)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcac9953a6ca0ac812cb0d1aed7ff1179eed01831f4af4a098f3e6938b5bc666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mdhw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:28Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:28 crc kubenswrapper[4930]: I1012 05:42:28.386160 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:28 crc kubenswrapper[4930]: I1012 05:42:28.386229 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:28 crc kubenswrapper[4930]: I1012 05:42:28.386249 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:28 crc kubenswrapper[4930]: I1012 05:42:28.386283 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:28 crc kubenswrapper[4930]: I1012 05:42:28.386306 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:28Z","lastTransitionTime":"2025-10-12T05:42:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:28 crc kubenswrapper[4930]: I1012 05:42:28.399154 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fjsw8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c21ff03a-29e7-467c-93e2-45f3a6cef5af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df55e443aa226c07478c488ee54785f35fbb1d6285775e82abcbf924aaf304fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dk8z2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bae41878203d4a2974f46d284f4fc05e705fb9fc9b1a3e8159db00a4c9d70762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dk8z2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fjsw8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:28Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:28 crc kubenswrapper[4930]: I1012 05:42:28.418112 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7cjzn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dda08509-105f-4935-a8a9-ff852e73c3ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2ntq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2ntq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7cjzn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:28Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:28 crc kubenswrapper[4930]: I1012 05:42:28.454687 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd0aa975-5017-4585-b9d9-66563c734f99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2ee3c6242b1dd92c05cad132f941c6463862d38dd3f32db106c4458b19ca281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce0b341a84353da33af118a985d78e1926365e672cd9d9a94a8c9ccdeeeef68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://772b3ac684f8d59be06d3921919db10572b68d9acfb6ae64e85b92c637963057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04341c7af2fbffb89ce55a0ed2594bdb4fd943ac9088994f81f9245c31c62a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c844939574ab6101aa35b883fd6b7dfabb8338c2de1bc238a60b92f7e0fc99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:28Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:28 crc kubenswrapper[4930]: I1012 05:42:28.477238 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:28Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:28 crc kubenswrapper[4930]: I1012 05:42:28.490766 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:28 crc kubenswrapper[4930]: I1012 05:42:28.490852 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:28 crc kubenswrapper[4930]: I1012 05:42:28.490879 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:28 crc kubenswrapper[4930]: I1012 05:42:28.490919 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:28 crc kubenswrapper[4930]: I1012 05:42:28.490947 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:28Z","lastTransitionTime":"2025-10-12T05:42:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:28 crc kubenswrapper[4930]: I1012 05:42:28.500028 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:28Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:28 crc kubenswrapper[4930]: I1012 05:42:28.521490 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://489b43d7659655253d7055c2725d6b59d594b9eea5d1e6dc78b015f8d44824ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:28Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:28 crc kubenswrapper[4930]: I1012 05:42:28.549428 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-br2vl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"545bf51c-0b04-4166-a984-ec9c1276470a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6148cdf3aed29894038db482881c0cf3d0fd2d99c241e87d864099a3eb4248f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tpks\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-br2vl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:28Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:28 crc kubenswrapper[4930]: I1012 05:42:28.568663 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jd2dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"835a1f98-4ae1-499b-b08c-a87dbcf8eaf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7816344abd767a7cdc2aa06418215233815285a683bb5a7fceeeb9a383b00bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvkzz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jd2dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:28Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:28 crc kubenswrapper[4930]: I1012 05:42:28.595127 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:28 crc kubenswrapper[4930]: I1012 05:42:28.595205 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:28 crc kubenswrapper[4930]: I1012 05:42:28.595226 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:28 crc kubenswrapper[4930]: I1012 05:42:28.595258 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:28 crc kubenswrapper[4930]: I1012 05:42:28.595288 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:28Z","lastTransitionTime":"2025-10-12T05:42:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:28 crc kubenswrapper[4930]: I1012 05:42:28.597657 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2acab1-2f7d-4497-ab0c-d2088825de18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://380b8da0bc8ba00ec7fe7a9dd1a4f49f0f0828fa469e21bd947a1b976ae5e584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14de1c8677ea9a164e1713e68861b55fa4b8fd0bff40211afa3ee9e883c65092\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93380cd1156afc3c34f019f5026f01b5242b9009c4c5de83edf5301e4cdb7b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b74b7cf89e0ba8d3417d38fdfc78b271cb23bc6e737358303956214c8f9a0f9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fb5b0196fd005013eba91e6ea563d14230abf0a8cbc1d2a4bf9cd4491f7c391\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T05:41:22Z\\\",\\\"message\\\":\\\"W1012 05:41:11.395066 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 05:41:11.395683 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760247671 cert, and key in /tmp/serving-cert-1865248722/serving-signer.crt, /tmp/serving-cert-1865248722/serving-signer.key\\\\nI1012 05:41:11.668419 1 observer_polling.go:159] Starting file observer\\\\nW1012 05:41:11.671952 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 05:41:11.672482 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 05:41:11.675332 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1865248722/tls.crt::/tmp/serving-cert-1865248722/tls.key\\\\\\\"\\\\nF1012 05:41:22.294380 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9e13b7dbf76d9dcdaa4a8edc40eb5ebac2c044e9f171f78f18b33a08784ed28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:28Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:28 crc kubenswrapper[4930]: I1012 05:42:28.697968 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:28 crc kubenswrapper[4930]: I1012 05:42:28.698033 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:28 crc kubenswrapper[4930]: I1012 05:42:28.698051 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:28 crc kubenswrapper[4930]: I1012 05:42:28.698074 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:28 crc kubenswrapper[4930]: I1012 05:42:28.698092 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:28Z","lastTransitionTime":"2025-10-12T05:42:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:28 crc kubenswrapper[4930]: I1012 05:42:28.801762 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:28 crc kubenswrapper[4930]: I1012 05:42:28.802049 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:28 crc kubenswrapper[4930]: I1012 05:42:28.802063 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:28 crc kubenswrapper[4930]: I1012 05:42:28.802084 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:28 crc kubenswrapper[4930]: I1012 05:42:28.802099 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:28Z","lastTransitionTime":"2025-10-12T05:42:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:28 crc kubenswrapper[4930]: I1012 05:42:28.905378 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:28 crc kubenswrapper[4930]: I1012 05:42:28.905454 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:28 crc kubenswrapper[4930]: I1012 05:42:28.905471 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:28 crc kubenswrapper[4930]: I1012 05:42:28.905499 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:28 crc kubenswrapper[4930]: I1012 05:42:28.905519 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:28Z","lastTransitionTime":"2025-10-12T05:42:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:29 crc kubenswrapper[4930]: I1012 05:42:29.008987 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:29 crc kubenswrapper[4930]: I1012 05:42:29.009066 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:29 crc kubenswrapper[4930]: I1012 05:42:29.009097 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:29 crc kubenswrapper[4930]: I1012 05:42:29.009130 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:29 crc kubenswrapper[4930]: I1012 05:42:29.009154 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:29Z","lastTransitionTime":"2025-10-12T05:42:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:29 crc kubenswrapper[4930]: I1012 05:42:29.114538 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:29 crc kubenswrapper[4930]: I1012 05:42:29.114614 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:29 crc kubenswrapper[4930]: I1012 05:42:29.114635 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:29 crc kubenswrapper[4930]: I1012 05:42:29.114661 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:29 crc kubenswrapper[4930]: I1012 05:42:29.114680 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:29Z","lastTransitionTime":"2025-10-12T05:42:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:29 crc kubenswrapper[4930]: I1012 05:42:29.217288 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:29 crc kubenswrapper[4930]: I1012 05:42:29.217365 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:29 crc kubenswrapper[4930]: I1012 05:42:29.217390 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:29 crc kubenswrapper[4930]: I1012 05:42:29.217419 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:29 crc kubenswrapper[4930]: I1012 05:42:29.217440 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:29Z","lastTransitionTime":"2025-10-12T05:42:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:29 crc kubenswrapper[4930]: I1012 05:42:29.321175 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:29 crc kubenswrapper[4930]: I1012 05:42:29.321245 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:29 crc kubenswrapper[4930]: I1012 05:42:29.321261 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:29 crc kubenswrapper[4930]: I1012 05:42:29.321289 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:29 crc kubenswrapper[4930]: I1012 05:42:29.321309 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:29Z","lastTransitionTime":"2025-10-12T05:42:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:29 crc kubenswrapper[4930]: I1012 05:42:29.426630 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:29 crc kubenswrapper[4930]: I1012 05:42:29.426685 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:29 crc kubenswrapper[4930]: I1012 05:42:29.426706 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:29 crc kubenswrapper[4930]: I1012 05:42:29.426780 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:29 crc kubenswrapper[4930]: I1012 05:42:29.426810 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:29Z","lastTransitionTime":"2025-10-12T05:42:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:29 crc kubenswrapper[4930]: I1012 05:42:29.529397 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:29 crc kubenswrapper[4930]: I1012 05:42:29.529459 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:29 crc kubenswrapper[4930]: I1012 05:42:29.529475 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:29 crc kubenswrapper[4930]: I1012 05:42:29.529497 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:29 crc kubenswrapper[4930]: I1012 05:42:29.529513 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:29Z","lastTransitionTime":"2025-10-12T05:42:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:29 crc kubenswrapper[4930]: I1012 05:42:29.632601 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:29 crc kubenswrapper[4930]: I1012 05:42:29.632655 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:29 crc kubenswrapper[4930]: I1012 05:42:29.632670 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:29 crc kubenswrapper[4930]: I1012 05:42:29.632687 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:29 crc kubenswrapper[4930]: I1012 05:42:29.632695 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:29Z","lastTransitionTime":"2025-10-12T05:42:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:29 crc kubenswrapper[4930]: I1012 05:42:29.735111 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:29 crc kubenswrapper[4930]: I1012 05:42:29.735167 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:29 crc kubenswrapper[4930]: I1012 05:42:29.735185 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:29 crc kubenswrapper[4930]: I1012 05:42:29.735207 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:29 crc kubenswrapper[4930]: I1012 05:42:29.735224 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:29Z","lastTransitionTime":"2025-10-12T05:42:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:29 crc kubenswrapper[4930]: I1012 05:42:29.838291 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:29 crc kubenswrapper[4930]: I1012 05:42:29.838335 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:29 crc kubenswrapper[4930]: I1012 05:42:29.838346 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:29 crc kubenswrapper[4930]: I1012 05:42:29.838361 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:29 crc kubenswrapper[4930]: I1012 05:42:29.838395 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:29Z","lastTransitionTime":"2025-10-12T05:42:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:29 crc kubenswrapper[4930]: I1012 05:42:29.941282 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:29 crc kubenswrapper[4930]: I1012 05:42:29.941330 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:29 crc kubenswrapper[4930]: I1012 05:42:29.941341 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:29 crc kubenswrapper[4930]: I1012 05:42:29.941361 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:29 crc kubenswrapper[4930]: I1012 05:42:29.941373 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:29Z","lastTransitionTime":"2025-10-12T05:42:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:30 crc kubenswrapper[4930]: I1012 05:42:30.045030 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:30 crc kubenswrapper[4930]: I1012 05:42:30.045119 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:30 crc kubenswrapper[4930]: I1012 05:42:30.045146 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:30 crc kubenswrapper[4930]: I1012 05:42:30.045182 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:30 crc kubenswrapper[4930]: I1012 05:42:30.045206 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:30Z","lastTransitionTime":"2025-10-12T05:42:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:30 crc kubenswrapper[4930]: I1012 05:42:30.136623 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 05:42:30 crc kubenswrapper[4930]: I1012 05:42:30.136785 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 05:42:30 crc kubenswrapper[4930]: I1012 05:42:30.136640 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cjzn" Oct 12 05:42:30 crc kubenswrapper[4930]: I1012 05:42:30.136932 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 05:42:30 crc kubenswrapper[4930]: E1012 05:42:30.136928 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 05:42:30 crc kubenswrapper[4930]: E1012 05:42:30.137118 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cjzn" podUID="dda08509-105f-4935-a8a9-ff852e73c3ce" Oct 12 05:42:30 crc kubenswrapper[4930]: E1012 05:42:30.137303 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 05:42:30 crc kubenswrapper[4930]: E1012 05:42:30.137683 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 05:42:30 crc kubenswrapper[4930]: I1012 05:42:30.148136 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:30 crc kubenswrapper[4930]: I1012 05:42:30.148246 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:30 crc kubenswrapper[4930]: I1012 05:42:30.148271 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:30 crc kubenswrapper[4930]: I1012 05:42:30.148336 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:30 crc kubenswrapper[4930]: I1012 05:42:30.148358 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:30Z","lastTransitionTime":"2025-10-12T05:42:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:30 crc kubenswrapper[4930]: I1012 05:42:30.251600 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:30 crc kubenswrapper[4930]: I1012 05:42:30.251725 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:30 crc kubenswrapper[4930]: I1012 05:42:30.251806 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:30 crc kubenswrapper[4930]: I1012 05:42:30.251932 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:30 crc kubenswrapper[4930]: I1012 05:42:30.252004 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:30Z","lastTransitionTime":"2025-10-12T05:42:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:30 crc kubenswrapper[4930]: I1012 05:42:30.355588 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:30 crc kubenswrapper[4930]: I1012 05:42:30.355659 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:30 crc kubenswrapper[4930]: I1012 05:42:30.355678 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:30 crc kubenswrapper[4930]: I1012 05:42:30.355707 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:30 crc kubenswrapper[4930]: I1012 05:42:30.355730 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:30Z","lastTransitionTime":"2025-10-12T05:42:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:30 crc kubenswrapper[4930]: I1012 05:42:30.460522 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:30 crc kubenswrapper[4930]: I1012 05:42:30.460599 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:30 crc kubenswrapper[4930]: I1012 05:42:30.460617 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:30 crc kubenswrapper[4930]: I1012 05:42:30.460645 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:30 crc kubenswrapper[4930]: I1012 05:42:30.460665 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:30Z","lastTransitionTime":"2025-10-12T05:42:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:30 crc kubenswrapper[4930]: I1012 05:42:30.564319 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:30 crc kubenswrapper[4930]: I1012 05:42:30.564397 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:30 crc kubenswrapper[4930]: I1012 05:42:30.564418 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:30 crc kubenswrapper[4930]: I1012 05:42:30.564445 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:30 crc kubenswrapper[4930]: I1012 05:42:30.564465 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:30Z","lastTransitionTime":"2025-10-12T05:42:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:30 crc kubenswrapper[4930]: I1012 05:42:30.667514 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:30 crc kubenswrapper[4930]: I1012 05:42:30.667571 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:30 crc kubenswrapper[4930]: I1012 05:42:30.667588 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:30 crc kubenswrapper[4930]: I1012 05:42:30.667612 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:30 crc kubenswrapper[4930]: I1012 05:42:30.667631 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:30Z","lastTransitionTime":"2025-10-12T05:42:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:30 crc kubenswrapper[4930]: I1012 05:42:30.770095 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:30 crc kubenswrapper[4930]: I1012 05:42:30.770151 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:30 crc kubenswrapper[4930]: I1012 05:42:30.770171 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:30 crc kubenswrapper[4930]: I1012 05:42:30.770196 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:30 crc kubenswrapper[4930]: I1012 05:42:30.770214 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:30Z","lastTransitionTime":"2025-10-12T05:42:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:30 crc kubenswrapper[4930]: I1012 05:42:30.874216 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:30 crc kubenswrapper[4930]: I1012 05:42:30.874264 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:30 crc kubenswrapper[4930]: I1012 05:42:30.874283 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:30 crc kubenswrapper[4930]: I1012 05:42:30.874315 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:30 crc kubenswrapper[4930]: I1012 05:42:30.874516 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:30Z","lastTransitionTime":"2025-10-12T05:42:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:30 crc kubenswrapper[4930]: I1012 05:42:30.979155 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:30 crc kubenswrapper[4930]: I1012 05:42:30.979286 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:30 crc kubenswrapper[4930]: I1012 05:42:30.979362 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:30 crc kubenswrapper[4930]: I1012 05:42:30.979397 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:30 crc kubenswrapper[4930]: I1012 05:42:30.979495 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:30Z","lastTransitionTime":"2025-10-12T05:42:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:31 crc kubenswrapper[4930]: I1012 05:42:31.083525 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:31 crc kubenswrapper[4930]: I1012 05:42:31.083614 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:31 crc kubenswrapper[4930]: I1012 05:42:31.083635 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:31 crc kubenswrapper[4930]: I1012 05:42:31.083666 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:31 crc kubenswrapper[4930]: I1012 05:42:31.083687 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:31Z","lastTransitionTime":"2025-10-12T05:42:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:31 crc kubenswrapper[4930]: I1012 05:42:31.187514 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:31 crc kubenswrapper[4930]: I1012 05:42:31.187595 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:31 crc kubenswrapper[4930]: I1012 05:42:31.187619 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:31 crc kubenswrapper[4930]: I1012 05:42:31.187655 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:31 crc kubenswrapper[4930]: I1012 05:42:31.187679 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:31Z","lastTransitionTime":"2025-10-12T05:42:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:31 crc kubenswrapper[4930]: I1012 05:42:31.291707 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:31 crc kubenswrapper[4930]: I1012 05:42:31.291806 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:31 crc kubenswrapper[4930]: I1012 05:42:31.291825 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:31 crc kubenswrapper[4930]: I1012 05:42:31.291850 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:31 crc kubenswrapper[4930]: I1012 05:42:31.291868 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:31Z","lastTransitionTime":"2025-10-12T05:42:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:31 crc kubenswrapper[4930]: I1012 05:42:31.396049 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:31 crc kubenswrapper[4930]: I1012 05:42:31.396120 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:31 crc kubenswrapper[4930]: I1012 05:42:31.396144 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:31 crc kubenswrapper[4930]: I1012 05:42:31.396207 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:31 crc kubenswrapper[4930]: I1012 05:42:31.396237 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:31Z","lastTransitionTime":"2025-10-12T05:42:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:31 crc kubenswrapper[4930]: I1012 05:42:31.499306 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:31 crc kubenswrapper[4930]: I1012 05:42:31.499368 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:31 crc kubenswrapper[4930]: I1012 05:42:31.499386 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:31 crc kubenswrapper[4930]: I1012 05:42:31.499412 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:31 crc kubenswrapper[4930]: I1012 05:42:31.499437 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:31Z","lastTransitionTime":"2025-10-12T05:42:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:31 crc kubenswrapper[4930]: I1012 05:42:31.602887 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:31 crc kubenswrapper[4930]: I1012 05:42:31.602948 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:31 crc kubenswrapper[4930]: I1012 05:42:31.602968 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:31 crc kubenswrapper[4930]: I1012 05:42:31.602994 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:31 crc kubenswrapper[4930]: I1012 05:42:31.603012 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:31Z","lastTransitionTime":"2025-10-12T05:42:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:31 crc kubenswrapper[4930]: I1012 05:42:31.706435 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:31 crc kubenswrapper[4930]: I1012 05:42:31.706494 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:31 crc kubenswrapper[4930]: I1012 05:42:31.706510 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:31 crc kubenswrapper[4930]: I1012 05:42:31.706536 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:31 crc kubenswrapper[4930]: I1012 05:42:31.706555 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:31Z","lastTransitionTime":"2025-10-12T05:42:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:31 crc kubenswrapper[4930]: I1012 05:42:31.808821 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:31 crc kubenswrapper[4930]: I1012 05:42:31.808894 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:31 crc kubenswrapper[4930]: I1012 05:42:31.808911 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:31 crc kubenswrapper[4930]: I1012 05:42:31.808937 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:31 crc kubenswrapper[4930]: I1012 05:42:31.808958 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:31Z","lastTransitionTime":"2025-10-12T05:42:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:31 crc kubenswrapper[4930]: I1012 05:42:31.913310 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:31 crc kubenswrapper[4930]: I1012 05:42:31.913369 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:31 crc kubenswrapper[4930]: I1012 05:42:31.913388 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:31 crc kubenswrapper[4930]: I1012 05:42:31.913412 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:31 crc kubenswrapper[4930]: I1012 05:42:31.913431 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:31Z","lastTransitionTime":"2025-10-12T05:42:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:32 crc kubenswrapper[4930]: I1012 05:42:32.016230 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:32 crc kubenswrapper[4930]: I1012 05:42:32.016315 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:32 crc kubenswrapper[4930]: I1012 05:42:32.016340 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:32 crc kubenswrapper[4930]: I1012 05:42:32.016371 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:32 crc kubenswrapper[4930]: I1012 05:42:32.016397 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:32Z","lastTransitionTime":"2025-10-12T05:42:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:32 crc kubenswrapper[4930]: I1012 05:42:32.104212 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 05:42:32 crc kubenswrapper[4930]: E1012 05:42:32.104405 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 05:43:36.104373729 +0000 UTC m=+148.646475524 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:42:32 crc kubenswrapper[4930]: I1012 05:42:32.104463 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 05:42:32 crc kubenswrapper[4930]: I1012 05:42:32.104513 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 05:42:32 crc kubenswrapper[4930]: I1012 05:42:32.104559 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 05:42:32 crc kubenswrapper[4930]: I1012 05:42:32.104592 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 05:42:32 crc kubenswrapper[4930]: E1012 05:42:32.104782 4930 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 12 05:42:32 crc kubenswrapper[4930]: E1012 05:42:32.104799 4930 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 12 05:42:32 crc kubenswrapper[4930]: E1012 05:42:32.104850 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-12 05:43:36.10483556 +0000 UTC m=+148.646937365 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 12 05:42:32 crc kubenswrapper[4930]: E1012 05:42:32.104882 4930 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 12 05:42:32 crc kubenswrapper[4930]: E1012 05:42:32.104923 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-12 05:43:36.104888812 +0000 UTC m=+148.646990607 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 12 05:42:32 crc kubenswrapper[4930]: E1012 05:42:32.104953 4930 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 12 05:42:32 crc kubenswrapper[4930]: E1012 05:42:32.104975 4930 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 05:42:32 crc kubenswrapper[4930]: E1012 05:42:32.104882 4930 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 12 05:42:32 crc kubenswrapper[4930]: E1012 05:42:32.105036 4930 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 12 05:42:32 crc kubenswrapper[4930]: E1012 05:42:32.105059 4930 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 05:42:32 crc kubenswrapper[4930]: E1012 05:42:32.105079 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-12 05:43:36.105061116 +0000 UTC m=+148.647162911 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 05:42:32 crc kubenswrapper[4930]: E1012 05:42:32.105140 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-12 05:43:36.105127918 +0000 UTC m=+148.647229723 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 12 05:42:32 crc kubenswrapper[4930]: I1012 05:42:32.119374 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:32 crc kubenswrapper[4930]: I1012 05:42:32.119429 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:32 crc kubenswrapper[4930]: I1012 05:42:32.119449 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:32 crc kubenswrapper[4930]: I1012 05:42:32.119474 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:32 crc kubenswrapper[4930]: I1012 05:42:32.119497 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:32Z","lastTransitionTime":"2025-10-12T05:42:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:32 crc kubenswrapper[4930]: I1012 05:42:32.135054 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cjzn" Oct 12 05:42:32 crc kubenswrapper[4930]: I1012 05:42:32.135130 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 05:42:32 crc kubenswrapper[4930]: I1012 05:42:32.135130 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 05:42:32 crc kubenswrapper[4930]: E1012 05:42:32.135226 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cjzn" podUID="dda08509-105f-4935-a8a9-ff852e73c3ce" Oct 12 05:42:32 crc kubenswrapper[4930]: E1012 05:42:32.135400 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 05:42:32 crc kubenswrapper[4930]: I1012 05:42:32.135473 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 05:42:32 crc kubenswrapper[4930]: E1012 05:42:32.135567 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 05:42:32 crc kubenswrapper[4930]: E1012 05:42:32.135627 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 05:42:32 crc kubenswrapper[4930]: I1012 05:42:32.222389 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:32 crc kubenswrapper[4930]: I1012 05:42:32.222457 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:32 crc kubenswrapper[4930]: I1012 05:42:32.222476 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:32 crc kubenswrapper[4930]: I1012 05:42:32.222502 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:32 crc kubenswrapper[4930]: I1012 05:42:32.222520 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:32Z","lastTransitionTime":"2025-10-12T05:42:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:32 crc kubenswrapper[4930]: I1012 05:42:32.325613 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:32 crc kubenswrapper[4930]: I1012 05:42:32.325680 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:32 crc kubenswrapper[4930]: I1012 05:42:32.325698 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:32 crc kubenswrapper[4930]: I1012 05:42:32.325725 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:32 crc kubenswrapper[4930]: I1012 05:42:32.325770 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:32Z","lastTransitionTime":"2025-10-12T05:42:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:32 crc kubenswrapper[4930]: I1012 05:42:32.428914 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:32 crc kubenswrapper[4930]: I1012 05:42:32.428977 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:32 crc kubenswrapper[4930]: I1012 05:42:32.428995 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:32 crc kubenswrapper[4930]: I1012 05:42:32.429023 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:32 crc kubenswrapper[4930]: I1012 05:42:32.429041 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:32Z","lastTransitionTime":"2025-10-12T05:42:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:32 crc kubenswrapper[4930]: I1012 05:42:32.532049 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:32 crc kubenswrapper[4930]: I1012 05:42:32.532118 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:32 crc kubenswrapper[4930]: I1012 05:42:32.532135 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:32 crc kubenswrapper[4930]: I1012 05:42:32.532160 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:32 crc kubenswrapper[4930]: I1012 05:42:32.532177 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:32Z","lastTransitionTime":"2025-10-12T05:42:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:32 crc kubenswrapper[4930]: I1012 05:42:32.634865 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:32 crc kubenswrapper[4930]: I1012 05:42:32.634934 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:32 crc kubenswrapper[4930]: I1012 05:42:32.634952 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:32 crc kubenswrapper[4930]: I1012 05:42:32.634977 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:32 crc kubenswrapper[4930]: I1012 05:42:32.634997 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:32Z","lastTransitionTime":"2025-10-12T05:42:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:32 crc kubenswrapper[4930]: I1012 05:42:32.737805 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:32 crc kubenswrapper[4930]: I1012 05:42:32.737869 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:32 crc kubenswrapper[4930]: I1012 05:42:32.737889 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:32 crc kubenswrapper[4930]: I1012 05:42:32.737914 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:32 crc kubenswrapper[4930]: I1012 05:42:32.737931 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:32Z","lastTransitionTime":"2025-10-12T05:42:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:32 crc kubenswrapper[4930]: I1012 05:42:32.840930 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:32 crc kubenswrapper[4930]: I1012 05:42:32.840994 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:32 crc kubenswrapper[4930]: I1012 05:42:32.841008 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:32 crc kubenswrapper[4930]: I1012 05:42:32.841038 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:32 crc kubenswrapper[4930]: I1012 05:42:32.841053 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:32Z","lastTransitionTime":"2025-10-12T05:42:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:32 crc kubenswrapper[4930]: I1012 05:42:32.944881 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:32 crc kubenswrapper[4930]: I1012 05:42:32.944947 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:32 crc kubenswrapper[4930]: I1012 05:42:32.944959 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:32 crc kubenswrapper[4930]: I1012 05:42:32.944983 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:32 crc kubenswrapper[4930]: I1012 05:42:32.945000 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:32Z","lastTransitionTime":"2025-10-12T05:42:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:33 crc kubenswrapper[4930]: I1012 05:42:33.048622 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:33 crc kubenswrapper[4930]: I1012 05:42:33.048785 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:33 crc kubenswrapper[4930]: I1012 05:42:33.048806 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:33 crc kubenswrapper[4930]: I1012 05:42:33.048835 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:33 crc kubenswrapper[4930]: I1012 05:42:33.048853 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:33Z","lastTransitionTime":"2025-10-12T05:42:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:33 crc kubenswrapper[4930]: I1012 05:42:33.151811 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:33 crc kubenswrapper[4930]: I1012 05:42:33.151879 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:33 crc kubenswrapper[4930]: I1012 05:42:33.151899 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:33 crc kubenswrapper[4930]: I1012 05:42:33.151926 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:33 crc kubenswrapper[4930]: I1012 05:42:33.151948 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:33Z","lastTransitionTime":"2025-10-12T05:42:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:33 crc kubenswrapper[4930]: I1012 05:42:33.254349 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:33 crc kubenswrapper[4930]: I1012 05:42:33.254454 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:33 crc kubenswrapper[4930]: I1012 05:42:33.254479 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:33 crc kubenswrapper[4930]: I1012 05:42:33.254516 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:33 crc kubenswrapper[4930]: I1012 05:42:33.254626 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:33Z","lastTransitionTime":"2025-10-12T05:42:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:33 crc kubenswrapper[4930]: I1012 05:42:33.358287 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:33 crc kubenswrapper[4930]: I1012 05:42:33.358376 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:33 crc kubenswrapper[4930]: I1012 05:42:33.358406 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:33 crc kubenswrapper[4930]: I1012 05:42:33.358440 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:33 crc kubenswrapper[4930]: I1012 05:42:33.358464 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:33Z","lastTransitionTime":"2025-10-12T05:42:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:33 crc kubenswrapper[4930]: I1012 05:42:33.462272 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:33 crc kubenswrapper[4930]: I1012 05:42:33.462345 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:33 crc kubenswrapper[4930]: I1012 05:42:33.462364 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:33 crc kubenswrapper[4930]: I1012 05:42:33.462388 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:33 crc kubenswrapper[4930]: I1012 05:42:33.462407 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:33Z","lastTransitionTime":"2025-10-12T05:42:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:33 crc kubenswrapper[4930]: I1012 05:42:33.565449 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:33 crc kubenswrapper[4930]: I1012 05:42:33.565526 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:33 crc kubenswrapper[4930]: I1012 05:42:33.565553 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:33 crc kubenswrapper[4930]: I1012 05:42:33.565585 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:33 crc kubenswrapper[4930]: I1012 05:42:33.565606 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:33Z","lastTransitionTime":"2025-10-12T05:42:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:33 crc kubenswrapper[4930]: I1012 05:42:33.669395 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:33 crc kubenswrapper[4930]: I1012 05:42:33.669464 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:33 crc kubenswrapper[4930]: I1012 05:42:33.669483 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:33 crc kubenswrapper[4930]: I1012 05:42:33.669511 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:33 crc kubenswrapper[4930]: I1012 05:42:33.669530 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:33Z","lastTransitionTime":"2025-10-12T05:42:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:33 crc kubenswrapper[4930]: I1012 05:42:33.774264 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:33 crc kubenswrapper[4930]: I1012 05:42:33.774333 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:33 crc kubenswrapper[4930]: I1012 05:42:33.774349 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:33 crc kubenswrapper[4930]: I1012 05:42:33.774376 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:33 crc kubenswrapper[4930]: I1012 05:42:33.774395 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:33Z","lastTransitionTime":"2025-10-12T05:42:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:33 crc kubenswrapper[4930]: I1012 05:42:33.877655 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:33 crc kubenswrapper[4930]: I1012 05:42:33.877774 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:33 crc kubenswrapper[4930]: I1012 05:42:33.877802 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:33 crc kubenswrapper[4930]: I1012 05:42:33.877833 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:33 crc kubenswrapper[4930]: I1012 05:42:33.877856 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:33Z","lastTransitionTime":"2025-10-12T05:42:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:33 crc kubenswrapper[4930]: I1012 05:42:33.984148 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:33 crc kubenswrapper[4930]: I1012 05:42:33.984223 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:33 crc kubenswrapper[4930]: I1012 05:42:33.984249 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:33 crc kubenswrapper[4930]: I1012 05:42:33.984277 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:33 crc kubenswrapper[4930]: I1012 05:42:33.984298 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:33Z","lastTransitionTime":"2025-10-12T05:42:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:34 crc kubenswrapper[4930]: I1012 05:42:34.087005 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:34 crc kubenswrapper[4930]: I1012 05:42:34.087122 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:34 crc kubenswrapper[4930]: I1012 05:42:34.087142 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:34 crc kubenswrapper[4930]: I1012 05:42:34.087169 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:34 crc kubenswrapper[4930]: I1012 05:42:34.087191 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:34Z","lastTransitionTime":"2025-10-12T05:42:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:34 crc kubenswrapper[4930]: I1012 05:42:34.135117 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 05:42:34 crc kubenswrapper[4930]: I1012 05:42:34.135216 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 05:42:34 crc kubenswrapper[4930]: I1012 05:42:34.135145 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cjzn" Oct 12 05:42:34 crc kubenswrapper[4930]: E1012 05:42:34.135330 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 05:42:34 crc kubenswrapper[4930]: I1012 05:42:34.135388 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 05:42:34 crc kubenswrapper[4930]: E1012 05:42:34.135605 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cjzn" podUID="dda08509-105f-4935-a8a9-ff852e73c3ce" Oct 12 05:42:34 crc kubenswrapper[4930]: E1012 05:42:34.135701 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 05:42:34 crc kubenswrapper[4930]: E1012 05:42:34.135884 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 05:42:34 crc kubenswrapper[4930]: I1012 05:42:34.190369 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:34 crc kubenswrapper[4930]: I1012 05:42:34.190458 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:34 crc kubenswrapper[4930]: I1012 05:42:34.190477 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:34 crc kubenswrapper[4930]: I1012 05:42:34.190921 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:34 crc kubenswrapper[4930]: I1012 05:42:34.190960 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:34Z","lastTransitionTime":"2025-10-12T05:42:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:34 crc kubenswrapper[4930]: I1012 05:42:34.294146 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:34 crc kubenswrapper[4930]: I1012 05:42:34.294220 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:34 crc kubenswrapper[4930]: I1012 05:42:34.294244 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:34 crc kubenswrapper[4930]: I1012 05:42:34.294274 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:34 crc kubenswrapper[4930]: I1012 05:42:34.294298 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:34Z","lastTransitionTime":"2025-10-12T05:42:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:34 crc kubenswrapper[4930]: I1012 05:42:34.397859 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:34 crc kubenswrapper[4930]: I1012 05:42:34.397977 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:34 crc kubenswrapper[4930]: I1012 05:42:34.397999 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:34 crc kubenswrapper[4930]: I1012 05:42:34.398021 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:34 crc kubenswrapper[4930]: I1012 05:42:34.398037 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:34Z","lastTransitionTime":"2025-10-12T05:42:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:34 crc kubenswrapper[4930]: I1012 05:42:34.501020 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:34 crc kubenswrapper[4930]: I1012 05:42:34.501095 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:34 crc kubenswrapper[4930]: I1012 05:42:34.501120 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:34 crc kubenswrapper[4930]: I1012 05:42:34.501150 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:34 crc kubenswrapper[4930]: I1012 05:42:34.501173 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:34Z","lastTransitionTime":"2025-10-12T05:42:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:34 crc kubenswrapper[4930]: I1012 05:42:34.604696 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:34 crc kubenswrapper[4930]: I1012 05:42:34.604847 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:34 crc kubenswrapper[4930]: I1012 05:42:34.604884 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:34 crc kubenswrapper[4930]: I1012 05:42:34.604926 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:34 crc kubenswrapper[4930]: I1012 05:42:34.604949 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:34Z","lastTransitionTime":"2025-10-12T05:42:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:34 crc kubenswrapper[4930]: I1012 05:42:34.708533 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:34 crc kubenswrapper[4930]: I1012 05:42:34.708601 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:34 crc kubenswrapper[4930]: I1012 05:42:34.708618 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:34 crc kubenswrapper[4930]: I1012 05:42:34.708643 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:34 crc kubenswrapper[4930]: I1012 05:42:34.708661 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:34Z","lastTransitionTime":"2025-10-12T05:42:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:34 crc kubenswrapper[4930]: I1012 05:42:34.811481 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:34 crc kubenswrapper[4930]: I1012 05:42:34.811557 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:34 crc kubenswrapper[4930]: I1012 05:42:34.811574 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:34 crc kubenswrapper[4930]: I1012 05:42:34.811600 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:34 crc kubenswrapper[4930]: I1012 05:42:34.811617 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:34Z","lastTransitionTime":"2025-10-12T05:42:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:34 crc kubenswrapper[4930]: I1012 05:42:34.914521 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:34 crc kubenswrapper[4930]: I1012 05:42:34.915137 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:34 crc kubenswrapper[4930]: I1012 05:42:34.915168 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:34 crc kubenswrapper[4930]: I1012 05:42:34.915201 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:34 crc kubenswrapper[4930]: I1012 05:42:34.915223 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:34Z","lastTransitionTime":"2025-10-12T05:42:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:35 crc kubenswrapper[4930]: I1012 05:42:35.018075 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:35 crc kubenswrapper[4930]: I1012 05:42:35.018134 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:35 crc kubenswrapper[4930]: I1012 05:42:35.018151 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:35 crc kubenswrapper[4930]: I1012 05:42:35.018173 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:35 crc kubenswrapper[4930]: I1012 05:42:35.018191 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:35Z","lastTransitionTime":"2025-10-12T05:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:35 crc kubenswrapper[4930]: I1012 05:42:35.121375 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:35 crc kubenswrapper[4930]: I1012 05:42:35.121433 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:35 crc kubenswrapper[4930]: I1012 05:42:35.121455 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:35 crc kubenswrapper[4930]: I1012 05:42:35.121479 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:35 crc kubenswrapper[4930]: I1012 05:42:35.121496 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:35Z","lastTransitionTime":"2025-10-12T05:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:35 crc kubenswrapper[4930]: I1012 05:42:35.224654 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:35 crc kubenswrapper[4930]: I1012 05:42:35.224713 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:35 crc kubenswrapper[4930]: I1012 05:42:35.224732 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:35 crc kubenswrapper[4930]: I1012 05:42:35.224789 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:35 crc kubenswrapper[4930]: I1012 05:42:35.224807 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:35Z","lastTransitionTime":"2025-10-12T05:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:35 crc kubenswrapper[4930]: I1012 05:42:35.327631 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:35 crc kubenswrapper[4930]: I1012 05:42:35.327716 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:35 crc kubenswrapper[4930]: I1012 05:42:35.327787 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:35 crc kubenswrapper[4930]: I1012 05:42:35.327816 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:35 crc kubenswrapper[4930]: I1012 05:42:35.327835 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:35Z","lastTransitionTime":"2025-10-12T05:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:35 crc kubenswrapper[4930]: I1012 05:42:35.430344 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:35 crc kubenswrapper[4930]: I1012 05:42:35.430403 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:35 crc kubenswrapper[4930]: I1012 05:42:35.430419 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:35 crc kubenswrapper[4930]: I1012 05:42:35.430441 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:35 crc kubenswrapper[4930]: I1012 05:42:35.430455 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:35Z","lastTransitionTime":"2025-10-12T05:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:35 crc kubenswrapper[4930]: I1012 05:42:35.534180 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:35 crc kubenswrapper[4930]: I1012 05:42:35.534242 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:35 crc kubenswrapper[4930]: I1012 05:42:35.534263 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:35 crc kubenswrapper[4930]: I1012 05:42:35.534288 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:35 crc kubenswrapper[4930]: I1012 05:42:35.534345 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:35Z","lastTransitionTime":"2025-10-12T05:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:35 crc kubenswrapper[4930]: I1012 05:42:35.636943 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:35 crc kubenswrapper[4930]: I1012 05:42:35.637042 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:35 crc kubenswrapper[4930]: I1012 05:42:35.637061 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:35 crc kubenswrapper[4930]: I1012 05:42:35.637087 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:35 crc kubenswrapper[4930]: I1012 05:42:35.637104 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:35Z","lastTransitionTime":"2025-10-12T05:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:35 crc kubenswrapper[4930]: I1012 05:42:35.739851 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:35 crc kubenswrapper[4930]: I1012 05:42:35.739898 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:35 crc kubenswrapper[4930]: I1012 05:42:35.739915 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:35 crc kubenswrapper[4930]: I1012 05:42:35.739936 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:35 crc kubenswrapper[4930]: I1012 05:42:35.739949 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:35Z","lastTransitionTime":"2025-10-12T05:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:35 crc kubenswrapper[4930]: I1012 05:42:35.842331 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:35 crc kubenswrapper[4930]: I1012 05:42:35.842377 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:35 crc kubenswrapper[4930]: I1012 05:42:35.842394 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:35 crc kubenswrapper[4930]: I1012 05:42:35.842418 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:35 crc kubenswrapper[4930]: I1012 05:42:35.842434 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:35Z","lastTransitionTime":"2025-10-12T05:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:35 crc kubenswrapper[4930]: I1012 05:42:35.945685 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:35 crc kubenswrapper[4930]: I1012 05:42:35.945778 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:35 crc kubenswrapper[4930]: I1012 05:42:35.945797 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:35 crc kubenswrapper[4930]: I1012 05:42:35.945819 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:35 crc kubenswrapper[4930]: I1012 05:42:35.945836 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:35Z","lastTransitionTime":"2025-10-12T05:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:36 crc kubenswrapper[4930]: I1012 05:42:36.048688 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:36 crc kubenswrapper[4930]: I1012 05:42:36.048764 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:36 crc kubenswrapper[4930]: I1012 05:42:36.048776 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:36 crc kubenswrapper[4930]: I1012 05:42:36.048796 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:36 crc kubenswrapper[4930]: I1012 05:42:36.048809 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:36Z","lastTransitionTime":"2025-10-12T05:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:36 crc kubenswrapper[4930]: I1012 05:42:36.060268 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:36 crc kubenswrapper[4930]: I1012 05:42:36.060325 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:36 crc kubenswrapper[4930]: I1012 05:42:36.060350 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:36 crc kubenswrapper[4930]: I1012 05:42:36.060377 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:36 crc kubenswrapper[4930]: I1012 05:42:36.060402 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:36Z","lastTransitionTime":"2025-10-12T05:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:36 crc kubenswrapper[4930]: E1012 05:42:36.075924 4930 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:42:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:42:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:42:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:42:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"208e324a-7c16-4d54-b585-7c9f58265cb2\\\",\\\"systemUUID\\\":\\\"42c4af26-8d34-49a0-8413-58384a3ecd2b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:36Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:36 crc kubenswrapper[4930]: I1012 05:42:36.081499 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:36 crc kubenswrapper[4930]: I1012 05:42:36.081549 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:36 crc kubenswrapper[4930]: I1012 05:42:36.081560 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:36 crc kubenswrapper[4930]: I1012 05:42:36.081585 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:36 crc kubenswrapper[4930]: I1012 05:42:36.081600 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:36Z","lastTransitionTime":"2025-10-12T05:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:36 crc kubenswrapper[4930]: E1012 05:42:36.101811 4930 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:42:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:42:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:42:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:42:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"208e324a-7c16-4d54-b585-7c9f58265cb2\\\",\\\"systemUUID\\\":\\\"42c4af26-8d34-49a0-8413-58384a3ecd2b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:36Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:36 crc kubenswrapper[4930]: I1012 05:42:36.108253 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:36 crc kubenswrapper[4930]: I1012 05:42:36.108373 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:36 crc kubenswrapper[4930]: I1012 05:42:36.108429 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:36 crc kubenswrapper[4930]: I1012 05:42:36.108463 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:36 crc kubenswrapper[4930]: I1012 05:42:36.108485 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:36Z","lastTransitionTime":"2025-10-12T05:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:36 crc kubenswrapper[4930]: E1012 05:42:36.130697 4930 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:42:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:42:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:42:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:42:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"208e324a-7c16-4d54-b585-7c9f58265cb2\\\",\\\"systemUUID\\\":\\\"42c4af26-8d34-49a0-8413-58384a3ecd2b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:36Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:36 crc kubenswrapper[4930]: I1012 05:42:36.134802 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 05:42:36 crc kubenswrapper[4930]: I1012 05:42:36.134895 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 05:42:36 crc kubenswrapper[4930]: I1012 05:42:36.134928 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 05:42:36 crc kubenswrapper[4930]: E1012 05:42:36.134977 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 05:42:36 crc kubenswrapper[4930]: E1012 05:42:36.135082 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 05:42:36 crc kubenswrapper[4930]: I1012 05:42:36.135162 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cjzn" Oct 12 05:42:36 crc kubenswrapper[4930]: E1012 05:42:36.135313 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 05:42:36 crc kubenswrapper[4930]: E1012 05:42:36.135371 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cjzn" podUID="dda08509-105f-4935-a8a9-ff852e73c3ce" Oct 12 05:42:36 crc kubenswrapper[4930]: I1012 05:42:36.136506 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:36 crc kubenswrapper[4930]: I1012 05:42:36.136566 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:36 crc kubenswrapper[4930]: I1012 05:42:36.136584 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:36 crc kubenswrapper[4930]: I1012 05:42:36.136612 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:36 crc kubenswrapper[4930]: I1012 05:42:36.136632 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:36Z","lastTransitionTime":"2025-10-12T05:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:36 crc kubenswrapper[4930]: E1012 05:42:36.159179 4930 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:42:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:42:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:42:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:42:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"208e324a-7c16-4d54-b585-7c9f58265cb2\\\",\\\"systemUUID\\\":\\\"42c4af26-8d34-49a0-8413-58384a3ecd2b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:36Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:36 crc kubenswrapper[4930]: I1012 05:42:36.163666 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:36 crc kubenswrapper[4930]: I1012 05:42:36.163708 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:36 crc kubenswrapper[4930]: I1012 05:42:36.163721 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:36 crc kubenswrapper[4930]: I1012 05:42:36.163758 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:36 crc kubenswrapper[4930]: I1012 05:42:36.163773 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:36Z","lastTransitionTime":"2025-10-12T05:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:36 crc kubenswrapper[4930]: E1012 05:42:36.183681 4930 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:42:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:42:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:42:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-12T05:42:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"208e324a-7c16-4d54-b585-7c9f58265cb2\\\",\\\"systemUUID\\\":\\\"42c4af26-8d34-49a0-8413-58384a3ecd2b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:36Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:36 crc kubenswrapper[4930]: E1012 05:42:36.183820 4930 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 12 05:42:36 crc kubenswrapper[4930]: I1012 05:42:36.185812 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:36 crc kubenswrapper[4930]: I1012 05:42:36.185848 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:36 crc kubenswrapper[4930]: I1012 05:42:36.185857 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:36 crc kubenswrapper[4930]: I1012 05:42:36.185871 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:36 crc kubenswrapper[4930]: I1012 05:42:36.185882 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:36Z","lastTransitionTime":"2025-10-12T05:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:36 crc kubenswrapper[4930]: I1012 05:42:36.288895 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:36 crc kubenswrapper[4930]: I1012 05:42:36.288951 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:36 crc kubenswrapper[4930]: I1012 05:42:36.288973 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:36 crc kubenswrapper[4930]: I1012 05:42:36.289000 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:36 crc kubenswrapper[4930]: I1012 05:42:36.289021 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:36Z","lastTransitionTime":"2025-10-12T05:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:36 crc kubenswrapper[4930]: I1012 05:42:36.391891 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:36 crc kubenswrapper[4930]: I1012 05:42:36.391952 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:36 crc kubenswrapper[4930]: I1012 05:42:36.391970 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:36 crc kubenswrapper[4930]: I1012 05:42:36.391999 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:36 crc kubenswrapper[4930]: I1012 05:42:36.392019 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:36Z","lastTransitionTime":"2025-10-12T05:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:36 crc kubenswrapper[4930]: I1012 05:42:36.494984 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:36 crc kubenswrapper[4930]: I1012 05:42:36.495049 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:36 crc kubenswrapper[4930]: I1012 05:42:36.495067 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:36 crc kubenswrapper[4930]: I1012 05:42:36.495092 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:36 crc kubenswrapper[4930]: I1012 05:42:36.495114 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:36Z","lastTransitionTime":"2025-10-12T05:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:36 crc kubenswrapper[4930]: I1012 05:42:36.597799 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:36 crc kubenswrapper[4930]: I1012 05:42:36.597873 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:36 crc kubenswrapper[4930]: I1012 05:42:36.597892 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:36 crc kubenswrapper[4930]: I1012 05:42:36.597918 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:36 crc kubenswrapper[4930]: I1012 05:42:36.597938 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:36Z","lastTransitionTime":"2025-10-12T05:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:36 crc kubenswrapper[4930]: I1012 05:42:36.701131 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:36 crc kubenswrapper[4930]: I1012 05:42:36.701179 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:36 crc kubenswrapper[4930]: I1012 05:42:36.701197 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:36 crc kubenswrapper[4930]: I1012 05:42:36.701220 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:36 crc kubenswrapper[4930]: I1012 05:42:36.701237 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:36Z","lastTransitionTime":"2025-10-12T05:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:36 crc kubenswrapper[4930]: I1012 05:42:36.804359 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:36 crc kubenswrapper[4930]: I1012 05:42:36.804412 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:36 crc kubenswrapper[4930]: I1012 05:42:36.804431 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:36 crc kubenswrapper[4930]: I1012 05:42:36.804453 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:36 crc kubenswrapper[4930]: I1012 05:42:36.804471 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:36Z","lastTransitionTime":"2025-10-12T05:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:36 crc kubenswrapper[4930]: I1012 05:42:36.907864 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:36 crc kubenswrapper[4930]: I1012 05:42:36.907929 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:36 crc kubenswrapper[4930]: I1012 05:42:36.907949 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:36 crc kubenswrapper[4930]: I1012 05:42:36.907977 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:36 crc kubenswrapper[4930]: I1012 05:42:36.907997 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:36Z","lastTransitionTime":"2025-10-12T05:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:37 crc kubenswrapper[4930]: I1012 05:42:37.010870 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:37 crc kubenswrapper[4930]: I1012 05:42:37.010924 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:37 crc kubenswrapper[4930]: I1012 05:42:37.010941 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:37 crc kubenswrapper[4930]: I1012 05:42:37.010962 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:37 crc kubenswrapper[4930]: I1012 05:42:37.011008 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:37Z","lastTransitionTime":"2025-10-12T05:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:37 crc kubenswrapper[4930]: I1012 05:42:37.114552 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:37 crc kubenswrapper[4930]: I1012 05:42:37.114617 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:37 crc kubenswrapper[4930]: I1012 05:42:37.114635 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:37 crc kubenswrapper[4930]: I1012 05:42:37.114661 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:37 crc kubenswrapper[4930]: I1012 05:42:37.114680 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:37Z","lastTransitionTime":"2025-10-12T05:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:37 crc kubenswrapper[4930]: I1012 05:42:37.152258 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Oct 12 05:42:37 crc kubenswrapper[4930]: I1012 05:42:37.218830 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:37 crc kubenswrapper[4930]: I1012 05:42:37.218904 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:37 crc kubenswrapper[4930]: I1012 05:42:37.218924 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:37 crc kubenswrapper[4930]: I1012 05:42:37.218948 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:37 crc kubenswrapper[4930]: I1012 05:42:37.218966 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:37Z","lastTransitionTime":"2025-10-12T05:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:37 crc kubenswrapper[4930]: I1012 05:42:37.321785 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:37 crc kubenswrapper[4930]: I1012 05:42:37.321850 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:37 crc kubenswrapper[4930]: I1012 05:42:37.321875 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:37 crc kubenswrapper[4930]: I1012 05:42:37.321903 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:37 crc kubenswrapper[4930]: I1012 05:42:37.321925 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:37Z","lastTransitionTime":"2025-10-12T05:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:37 crc kubenswrapper[4930]: I1012 05:42:37.424503 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:37 crc kubenswrapper[4930]: I1012 05:42:37.424556 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:37 crc kubenswrapper[4930]: I1012 05:42:37.424573 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:37 crc kubenswrapper[4930]: I1012 05:42:37.424595 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:37 crc kubenswrapper[4930]: I1012 05:42:37.424613 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:37Z","lastTransitionTime":"2025-10-12T05:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:37 crc kubenswrapper[4930]: I1012 05:42:37.527563 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:37 crc kubenswrapper[4930]: I1012 05:42:37.527638 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:37 crc kubenswrapper[4930]: I1012 05:42:37.527659 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:37 crc kubenswrapper[4930]: I1012 05:42:37.527688 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:37 crc kubenswrapper[4930]: I1012 05:42:37.527708 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:37Z","lastTransitionTime":"2025-10-12T05:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:37 crc kubenswrapper[4930]: I1012 05:42:37.630279 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:37 crc kubenswrapper[4930]: I1012 05:42:37.630349 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:37 crc kubenswrapper[4930]: I1012 05:42:37.630369 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:37 crc kubenswrapper[4930]: I1012 05:42:37.630400 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:37 crc kubenswrapper[4930]: I1012 05:42:37.630417 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:37Z","lastTransitionTime":"2025-10-12T05:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:37 crc kubenswrapper[4930]: I1012 05:42:37.733699 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:37 crc kubenswrapper[4930]: I1012 05:42:37.733847 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:37 crc kubenswrapper[4930]: I1012 05:42:37.733907 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:37 crc kubenswrapper[4930]: I1012 05:42:37.733935 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:37 crc kubenswrapper[4930]: I1012 05:42:37.733986 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:37Z","lastTransitionTime":"2025-10-12T05:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:37 crc kubenswrapper[4930]: I1012 05:42:37.837265 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:37 crc kubenswrapper[4930]: I1012 05:42:37.837350 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:37 crc kubenswrapper[4930]: I1012 05:42:37.837370 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:37 crc kubenswrapper[4930]: I1012 05:42:37.837401 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:37 crc kubenswrapper[4930]: I1012 05:42:37.837421 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:37Z","lastTransitionTime":"2025-10-12T05:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:37 crc kubenswrapper[4930]: I1012 05:42:37.940314 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:37 crc kubenswrapper[4930]: I1012 05:42:37.940385 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:37 crc kubenswrapper[4930]: I1012 05:42:37.940403 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:37 crc kubenswrapper[4930]: I1012 05:42:37.940433 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:37 crc kubenswrapper[4930]: I1012 05:42:37.940455 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:37Z","lastTransitionTime":"2025-10-12T05:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:38 crc kubenswrapper[4930]: I1012 05:42:38.043395 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:38 crc kubenswrapper[4930]: I1012 05:42:38.043461 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:38 crc kubenswrapper[4930]: I1012 05:42:38.043479 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:38 crc kubenswrapper[4930]: I1012 05:42:38.043506 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:38 crc kubenswrapper[4930]: I1012 05:42:38.043523 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:38Z","lastTransitionTime":"2025-10-12T05:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:38 crc kubenswrapper[4930]: I1012 05:42:38.134837 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 05:42:38 crc kubenswrapper[4930]: I1012 05:42:38.134948 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 05:42:38 crc kubenswrapper[4930]: E1012 05:42:38.135033 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 05:42:38 crc kubenswrapper[4930]: I1012 05:42:38.135097 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 05:42:38 crc kubenswrapper[4930]: I1012 05:42:38.135191 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cjzn" Oct 12 05:42:38 crc kubenswrapper[4930]: E1012 05:42:38.135287 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 05:42:38 crc kubenswrapper[4930]: E1012 05:42:38.135670 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cjzn" podUID="dda08509-105f-4935-a8a9-ff852e73c3ce" Oct 12 05:42:38 crc kubenswrapper[4930]: E1012 05:42:38.136056 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 05:42:38 crc kubenswrapper[4930]: I1012 05:42:38.146050 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:38 crc kubenswrapper[4930]: I1012 05:42:38.146101 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:38 crc kubenswrapper[4930]: I1012 05:42:38.146118 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:38 crc kubenswrapper[4930]: I1012 05:42:38.146140 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:38 crc kubenswrapper[4930]: I1012 05:42:38.146158 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:38Z","lastTransitionTime":"2025-10-12T05:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:38 crc kubenswrapper[4930]: I1012 05:42:38.157210 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:38Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:38 crc kubenswrapper[4930]: I1012 05:42:38.178348 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tq29s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1c3ae9e-26ae-418f-b261-eabc4302b332\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6697bf685228a837d46450ea695e9579b3bda51686b6b5bcd8338e66757476cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbe4d76b294d8c174b77355aae9ec55f4d38c04446fc048603a294d171332584\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T05:42:17Z\\\",\\\"message\\\":\\\"2025-10-12T05:41:31+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9dcab761-8478-4ed2-bae4-530da435f8cb\\\\n2025-10-12T05:41:31+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9dcab761-8478-4ed2-bae4-530da435f8cb to /host/opt/cni/bin/\\\\n2025-10-12T05:41:32Z [verbose] multus-daemon started\\\\n2025-10-12T05:41:32Z [verbose] Readiness Indicator file check\\\\n2025-10-12T05:42:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:42:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s29kh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tq29s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:38Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:38 crc kubenswrapper[4930]: I1012 05:42:38.198728 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02f8684c-a3e4-44e8-9741-9f54488d8d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e74c17edeca2e2b56f4be36327914c7232a5eee3190d7cd9ce54fe81050942ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqrgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2193886cf3e24fbf30c3e6f91ab52d9b7e68968c8b42a4d0deafab5668555701\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqrgm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mk4tf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:38Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:38 crc kubenswrapper[4930]: I1012 05:42:38.218688 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f56867a3-bcac-4478-b32f-0869b7e330a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:42:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39f8602e4dd7d746c23c7b303c631d40b2bc2c447acd1491b77be0895c1715a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82f1df2fa5060e9f2922e06c412de9fe47f1dd0236b2bba1d81783dfd52a68d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebbbfef668586cda6fda820fbbba5432c59b621a94025e2dd6d7d0733a3dee09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecfafe319660c39bb6291f6fdb7c0dd529065b7db2e0dd52cbf8e42a56c58dd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecfafe319660c39bb6291f6fdb7c0dd529065b7db2e0dd52cbf8e42a56c58dd4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:38Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:38 crc kubenswrapper[4930]: I1012 05:42:38.236545 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aafbe09e-bf2b-4205-ba9f-3cd300cd3135\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46b72209a6e5e34e3ba9a5d17cc19a2817d8e455eea8a0026674589089df2376\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c2895c24173ca66c5aa199d3049c9f6b8dcda56ae7f0a91d92dbd7408d46be2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c2895c24173ca66c5aa199d3049c9f6b8dcda56ae7f0a91d92dbd7408d46be2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:38Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:38 crc kubenswrapper[4930]: I1012 05:42:38.250391 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:38 crc kubenswrapper[4930]: I1012 05:42:38.250504 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:38 crc kubenswrapper[4930]: I1012 05:42:38.250522 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:38 crc kubenswrapper[4930]: I1012 05:42:38.250546 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:38 crc kubenswrapper[4930]: I1012 05:42:38.250562 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:38Z","lastTransitionTime":"2025-10-12T05:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:38 crc kubenswrapper[4930]: I1012 05:42:38.261595 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85c87fc1896b611cf87d3f6e8dce960e0aea4e0a5e161d12e046bec5a4d46d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:38Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:38 crc kubenswrapper[4930]: I1012 05:42:38.280440 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fjsw8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c21ff03a-29e7-467c-93e2-45f3a6cef5af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df55e443aa226c07478c488ee54785f35fbb1d6285775e82abcbf924aaf304fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dk8z2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bae41878203d4a2974f46d284f4fc05e705fb9fc9b1a3e8159db00a4c9d70762\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dk8z2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fjsw8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:38Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:38 crc kubenswrapper[4930]: I1012 05:42:38.301148 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7cjzn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dda08509-105f-4935-a8a9-ff852e73c3ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2ntq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2ntq5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7cjzn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:38Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:38 crc kubenswrapper[4930]: I1012 05:42:38.331961 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd0aa975-5017-4585-b9d9-66563c734f99\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2ee3c6242b1dd92c05cad132f941c6463862d38dd3f32db106c4458b19ca281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce0b341a84353da33af118a985d78e1926365e672cd9d9a94a8c9ccdeeeef68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://772b3ac684f8d59be06d3921919db10572b68d9acfb6ae64e85b92c637963057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04341c7af2fbffb89ce55a0ed2594bdb4fd943ac9088994f81f9245c31c62a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c844939574ab6101aa35b883fd6b7dfabb8338c2de1bc238a60b92f7e0fc99a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c74c32e6d90259f9f86f70554faf163aeedf344fe51ad49a169f2b12f3db7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdd4dd246b7445e25e4328faa31fac42aaa6a6c6504c218c7d13f3870eaffe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48f2d2d726b7df395399b0134aca81dd54ad3f206c1276d1222b30dd3d4b7210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:38Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:38 crc kubenswrapper[4930]: I1012 05:42:38.355549 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:38 crc kubenswrapper[4930]: I1012 05:42:38.355617 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:38 crc kubenswrapper[4930]: I1012 05:42:38.355636 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:38 crc kubenswrapper[4930]: I1012 05:42:38.355666 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:38 crc kubenswrapper[4930]: I1012 05:42:38.355684 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:38Z","lastTransitionTime":"2025-10-12T05:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:38 crc kubenswrapper[4930]: I1012 05:42:38.356803 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vwttt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d928520f-ca1d-4cca-b966-c1e6c9168db0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://083fd9c4a4a835276945402005503980476de4e834b8c77b4347cbfd74347baf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://271a5e8f9d11a30e34aaa4965e6d9d036cfc47d469d87a2ff0ddcae5ba4e8fe8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://271a5e8f9d11a30e34aaa4965e6d9d036cfc47d469d87a2ff0ddcae5ba4e8fe8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a7387bc9fd2f9b98dd174adbd738309e22b90e1c82756e0e4c114cd5bc1279a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a7387bc9fd2f9b98dd174adbd738309e22b90e1c82756e0e4c114cd5bc1279a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b73defe23757469590b3d3d92e7b79bf3b6dcbcaec44d750770cd2b94caaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6b73defe23757469590b3d3d92e7b79bf3b6dcbcaec44d750770cd2b94caaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c379d6c40705bccb05ab8835e3b2410fc0a98db9e372b4dbd62cc3d4c1c5470\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c379d6c40705bccb05ab8835e3b2410fc0a98db9e372b4dbd62cc3d4c1c5470\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5e3c8e16c925694d42868813bc57edd9c50c88f2153189a2b9f8451655aff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5e3c8e16c925694d42868813bc57edd9c50c88f2153189a2b9f8451655aff1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d01f2b61f7fe3605f4ad0626d2cf28756a8c7e3de343b9a20acafe7f35c19bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d01f2b61f7fe3605f4ad0626d2cf28756a8c7e3de343b9a20acafe7f35c19bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4m7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vwttt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:38Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:38 crc kubenswrapper[4930]: I1012 05:42:38.388233 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0add0fa2-092f-4dcc-8c72-82881564bf63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7495c144ce989bf10b08b199a7459ea85351b77941a3edcb0ef76a01b2e08164\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a4295347cdfdf0864fbc2b102904f73e19cc33021fc5a82eed61255c4641d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2cba08ba8fd74a521bdd005e51be52f80687a1ccf0f250d11b709c894be8235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1006beab015e55856b70b9bcb93bb81b541e6c593fae8fd366c361990d233914\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b17051b78f584f1199be3038a82d9e675d1b9fb4da939611577c0b05235a147a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9898a7d7e617b5eb8eab6b6d9cb5d2ea92e8fbb0f4b1e0e60711e8f4a5e3aef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d13c51f393f32e647751366764fe45c6eb3dd586ba90fc528fce23cae8b16b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d13c51f393f32e647751366764fe45c6eb3dd586ba90fc528fce23cae8b16b7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-12T05:42:25Z\\\",\\\"message\\\":\\\"-source-55646444c4-trplf uuid:960d98b2-dc64-4e93-a4b6-9b19847af71e logicalSwitch:crc ips:[0xc008e1e3c0] mac:[10 88 10 217 0 59] expires:{wall:0 ext:0 loc:\\\\u003cnil\\\\u003e}} with IP: [10.217.0.59/23] and MAC: 0a:58:0a:d9:00:3b\\\\nI1012 05:42:25.153951 6920 ovnkube_controller.go:900] Cache entry expected pod with UID \\\\\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\\\\\" but failed to find it\\\\nI1012 05:42:25.155495 6920 ovnkube_controller.go:804] Add Logical Switch Port event expected pod with UID \\\\\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\\\\\" in cache\\\\nI1012 05:42:25.155565 6920 ovnkube_controller.go:900] Cache entry expected pod with UID \\\\\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\\\\\" but failed to find it\\\\nI1012 05:42:25.155628 6920 ovnkube_controller.go:804] Add Logical Switch Port event expected pod with UID \\\\\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\\\\\" in cache\\\\nI1012 05:42:25.155704 6920 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-operator-lifecycle-manager/olm-operator-metrics]} name:Service_openshift-operator-lifecycle-manager/olm-operator-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.168:8443:]}] Rows:[] C\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T05:42:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-mdhw6_openshift-ovn-kubernetes(0add0fa2-092f-4dcc-8c72-82881564bf63)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcac9953a6ca0ac812cb0d1aed7ff1179eed01831f4af4a098f3e6938b5bc666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7hpj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mdhw6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:38Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:38 crc kubenswrapper[4930]: I1012 05:42:38.407713 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://489b43d7659655253d7055c2725d6b59d594b9eea5d1e6dc78b015f8d44824ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:38Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:38 crc kubenswrapper[4930]: I1012 05:42:38.424460 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-br2vl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"545bf51c-0b04-4166-a984-ec9c1276470a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6148cdf3aed29894038db482881c0cf3d0fd2d99c241e87d864099a3eb4248f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5tpks\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-br2vl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:38Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:38 crc kubenswrapper[4930]: I1012 05:42:38.440491 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jd2dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"835a1f98-4ae1-499b-b08c-a87dbcf8eaf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7816344abd767a7cdc2aa06418215233815285a683bb5a7fceeeb9a383b00bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvkzz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jd2dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:38Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:38 crc kubenswrapper[4930]: I1012 05:42:38.460229 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:38 crc kubenswrapper[4930]: I1012 05:42:38.460320 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:38 crc kubenswrapper[4930]: I1012 05:42:38.460379 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:38 crc kubenswrapper[4930]: I1012 05:42:38.460471 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:38 crc kubenswrapper[4930]: I1012 05:42:38.460493 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:38Z","lastTransitionTime":"2025-10-12T05:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:38 crc kubenswrapper[4930]: I1012 05:42:38.464351 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2acab1-2f7d-4497-ab0c-d2088825de18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://380b8da0bc8ba00ec7fe7a9dd1a4f49f0f0828fa469e21bd947a1b976ae5e584\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14de1c8677ea9a164e1713e68861b55fa4b8fd0bff40211afa3ee9e883c65092\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93380cd1156afc3c34f019f5026f01b5242b9009c4c5de83edf5301e4cdb7b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b74b7cf89e0ba8d3417d38fdfc78b271cb23bc6e737358303956214c8f9a0f9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fb5b0196fd005013eba91e6ea563d14230abf0a8cbc1d2a4bf9cd4491f7c391\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-12T05:41:22Z\\\",\\\"message\\\":\\\"W1012 05:41:11.395066 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1012 05:41:11.395683 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760247671 cert, and key in /tmp/serving-cert-1865248722/serving-signer.crt, /tmp/serving-cert-1865248722/serving-signer.key\\\\nI1012 05:41:11.668419 1 observer_polling.go:159] Starting file observer\\\\nW1012 05:41:11.671952 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1012 05:41:11.672482 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1012 05:41:11.675332 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1865248722/tls.crt::/tmp/serving-cert-1865248722/tls.key\\\\\\\"\\\\nF1012 05:41:22.294380 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9e13b7dbf76d9dcdaa4a8edc40eb5ebac2c044e9f171f78f18b33a08784ed28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ce2a112d18c1b8182e432a7aa567901720a4fb9e6be25905b4be59571940a87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-12T05:41:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:38Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:38 crc kubenswrapper[4930]: I1012 05:42:38.486724 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:38Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:38 crc kubenswrapper[4930]: I1012 05:42:38.508101 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:38Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:38 crc kubenswrapper[4930]: I1012 05:42:38.524989 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e930f9f-8120-43b6-95ba-39319c069f64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://428593f6ee3749d62d3cb8553ec6707f0f13f70f7750b8ea90fb0b9f0aa8f337\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05e5821bdc37103cb95d9f2a3dcacbff75c87335a6c6ed400846ea49fe370c00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e44390f068c28c4a9917648c747a322ddfa44915b128a61f145d6e2e23fc167a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://586395a6006e26fe969c2ef9af6ec6299523c0868c26989709389f5af24b7e39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-12T05:41:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:38Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:38 crc kubenswrapper[4930]: I1012 05:42:38.540625 4930 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-12T05:41:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d92840f87187d6a4e0a40972f9b1601b70c300104b6023b980e0939e8b83eba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d6f5f971b8708420f74c2b9602179815dd69145979d42bbaeb899290c01c286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-12T05:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-12T05:42:38Z is after 2025-08-24T17:21:41Z" Oct 12 05:42:38 crc kubenswrapper[4930]: I1012 05:42:38.563563 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:38 crc kubenswrapper[4930]: I1012 05:42:38.563625 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:38 crc kubenswrapper[4930]: I1012 05:42:38.563642 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:38 crc kubenswrapper[4930]: I1012 05:42:38.563667 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:38 crc kubenswrapper[4930]: I1012 05:42:38.563685 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:38Z","lastTransitionTime":"2025-10-12T05:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:38 crc kubenswrapper[4930]: I1012 05:42:38.666368 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:38 crc kubenswrapper[4930]: I1012 05:42:38.666437 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:38 crc kubenswrapper[4930]: I1012 05:42:38.666454 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:38 crc kubenswrapper[4930]: I1012 05:42:38.666480 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:38 crc kubenswrapper[4930]: I1012 05:42:38.666499 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:38Z","lastTransitionTime":"2025-10-12T05:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:38 crc kubenswrapper[4930]: I1012 05:42:38.768721 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:38 crc kubenswrapper[4930]: I1012 05:42:38.768836 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:38 crc kubenswrapper[4930]: I1012 05:42:38.768856 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:38 crc kubenswrapper[4930]: I1012 05:42:38.768886 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:38 crc kubenswrapper[4930]: I1012 05:42:38.768911 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:38Z","lastTransitionTime":"2025-10-12T05:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:38 crc kubenswrapper[4930]: I1012 05:42:38.871670 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:38 crc kubenswrapper[4930]: I1012 05:42:38.871781 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:38 crc kubenswrapper[4930]: I1012 05:42:38.871801 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:38 crc kubenswrapper[4930]: I1012 05:42:38.871829 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:38 crc kubenswrapper[4930]: I1012 05:42:38.871849 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:38Z","lastTransitionTime":"2025-10-12T05:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:38 crc kubenswrapper[4930]: I1012 05:42:38.975498 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:38 crc kubenswrapper[4930]: I1012 05:42:38.975564 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:38 crc kubenswrapper[4930]: I1012 05:42:38.975585 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:38 crc kubenswrapper[4930]: I1012 05:42:38.975616 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:38 crc kubenswrapper[4930]: I1012 05:42:38.975638 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:38Z","lastTransitionTime":"2025-10-12T05:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:39 crc kubenswrapper[4930]: I1012 05:42:39.079766 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:39 crc kubenswrapper[4930]: I1012 05:42:39.079842 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:39 crc kubenswrapper[4930]: I1012 05:42:39.079860 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:39 crc kubenswrapper[4930]: I1012 05:42:39.079891 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:39 crc kubenswrapper[4930]: I1012 05:42:39.079912 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:39Z","lastTransitionTime":"2025-10-12T05:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:39 crc kubenswrapper[4930]: I1012 05:42:39.136472 4930 scope.go:117] "RemoveContainer" containerID="3d13c51f393f32e647751366764fe45c6eb3dd586ba90fc528fce23cae8b16b7" Oct 12 05:42:39 crc kubenswrapper[4930]: E1012 05:42:39.136833 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-mdhw6_openshift-ovn-kubernetes(0add0fa2-092f-4dcc-8c72-82881564bf63)\"" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" podUID="0add0fa2-092f-4dcc-8c72-82881564bf63" Oct 12 05:42:39 crc kubenswrapper[4930]: I1012 05:42:39.183010 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:39 crc kubenswrapper[4930]: I1012 05:42:39.183096 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:39 crc kubenswrapper[4930]: I1012 05:42:39.183117 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:39 crc kubenswrapper[4930]: I1012 05:42:39.183152 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:39 crc kubenswrapper[4930]: I1012 05:42:39.183177 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:39Z","lastTransitionTime":"2025-10-12T05:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:39 crc kubenswrapper[4930]: I1012 05:42:39.286280 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:39 crc kubenswrapper[4930]: I1012 05:42:39.286344 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:39 crc kubenswrapper[4930]: I1012 05:42:39.286363 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:39 crc kubenswrapper[4930]: I1012 05:42:39.286388 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:39 crc kubenswrapper[4930]: I1012 05:42:39.286408 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:39Z","lastTransitionTime":"2025-10-12T05:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:39 crc kubenswrapper[4930]: I1012 05:42:39.390208 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:39 crc kubenswrapper[4930]: I1012 05:42:39.390274 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:39 crc kubenswrapper[4930]: I1012 05:42:39.390291 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:39 crc kubenswrapper[4930]: I1012 05:42:39.390315 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:39 crc kubenswrapper[4930]: I1012 05:42:39.390332 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:39Z","lastTransitionTime":"2025-10-12T05:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:39 crc kubenswrapper[4930]: I1012 05:42:39.495024 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:39 crc kubenswrapper[4930]: I1012 05:42:39.495120 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:39 crc kubenswrapper[4930]: I1012 05:42:39.495139 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:39 crc kubenswrapper[4930]: I1012 05:42:39.495171 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:39 crc kubenswrapper[4930]: I1012 05:42:39.495191 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:39Z","lastTransitionTime":"2025-10-12T05:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:39 crc kubenswrapper[4930]: I1012 05:42:39.598349 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:39 crc kubenswrapper[4930]: I1012 05:42:39.598415 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:39 crc kubenswrapper[4930]: I1012 05:42:39.598432 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:39 crc kubenswrapper[4930]: I1012 05:42:39.598459 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:39 crc kubenswrapper[4930]: I1012 05:42:39.598481 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:39Z","lastTransitionTime":"2025-10-12T05:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:39 crc kubenswrapper[4930]: I1012 05:42:39.701514 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:39 crc kubenswrapper[4930]: I1012 05:42:39.701600 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:39 crc kubenswrapper[4930]: I1012 05:42:39.701618 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:39 crc kubenswrapper[4930]: I1012 05:42:39.701650 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:39 crc kubenswrapper[4930]: I1012 05:42:39.701729 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:39Z","lastTransitionTime":"2025-10-12T05:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:39 crc kubenswrapper[4930]: I1012 05:42:39.804678 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:39 crc kubenswrapper[4930]: I1012 05:42:39.804782 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:39 crc kubenswrapper[4930]: I1012 05:42:39.804805 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:39 crc kubenswrapper[4930]: I1012 05:42:39.804841 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:39 crc kubenswrapper[4930]: I1012 05:42:39.804867 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:39Z","lastTransitionTime":"2025-10-12T05:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:39 crc kubenswrapper[4930]: I1012 05:42:39.907999 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:39 crc kubenswrapper[4930]: I1012 05:42:39.908092 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:39 crc kubenswrapper[4930]: I1012 05:42:39.908111 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:39 crc kubenswrapper[4930]: I1012 05:42:39.908137 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:39 crc kubenswrapper[4930]: I1012 05:42:39.908156 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:39Z","lastTransitionTime":"2025-10-12T05:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:40 crc kubenswrapper[4930]: I1012 05:42:40.010113 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:40 crc kubenswrapper[4930]: I1012 05:42:40.010152 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:40 crc kubenswrapper[4930]: I1012 05:42:40.010161 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:40 crc kubenswrapper[4930]: I1012 05:42:40.010178 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:40 crc kubenswrapper[4930]: I1012 05:42:40.010190 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:40Z","lastTransitionTime":"2025-10-12T05:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:40 crc kubenswrapper[4930]: I1012 05:42:40.113150 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:40 crc kubenswrapper[4930]: I1012 05:42:40.113213 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:40 crc kubenswrapper[4930]: I1012 05:42:40.113229 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:40 crc kubenswrapper[4930]: I1012 05:42:40.113249 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:40 crc kubenswrapper[4930]: I1012 05:42:40.113261 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:40Z","lastTransitionTime":"2025-10-12T05:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:40 crc kubenswrapper[4930]: I1012 05:42:40.135093 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 05:42:40 crc kubenswrapper[4930]: I1012 05:42:40.135102 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 05:42:40 crc kubenswrapper[4930]: E1012 05:42:40.135199 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 05:42:40 crc kubenswrapper[4930]: I1012 05:42:40.135094 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 05:42:40 crc kubenswrapper[4930]: E1012 05:42:40.135387 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 05:42:40 crc kubenswrapper[4930]: I1012 05:42:40.135456 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cjzn" Oct 12 05:42:40 crc kubenswrapper[4930]: E1012 05:42:40.135517 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 05:42:40 crc kubenswrapper[4930]: E1012 05:42:40.135692 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cjzn" podUID="dda08509-105f-4935-a8a9-ff852e73c3ce" Oct 12 05:42:40 crc kubenswrapper[4930]: I1012 05:42:40.216846 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:40 crc kubenswrapper[4930]: I1012 05:42:40.216909 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:40 crc kubenswrapper[4930]: I1012 05:42:40.216926 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:40 crc kubenswrapper[4930]: I1012 05:42:40.216951 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:40 crc kubenswrapper[4930]: I1012 05:42:40.216971 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:40Z","lastTransitionTime":"2025-10-12T05:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:40 crc kubenswrapper[4930]: I1012 05:42:40.319589 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:40 crc kubenswrapper[4930]: I1012 05:42:40.319663 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:40 crc kubenswrapper[4930]: I1012 05:42:40.319685 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:40 crc kubenswrapper[4930]: I1012 05:42:40.319716 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:40 crc kubenswrapper[4930]: I1012 05:42:40.319769 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:40Z","lastTransitionTime":"2025-10-12T05:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:40 crc kubenswrapper[4930]: I1012 05:42:40.422299 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:40 crc kubenswrapper[4930]: I1012 05:42:40.422373 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:40 crc kubenswrapper[4930]: I1012 05:42:40.422400 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:40 crc kubenswrapper[4930]: I1012 05:42:40.422429 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:40 crc kubenswrapper[4930]: I1012 05:42:40.422451 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:40Z","lastTransitionTime":"2025-10-12T05:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:40 crc kubenswrapper[4930]: I1012 05:42:40.526192 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:40 crc kubenswrapper[4930]: I1012 05:42:40.526273 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:40 crc kubenswrapper[4930]: I1012 05:42:40.526295 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:40 crc kubenswrapper[4930]: I1012 05:42:40.526335 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:40 crc kubenswrapper[4930]: I1012 05:42:40.526354 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:40Z","lastTransitionTime":"2025-10-12T05:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:40 crc kubenswrapper[4930]: I1012 05:42:40.629314 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:40 crc kubenswrapper[4930]: I1012 05:42:40.629379 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:40 crc kubenswrapper[4930]: I1012 05:42:40.629398 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:40 crc kubenswrapper[4930]: I1012 05:42:40.629422 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:40 crc kubenswrapper[4930]: I1012 05:42:40.629440 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:40Z","lastTransitionTime":"2025-10-12T05:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:40 crc kubenswrapper[4930]: I1012 05:42:40.732299 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:40 crc kubenswrapper[4930]: I1012 05:42:40.732366 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:40 crc kubenswrapper[4930]: I1012 05:42:40.732385 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:40 crc kubenswrapper[4930]: I1012 05:42:40.732406 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:40 crc kubenswrapper[4930]: I1012 05:42:40.732424 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:40Z","lastTransitionTime":"2025-10-12T05:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:40 crc kubenswrapper[4930]: I1012 05:42:40.834872 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:40 crc kubenswrapper[4930]: I1012 05:42:40.834965 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:40 crc kubenswrapper[4930]: I1012 05:42:40.834986 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:40 crc kubenswrapper[4930]: I1012 05:42:40.835012 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:40 crc kubenswrapper[4930]: I1012 05:42:40.835029 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:40Z","lastTransitionTime":"2025-10-12T05:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:40 crc kubenswrapper[4930]: I1012 05:42:40.937562 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:40 crc kubenswrapper[4930]: I1012 05:42:40.937618 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:40 crc kubenswrapper[4930]: I1012 05:42:40.937631 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:40 crc kubenswrapper[4930]: I1012 05:42:40.937654 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:40 crc kubenswrapper[4930]: I1012 05:42:40.937667 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:40Z","lastTransitionTime":"2025-10-12T05:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:41 crc kubenswrapper[4930]: I1012 05:42:41.039966 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:41 crc kubenswrapper[4930]: I1012 05:42:41.039996 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:41 crc kubenswrapper[4930]: I1012 05:42:41.040004 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:41 crc kubenswrapper[4930]: I1012 05:42:41.040017 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:41 crc kubenswrapper[4930]: I1012 05:42:41.040026 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:41Z","lastTransitionTime":"2025-10-12T05:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:41 crc kubenswrapper[4930]: I1012 05:42:41.143399 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:41 crc kubenswrapper[4930]: I1012 05:42:41.143469 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:41 crc kubenswrapper[4930]: I1012 05:42:41.143486 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:41 crc kubenswrapper[4930]: I1012 05:42:41.143507 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:41 crc kubenswrapper[4930]: I1012 05:42:41.143527 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:41Z","lastTransitionTime":"2025-10-12T05:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:41 crc kubenswrapper[4930]: I1012 05:42:41.247099 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:41 crc kubenswrapper[4930]: I1012 05:42:41.247158 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:41 crc kubenswrapper[4930]: I1012 05:42:41.247176 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:41 crc kubenswrapper[4930]: I1012 05:42:41.247200 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:41 crc kubenswrapper[4930]: I1012 05:42:41.247216 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:41Z","lastTransitionTime":"2025-10-12T05:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:41 crc kubenswrapper[4930]: I1012 05:42:41.350445 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:41 crc kubenswrapper[4930]: I1012 05:42:41.350523 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:41 crc kubenswrapper[4930]: I1012 05:42:41.350540 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:41 crc kubenswrapper[4930]: I1012 05:42:41.350565 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:41 crc kubenswrapper[4930]: I1012 05:42:41.350583 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:41Z","lastTransitionTime":"2025-10-12T05:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:41 crc kubenswrapper[4930]: I1012 05:42:41.454113 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:41 crc kubenswrapper[4930]: I1012 05:42:41.454206 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:41 crc kubenswrapper[4930]: I1012 05:42:41.454231 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:41 crc kubenswrapper[4930]: I1012 05:42:41.454262 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:41 crc kubenswrapper[4930]: I1012 05:42:41.454287 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:41Z","lastTransitionTime":"2025-10-12T05:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:41 crc kubenswrapper[4930]: I1012 05:42:41.557496 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:41 crc kubenswrapper[4930]: I1012 05:42:41.557571 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:41 crc kubenswrapper[4930]: I1012 05:42:41.557587 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:41 crc kubenswrapper[4930]: I1012 05:42:41.557613 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:41 crc kubenswrapper[4930]: I1012 05:42:41.557631 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:41Z","lastTransitionTime":"2025-10-12T05:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:41 crc kubenswrapper[4930]: I1012 05:42:41.661023 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:41 crc kubenswrapper[4930]: I1012 05:42:41.661084 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:41 crc kubenswrapper[4930]: I1012 05:42:41.661100 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:41 crc kubenswrapper[4930]: I1012 05:42:41.661123 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:41 crc kubenswrapper[4930]: I1012 05:42:41.661139 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:41Z","lastTransitionTime":"2025-10-12T05:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:41 crc kubenswrapper[4930]: I1012 05:42:41.764238 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:41 crc kubenswrapper[4930]: I1012 05:42:41.764338 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:41 crc kubenswrapper[4930]: I1012 05:42:41.764358 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:41 crc kubenswrapper[4930]: I1012 05:42:41.764381 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:41 crc kubenswrapper[4930]: I1012 05:42:41.764403 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:41Z","lastTransitionTime":"2025-10-12T05:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:41 crc kubenswrapper[4930]: I1012 05:42:41.867244 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:41 crc kubenswrapper[4930]: I1012 05:42:41.867296 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:41 crc kubenswrapper[4930]: I1012 05:42:41.867312 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:41 crc kubenswrapper[4930]: I1012 05:42:41.867337 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:41 crc kubenswrapper[4930]: I1012 05:42:41.867357 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:41Z","lastTransitionTime":"2025-10-12T05:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:41 crc kubenswrapper[4930]: I1012 05:42:41.972210 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:41 crc kubenswrapper[4930]: I1012 05:42:41.972275 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:41 crc kubenswrapper[4930]: I1012 05:42:41.972295 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:41 crc kubenswrapper[4930]: I1012 05:42:41.972320 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:41 crc kubenswrapper[4930]: I1012 05:42:41.972339 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:41Z","lastTransitionTime":"2025-10-12T05:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:42 crc kubenswrapper[4930]: I1012 05:42:42.075762 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:42 crc kubenswrapper[4930]: I1012 05:42:42.075838 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:42 crc kubenswrapper[4930]: I1012 05:42:42.075862 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:42 crc kubenswrapper[4930]: I1012 05:42:42.075886 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:42 crc kubenswrapper[4930]: I1012 05:42:42.075903 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:42Z","lastTransitionTime":"2025-10-12T05:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:42 crc kubenswrapper[4930]: I1012 05:42:42.134794 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cjzn" Oct 12 05:42:42 crc kubenswrapper[4930]: I1012 05:42:42.134793 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 05:42:42 crc kubenswrapper[4930]: I1012 05:42:42.134929 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 05:42:42 crc kubenswrapper[4930]: I1012 05:42:42.134932 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 05:42:42 crc kubenswrapper[4930]: E1012 05:42:42.135051 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cjzn" podUID="dda08509-105f-4935-a8a9-ff852e73c3ce" Oct 12 05:42:42 crc kubenswrapper[4930]: E1012 05:42:42.135210 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 05:42:42 crc kubenswrapper[4930]: E1012 05:42:42.135406 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 05:42:42 crc kubenswrapper[4930]: E1012 05:42:42.135526 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 05:42:42 crc kubenswrapper[4930]: I1012 05:42:42.178842 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:42 crc kubenswrapper[4930]: I1012 05:42:42.178900 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:42 crc kubenswrapper[4930]: I1012 05:42:42.178918 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:42 crc kubenswrapper[4930]: I1012 05:42:42.178942 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:42 crc kubenswrapper[4930]: I1012 05:42:42.178960 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:42Z","lastTransitionTime":"2025-10-12T05:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:42 crc kubenswrapper[4930]: I1012 05:42:42.281633 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:42 crc kubenswrapper[4930]: I1012 05:42:42.281711 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:42 crc kubenswrapper[4930]: I1012 05:42:42.281728 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:42 crc kubenswrapper[4930]: I1012 05:42:42.281780 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:42 crc kubenswrapper[4930]: I1012 05:42:42.281798 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:42Z","lastTransitionTime":"2025-10-12T05:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:42 crc kubenswrapper[4930]: I1012 05:42:42.384808 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:42 crc kubenswrapper[4930]: I1012 05:42:42.384947 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:42 crc kubenswrapper[4930]: I1012 05:42:42.384964 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:42 crc kubenswrapper[4930]: I1012 05:42:42.384987 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:42 crc kubenswrapper[4930]: I1012 05:42:42.385003 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:42Z","lastTransitionTime":"2025-10-12T05:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:42 crc kubenswrapper[4930]: I1012 05:42:42.488511 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:42 crc kubenswrapper[4930]: I1012 05:42:42.488591 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:42 crc kubenswrapper[4930]: I1012 05:42:42.488614 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:42 crc kubenswrapper[4930]: I1012 05:42:42.488644 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:42 crc kubenswrapper[4930]: I1012 05:42:42.488663 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:42Z","lastTransitionTime":"2025-10-12T05:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:42 crc kubenswrapper[4930]: I1012 05:42:42.591664 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:42 crc kubenswrapper[4930]: I1012 05:42:42.591763 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:42 crc kubenswrapper[4930]: I1012 05:42:42.591785 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:42 crc kubenswrapper[4930]: I1012 05:42:42.591814 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:42 crc kubenswrapper[4930]: I1012 05:42:42.591834 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:42Z","lastTransitionTime":"2025-10-12T05:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:42 crc kubenswrapper[4930]: I1012 05:42:42.695137 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:42 crc kubenswrapper[4930]: I1012 05:42:42.695199 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:42 crc kubenswrapper[4930]: I1012 05:42:42.695221 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:42 crc kubenswrapper[4930]: I1012 05:42:42.695249 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:42 crc kubenswrapper[4930]: I1012 05:42:42.695270 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:42Z","lastTransitionTime":"2025-10-12T05:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:42 crc kubenswrapper[4930]: I1012 05:42:42.798340 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:42 crc kubenswrapper[4930]: I1012 05:42:42.798402 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:42 crc kubenswrapper[4930]: I1012 05:42:42.798420 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:42 crc kubenswrapper[4930]: I1012 05:42:42.798449 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:42 crc kubenswrapper[4930]: I1012 05:42:42.798468 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:42Z","lastTransitionTime":"2025-10-12T05:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:42 crc kubenswrapper[4930]: I1012 05:42:42.901796 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:42 crc kubenswrapper[4930]: I1012 05:42:42.901875 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:42 crc kubenswrapper[4930]: I1012 05:42:42.901898 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:42 crc kubenswrapper[4930]: I1012 05:42:42.901930 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:42 crc kubenswrapper[4930]: I1012 05:42:42.901954 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:42Z","lastTransitionTime":"2025-10-12T05:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:43 crc kubenswrapper[4930]: I1012 05:42:43.005410 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:43 crc kubenswrapper[4930]: I1012 05:42:43.005473 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:43 crc kubenswrapper[4930]: I1012 05:42:43.005493 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:43 crc kubenswrapper[4930]: I1012 05:42:43.005521 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:43 crc kubenswrapper[4930]: I1012 05:42:43.005540 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:43Z","lastTransitionTime":"2025-10-12T05:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:43 crc kubenswrapper[4930]: I1012 05:42:43.109784 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:43 crc kubenswrapper[4930]: I1012 05:42:43.109852 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:43 crc kubenswrapper[4930]: I1012 05:42:43.109869 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:43 crc kubenswrapper[4930]: I1012 05:42:43.109894 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:43 crc kubenswrapper[4930]: I1012 05:42:43.109913 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:43Z","lastTransitionTime":"2025-10-12T05:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:43 crc kubenswrapper[4930]: I1012 05:42:43.213895 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:43 crc kubenswrapper[4930]: I1012 05:42:43.213961 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:43 crc kubenswrapper[4930]: I1012 05:42:43.213980 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:43 crc kubenswrapper[4930]: I1012 05:42:43.214006 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:43 crc kubenswrapper[4930]: I1012 05:42:43.214026 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:43Z","lastTransitionTime":"2025-10-12T05:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:43 crc kubenswrapper[4930]: I1012 05:42:43.316912 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:43 crc kubenswrapper[4930]: I1012 05:42:43.316969 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:43 crc kubenswrapper[4930]: I1012 05:42:43.316987 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:43 crc kubenswrapper[4930]: I1012 05:42:43.317013 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:43 crc kubenswrapper[4930]: I1012 05:42:43.317031 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:43Z","lastTransitionTime":"2025-10-12T05:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:43 crc kubenswrapper[4930]: I1012 05:42:43.419072 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:43 crc kubenswrapper[4930]: I1012 05:42:43.419103 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:43 crc kubenswrapper[4930]: I1012 05:42:43.419112 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:43 crc kubenswrapper[4930]: I1012 05:42:43.419125 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:43 crc kubenswrapper[4930]: I1012 05:42:43.419135 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:43Z","lastTransitionTime":"2025-10-12T05:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:43 crc kubenswrapper[4930]: I1012 05:42:43.522227 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:43 crc kubenswrapper[4930]: I1012 05:42:43.522296 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:43 crc kubenswrapper[4930]: I1012 05:42:43.522318 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:43 crc kubenswrapper[4930]: I1012 05:42:43.522349 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:43 crc kubenswrapper[4930]: I1012 05:42:43.522373 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:43Z","lastTransitionTime":"2025-10-12T05:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:43 crc kubenswrapper[4930]: I1012 05:42:43.625601 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:43 crc kubenswrapper[4930]: I1012 05:42:43.625668 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:43 crc kubenswrapper[4930]: I1012 05:42:43.625688 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:43 crc kubenswrapper[4930]: I1012 05:42:43.625713 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:43 crc kubenswrapper[4930]: I1012 05:42:43.625732 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:43Z","lastTransitionTime":"2025-10-12T05:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:43 crc kubenswrapper[4930]: I1012 05:42:43.729502 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:43 crc kubenswrapper[4930]: I1012 05:42:43.729562 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:43 crc kubenswrapper[4930]: I1012 05:42:43.729580 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:43 crc kubenswrapper[4930]: I1012 05:42:43.729604 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:43 crc kubenswrapper[4930]: I1012 05:42:43.729622 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:43Z","lastTransitionTime":"2025-10-12T05:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:43 crc kubenswrapper[4930]: I1012 05:42:43.833045 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:43 crc kubenswrapper[4930]: I1012 05:42:43.833114 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:43 crc kubenswrapper[4930]: I1012 05:42:43.833132 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:43 crc kubenswrapper[4930]: I1012 05:42:43.833160 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:43 crc kubenswrapper[4930]: I1012 05:42:43.833179 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:43Z","lastTransitionTime":"2025-10-12T05:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:43 crc kubenswrapper[4930]: I1012 05:42:43.936520 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:43 crc kubenswrapper[4930]: I1012 05:42:43.936595 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:43 crc kubenswrapper[4930]: I1012 05:42:43.936612 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:43 crc kubenswrapper[4930]: I1012 05:42:43.936639 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:43 crc kubenswrapper[4930]: I1012 05:42:43.936680 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:43Z","lastTransitionTime":"2025-10-12T05:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:44 crc kubenswrapper[4930]: I1012 05:42:44.040446 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:44 crc kubenswrapper[4930]: I1012 05:42:44.040527 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:44 crc kubenswrapper[4930]: I1012 05:42:44.040550 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:44 crc kubenswrapper[4930]: I1012 05:42:44.040580 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:44 crc kubenswrapper[4930]: I1012 05:42:44.040602 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:44Z","lastTransitionTime":"2025-10-12T05:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:44 crc kubenswrapper[4930]: I1012 05:42:44.135282 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 05:42:44 crc kubenswrapper[4930]: I1012 05:42:44.135307 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 05:42:44 crc kubenswrapper[4930]: I1012 05:42:44.135433 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cjzn" Oct 12 05:42:44 crc kubenswrapper[4930]: E1012 05:42:44.135625 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 05:42:44 crc kubenswrapper[4930]: I1012 05:42:44.135652 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 05:42:44 crc kubenswrapper[4930]: E1012 05:42:44.135777 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 05:42:44 crc kubenswrapper[4930]: E1012 05:42:44.135990 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 05:42:44 crc kubenswrapper[4930]: E1012 05:42:44.136125 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cjzn" podUID="dda08509-105f-4935-a8a9-ff852e73c3ce" Oct 12 05:42:44 crc kubenswrapper[4930]: I1012 05:42:44.142554 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:44 crc kubenswrapper[4930]: I1012 05:42:44.142656 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:44 crc kubenswrapper[4930]: I1012 05:42:44.142686 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:44 crc kubenswrapper[4930]: I1012 05:42:44.142713 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:44 crc kubenswrapper[4930]: I1012 05:42:44.142777 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:44Z","lastTransitionTime":"2025-10-12T05:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:44 crc kubenswrapper[4930]: I1012 05:42:44.246082 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:44 crc kubenswrapper[4930]: I1012 05:42:44.246152 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:44 crc kubenswrapper[4930]: I1012 05:42:44.246170 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:44 crc kubenswrapper[4930]: I1012 05:42:44.246191 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:44 crc kubenswrapper[4930]: I1012 05:42:44.246207 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:44Z","lastTransitionTime":"2025-10-12T05:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:44 crc kubenswrapper[4930]: I1012 05:42:44.349895 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:44 crc kubenswrapper[4930]: I1012 05:42:44.349956 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:44 crc kubenswrapper[4930]: I1012 05:42:44.349973 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:44 crc kubenswrapper[4930]: I1012 05:42:44.350021 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:44 crc kubenswrapper[4930]: I1012 05:42:44.350042 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:44Z","lastTransitionTime":"2025-10-12T05:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:44 crc kubenswrapper[4930]: I1012 05:42:44.453526 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:44 crc kubenswrapper[4930]: I1012 05:42:44.453658 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:44 crc kubenswrapper[4930]: I1012 05:42:44.453709 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:44 crc kubenswrapper[4930]: I1012 05:42:44.453779 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:44 crc kubenswrapper[4930]: I1012 05:42:44.453805 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:44Z","lastTransitionTime":"2025-10-12T05:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:44 crc kubenswrapper[4930]: I1012 05:42:44.557218 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:44 crc kubenswrapper[4930]: I1012 05:42:44.557286 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:44 crc kubenswrapper[4930]: I1012 05:42:44.557305 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:44 crc kubenswrapper[4930]: I1012 05:42:44.557333 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:44 crc kubenswrapper[4930]: I1012 05:42:44.557350 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:44Z","lastTransitionTime":"2025-10-12T05:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:44 crc kubenswrapper[4930]: I1012 05:42:44.660178 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:44 crc kubenswrapper[4930]: I1012 05:42:44.660253 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:44 crc kubenswrapper[4930]: I1012 05:42:44.660279 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:44 crc kubenswrapper[4930]: I1012 05:42:44.660308 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:44 crc kubenswrapper[4930]: I1012 05:42:44.660329 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:44Z","lastTransitionTime":"2025-10-12T05:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:44 crc kubenswrapper[4930]: I1012 05:42:44.762964 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:44 crc kubenswrapper[4930]: I1012 05:42:44.763017 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:44 crc kubenswrapper[4930]: I1012 05:42:44.763035 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:44 crc kubenswrapper[4930]: I1012 05:42:44.763058 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:44 crc kubenswrapper[4930]: I1012 05:42:44.763075 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:44Z","lastTransitionTime":"2025-10-12T05:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:44 crc kubenswrapper[4930]: I1012 05:42:44.865846 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:44 crc kubenswrapper[4930]: I1012 05:42:44.865907 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:44 crc kubenswrapper[4930]: I1012 05:42:44.865923 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:44 crc kubenswrapper[4930]: I1012 05:42:44.865945 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:44 crc kubenswrapper[4930]: I1012 05:42:44.865963 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:44Z","lastTransitionTime":"2025-10-12T05:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:44 crc kubenswrapper[4930]: I1012 05:42:44.969537 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:44 crc kubenswrapper[4930]: I1012 05:42:44.969597 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:44 crc kubenswrapper[4930]: I1012 05:42:44.969618 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:44 crc kubenswrapper[4930]: I1012 05:42:44.969639 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:44 crc kubenswrapper[4930]: I1012 05:42:44.969658 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:44Z","lastTransitionTime":"2025-10-12T05:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:45 crc kubenswrapper[4930]: I1012 05:42:45.073618 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:45 crc kubenswrapper[4930]: I1012 05:42:45.073674 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:45 crc kubenswrapper[4930]: I1012 05:42:45.073695 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:45 crc kubenswrapper[4930]: I1012 05:42:45.073717 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:45 crc kubenswrapper[4930]: I1012 05:42:45.073759 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:45Z","lastTransitionTime":"2025-10-12T05:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:45 crc kubenswrapper[4930]: I1012 05:42:45.176251 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:45 crc kubenswrapper[4930]: I1012 05:42:45.176304 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:45 crc kubenswrapper[4930]: I1012 05:42:45.176328 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:45 crc kubenswrapper[4930]: I1012 05:42:45.176355 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:45 crc kubenswrapper[4930]: I1012 05:42:45.176378 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:45Z","lastTransitionTime":"2025-10-12T05:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:45 crc kubenswrapper[4930]: I1012 05:42:45.278941 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:45 crc kubenswrapper[4930]: I1012 05:42:45.279002 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:45 crc kubenswrapper[4930]: I1012 05:42:45.279020 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:45 crc kubenswrapper[4930]: I1012 05:42:45.279045 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:45 crc kubenswrapper[4930]: I1012 05:42:45.279064 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:45Z","lastTransitionTime":"2025-10-12T05:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:45 crc kubenswrapper[4930]: I1012 05:42:45.381871 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:45 crc kubenswrapper[4930]: I1012 05:42:45.381933 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:45 crc kubenswrapper[4930]: I1012 05:42:45.381953 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:45 crc kubenswrapper[4930]: I1012 05:42:45.381979 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:45 crc kubenswrapper[4930]: I1012 05:42:45.382004 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:45Z","lastTransitionTime":"2025-10-12T05:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:45 crc kubenswrapper[4930]: I1012 05:42:45.486413 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:45 crc kubenswrapper[4930]: I1012 05:42:45.486483 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:45 crc kubenswrapper[4930]: I1012 05:42:45.486506 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:45 crc kubenswrapper[4930]: I1012 05:42:45.486534 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:45 crc kubenswrapper[4930]: I1012 05:42:45.486555 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:45Z","lastTransitionTime":"2025-10-12T05:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:45 crc kubenswrapper[4930]: I1012 05:42:45.590099 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:45 crc kubenswrapper[4930]: I1012 05:42:45.590173 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:45 crc kubenswrapper[4930]: I1012 05:42:45.590197 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:45 crc kubenswrapper[4930]: I1012 05:42:45.590228 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:45 crc kubenswrapper[4930]: I1012 05:42:45.590250 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:45Z","lastTransitionTime":"2025-10-12T05:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:45 crc kubenswrapper[4930]: I1012 05:42:45.692835 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:45 crc kubenswrapper[4930]: I1012 05:42:45.692899 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:45 crc kubenswrapper[4930]: I1012 05:42:45.692923 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:45 crc kubenswrapper[4930]: I1012 05:42:45.692950 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:45 crc kubenswrapper[4930]: I1012 05:42:45.692976 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:45Z","lastTransitionTime":"2025-10-12T05:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:45 crc kubenswrapper[4930]: I1012 05:42:45.795777 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:45 crc kubenswrapper[4930]: I1012 05:42:45.795825 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:45 crc kubenswrapper[4930]: I1012 05:42:45.795850 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:45 crc kubenswrapper[4930]: I1012 05:42:45.795879 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:45 crc kubenswrapper[4930]: I1012 05:42:45.795901 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:45Z","lastTransitionTime":"2025-10-12T05:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:45 crc kubenswrapper[4930]: I1012 05:42:45.899280 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:45 crc kubenswrapper[4930]: I1012 05:42:45.899338 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:45 crc kubenswrapper[4930]: I1012 05:42:45.899357 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:45 crc kubenswrapper[4930]: I1012 05:42:45.899384 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:45 crc kubenswrapper[4930]: I1012 05:42:45.899405 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:45Z","lastTransitionTime":"2025-10-12T05:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:46 crc kubenswrapper[4930]: I1012 05:42:46.002654 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:46 crc kubenswrapper[4930]: I1012 05:42:46.002709 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:46 crc kubenswrapper[4930]: I1012 05:42:46.002725 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:46 crc kubenswrapper[4930]: I1012 05:42:46.002792 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:46 crc kubenswrapper[4930]: I1012 05:42:46.002820 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:46Z","lastTransitionTime":"2025-10-12T05:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:46 crc kubenswrapper[4930]: I1012 05:42:46.106518 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:46 crc kubenswrapper[4930]: I1012 05:42:46.106574 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:46 crc kubenswrapper[4930]: I1012 05:42:46.106592 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:46 crc kubenswrapper[4930]: I1012 05:42:46.106619 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:46 crc kubenswrapper[4930]: I1012 05:42:46.106640 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:46Z","lastTransitionTime":"2025-10-12T05:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:46 crc kubenswrapper[4930]: I1012 05:42:46.134312 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cjzn" Oct 12 05:42:46 crc kubenswrapper[4930]: E1012 05:42:46.134493 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cjzn" podUID="dda08509-105f-4935-a8a9-ff852e73c3ce" Oct 12 05:42:46 crc kubenswrapper[4930]: I1012 05:42:46.134532 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 05:42:46 crc kubenswrapper[4930]: I1012 05:42:46.134619 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 05:42:46 crc kubenswrapper[4930]: E1012 05:42:46.134829 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 05:42:46 crc kubenswrapper[4930]: I1012 05:42:46.134888 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 05:42:46 crc kubenswrapper[4930]: E1012 05:42:46.135267 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 05:42:46 crc kubenswrapper[4930]: E1012 05:42:46.135480 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 05:42:46 crc kubenswrapper[4930]: I1012 05:42:46.209252 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:46 crc kubenswrapper[4930]: I1012 05:42:46.209318 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:46 crc kubenswrapper[4930]: I1012 05:42:46.209342 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:46 crc kubenswrapper[4930]: I1012 05:42:46.209372 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:46 crc kubenswrapper[4930]: I1012 05:42:46.209395 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:46Z","lastTransitionTime":"2025-10-12T05:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:46 crc kubenswrapper[4930]: I1012 05:42:46.312473 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:46 crc kubenswrapper[4930]: I1012 05:42:46.312556 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:46 crc kubenswrapper[4930]: I1012 05:42:46.312574 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:46 crc kubenswrapper[4930]: I1012 05:42:46.312598 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:46 crc kubenswrapper[4930]: I1012 05:42:46.312614 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:46Z","lastTransitionTime":"2025-10-12T05:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:46 crc kubenswrapper[4930]: I1012 05:42:46.415952 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:46 crc kubenswrapper[4930]: I1012 05:42:46.415987 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:46 crc kubenswrapper[4930]: I1012 05:42:46.416004 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:46 crc kubenswrapper[4930]: I1012 05:42:46.416025 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:46 crc kubenswrapper[4930]: I1012 05:42:46.416042 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:46Z","lastTransitionTime":"2025-10-12T05:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:46 crc kubenswrapper[4930]: I1012 05:42:46.509024 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 12 05:42:46 crc kubenswrapper[4930]: I1012 05:42:46.509075 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 12 05:42:46 crc kubenswrapper[4930]: I1012 05:42:46.509091 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 12 05:42:46 crc kubenswrapper[4930]: I1012 05:42:46.509113 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 12 05:42:46 crc kubenswrapper[4930]: I1012 05:42:46.509129 4930 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-12T05:42:46Z","lastTransitionTime":"2025-10-12T05:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 12 05:42:46 crc kubenswrapper[4930]: I1012 05:42:46.578571 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-mbkw5"] Oct 12 05:42:46 crc kubenswrapper[4930]: I1012 05:42:46.579891 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mbkw5" Oct 12 05:42:46 crc kubenswrapper[4930]: I1012 05:42:46.585359 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Oct 12 05:42:46 crc kubenswrapper[4930]: I1012 05:42:46.585724 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Oct 12 05:42:46 crc kubenswrapper[4930]: I1012 05:42:46.585935 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Oct 12 05:42:46 crc kubenswrapper[4930]: I1012 05:42:46.590668 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Oct 12 05:42:46 crc kubenswrapper[4930]: I1012 05:42:46.634635 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=72.634598634 podStartE2EDuration="1m12.634598634s" podCreationTimestamp="2025-10-12 05:41:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:42:46.614011239 +0000 UTC m=+99.156113044" watchObservedRunningTime="2025-10-12 05:42:46.634598634 +0000 UTC m=+99.176700439" Oct 12 05:42:46 crc kubenswrapper[4930]: I1012 05:42:46.665173 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-tq29s" podStartSLOduration=77.665145403 podStartE2EDuration="1m17.665145403s" podCreationTimestamp="2025-10-12 05:41:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:42:46.661915924 +0000 UTC m=+99.204017719" watchObservedRunningTime="2025-10-12 05:42:46.665145403 +0000 UTC m=+99.207247198" Oct 12 05:42:46 crc kubenswrapper[4930]: I1012 05:42:46.676564 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podStartSLOduration=77.676535052 podStartE2EDuration="1m17.676535052s" podCreationTimestamp="2025-10-12 05:41:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:42:46.676131143 +0000 UTC m=+99.218232948" watchObservedRunningTime="2025-10-12 05:42:46.676535052 +0000 UTC m=+99.218636857" Oct 12 05:42:46 crc kubenswrapper[4930]: I1012 05:42:46.688120 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/1cc3e888-aaa6-46ec-9a2b-381a6103e95d-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-mbkw5\" (UID: \"1cc3e888-aaa6-46ec-9a2b-381a6103e95d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mbkw5" Oct 12 05:42:46 crc kubenswrapper[4930]: I1012 05:42:46.688523 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1cc3e888-aaa6-46ec-9a2b-381a6103e95d-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-mbkw5\" (UID: \"1cc3e888-aaa6-46ec-9a2b-381a6103e95d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mbkw5" Oct 12 05:42:46 crc kubenswrapper[4930]: I1012 05:42:46.688826 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1cc3e888-aaa6-46ec-9a2b-381a6103e95d-service-ca\") pod \"cluster-version-operator-5c965bbfc6-mbkw5\" (UID: \"1cc3e888-aaa6-46ec-9a2b-381a6103e95d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mbkw5" Oct 12 05:42:46 crc kubenswrapper[4930]: I1012 05:42:46.689081 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/1cc3e888-aaa6-46ec-9a2b-381a6103e95d-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-mbkw5\" (UID: \"1cc3e888-aaa6-46ec-9a2b-381a6103e95d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mbkw5" Oct 12 05:42:46 crc kubenswrapper[4930]: I1012 05:42:46.689385 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1cc3e888-aaa6-46ec-9a2b-381a6103e95d-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-mbkw5\" (UID: \"1cc3e888-aaa6-46ec-9a2b-381a6103e95d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mbkw5" Oct 12 05:42:46 crc kubenswrapper[4930]: I1012 05:42:46.716913 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=44.716891142 podStartE2EDuration="44.716891142s" podCreationTimestamp="2025-10-12 05:42:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:42:46.69966501 +0000 UTC m=+99.241766815" watchObservedRunningTime="2025-10-12 05:42:46.716891142 +0000 UTC m=+99.258992947" Oct 12 05:42:46 crc kubenswrapper[4930]: I1012 05:42:46.717723 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=9.717690372 podStartE2EDuration="9.717690372s" podCreationTimestamp="2025-10-12 05:42:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:42:46.717261261 +0000 UTC m=+99.259363056" watchObservedRunningTime="2025-10-12 05:42:46.717690372 +0000 UTC m=+99.259792167" Oct 12 05:42:46 crc kubenswrapper[4930]: I1012 05:42:46.791015 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1cc3e888-aaa6-46ec-9a2b-381a6103e95d-service-ca\") pod \"cluster-version-operator-5c965bbfc6-mbkw5\" (UID: \"1cc3e888-aaa6-46ec-9a2b-381a6103e95d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mbkw5" Oct 12 05:42:46 crc kubenswrapper[4930]: I1012 05:42:46.791082 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/1cc3e888-aaa6-46ec-9a2b-381a6103e95d-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-mbkw5\" (UID: \"1cc3e888-aaa6-46ec-9a2b-381a6103e95d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mbkw5" Oct 12 05:42:46 crc kubenswrapper[4930]: I1012 05:42:46.791152 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1cc3e888-aaa6-46ec-9a2b-381a6103e95d-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-mbkw5\" (UID: \"1cc3e888-aaa6-46ec-9a2b-381a6103e95d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mbkw5" Oct 12 05:42:46 crc kubenswrapper[4930]: I1012 05:42:46.791232 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1cc3e888-aaa6-46ec-9a2b-381a6103e95d-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-mbkw5\" (UID: \"1cc3e888-aaa6-46ec-9a2b-381a6103e95d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mbkw5" Oct 12 05:42:46 crc kubenswrapper[4930]: I1012 05:42:46.791267 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/1cc3e888-aaa6-46ec-9a2b-381a6103e95d-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-mbkw5\" (UID: \"1cc3e888-aaa6-46ec-9a2b-381a6103e95d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mbkw5" Oct 12 05:42:46 crc kubenswrapper[4930]: I1012 05:42:46.791399 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/1cc3e888-aaa6-46ec-9a2b-381a6103e95d-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-mbkw5\" (UID: \"1cc3e888-aaa6-46ec-9a2b-381a6103e95d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mbkw5" Oct 12 05:42:46 crc kubenswrapper[4930]: I1012 05:42:46.791830 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/1cc3e888-aaa6-46ec-9a2b-381a6103e95d-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-mbkw5\" (UID: \"1cc3e888-aaa6-46ec-9a2b-381a6103e95d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mbkw5" Oct 12 05:42:46 crc kubenswrapper[4930]: I1012 05:42:46.792592 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1cc3e888-aaa6-46ec-9a2b-381a6103e95d-service-ca\") pod \"cluster-version-operator-5c965bbfc6-mbkw5\" (UID: \"1cc3e888-aaa6-46ec-9a2b-381a6103e95d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mbkw5" Oct 12 05:42:46 crc kubenswrapper[4930]: I1012 05:42:46.799963 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1cc3e888-aaa6-46ec-9a2b-381a6103e95d-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-mbkw5\" (UID: \"1cc3e888-aaa6-46ec-9a2b-381a6103e95d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mbkw5" Oct 12 05:42:46 crc kubenswrapper[4930]: I1012 05:42:46.827899 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1cc3e888-aaa6-46ec-9a2b-381a6103e95d-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-mbkw5\" (UID: \"1cc3e888-aaa6-46ec-9a2b-381a6103e95d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mbkw5" Oct 12 05:42:46 crc kubenswrapper[4930]: I1012 05:42:46.856576 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=78.856545767 podStartE2EDuration="1m18.856545767s" podCreationTimestamp="2025-10-12 05:41:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:42:46.855626494 +0000 UTC m=+99.397728299" watchObservedRunningTime="2025-10-12 05:42:46.856545767 +0000 UTC m=+99.398647572" Oct 12 05:42:46 crc kubenswrapper[4930]: I1012 05:42:46.894441 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-vwttt" podStartSLOduration=77.894401075 podStartE2EDuration="1m17.894401075s" podCreationTimestamp="2025-10-12 05:41:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:42:46.893468142 +0000 UTC m=+99.435569957" watchObservedRunningTime="2025-10-12 05:42:46.894401075 +0000 UTC m=+99.436502880" Oct 12 05:42:46 crc kubenswrapper[4930]: I1012 05:42:46.901041 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mbkw5" Oct 12 05:42:46 crc kubenswrapper[4930]: W1012 05:42:46.925431 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1cc3e888_aaa6_46ec_9a2b_381a6103e95d.slice/crio-95944e6882a897bad72f976b97f911fbccb382d6d7f4bb66f2e0b1626daefa08 WatchSource:0}: Error finding container 95944e6882a897bad72f976b97f911fbccb382d6d7f4bb66f2e0b1626daefa08: Status 404 returned error can't find the container with id 95944e6882a897bad72f976b97f911fbccb382d6d7f4bb66f2e0b1626daefa08 Oct 12 05:42:46 crc kubenswrapper[4930]: I1012 05:42:46.973289 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-br2vl" podStartSLOduration=78.973262129 podStartE2EDuration="1m18.973262129s" podCreationTimestamp="2025-10-12 05:41:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:42:46.973118235 +0000 UTC m=+99.515220010" watchObservedRunningTime="2025-10-12 05:42:46.973262129 +0000 UTC m=+99.515363934" Oct 12 05:42:46 crc kubenswrapper[4930]: I1012 05:42:46.973814 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fjsw8" podStartSLOduration=77.973803632 podStartE2EDuration="1m17.973803632s" podCreationTimestamp="2025-10-12 05:41:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:42:46.95171476 +0000 UTC m=+99.493816555" watchObservedRunningTime="2025-10-12 05:42:46.973803632 +0000 UTC m=+99.515905437" Oct 12 05:42:46 crc kubenswrapper[4930]: I1012 05:42:46.987945 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-jd2dw" podStartSLOduration=78.987927538 podStartE2EDuration="1m18.987927538s" podCreationTimestamp="2025-10-12 05:41:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:42:46.987504688 +0000 UTC m=+99.529606463" watchObservedRunningTime="2025-10-12 05:42:46.987927538 +0000 UTC m=+99.530029343" Oct 12 05:42:47 crc kubenswrapper[4930]: I1012 05:42:47.016050 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=79.016030348 podStartE2EDuration="1m19.016030348s" podCreationTimestamp="2025-10-12 05:41:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:42:47.014969152 +0000 UTC m=+99.557070987" watchObservedRunningTime="2025-10-12 05:42:47.016030348 +0000 UTC m=+99.558132113" Oct 12 05:42:47 crc kubenswrapper[4930]: I1012 05:42:47.796801 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mbkw5" event={"ID":"1cc3e888-aaa6-46ec-9a2b-381a6103e95d","Type":"ContainerStarted","Data":"dc18c3200b20b91631281b2cd36aa95784729e91269d4b3cfe6ab150664a81b8"} Oct 12 05:42:47 crc kubenswrapper[4930]: I1012 05:42:47.796869 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mbkw5" event={"ID":"1cc3e888-aaa6-46ec-9a2b-381a6103e95d","Type":"ContainerStarted","Data":"95944e6882a897bad72f976b97f911fbccb382d6d7f4bb66f2e0b1626daefa08"} Oct 12 05:42:47 crc kubenswrapper[4930]: I1012 05:42:47.820377 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mbkw5" podStartSLOduration=78.820347541 podStartE2EDuration="1m18.820347541s" podCreationTimestamp="2025-10-12 05:41:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:42:47.8182776 +0000 UTC m=+100.360379395" watchObservedRunningTime="2025-10-12 05:42:47.820347541 +0000 UTC m=+100.362449336" Oct 12 05:42:48 crc kubenswrapper[4930]: I1012 05:42:48.134897 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cjzn" Oct 12 05:42:48 crc kubenswrapper[4930]: I1012 05:42:48.135034 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 05:42:48 crc kubenswrapper[4930]: E1012 05:42:48.144005 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cjzn" podUID="dda08509-105f-4935-a8a9-ff852e73c3ce" Oct 12 05:42:48 crc kubenswrapper[4930]: I1012 05:42:48.144073 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 05:42:48 crc kubenswrapper[4930]: E1012 05:42:48.144174 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 05:42:48 crc kubenswrapper[4930]: E1012 05:42:48.144254 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 05:42:48 crc kubenswrapper[4930]: I1012 05:42:48.145084 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 05:42:48 crc kubenswrapper[4930]: E1012 05:42:48.145426 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 05:42:48 crc kubenswrapper[4930]: I1012 05:42:48.207414 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dda08509-105f-4935-a8a9-ff852e73c3ce-metrics-certs\") pod \"network-metrics-daemon-7cjzn\" (UID: \"dda08509-105f-4935-a8a9-ff852e73c3ce\") " pod="openshift-multus/network-metrics-daemon-7cjzn" Oct 12 05:42:48 crc kubenswrapper[4930]: E1012 05:42:48.207639 4930 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 12 05:42:48 crc kubenswrapper[4930]: E1012 05:42:48.207707 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dda08509-105f-4935-a8a9-ff852e73c3ce-metrics-certs podName:dda08509-105f-4935-a8a9-ff852e73c3ce nodeName:}" failed. No retries permitted until 2025-10-12 05:43:52.207684509 +0000 UTC m=+164.749786314 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dda08509-105f-4935-a8a9-ff852e73c3ce-metrics-certs") pod "network-metrics-daemon-7cjzn" (UID: "dda08509-105f-4935-a8a9-ff852e73c3ce") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 12 05:42:50 crc kubenswrapper[4930]: I1012 05:42:50.135123 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cjzn" Oct 12 05:42:50 crc kubenswrapper[4930]: E1012 05:42:50.135315 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cjzn" podUID="dda08509-105f-4935-a8a9-ff852e73c3ce" Oct 12 05:42:50 crc kubenswrapper[4930]: I1012 05:42:50.136394 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 05:42:50 crc kubenswrapper[4930]: I1012 05:42:50.136501 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 05:42:50 crc kubenswrapper[4930]: I1012 05:42:50.136527 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 05:42:50 crc kubenswrapper[4930]: E1012 05:42:50.137204 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 05:42:50 crc kubenswrapper[4930]: E1012 05:42:50.137296 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 05:42:50 crc kubenswrapper[4930]: E1012 05:42:50.137383 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 05:42:52 crc kubenswrapper[4930]: I1012 05:42:52.134965 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cjzn" Oct 12 05:42:52 crc kubenswrapper[4930]: I1012 05:42:52.135030 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 05:42:52 crc kubenswrapper[4930]: I1012 05:42:52.135045 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 05:42:52 crc kubenswrapper[4930]: I1012 05:42:52.135144 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 05:42:52 crc kubenswrapper[4930]: E1012 05:42:52.135139 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cjzn" podUID="dda08509-105f-4935-a8a9-ff852e73c3ce" Oct 12 05:42:52 crc kubenswrapper[4930]: E1012 05:42:52.135326 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 05:42:52 crc kubenswrapper[4930]: E1012 05:42:52.135434 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 05:42:52 crc kubenswrapper[4930]: E1012 05:42:52.135603 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 05:42:54 crc kubenswrapper[4930]: I1012 05:42:54.135112 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cjzn" Oct 12 05:42:54 crc kubenswrapper[4930]: I1012 05:42:54.135185 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 05:42:54 crc kubenswrapper[4930]: I1012 05:42:54.135203 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 05:42:54 crc kubenswrapper[4930]: I1012 05:42:54.135123 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 05:42:54 crc kubenswrapper[4930]: E1012 05:42:54.135351 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cjzn" podUID="dda08509-105f-4935-a8a9-ff852e73c3ce" Oct 12 05:42:54 crc kubenswrapper[4930]: E1012 05:42:54.135475 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 05:42:54 crc kubenswrapper[4930]: E1012 05:42:54.135647 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 05:42:54 crc kubenswrapper[4930]: E1012 05:42:54.136606 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 05:42:54 crc kubenswrapper[4930]: I1012 05:42:54.137112 4930 scope.go:117] "RemoveContainer" containerID="3d13c51f393f32e647751366764fe45c6eb3dd586ba90fc528fce23cae8b16b7" Oct 12 05:42:54 crc kubenswrapper[4930]: E1012 05:42:54.137346 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-mdhw6_openshift-ovn-kubernetes(0add0fa2-092f-4dcc-8c72-82881564bf63)\"" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" podUID="0add0fa2-092f-4dcc-8c72-82881564bf63" Oct 12 05:42:56 crc kubenswrapper[4930]: I1012 05:42:56.134982 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 05:42:56 crc kubenswrapper[4930]: I1012 05:42:56.135070 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 05:42:56 crc kubenswrapper[4930]: E1012 05:42:56.135162 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 05:42:56 crc kubenswrapper[4930]: I1012 05:42:56.135345 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 05:42:56 crc kubenswrapper[4930]: I1012 05:42:56.135705 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cjzn" Oct 12 05:42:56 crc kubenswrapper[4930]: E1012 05:42:56.135835 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 05:42:56 crc kubenswrapper[4930]: E1012 05:42:56.135933 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 05:42:56 crc kubenswrapper[4930]: E1012 05:42:56.136106 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cjzn" podUID="dda08509-105f-4935-a8a9-ff852e73c3ce" Oct 12 05:42:58 crc kubenswrapper[4930]: I1012 05:42:58.135470 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 05:42:58 crc kubenswrapper[4930]: I1012 05:42:58.135565 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 05:42:58 crc kubenswrapper[4930]: I1012 05:42:58.137818 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 05:42:58 crc kubenswrapper[4930]: I1012 05:42:58.137861 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cjzn" Oct 12 05:42:58 crc kubenswrapper[4930]: E1012 05:42:58.138046 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 05:42:58 crc kubenswrapper[4930]: E1012 05:42:58.138193 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cjzn" podUID="dda08509-105f-4935-a8a9-ff852e73c3ce" Oct 12 05:42:58 crc kubenswrapper[4930]: E1012 05:42:58.138483 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 05:42:58 crc kubenswrapper[4930]: E1012 05:42:58.138640 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 05:43:00 crc kubenswrapper[4930]: I1012 05:43:00.135273 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 05:43:00 crc kubenswrapper[4930]: I1012 05:43:00.135318 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 05:43:00 crc kubenswrapper[4930]: I1012 05:43:00.135326 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 05:43:00 crc kubenswrapper[4930]: E1012 05:43:00.135483 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 05:43:00 crc kubenswrapper[4930]: E1012 05:43:00.135660 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 05:43:00 crc kubenswrapper[4930]: I1012 05:43:00.135441 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cjzn" Oct 12 05:43:00 crc kubenswrapper[4930]: E1012 05:43:00.135902 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 05:43:00 crc kubenswrapper[4930]: E1012 05:43:00.136189 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cjzn" podUID="dda08509-105f-4935-a8a9-ff852e73c3ce" Oct 12 05:43:02 crc kubenswrapper[4930]: I1012 05:43:02.134997 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 05:43:02 crc kubenswrapper[4930]: I1012 05:43:02.135131 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 05:43:02 crc kubenswrapper[4930]: I1012 05:43:02.135298 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 05:43:02 crc kubenswrapper[4930]: I1012 05:43:02.135365 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cjzn" Oct 12 05:43:02 crc kubenswrapper[4930]: E1012 05:43:02.135525 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 05:43:02 crc kubenswrapper[4930]: E1012 05:43:02.135604 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 05:43:02 crc kubenswrapper[4930]: E1012 05:43:02.135704 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cjzn" podUID="dda08509-105f-4935-a8a9-ff852e73c3ce" Oct 12 05:43:02 crc kubenswrapper[4930]: E1012 05:43:02.135784 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 05:43:03 crc kubenswrapper[4930]: I1012 05:43:03.862268 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tq29s_c1c3ae9e-26ae-418f-b261-eabc4302b332/kube-multus/1.log" Oct 12 05:43:03 crc kubenswrapper[4930]: I1012 05:43:03.862880 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tq29s_c1c3ae9e-26ae-418f-b261-eabc4302b332/kube-multus/0.log" Oct 12 05:43:03 crc kubenswrapper[4930]: I1012 05:43:03.862929 4930 generic.go:334] "Generic (PLEG): container finished" podID="c1c3ae9e-26ae-418f-b261-eabc4302b332" containerID="6697bf685228a837d46450ea695e9579b3bda51686b6b5bcd8338e66757476cd" exitCode=1 Oct 12 05:43:03 crc kubenswrapper[4930]: I1012 05:43:03.862972 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tq29s" event={"ID":"c1c3ae9e-26ae-418f-b261-eabc4302b332","Type":"ContainerDied","Data":"6697bf685228a837d46450ea695e9579b3bda51686b6b5bcd8338e66757476cd"} Oct 12 05:43:03 crc kubenswrapper[4930]: I1012 05:43:03.863019 4930 scope.go:117] "RemoveContainer" containerID="cbe4d76b294d8c174b77355aae9ec55f4d38c04446fc048603a294d171332584" Oct 12 05:43:03 crc kubenswrapper[4930]: I1012 05:43:03.863632 4930 scope.go:117] "RemoveContainer" containerID="6697bf685228a837d46450ea695e9579b3bda51686b6b5bcd8338e66757476cd" Oct 12 05:43:03 crc kubenswrapper[4930]: E1012 05:43:03.863914 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-tq29s_openshift-multus(c1c3ae9e-26ae-418f-b261-eabc4302b332)\"" pod="openshift-multus/multus-tq29s" podUID="c1c3ae9e-26ae-418f-b261-eabc4302b332" Oct 12 05:43:04 crc kubenswrapper[4930]: I1012 05:43:04.134934 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 05:43:04 crc kubenswrapper[4930]: I1012 05:43:04.135025 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 05:43:04 crc kubenswrapper[4930]: I1012 05:43:04.135141 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 05:43:04 crc kubenswrapper[4930]: E1012 05:43:04.135139 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 05:43:04 crc kubenswrapper[4930]: E1012 05:43:04.135285 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 05:43:04 crc kubenswrapper[4930]: I1012 05:43:04.135365 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cjzn" Oct 12 05:43:04 crc kubenswrapper[4930]: E1012 05:43:04.135439 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 05:43:04 crc kubenswrapper[4930]: E1012 05:43:04.135600 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cjzn" podUID="dda08509-105f-4935-a8a9-ff852e73c3ce" Oct 12 05:43:04 crc kubenswrapper[4930]: I1012 05:43:04.869014 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tq29s_c1c3ae9e-26ae-418f-b261-eabc4302b332/kube-multus/1.log" Oct 12 05:43:05 crc kubenswrapper[4930]: I1012 05:43:05.136160 4930 scope.go:117] "RemoveContainer" containerID="3d13c51f393f32e647751366764fe45c6eb3dd586ba90fc528fce23cae8b16b7" Oct 12 05:43:05 crc kubenswrapper[4930]: E1012 05:43:05.136454 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-mdhw6_openshift-ovn-kubernetes(0add0fa2-092f-4dcc-8c72-82881564bf63)\"" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" podUID="0add0fa2-092f-4dcc-8c72-82881564bf63" Oct 12 05:43:06 crc kubenswrapper[4930]: I1012 05:43:06.134329 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 05:43:06 crc kubenswrapper[4930]: I1012 05:43:06.134361 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 05:43:06 crc kubenswrapper[4930]: I1012 05:43:06.134361 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cjzn" Oct 12 05:43:06 crc kubenswrapper[4930]: I1012 05:43:06.134476 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 05:43:06 crc kubenswrapper[4930]: E1012 05:43:06.134473 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 05:43:06 crc kubenswrapper[4930]: E1012 05:43:06.134608 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cjzn" podUID="dda08509-105f-4935-a8a9-ff852e73c3ce" Oct 12 05:43:06 crc kubenswrapper[4930]: E1012 05:43:06.134801 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 05:43:06 crc kubenswrapper[4930]: E1012 05:43:06.134695 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 05:43:08 crc kubenswrapper[4930]: I1012 05:43:08.149644 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 05:43:08 crc kubenswrapper[4930]: I1012 05:43:08.149766 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 05:43:08 crc kubenswrapper[4930]: I1012 05:43:08.149867 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cjzn" Oct 12 05:43:08 crc kubenswrapper[4930]: I1012 05:43:08.149893 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 05:43:08 crc kubenswrapper[4930]: E1012 05:43:08.150879 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 05:43:08 crc kubenswrapper[4930]: E1012 05:43:08.151130 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cjzn" podUID="dda08509-105f-4935-a8a9-ff852e73c3ce" Oct 12 05:43:08 crc kubenswrapper[4930]: E1012 05:43:08.151295 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 05:43:08 crc kubenswrapper[4930]: E1012 05:43:08.151342 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 05:43:08 crc kubenswrapper[4930]: E1012 05:43:08.164424 4930 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Oct 12 05:43:08 crc kubenswrapper[4930]: E1012 05:43:08.253470 4930 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 12 05:43:10 crc kubenswrapper[4930]: I1012 05:43:10.135055 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 05:43:10 crc kubenswrapper[4930]: I1012 05:43:10.135070 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 05:43:10 crc kubenswrapper[4930]: I1012 05:43:10.135114 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cjzn" Oct 12 05:43:10 crc kubenswrapper[4930]: I1012 05:43:10.135253 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 05:43:10 crc kubenswrapper[4930]: E1012 05:43:10.135437 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 05:43:10 crc kubenswrapper[4930]: E1012 05:43:10.135842 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 05:43:10 crc kubenswrapper[4930]: E1012 05:43:10.136043 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cjzn" podUID="dda08509-105f-4935-a8a9-ff852e73c3ce" Oct 12 05:43:10 crc kubenswrapper[4930]: E1012 05:43:10.136184 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 05:43:12 crc kubenswrapper[4930]: I1012 05:43:12.134803 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 05:43:12 crc kubenswrapper[4930]: I1012 05:43:12.134891 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 05:43:12 crc kubenswrapper[4930]: I1012 05:43:12.134948 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 05:43:12 crc kubenswrapper[4930]: E1012 05:43:12.135023 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 05:43:12 crc kubenswrapper[4930]: I1012 05:43:12.135042 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cjzn" Oct 12 05:43:12 crc kubenswrapper[4930]: E1012 05:43:12.135156 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 05:43:12 crc kubenswrapper[4930]: E1012 05:43:12.135291 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cjzn" podUID="dda08509-105f-4935-a8a9-ff852e73c3ce" Oct 12 05:43:12 crc kubenswrapper[4930]: E1012 05:43:12.135442 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 05:43:13 crc kubenswrapper[4930]: E1012 05:43:13.255458 4930 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 12 05:43:14 crc kubenswrapper[4930]: I1012 05:43:14.134854 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cjzn" Oct 12 05:43:14 crc kubenswrapper[4930]: I1012 05:43:14.134866 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 05:43:14 crc kubenswrapper[4930]: I1012 05:43:14.134979 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 05:43:14 crc kubenswrapper[4930]: I1012 05:43:14.135030 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 05:43:14 crc kubenswrapper[4930]: E1012 05:43:14.135205 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cjzn" podUID="dda08509-105f-4935-a8a9-ff852e73c3ce" Oct 12 05:43:14 crc kubenswrapper[4930]: E1012 05:43:14.135311 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 05:43:14 crc kubenswrapper[4930]: E1012 05:43:14.135622 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 05:43:14 crc kubenswrapper[4930]: E1012 05:43:14.135821 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 05:43:16 crc kubenswrapper[4930]: I1012 05:43:16.134215 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 05:43:16 crc kubenswrapper[4930]: I1012 05:43:16.134263 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 05:43:16 crc kubenswrapper[4930]: I1012 05:43:16.134318 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 05:43:16 crc kubenswrapper[4930]: I1012 05:43:16.134218 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cjzn" Oct 12 05:43:16 crc kubenswrapper[4930]: E1012 05:43:16.134394 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 05:43:16 crc kubenswrapper[4930]: E1012 05:43:16.134456 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 05:43:16 crc kubenswrapper[4930]: E1012 05:43:16.134687 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cjzn" podUID="dda08509-105f-4935-a8a9-ff852e73c3ce" Oct 12 05:43:16 crc kubenswrapper[4930]: E1012 05:43:16.134838 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 05:43:17 crc kubenswrapper[4930]: I1012 05:43:17.135863 4930 scope.go:117] "RemoveContainer" containerID="6697bf685228a837d46450ea695e9579b3bda51686b6b5bcd8338e66757476cd" Oct 12 05:43:17 crc kubenswrapper[4930]: I1012 05:43:17.136629 4930 scope.go:117] "RemoveContainer" containerID="3d13c51f393f32e647751366764fe45c6eb3dd586ba90fc528fce23cae8b16b7" Oct 12 05:43:17 crc kubenswrapper[4930]: I1012 05:43:17.918575 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mdhw6_0add0fa2-092f-4dcc-8c72-82881564bf63/ovnkube-controller/3.log" Oct 12 05:43:17 crc kubenswrapper[4930]: I1012 05:43:17.922931 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" event={"ID":"0add0fa2-092f-4dcc-8c72-82881564bf63","Type":"ContainerStarted","Data":"63db3e72cc84311fe4ad16d27280c33993ec32d15e656ea269fe9ed2f19426ba"} Oct 12 05:43:17 crc kubenswrapper[4930]: I1012 05:43:17.924074 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" Oct 12 05:43:17 crc kubenswrapper[4930]: I1012 05:43:17.926556 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tq29s_c1c3ae9e-26ae-418f-b261-eabc4302b332/kube-multus/1.log" Oct 12 05:43:17 crc kubenswrapper[4930]: I1012 05:43:17.926592 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tq29s" event={"ID":"c1c3ae9e-26ae-418f-b261-eabc4302b332","Type":"ContainerStarted","Data":"b8fe6fb418f70bbc6c0032da29f42a0654018e861808de8b636b2a9170c51464"} Oct 12 05:43:17 crc kubenswrapper[4930]: I1012 05:43:17.973094 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" podStartSLOduration=108.973059818 podStartE2EDuration="1m48.973059818s" podCreationTimestamp="2025-10-12 05:41:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:43:17.972683399 +0000 UTC m=+130.514785174" watchObservedRunningTime="2025-10-12 05:43:17.973059818 +0000 UTC m=+130.515161603" Oct 12 05:43:18 crc kubenswrapper[4930]: I1012 05:43:18.135034 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 05:43:18 crc kubenswrapper[4930]: I1012 05:43:18.135039 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 05:43:18 crc kubenswrapper[4930]: I1012 05:43:18.135091 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cjzn" Oct 12 05:43:18 crc kubenswrapper[4930]: E1012 05:43:18.135884 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 05:43:18 crc kubenswrapper[4930]: I1012 05:43:18.135946 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 05:43:18 crc kubenswrapper[4930]: E1012 05:43:18.136114 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 05:43:18 crc kubenswrapper[4930]: E1012 05:43:18.136162 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 05:43:18 crc kubenswrapper[4930]: E1012 05:43:18.136235 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cjzn" podUID="dda08509-105f-4935-a8a9-ff852e73c3ce" Oct 12 05:43:18 crc kubenswrapper[4930]: I1012 05:43:18.149815 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-7cjzn"] Oct 12 05:43:18 crc kubenswrapper[4930]: E1012 05:43:18.256526 4930 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 12 05:43:18 crc kubenswrapper[4930]: I1012 05:43:18.930759 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cjzn" Oct 12 05:43:18 crc kubenswrapper[4930]: E1012 05:43:18.930949 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cjzn" podUID="dda08509-105f-4935-a8a9-ff852e73c3ce" Oct 12 05:43:20 crc kubenswrapper[4930]: I1012 05:43:20.135116 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 05:43:20 crc kubenswrapper[4930]: E1012 05:43:20.135306 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 05:43:20 crc kubenswrapper[4930]: I1012 05:43:20.135592 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 05:43:20 crc kubenswrapper[4930]: E1012 05:43:20.135678 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 05:43:20 crc kubenswrapper[4930]: I1012 05:43:20.135943 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 05:43:20 crc kubenswrapper[4930]: E1012 05:43:20.136039 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 05:43:20 crc kubenswrapper[4930]: I1012 05:43:20.136229 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cjzn" Oct 12 05:43:20 crc kubenswrapper[4930]: E1012 05:43:20.136324 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cjzn" podUID="dda08509-105f-4935-a8a9-ff852e73c3ce" Oct 12 05:43:22 crc kubenswrapper[4930]: I1012 05:43:22.134806 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cjzn" Oct 12 05:43:22 crc kubenswrapper[4930]: I1012 05:43:22.134936 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 05:43:22 crc kubenswrapper[4930]: E1012 05:43:22.135004 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7cjzn" podUID="dda08509-105f-4935-a8a9-ff852e73c3ce" Oct 12 05:43:22 crc kubenswrapper[4930]: I1012 05:43:22.135127 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 05:43:22 crc kubenswrapper[4930]: E1012 05:43:22.135339 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 12 05:43:22 crc kubenswrapper[4930]: E1012 05:43:22.135480 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 12 05:43:22 crc kubenswrapper[4930]: I1012 05:43:22.135533 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 05:43:22 crc kubenswrapper[4930]: E1012 05:43:22.135624 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 12 05:43:24 crc kubenswrapper[4930]: I1012 05:43:24.134879 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 05:43:24 crc kubenswrapper[4930]: I1012 05:43:24.134944 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 05:43:24 crc kubenswrapper[4930]: I1012 05:43:24.134898 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cjzn" Oct 12 05:43:24 crc kubenswrapper[4930]: I1012 05:43:24.136227 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 05:43:24 crc kubenswrapper[4930]: I1012 05:43:24.139914 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Oct 12 05:43:24 crc kubenswrapper[4930]: I1012 05:43:24.140589 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Oct 12 05:43:24 crc kubenswrapper[4930]: I1012 05:43:24.141438 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Oct 12 05:43:24 crc kubenswrapper[4930]: I1012 05:43:24.142102 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Oct 12 05:43:24 crc kubenswrapper[4930]: I1012 05:43:24.142406 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Oct 12 05:43:24 crc kubenswrapper[4930]: I1012 05:43:24.145288 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.679598 4930 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.734468 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-x89kc"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.735195 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x89kc" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.738685 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.739915 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-t48rv"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.740712 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-t48rv" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.741513 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.742353 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-6gz2l"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.742378 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.743483 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-6gz2l" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.745088 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-zkzbp"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.745693 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-zkzbp" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.756697 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.757334 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.759312 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.759820 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.760621 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m4qrq"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.761147 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-cqctf"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.761599 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cqctf" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.761690 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.762060 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.762078 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m4qrq" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.763005 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.763048 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.763096 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.763016 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.763024 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.763688 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.764136 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.764588 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.764998 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.765386 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.765537 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.766409 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.767262 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.766889 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.766947 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.768117 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.768311 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.768499 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.770473 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-lmr9n"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.771372 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lmr9n" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.772445 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.774804 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-ztn69"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.780376 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xctd8"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.774959 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.780775 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-rg6mp"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.775448 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.781039 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-ztn69" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.781606 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.781785 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xctd8" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.777223 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.777416 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.777485 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.778725 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.778815 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.779279 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.779402 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.781614 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-rg6mp" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.779444 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.813437 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.816215 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.816664 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.799719 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-b6q56"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.818527 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9m6tj"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.818850 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7xzpm"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.818937 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.818975 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.819114 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.819230 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7xzpm" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.819272 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-b6q56" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.819380 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.819534 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9m6tj" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.820376 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.822223 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-gbqmf"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.822762 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-gbqmf" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.824452 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.827646 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.828149 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.828388 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.828583 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.829156 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.829282 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.829395 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.830111 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.831482 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.831485 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.831530 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.831543 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.831564 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.832446 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.834790 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-zxqcl"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.835409 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-zxqcl" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.835712 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-wsqp8"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.836425 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wsqp8" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.837060 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rl2fl"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.837349 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rl2fl" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.837457 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.837982 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.838907 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.839276 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.839337 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.839398 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.839506 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-t48rv"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.839527 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.839833 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.839874 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.839950 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.839878 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.839966 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.839399 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.840430 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.840449 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.840799 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.840956 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.841080 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.841227 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.842481 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.842790 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.843062 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.844668 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.845524 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.846641 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-hbgd8"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.847302 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.847327 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9h9hz"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.847362 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.847553 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.847617 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-lmr9n"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.847800 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-hbgd8" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.847836 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-mdwcw"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.847919 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9h9hz" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.848633 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-mdwcw" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.855127 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-fmt5q"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.875903 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.886472 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.889191 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.889462 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.889610 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.889693 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.889810 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.890910 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62f81c40-00a5-41d6-978a-a12cd3878495-serving-cert\") pod \"authentication-operator-69f744f599-zkzbp\" (UID: \"62f81c40-00a5-41d6-978a-a12cd3878495\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zkzbp" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.891014 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/14ec6d5d-4697-48f0-be34-aa3c028fc8a4-available-featuregates\") pod \"openshift-config-operator-7777fb866f-lmr9n\" (UID: \"14ec6d5d-4697-48f0-be34-aa3c028fc8a4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lmr9n" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.891093 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/79650663-cd46-4fef-a731-dfc45f8fa945-client-ca\") pod \"controller-manager-879f6c89f-t48rv\" (UID: \"79650663-cd46-4fef-a731-dfc45f8fa945\") " pod="openshift-controller-manager/controller-manager-879f6c89f-t48rv" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.891169 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-th5xb\" (UniqueName: \"kubernetes.io/projected/cf1f38ec-f38b-47e0-9b80-e66740ebdaba-kube-api-access-th5xb\") pod \"apiserver-76f77b778f-6gz2l\" (UID: \"cf1f38ec-f38b-47e0-9b80-e66740ebdaba\") " pod="openshift-apiserver/apiserver-76f77b778f-6gz2l" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.891251 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gk2rs\" (UniqueName: \"kubernetes.io/projected/6c179d5f-d7dd-42b9-b248-cd3c34237961-kube-api-access-gk2rs\") pod \"downloads-7954f5f757-ztn69\" (UID: \"6c179d5f-d7dd-42b9-b248-cd3c34237961\") " pod="openshift-console/downloads-7954f5f757-ztn69" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.891318 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/cf1f38ec-f38b-47e0-9b80-e66740ebdaba-encryption-config\") pod \"apiserver-76f77b778f-6gz2l\" (UID: \"cf1f38ec-f38b-47e0-9b80-e66740ebdaba\") " pod="openshift-apiserver/apiserver-76f77b778f-6gz2l" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.891380 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q99cr\" (UniqueName: \"kubernetes.io/projected/79650663-cd46-4fef-a731-dfc45f8fa945-kube-api-access-q99cr\") pod \"controller-manager-879f6c89f-t48rv\" (UID: \"79650663-cd46-4fef-a731-dfc45f8fa945\") " pod="openshift-controller-manager/controller-manager-879f6c89f-t48rv" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.891442 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd39a673-35f1-435a-bd8d-02b253c12f1c-serving-cert\") pod \"route-controller-manager-6576b87f9c-cqctf\" (UID: \"cd39a673-35f1-435a-bd8d-02b253c12f1c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cqctf" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.891510 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62f81c40-00a5-41d6-978a-a12cd3878495-service-ca-bundle\") pod \"authentication-operator-69f744f599-zkzbp\" (UID: \"62f81c40-00a5-41d6-978a-a12cd3878495\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zkzbp" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.891580 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9ce83b79-8e6f-4188-a019-30399c8367f7-trusted-ca\") pod \"console-operator-58897d9998-rg6mp\" (UID: \"9ce83b79-8e6f-4188-a019-30399c8367f7\") " pod="openshift-console-operator/console-operator-58897d9998-rg6mp" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.891649 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9ce83b79-8e6f-4188-a019-30399c8367f7-serving-cert\") pod \"console-operator-58897d9998-rg6mp\" (UID: \"9ce83b79-8e6f-4188-a019-30399c8367f7\") " pod="openshift-console-operator/console-operator-58897d9998-rg6mp" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.891715 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/cf1f38ec-f38b-47e0-9b80-e66740ebdaba-node-pullsecrets\") pod \"apiserver-76f77b778f-6gz2l\" (UID: \"cf1f38ec-f38b-47e0-9b80-e66740ebdaba\") " pod="openshift-apiserver/apiserver-76f77b778f-6gz2l" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.891808 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f0c1ca3-f083-49b3-936d-84311a33c5a3-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-xctd8\" (UID: \"8f0c1ca3-f083-49b3-936d-84311a33c5a3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xctd8" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.891901 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf1f38ec-f38b-47e0-9b80-e66740ebdaba-serving-cert\") pod \"apiserver-76f77b778f-6gz2l\" (UID: \"cf1f38ec-f38b-47e0-9b80-e66740ebdaba\") " pod="openshift-apiserver/apiserver-76f77b778f-6gz2l" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.891981 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/cf1f38ec-f38b-47e0-9b80-e66740ebdaba-audit\") pod \"apiserver-76f77b778f-6gz2l\" (UID: \"cf1f38ec-f38b-47e0-9b80-e66740ebdaba\") " pod="openshift-apiserver/apiserver-76f77b778f-6gz2l" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.892044 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/8f0c1ca3-f083-49b3-936d-84311a33c5a3-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-xctd8\" (UID: \"8f0c1ca3-f083-49b3-936d-84311a33c5a3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xctd8" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.892113 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/9ed0c781-82fd-4490-adcb-0c03fff3a7c8-machine-approver-tls\") pod \"machine-approver-56656f9798-x89kc\" (UID: \"9ed0c781-82fd-4490-adcb-0c03fff3a7c8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x89kc" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.892213 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62f81c40-00a5-41d6-978a-a12cd3878495-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-zkzbp\" (UID: \"62f81c40-00a5-41d6-978a-a12cd3878495\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zkzbp" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.892286 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f0c1ca3-f083-49b3-936d-84311a33c5a3-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-xctd8\" (UID: \"8f0c1ca3-f083-49b3-936d-84311a33c5a3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xctd8" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.892354 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlx2z\" (UniqueName: \"kubernetes.io/projected/9ce83b79-8e6f-4188-a019-30399c8367f7-kube-api-access-tlx2z\") pod \"console-operator-58897d9998-rg6mp\" (UID: \"9ce83b79-8e6f-4188-a019-30399c8367f7\") " pod="openshift-console-operator/console-operator-58897d9998-rg6mp" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.892420 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/882d387b-cc67-4a31-b25a-8b4197bd42f2-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-m4qrq\" (UID: \"882d387b-cc67-4a31-b25a-8b4197bd42f2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m4qrq" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.892487 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hvbg\" (UniqueName: \"kubernetes.io/projected/14ec6d5d-4697-48f0-be34-aa3c028fc8a4-kube-api-access-5hvbg\") pod \"openshift-config-operator-7777fb866f-lmr9n\" (UID: \"14ec6d5d-4697-48f0-be34-aa3c028fc8a4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lmr9n" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.892554 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/cf1f38ec-f38b-47e0-9b80-e66740ebdaba-etcd-serving-ca\") pod \"apiserver-76f77b778f-6gz2l\" (UID: \"cf1f38ec-f38b-47e0-9b80-e66740ebdaba\") " pod="openshift-apiserver/apiserver-76f77b778f-6gz2l" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.892617 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79650663-cd46-4fef-a731-dfc45f8fa945-serving-cert\") pod \"controller-manager-879f6c89f-t48rv\" (UID: \"79650663-cd46-4fef-a731-dfc45f8fa945\") " pod="openshift-controller-manager/controller-manager-879f6c89f-t48rv" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.892686 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/882d387b-cc67-4a31-b25a-8b4197bd42f2-config\") pod \"openshift-apiserver-operator-796bbdcf4f-m4qrq\" (UID: \"882d387b-cc67-4a31-b25a-8b4197bd42f2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m4qrq" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.892766 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4srb\" (UniqueName: \"kubernetes.io/projected/9ed0c781-82fd-4490-adcb-0c03fff3a7c8-kube-api-access-t4srb\") pod \"machine-approver-56656f9798-x89kc\" (UID: \"9ed0c781-82fd-4490-adcb-0c03fff3a7c8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x89kc" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.892840 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdxhp\" (UniqueName: \"kubernetes.io/projected/62f81c40-00a5-41d6-978a-a12cd3878495-kube-api-access-tdxhp\") pod \"authentication-operator-69f744f599-zkzbp\" (UID: \"62f81c40-00a5-41d6-978a-a12cd3878495\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zkzbp" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.892909 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mhct\" (UniqueName: \"kubernetes.io/projected/cd39a673-35f1-435a-bd8d-02b253c12f1c-kube-api-access-8mhct\") pod \"route-controller-manager-6576b87f9c-cqctf\" (UID: \"cd39a673-35f1-435a-bd8d-02b253c12f1c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cqctf" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.892976 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf1f38ec-f38b-47e0-9b80-e66740ebdaba-trusted-ca-bundle\") pod \"apiserver-76f77b778f-6gz2l\" (UID: \"cf1f38ec-f38b-47e0-9b80-e66740ebdaba\") " pod="openshift-apiserver/apiserver-76f77b778f-6gz2l" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.893044 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cd39a673-35f1-435a-bd8d-02b253c12f1c-client-ca\") pod \"route-controller-manager-6576b87f9c-cqctf\" (UID: \"cd39a673-35f1-435a-bd8d-02b253c12f1c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cqctf" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.893111 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t49ts\" (UniqueName: \"kubernetes.io/projected/882d387b-cc67-4a31-b25a-8b4197bd42f2-kube-api-access-t49ts\") pod \"openshift-apiserver-operator-796bbdcf4f-m4qrq\" (UID: \"882d387b-cc67-4a31-b25a-8b4197bd42f2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m4qrq" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.893177 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf1f38ec-f38b-47e0-9b80-e66740ebdaba-config\") pod \"apiserver-76f77b778f-6gz2l\" (UID: \"cf1f38ec-f38b-47e0-9b80-e66740ebdaba\") " pod="openshift-apiserver/apiserver-76f77b778f-6gz2l" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.893283 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ed0c781-82fd-4490-adcb-0c03fff3a7c8-config\") pod \"machine-approver-56656f9798-x89kc\" (UID: \"9ed0c781-82fd-4490-adcb-0c03fff3a7c8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x89kc" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.893354 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9ed0c781-82fd-4490-adcb-0c03fff3a7c8-auth-proxy-config\") pod \"machine-approver-56656f9798-x89kc\" (UID: \"9ed0c781-82fd-4490-adcb-0c03fff3a7c8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x89kc" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.893422 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd39a673-35f1-435a-bd8d-02b253c12f1c-config\") pod \"route-controller-manager-6576b87f9c-cqctf\" (UID: \"cd39a673-35f1-435a-bd8d-02b253c12f1c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cqctf" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.893486 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gb6w\" (UniqueName: \"kubernetes.io/projected/8f0c1ca3-f083-49b3-936d-84311a33c5a3-kube-api-access-9gb6w\") pod \"cluster-image-registry-operator-dc59b4c8b-xctd8\" (UID: \"8f0c1ca3-f083-49b3-936d-84311a33c5a3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xctd8" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.893554 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ce83b79-8e6f-4188-a019-30399c8367f7-config\") pod \"console-operator-58897d9998-rg6mp\" (UID: \"9ce83b79-8e6f-4188-a019-30399c8367f7\") " pod="openshift-console-operator/console-operator-58897d9998-rg6mp" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.893633 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/cf1f38ec-f38b-47e0-9b80-e66740ebdaba-image-import-ca\") pod \"apiserver-76f77b778f-6gz2l\" (UID: \"cf1f38ec-f38b-47e0-9b80-e66740ebdaba\") " pod="openshift-apiserver/apiserver-76f77b778f-6gz2l" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.893702 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/79650663-cd46-4fef-a731-dfc45f8fa945-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-t48rv\" (UID: \"79650663-cd46-4fef-a731-dfc45f8fa945\") " pod="openshift-controller-manager/controller-manager-879f6c89f-t48rv" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.893796 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14ec6d5d-4697-48f0-be34-aa3c028fc8a4-serving-cert\") pod \"openshift-config-operator-7777fb866f-lmr9n\" (UID: \"14ec6d5d-4697-48f0-be34-aa3c028fc8a4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lmr9n" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.893860 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79650663-cd46-4fef-a731-dfc45f8fa945-config\") pod \"controller-manager-879f6c89f-t48rv\" (UID: \"79650663-cd46-4fef-a731-dfc45f8fa945\") " pod="openshift-controller-manager/controller-manager-879f6c89f-t48rv" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.893925 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cf1f38ec-f38b-47e0-9b80-e66740ebdaba-audit-dir\") pod \"apiserver-76f77b778f-6gz2l\" (UID: \"cf1f38ec-f38b-47e0-9b80-e66740ebdaba\") " pod="openshift-apiserver/apiserver-76f77b778f-6gz2l" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.893998 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/cf1f38ec-f38b-47e0-9b80-e66740ebdaba-etcd-client\") pod \"apiserver-76f77b778f-6gz2l\" (UID: \"cf1f38ec-f38b-47e0-9b80-e66740ebdaba\") " pod="openshift-apiserver/apiserver-76f77b778f-6gz2l" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.894063 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62f81c40-00a5-41d6-978a-a12cd3878495-config\") pod \"authentication-operator-69f744f599-zkzbp\" (UID: \"62f81c40-00a5-41d6-978a-a12cd3878495\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zkzbp" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.891088 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.891625 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-w9zwj"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.895079 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dprns"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.891695 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fmt5q" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.895548 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bzqf2"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.895660 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-w9zwj" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.895750 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dprns" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.896193 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6r5ch"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.896457 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-p5m42"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.896932 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-p5m42" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.897095 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6r5ch" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.897206 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-bzqf2" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.898921 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.899406 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.899799 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.899980 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.900075 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.902904 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ncmk8"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.903507 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-zxqcl"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.903660 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ncmk8" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.904456 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.905100 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-wsqp8"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.905370 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rl2fl"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.909490 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.913500 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-47d6m"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.914043 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-47d6m" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.916024 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-hbgd8"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.916551 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-gbqmf"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.917714 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m4qrq"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.920361 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.924623 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xctd8"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.926359 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7xzpm"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.926453 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-6gz2l"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.928365 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-rg6mp"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.929053 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-ztn69"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.930058 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9m6tj"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.930963 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2lc4l"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.932579 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-zkzbp"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.932639 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2lc4l" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.932864 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-psfdl"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.933284 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-psfdl" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.933728 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-d6mxh"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.934171 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-d6mxh" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.934701 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29337450-9g4rr"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.935569 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29337450-9g4rr" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.936302 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-lk89q"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.936782 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.937207 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lk89q" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.937469 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wdpcs"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.937999 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wdpcs" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.938783 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-vd458"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.940177 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-968tr"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.940676 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-968tr" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.940822 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-92ngq"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.940968 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-vd458" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.941459 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-92ngq" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.942918 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-5v7hl"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.943529 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5v7hl" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.943903 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bs29k"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.944569 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-bs29k" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.947241 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-phltz"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.949771 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-phltz" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.950266 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-fmt5q"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.951800 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-p5m42"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.954426 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-cqctf"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.957308 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.957785 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dprns"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.958846 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-d6mxh"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.960400 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9h9hz"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.962575 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-w9zwj"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.964043 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-psfdl"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.965415 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6r5ch"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.970504 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-vd458"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.970858 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-25pm7"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.972030 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-25pm7" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.972455 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-g8lk2"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.973506 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-phltz"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.973598 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-g8lk2" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.975150 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-b6q56"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.975653 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bzqf2"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.976785 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.976949 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29337450-9g4rr"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.977642 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2lc4l"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.978859 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ncmk8"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.981653 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-25pm7"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.981699 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-5v7hl"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.983146 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-47d6m"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.984984 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-968tr"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.986561 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-g8lk2"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.988129 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bs29k"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.989754 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-lk89q"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.991268 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wdpcs"] Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.994999 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t49ts\" (UniqueName: \"kubernetes.io/projected/882d387b-cc67-4a31-b25a-8b4197bd42f2-kube-api-access-t49ts\") pod \"openshift-apiserver-operator-796bbdcf4f-m4qrq\" (UID: \"882d387b-cc67-4a31-b25a-8b4197bd42f2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m4qrq" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.995074 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf1f38ec-f38b-47e0-9b80-e66740ebdaba-config\") pod \"apiserver-76f77b778f-6gz2l\" (UID: \"cf1f38ec-f38b-47e0-9b80-e66740ebdaba\") " pod="openshift-apiserver/apiserver-76f77b778f-6gz2l" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.995103 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf1f38ec-f38b-47e0-9b80-e66740ebdaba-trusted-ca-bundle\") pod \"apiserver-76f77b778f-6gz2l\" (UID: \"cf1f38ec-f38b-47e0-9b80-e66740ebdaba\") " pod="openshift-apiserver/apiserver-76f77b778f-6gz2l" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.995152 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjhcj\" (UniqueName: \"kubernetes.io/projected/805f5605-a80f-40be-aaa2-5cedbc94960f-kube-api-access-kjhcj\") pod \"cluster-samples-operator-665b6dd947-9m6tj\" (UID: \"805f5605-a80f-40be-aaa2-5cedbc94960f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9m6tj" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.995703 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf1f38ec-f38b-47e0-9b80-e66740ebdaba-config\") pod \"apiserver-76f77b778f-6gz2l\" (UID: \"cf1f38ec-f38b-47e0-9b80-e66740ebdaba\") " pod="openshift-apiserver/apiserver-76f77b778f-6gz2l" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.996189 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf1f38ec-f38b-47e0-9b80-e66740ebdaba-trusted-ca-bundle\") pod \"apiserver-76f77b778f-6gz2l\" (UID: \"cf1f38ec-f38b-47e0-9b80-e66740ebdaba\") " pod="openshift-apiserver/apiserver-76f77b778f-6gz2l" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.996262 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f086907d-7c4a-488b-a783-bc24e827e5e6-etcd-client\") pod \"etcd-operator-b45778765-w9zwj\" (UID: \"f086907d-7c4a-488b-a783-bc24e827e5e6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w9zwj" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.996335 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cd39a673-35f1-435a-bd8d-02b253c12f1c-client-ca\") pod \"route-controller-manager-6576b87f9c-cqctf\" (UID: \"cd39a673-35f1-435a-bd8d-02b253c12f1c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cqctf" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.996397 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.997110 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ed0c781-82fd-4490-adcb-0c03fff3a7c8-config\") pod \"machine-approver-56656f9798-x89kc\" (UID: \"9ed0c781-82fd-4490-adcb-0c03fff3a7c8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x89kc" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.997289 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cd39a673-35f1-435a-bd8d-02b253c12f1c-client-ca\") pod \"route-controller-manager-6576b87f9c-cqctf\" (UID: \"cd39a673-35f1-435a-bd8d-02b253c12f1c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cqctf" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.997337 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ed0c781-82fd-4490-adcb-0c03fff3a7c8-config\") pod \"machine-approver-56656f9798-x89kc\" (UID: \"9ed0c781-82fd-4490-adcb-0c03fff3a7c8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x89kc" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.997391 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcac2427-8311-481a-85f6-4c9b96d3bbe2-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-7xzpm\" (UID: \"bcac2427-8311-481a-85f6-4c9b96d3bbe2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7xzpm" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.997768 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/69e0e716-3ae8-482b-bb9e-57f6f5d859cd-bound-sa-token\") pod \"ingress-operator-5b745b69d9-fmt5q\" (UID: \"69e0e716-3ae8-482b-bb9e-57f6f5d859cd\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fmt5q" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.997801 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/93aad091-5fd5-4eb5-a123-b7932dc268fe-stats-auth\") pod \"router-default-5444994796-mdwcw\" (UID: \"93aad091-5fd5-4eb5-a123-b7932dc268fe\") " pod="openshift-ingress/router-default-5444994796-mdwcw" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.997827 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9ed0c781-82fd-4490-adcb-0c03fff3a7c8-auth-proxy-config\") pod \"machine-approver-56656f9798-x89kc\" (UID: \"9ed0c781-82fd-4490-adcb-0c03fff3a7c8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x89kc" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.997850 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd39a673-35f1-435a-bd8d-02b253c12f1c-config\") pod \"route-controller-manager-6576b87f9c-cqctf\" (UID: \"cd39a673-35f1-435a-bd8d-02b253c12f1c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cqctf" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.997903 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gb6w\" (UniqueName: \"kubernetes.io/projected/8f0c1ca3-f083-49b3-936d-84311a33c5a3-kube-api-access-9gb6w\") pod \"cluster-image-registry-operator-dc59b4c8b-xctd8\" (UID: \"8f0c1ca3-f083-49b3-936d-84311a33c5a3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xctd8" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.997936 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ce83b79-8e6f-4188-a019-30399c8367f7-config\") pod \"console-operator-58897d9998-rg6mp\" (UID: \"9ce83b79-8e6f-4188-a019-30399c8367f7\") " pod="openshift-console-operator/console-operator-58897d9998-rg6mp" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.997967 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/93aad091-5fd5-4eb5-a123-b7932dc268fe-default-certificate\") pod \"router-default-5444994796-mdwcw\" (UID: \"93aad091-5fd5-4eb5-a123-b7932dc268fe\") " pod="openshift-ingress/router-default-5444994796-mdwcw" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.998012 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/74488d89-3c1c-4e78-9c26-35be09ac8cde-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-dprns\" (UID: \"74488d89-3c1c-4e78-9c26-35be09ac8cde\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dprns" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.998039 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98fa14b7-a351-47ab-bcd2-83e2a81f9859-serving-cert\") pod \"apiserver-7bbb656c7d-wsqp8\" (UID: \"98fa14b7-a351-47ab-bcd2-83e2a81f9859\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wsqp8" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.998063 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98fa14b7-a351-47ab-bcd2-83e2a81f9859-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-wsqp8\" (UID: \"98fa14b7-a351-47ab-bcd2-83e2a81f9859\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wsqp8" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.998087 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93aad091-5fd5-4eb5-a123-b7932dc268fe-metrics-certs\") pod \"router-default-5444994796-mdwcw\" (UID: \"93aad091-5fd5-4eb5-a123-b7932dc268fe\") " pod="openshift-ingress/router-default-5444994796-mdwcw" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.998116 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/cf1f38ec-f38b-47e0-9b80-e66740ebdaba-image-import-ca\") pod \"apiserver-76f77b778f-6gz2l\" (UID: \"cf1f38ec-f38b-47e0-9b80-e66740ebdaba\") " pod="openshift-apiserver/apiserver-76f77b778f-6gz2l" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.998153 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/805f5605-a80f-40be-aaa2-5cedbc94960f-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-9m6tj\" (UID: \"805f5605-a80f-40be-aaa2-5cedbc94960f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9m6tj" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.998180 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/98fa14b7-a351-47ab-bcd2-83e2a81f9859-audit-dir\") pod \"apiserver-7bbb656c7d-wsqp8\" (UID: \"98fa14b7-a351-47ab-bcd2-83e2a81f9859\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wsqp8" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.998237 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14ec6d5d-4697-48f0-be34-aa3c028fc8a4-serving-cert\") pod \"openshift-config-operator-7777fb866f-lmr9n\" (UID: \"14ec6d5d-4697-48f0-be34-aa3c028fc8a4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lmr9n" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.998268 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79650663-cd46-4fef-a731-dfc45f8fa945-config\") pod \"controller-manager-879f6c89f-t48rv\" (UID: \"79650663-cd46-4fef-a731-dfc45f8fa945\") " pod="openshift-controller-manager/controller-manager-879f6c89f-t48rv" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.998295 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/79650663-cd46-4fef-a731-dfc45f8fa945-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-t48rv\" (UID: \"79650663-cd46-4fef-a731-dfc45f8fa945\") " pod="openshift-controller-manager/controller-manager-879f6c89f-t48rv" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.998320 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/54c34701-7d60-4396-9a64-81b91379fbe9-images\") pod \"machine-api-operator-5694c8668f-zxqcl\" (UID: \"54c34701-7d60-4396-9a64-81b91379fbe9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-zxqcl" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.998346 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gr4mn\" (UniqueName: \"kubernetes.io/projected/02cd3769-c022-4645-b33e-eb1303133aea-kube-api-access-gr4mn\") pod \"kube-storage-version-migrator-operator-b67b599dd-ncmk8\" (UID: \"02cd3769-c022-4645-b33e-eb1303133aea\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ncmk8" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.998386 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsxnw\" (UniqueName: \"kubernetes.io/projected/e9baf15f-27e7-442f-98d8-fdb29719ac71-kube-api-access-wsxnw\") pod \"service-ca-operator-777779d784-47d6m\" (UID: \"e9baf15f-27e7-442f-98d8-fdb29719ac71\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-47d6m" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.998425 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/cf1f38ec-f38b-47e0-9b80-e66740ebdaba-etcd-client\") pod \"apiserver-76f77b778f-6gz2l\" (UID: \"cf1f38ec-f38b-47e0-9b80-e66740ebdaba\") " pod="openshift-apiserver/apiserver-76f77b778f-6gz2l" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.998454 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cf1f38ec-f38b-47e0-9b80-e66740ebdaba-audit-dir\") pod \"apiserver-76f77b778f-6gz2l\" (UID: \"cf1f38ec-f38b-47e0-9b80-e66740ebdaba\") " pod="openshift-apiserver/apiserver-76f77b778f-6gz2l" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.998479 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hnkb\" (UniqueName: \"kubernetes.io/projected/54c34701-7d60-4396-9a64-81b91379fbe9-kube-api-access-6hnkb\") pod \"machine-api-operator-5694c8668f-zxqcl\" (UID: \"54c34701-7d60-4396-9a64-81b91379fbe9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-zxqcl" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.998503 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/98fa14b7-a351-47ab-bcd2-83e2a81f9859-etcd-client\") pod \"apiserver-7bbb656c7d-wsqp8\" (UID: \"98fa14b7-a351-47ab-bcd2-83e2a81f9859\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wsqp8" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.998540 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62f81c40-00a5-41d6-978a-a12cd3878495-config\") pod \"authentication-operator-69f744f599-zkzbp\" (UID: \"62f81c40-00a5-41d6-978a-a12cd3878495\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zkzbp" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.998586 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62f81c40-00a5-41d6-978a-a12cd3878495-serving-cert\") pod \"authentication-operator-69f744f599-zkzbp\" (UID: \"62f81c40-00a5-41d6-978a-a12cd3878495\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zkzbp" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.998613 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/98fa14b7-a351-47ab-bcd2-83e2a81f9859-encryption-config\") pod \"apiserver-7bbb656c7d-wsqp8\" (UID: \"98fa14b7-a351-47ab-bcd2-83e2a81f9859\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wsqp8" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.998636 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjnd4\" (UniqueName: \"kubernetes.io/projected/98fa14b7-a351-47ab-bcd2-83e2a81f9859-kube-api-access-kjnd4\") pod \"apiserver-7bbb656c7d-wsqp8\" (UID: \"98fa14b7-a351-47ab-bcd2-83e2a81f9859\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wsqp8" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.998665 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-969tm\" (UniqueName: \"kubernetes.io/projected/f086907d-7c4a-488b-a783-bc24e827e5e6-kube-api-access-969tm\") pod \"etcd-operator-b45778765-w9zwj\" (UID: \"f086907d-7c4a-488b-a783-bc24e827e5e6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w9zwj" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.998687 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/14ec6d5d-4697-48f0-be34-aa3c028fc8a4-available-featuregates\") pod \"openshift-config-operator-7777fb866f-lmr9n\" (UID: \"14ec6d5d-4697-48f0-be34-aa3c028fc8a4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lmr9n" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.998710 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0bfadb73-bbc1-41db-8374-9c3aaf00682d-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9h9hz\" (UID: \"0bfadb73-bbc1-41db-8374-9c3aaf00682d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9h9hz" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.998731 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8ns4\" (UniqueName: \"kubernetes.io/projected/eb977a38-ef3b-4820-a364-ad16d6c857d5-kube-api-access-l8ns4\") pod \"console-f9d7485db-b6q56\" (UID: \"eb977a38-ef3b-4820-a364-ad16d6c857d5\") " pod="openshift-console/console-f9d7485db-b6q56" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.998770 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/69e0e716-3ae8-482b-bb9e-57f6f5d859cd-metrics-tls\") pod \"ingress-operator-5b745b69d9-fmt5q\" (UID: \"69e0e716-3ae8-482b-bb9e-57f6f5d859cd\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fmt5q" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.998792 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9baf15f-27e7-442f-98d8-fdb29719ac71-config\") pod \"service-ca-operator-777779d784-47d6m\" (UID: \"e9baf15f-27e7-442f-98d8-fdb29719ac71\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-47d6m" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.998812 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcrg2\" (UniqueName: \"kubernetes.io/projected/93aad091-5fd5-4eb5-a123-b7932dc268fe-kube-api-access-qcrg2\") pod \"router-default-5444994796-mdwcw\" (UID: \"93aad091-5fd5-4eb5-a123-b7932dc268fe\") " pod="openshift-ingress/router-default-5444994796-mdwcw" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.998839 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eb977a38-ef3b-4820-a364-ad16d6c857d5-console-config\") pod \"console-f9d7485db-b6q56\" (UID: \"eb977a38-ef3b-4820-a364-ad16d6c857d5\") " pod="openshift-console/console-f9d7485db-b6q56" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.998864 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eb977a38-ef3b-4820-a364-ad16d6c857d5-oauth-serving-cert\") pod \"console-f9d7485db-b6q56\" (UID: \"eb977a38-ef3b-4820-a364-ad16d6c857d5\") " pod="openshift-console/console-f9d7485db-b6q56" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.998887 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/79650663-cd46-4fef-a731-dfc45f8fa945-client-ca\") pod \"controller-manager-879f6c89f-t48rv\" (UID: \"79650663-cd46-4fef-a731-dfc45f8fa945\") " pod="openshift-controller-manager/controller-manager-879f6c89f-t48rv" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.998911 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-th5xb\" (UniqueName: \"kubernetes.io/projected/cf1f38ec-f38b-47e0-9b80-e66740ebdaba-kube-api-access-th5xb\") pod \"apiserver-76f77b778f-6gz2l\" (UID: \"cf1f38ec-f38b-47e0-9b80-e66740ebdaba\") " pod="openshift-apiserver/apiserver-76f77b778f-6gz2l" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.998933 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/98fa14b7-a351-47ab-bcd2-83e2a81f9859-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-wsqp8\" (UID: \"98fa14b7-a351-47ab-bcd2-83e2a81f9859\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wsqp8" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.998960 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/cf1f38ec-f38b-47e0-9b80-e66740ebdaba-encryption-config\") pod \"apiserver-76f77b778f-6gz2l\" (UID: \"cf1f38ec-f38b-47e0-9b80-e66740ebdaba\") " pod="openshift-apiserver/apiserver-76f77b778f-6gz2l" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.998983 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eb977a38-ef3b-4820-a364-ad16d6c857d5-console-serving-cert\") pod \"console-f9d7485db-b6q56\" (UID: \"eb977a38-ef3b-4820-a364-ad16d6c857d5\") " pod="openshift-console/console-f9d7485db-b6q56" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.999005 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb977a38-ef3b-4820-a364-ad16d6c857d5-trusted-ca-bundle\") pod \"console-f9d7485db-b6q56\" (UID: \"eb977a38-ef3b-4820-a364-ad16d6c857d5\") " pod="openshift-console/console-f9d7485db-b6q56" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.999029 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e4e4971a-01da-4555-b3a5-2ab4a9118163-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-rl2fl\" (UID: \"e4e4971a-01da-4555-b3a5-2ab4a9118163\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rl2fl" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.999055 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gk2rs\" (UniqueName: \"kubernetes.io/projected/6c179d5f-d7dd-42b9-b248-cd3c34237961-kube-api-access-gk2rs\") pod \"downloads-7954f5f757-ztn69\" (UID: \"6c179d5f-d7dd-42b9-b248-cd3c34237961\") " pod="openshift-console/downloads-7954f5f757-ztn69" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.999077 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q99cr\" (UniqueName: \"kubernetes.io/projected/79650663-cd46-4fef-a731-dfc45f8fa945-kube-api-access-q99cr\") pod \"controller-manager-879f6c89f-t48rv\" (UID: \"79650663-cd46-4fef-a731-dfc45f8fa945\") " pod="openshift-controller-manager/controller-manager-879f6c89f-t48rv" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.999100 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd39a673-35f1-435a-bd8d-02b253c12f1c-serving-cert\") pod \"route-controller-manager-6576b87f9c-cqctf\" (UID: \"cd39a673-35f1-435a-bd8d-02b253c12f1c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cqctf" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.999122 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54c34701-7d60-4396-9a64-81b91379fbe9-config\") pod \"machine-api-operator-5694c8668f-zxqcl\" (UID: \"54c34701-7d60-4396-9a64-81b91379fbe9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-zxqcl" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.999151 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62f81c40-00a5-41d6-978a-a12cd3878495-service-ca-bundle\") pod \"authentication-operator-69f744f599-zkzbp\" (UID: \"62f81c40-00a5-41d6-978a-a12cd3878495\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zkzbp" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.999173 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0bfadb73-bbc1-41db-8374-9c3aaf00682d-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9h9hz\" (UID: \"0bfadb73-bbc1-41db-8374-9c3aaf00682d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9h9hz" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.999201 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9ce83b79-8e6f-4188-a019-30399c8367f7-serving-cert\") pod \"console-operator-58897d9998-rg6mp\" (UID: \"9ce83b79-8e6f-4188-a019-30399c8367f7\") " pod="openshift-console-operator/console-operator-58897d9998-rg6mp" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.999224 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9ce83b79-8e6f-4188-a019-30399c8367f7-trusted-ca\") pod \"console-operator-58897d9998-rg6mp\" (UID: \"9ce83b79-8e6f-4188-a019-30399c8367f7\") " pod="openshift-console-operator/console-operator-58897d9998-rg6mp" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.999246 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02cd3769-c022-4645-b33e-eb1303133aea-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-ncmk8\" (UID: \"02cd3769-c022-4645-b33e-eb1303133aea\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ncmk8" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.999269 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/69e0e716-3ae8-482b-bb9e-57f6f5d859cd-trusted-ca\") pod \"ingress-operator-5b745b69d9-fmt5q\" (UID: \"69e0e716-3ae8-482b-bb9e-57f6f5d859cd\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fmt5q" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.999294 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/02cd3769-c022-4645-b33e-eb1303133aea-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-ncmk8\" (UID: \"02cd3769-c022-4645-b33e-eb1303133aea\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ncmk8" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.999316 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxlks\" (UniqueName: \"kubernetes.io/projected/69e0e716-3ae8-482b-bb9e-57f6f5d859cd-kube-api-access-wxlks\") pod \"ingress-operator-5b745b69d9-fmt5q\" (UID: \"69e0e716-3ae8-482b-bb9e-57f6f5d859cd\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fmt5q" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.999342 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74488d89-3c1c-4e78-9c26-35be09ac8cde-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-dprns\" (UID: \"74488d89-3c1c-4e78-9c26-35be09ac8cde\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dprns" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.999365 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74488d89-3c1c-4e78-9c26-35be09ac8cde-config\") pod \"kube-controller-manager-operator-78b949d7b-dprns\" (UID: \"74488d89-3c1c-4e78-9c26-35be09ac8cde\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dprns" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.999391 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/cf1f38ec-f38b-47e0-9b80-e66740ebdaba-node-pullsecrets\") pod \"apiserver-76f77b778f-6gz2l\" (UID: \"cf1f38ec-f38b-47e0-9b80-e66740ebdaba\") " pod="openshift-apiserver/apiserver-76f77b778f-6gz2l" Oct 12 05:43:27 crc kubenswrapper[4930]: I1012 05:43:27.999412 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/54c34701-7d60-4396-9a64-81b91379fbe9-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-zxqcl\" (UID: \"54c34701-7d60-4396-9a64-81b91379fbe9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-zxqcl" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:27.999438 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f0c1ca3-f083-49b3-936d-84311a33c5a3-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-xctd8\" (UID: \"8f0c1ca3-f083-49b3-936d-84311a33c5a3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xctd8" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:27.999449 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9ed0c781-82fd-4490-adcb-0c03fff3a7c8-auth-proxy-config\") pod \"machine-approver-56656f9798-x89kc\" (UID: \"9ed0c781-82fd-4490-adcb-0c03fff3a7c8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x89kc" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:27.999461 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf1f38ec-f38b-47e0-9b80-e66740ebdaba-serving-cert\") pod \"apiserver-76f77b778f-6gz2l\" (UID: \"cf1f38ec-f38b-47e0-9b80-e66740ebdaba\") " pod="openshift-apiserver/apiserver-76f77b778f-6gz2l" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:27.999499 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bcac2427-8311-481a-85f6-4c9b96d3bbe2-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-7xzpm\" (UID: \"bcac2427-8311-481a-85f6-4c9b96d3bbe2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7xzpm" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:27.999526 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/8f0c1ca3-f083-49b3-936d-84311a33c5a3-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-xctd8\" (UID: \"8f0c1ca3-f083-49b3-936d-84311a33c5a3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xctd8" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:27.999551 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/9ed0c781-82fd-4490-adcb-0c03fff3a7c8-machine-approver-tls\") pod \"machine-approver-56656f9798-x89kc\" (UID: \"9ed0c781-82fd-4490-adcb-0c03fff3a7c8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x89kc" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:27.999570 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/cf1f38ec-f38b-47e0-9b80-e66740ebdaba-audit\") pod \"apiserver-76f77b778f-6gz2l\" (UID: \"cf1f38ec-f38b-47e0-9b80-e66740ebdaba\") " pod="openshift-apiserver/apiserver-76f77b778f-6gz2l" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:27.999595 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eb977a38-ef3b-4820-a364-ad16d6c857d5-console-oauth-config\") pod \"console-f9d7485db-b6q56\" (UID: \"eb977a38-ef3b-4820-a364-ad16d6c857d5\") " pod="openshift-console/console-f9d7485db-b6q56" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:27.999614 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/98fa14b7-a351-47ab-bcd2-83e2a81f9859-audit-policies\") pod \"apiserver-7bbb656c7d-wsqp8\" (UID: \"98fa14b7-a351-47ab-bcd2-83e2a81f9859\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wsqp8" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:27.999635 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93aad091-5fd5-4eb5-a123-b7932dc268fe-service-ca-bundle\") pod \"router-default-5444994796-mdwcw\" (UID: \"93aad091-5fd5-4eb5-a123-b7932dc268fe\") " pod="openshift-ingress/router-default-5444994796-mdwcw" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:27.999667 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bfadb73-bbc1-41db-8374-9c3aaf00682d-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9h9hz\" (UID: \"0bfadb73-bbc1-41db-8374-9c3aaf00682d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9h9hz" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:27.999686 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f086907d-7c4a-488b-a783-bc24e827e5e6-config\") pod \"etcd-operator-b45778765-w9zwj\" (UID: \"f086907d-7c4a-488b-a783-bc24e827e5e6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w9zwj" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:27.999717 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlx2z\" (UniqueName: \"kubernetes.io/projected/9ce83b79-8e6f-4188-a019-30399c8367f7-kube-api-access-tlx2z\") pod \"console-operator-58897d9998-rg6mp\" (UID: \"9ce83b79-8e6f-4188-a019-30399c8367f7\") " pod="openshift-console-operator/console-operator-58897d9998-rg6mp" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:27.999791 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/f086907d-7c4a-488b-a783-bc24e827e5e6-etcd-ca\") pod \"etcd-operator-b45778765-w9zwj\" (UID: \"f086907d-7c4a-488b-a783-bc24e827e5e6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w9zwj" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:27.999814 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e9baf15f-27e7-442f-98d8-fdb29719ac71-serving-cert\") pod \"service-ca-operator-777779d784-47d6m\" (UID: \"e9baf15f-27e7-442f-98d8-fdb29719ac71\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-47d6m" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:27.999837 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62f81c40-00a5-41d6-978a-a12cd3878495-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-zkzbp\" (UID: \"62f81c40-00a5-41d6-978a-a12cd3878495\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zkzbp" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:27.999859 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f0c1ca3-f083-49b3-936d-84311a33c5a3-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-xctd8\" (UID: \"8f0c1ca3-f083-49b3-936d-84311a33c5a3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xctd8" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:27.999880 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hvbg\" (UniqueName: \"kubernetes.io/projected/14ec6d5d-4697-48f0-be34-aa3c028fc8a4-kube-api-access-5hvbg\") pod \"openshift-config-operator-7777fb866f-lmr9n\" (UID: \"14ec6d5d-4697-48f0-be34-aa3c028fc8a4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lmr9n" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:27.999896 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/882d387b-cc67-4a31-b25a-8b4197bd42f2-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-m4qrq\" (UID: \"882d387b-cc67-4a31-b25a-8b4197bd42f2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m4qrq" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:27.999915 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gj5pb\" (UniqueName: \"kubernetes.io/projected/de12eb0a-0d0a-455c-a275-a3bfdb6f9d72-kube-api-access-gj5pb\") pod \"migrator-59844c95c7-p5m42\" (UID: \"de12eb0a-0d0a-455c-a275-a3bfdb6f9d72\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-p5m42" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:27.999935 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/cf1f38ec-f38b-47e0-9b80-e66740ebdaba-etcd-serving-ca\") pod \"apiserver-76f77b778f-6gz2l\" (UID: \"cf1f38ec-f38b-47e0-9b80-e66740ebdaba\") " pod="openshift-apiserver/apiserver-76f77b778f-6gz2l" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:27.999951 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eb977a38-ef3b-4820-a364-ad16d6c857d5-service-ca\") pod \"console-f9d7485db-b6q56\" (UID: \"eb977a38-ef3b-4820-a364-ad16d6c857d5\") " pod="openshift-console/console-f9d7485db-b6q56" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:27.999966 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f086907d-7c4a-488b-a783-bc24e827e5e6-serving-cert\") pod \"etcd-operator-b45778765-w9zwj\" (UID: \"f086907d-7c4a-488b-a783-bc24e827e5e6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w9zwj" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:27.999987 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79650663-cd46-4fef-a731-dfc45f8fa945-serving-cert\") pod \"controller-manager-879f6c89f-t48rv\" (UID: \"79650663-cd46-4fef-a731-dfc45f8fa945\") " pod="openshift-controller-manager/controller-manager-879f6c89f-t48rv" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.000001 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4e4971a-01da-4555-b3a5-2ab4a9118163-config\") pod \"kube-apiserver-operator-766d6c64bb-rl2fl\" (UID: \"e4e4971a-01da-4555-b3a5-2ab4a9118163\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rl2fl" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.000021 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4srb\" (UniqueName: \"kubernetes.io/projected/9ed0c781-82fd-4490-adcb-0c03fff3a7c8-kube-api-access-t4srb\") pod \"machine-approver-56656f9798-x89kc\" (UID: \"9ed0c781-82fd-4490-adcb-0c03fff3a7c8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x89kc" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.000039 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/882d387b-cc67-4a31-b25a-8b4197bd42f2-config\") pod \"openshift-apiserver-operator-796bbdcf4f-m4qrq\" (UID: \"882d387b-cc67-4a31-b25a-8b4197bd42f2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m4qrq" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.000054 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4e4971a-01da-4555-b3a5-2ab4a9118163-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-rl2fl\" (UID: \"e4e4971a-01da-4555-b3a5-2ab4a9118163\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rl2fl" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.000073 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mhct\" (UniqueName: \"kubernetes.io/projected/cd39a673-35f1-435a-bd8d-02b253c12f1c-kube-api-access-8mhct\") pod \"route-controller-manager-6576b87f9c-cqctf\" (UID: \"cd39a673-35f1-435a-bd8d-02b253c12f1c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cqctf" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.000090 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fj8fr\" (UniqueName: \"kubernetes.io/projected/bcac2427-8311-481a-85f6-4c9b96d3bbe2-kube-api-access-fj8fr\") pod \"openshift-controller-manager-operator-756b6f6bc6-7xzpm\" (UID: \"bcac2427-8311-481a-85f6-4c9b96d3bbe2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7xzpm" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.000106 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/f086907d-7c4a-488b-a783-bc24e827e5e6-etcd-service-ca\") pod \"etcd-operator-b45778765-w9zwj\" (UID: \"f086907d-7c4a-488b-a783-bc24e827e5e6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w9zwj" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.000126 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdxhp\" (UniqueName: \"kubernetes.io/projected/62f81c40-00a5-41d6-978a-a12cd3878495-kube-api-access-tdxhp\") pod \"authentication-operator-69f744f599-zkzbp\" (UID: \"62f81c40-00a5-41d6-978a-a12cd3878495\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zkzbp" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.000987 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ce83b79-8e6f-4188-a019-30399c8367f7-config\") pod \"console-operator-58897d9998-rg6mp\" (UID: \"9ce83b79-8e6f-4188-a019-30399c8367f7\") " pod="openshift-console-operator/console-operator-58897d9998-rg6mp" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.001570 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/cf1f38ec-f38b-47e0-9b80-e66740ebdaba-image-import-ca\") pod \"apiserver-76f77b778f-6gz2l\" (UID: \"cf1f38ec-f38b-47e0-9b80-e66740ebdaba\") " pod="openshift-apiserver/apiserver-76f77b778f-6gz2l" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.001770 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cf1f38ec-f38b-47e0-9b80-e66740ebdaba-audit-dir\") pod \"apiserver-76f77b778f-6gz2l\" (UID: \"cf1f38ec-f38b-47e0-9b80-e66740ebdaba\") " pod="openshift-apiserver/apiserver-76f77b778f-6gz2l" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.002455 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62f81c40-00a5-41d6-978a-a12cd3878495-config\") pod \"authentication-operator-69f744f599-zkzbp\" (UID: \"62f81c40-00a5-41d6-978a-a12cd3878495\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zkzbp" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.002479 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/79650663-cd46-4fef-a731-dfc45f8fa945-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-t48rv\" (UID: \"79650663-cd46-4fef-a731-dfc45f8fa945\") " pod="openshift-controller-manager/controller-manager-879f6c89f-t48rv" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.002768 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/14ec6d5d-4697-48f0-be34-aa3c028fc8a4-available-featuregates\") pod \"openshift-config-operator-7777fb866f-lmr9n\" (UID: \"14ec6d5d-4697-48f0-be34-aa3c028fc8a4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lmr9n" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.003125 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79650663-cd46-4fef-a731-dfc45f8fa945-config\") pod \"controller-manager-879f6c89f-t48rv\" (UID: \"79650663-cd46-4fef-a731-dfc45f8fa945\") " pod="openshift-controller-manager/controller-manager-879f6c89f-t48rv" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.003224 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd39a673-35f1-435a-bd8d-02b253c12f1c-config\") pod \"route-controller-manager-6576b87f9c-cqctf\" (UID: \"cd39a673-35f1-435a-bd8d-02b253c12f1c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cqctf" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.003708 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/79650663-cd46-4fef-a731-dfc45f8fa945-client-ca\") pod \"controller-manager-879f6c89f-t48rv\" (UID: \"79650663-cd46-4fef-a731-dfc45f8fa945\") " pod="openshift-controller-manager/controller-manager-879f6c89f-t48rv" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.004077 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/882d387b-cc67-4a31-b25a-8b4197bd42f2-config\") pod \"openshift-apiserver-operator-796bbdcf4f-m4qrq\" (UID: \"882d387b-cc67-4a31-b25a-8b4197bd42f2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m4qrq" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.004640 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9ce83b79-8e6f-4188-a019-30399c8367f7-trusted-ca\") pod \"console-operator-58897d9998-rg6mp\" (UID: \"9ce83b79-8e6f-4188-a019-30399c8367f7\") " pod="openshift-console-operator/console-operator-58897d9998-rg6mp" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.004687 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62f81c40-00a5-41d6-978a-a12cd3878495-service-ca-bundle\") pod \"authentication-operator-69f744f599-zkzbp\" (UID: \"62f81c40-00a5-41d6-978a-a12cd3878495\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zkzbp" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.005140 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/cf1f38ec-f38b-47e0-9b80-e66740ebdaba-audit\") pod \"apiserver-76f77b778f-6gz2l\" (UID: \"cf1f38ec-f38b-47e0-9b80-e66740ebdaba\") " pod="openshift-apiserver/apiserver-76f77b778f-6gz2l" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.005714 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/cf1f38ec-f38b-47e0-9b80-e66740ebdaba-etcd-serving-ca\") pod \"apiserver-76f77b778f-6gz2l\" (UID: \"cf1f38ec-f38b-47e0-9b80-e66740ebdaba\") " pod="openshift-apiserver/apiserver-76f77b778f-6gz2l" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.005824 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62f81c40-00a5-41d6-978a-a12cd3878495-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-zkzbp\" (UID: \"62f81c40-00a5-41d6-978a-a12cd3878495\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zkzbp" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.006387 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/cf1f38ec-f38b-47e0-9b80-e66740ebdaba-node-pullsecrets\") pod \"apiserver-76f77b778f-6gz2l\" (UID: \"cf1f38ec-f38b-47e0-9b80-e66740ebdaba\") " pod="openshift-apiserver/apiserver-76f77b778f-6gz2l" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.007212 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/cf1f38ec-f38b-47e0-9b80-e66740ebdaba-encryption-config\") pod \"apiserver-76f77b778f-6gz2l\" (UID: \"cf1f38ec-f38b-47e0-9b80-e66740ebdaba\") " pod="openshift-apiserver/apiserver-76f77b778f-6gz2l" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.007480 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/9ed0c781-82fd-4490-adcb-0c03fff3a7c8-machine-approver-tls\") pod \"machine-approver-56656f9798-x89kc\" (UID: \"9ed0c781-82fd-4490-adcb-0c03fff3a7c8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x89kc" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.007589 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62f81c40-00a5-41d6-978a-a12cd3878495-serving-cert\") pod \"authentication-operator-69f744f599-zkzbp\" (UID: \"62f81c40-00a5-41d6-978a-a12cd3878495\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zkzbp" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.008045 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/cf1f38ec-f38b-47e0-9b80-e66740ebdaba-etcd-client\") pod \"apiserver-76f77b778f-6gz2l\" (UID: \"cf1f38ec-f38b-47e0-9b80-e66740ebdaba\") " pod="openshift-apiserver/apiserver-76f77b778f-6gz2l" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.008091 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f0c1ca3-f083-49b3-936d-84311a33c5a3-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-xctd8\" (UID: \"8f0c1ca3-f083-49b3-936d-84311a33c5a3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xctd8" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.008445 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf1f38ec-f38b-47e0-9b80-e66740ebdaba-serving-cert\") pod \"apiserver-76f77b778f-6gz2l\" (UID: \"cf1f38ec-f38b-47e0-9b80-e66740ebdaba\") " pod="openshift-apiserver/apiserver-76f77b778f-6gz2l" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.008712 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd39a673-35f1-435a-bd8d-02b253c12f1c-serving-cert\") pod \"route-controller-manager-6576b87f9c-cqctf\" (UID: \"cd39a673-35f1-435a-bd8d-02b253c12f1c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cqctf" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.009042 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/8f0c1ca3-f083-49b3-936d-84311a33c5a3-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-xctd8\" (UID: \"8f0c1ca3-f083-49b3-936d-84311a33c5a3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xctd8" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.009135 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/882d387b-cc67-4a31-b25a-8b4197bd42f2-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-m4qrq\" (UID: \"882d387b-cc67-4a31-b25a-8b4197bd42f2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m4qrq" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.009442 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79650663-cd46-4fef-a731-dfc45f8fa945-serving-cert\") pod \"controller-manager-879f6c89f-t48rv\" (UID: \"79650663-cd46-4fef-a731-dfc45f8fa945\") " pod="openshift-controller-manager/controller-manager-879f6c89f-t48rv" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.009567 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9ce83b79-8e6f-4188-a019-30399c8367f7-serving-cert\") pod \"console-operator-58897d9998-rg6mp\" (UID: \"9ce83b79-8e6f-4188-a019-30399c8367f7\") " pod="openshift-console-operator/console-operator-58897d9998-rg6mp" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.010934 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14ec6d5d-4697-48f0-be34-aa3c028fc8a4-serving-cert\") pod \"openshift-config-operator-7777fb866f-lmr9n\" (UID: \"14ec6d5d-4697-48f0-be34-aa3c028fc8a4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lmr9n" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.016745 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.037103 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.057245 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.076876 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.098001 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.101077 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcrg2\" (UniqueName: \"kubernetes.io/projected/93aad091-5fd5-4eb5-a123-b7932dc268fe-kube-api-access-qcrg2\") pod \"router-default-5444994796-mdwcw\" (UID: \"93aad091-5fd5-4eb5-a123-b7932dc268fe\") " pod="openshift-ingress/router-default-5444994796-mdwcw" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.101118 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eb977a38-ef3b-4820-a364-ad16d6c857d5-console-config\") pod \"console-f9d7485db-b6q56\" (UID: \"eb977a38-ef3b-4820-a364-ad16d6c857d5\") " pod="openshift-console/console-f9d7485db-b6q56" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.101145 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eb977a38-ef3b-4820-a364-ad16d6c857d5-oauth-serving-cert\") pod \"console-f9d7485db-b6q56\" (UID: \"eb977a38-ef3b-4820-a364-ad16d6c857d5\") " pod="openshift-console/console-f9d7485db-b6q56" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.101168 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8ns4\" (UniqueName: \"kubernetes.io/projected/eb977a38-ef3b-4820-a364-ad16d6c857d5-kube-api-access-l8ns4\") pod \"console-f9d7485db-b6q56\" (UID: \"eb977a38-ef3b-4820-a364-ad16d6c857d5\") " pod="openshift-console/console-f9d7485db-b6q56" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.101188 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/69e0e716-3ae8-482b-bb9e-57f6f5d859cd-metrics-tls\") pod \"ingress-operator-5b745b69d9-fmt5q\" (UID: \"69e0e716-3ae8-482b-bb9e-57f6f5d859cd\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fmt5q" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.101223 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9baf15f-27e7-442f-98d8-fdb29719ac71-config\") pod \"service-ca-operator-777779d784-47d6m\" (UID: \"e9baf15f-27e7-442f-98d8-fdb29719ac71\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-47d6m" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.101261 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/98fa14b7-a351-47ab-bcd2-83e2a81f9859-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-wsqp8\" (UID: \"98fa14b7-a351-47ab-bcd2-83e2a81f9859\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wsqp8" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.101285 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e4e4971a-01da-4555-b3a5-2ab4a9118163-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-rl2fl\" (UID: \"e4e4971a-01da-4555-b3a5-2ab4a9118163\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rl2fl" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.101314 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eb977a38-ef3b-4820-a364-ad16d6c857d5-console-serving-cert\") pod \"console-f9d7485db-b6q56\" (UID: \"eb977a38-ef3b-4820-a364-ad16d6c857d5\") " pod="openshift-console/console-f9d7485db-b6q56" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.101338 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb977a38-ef3b-4820-a364-ad16d6c857d5-trusted-ca-bundle\") pod \"console-f9d7485db-b6q56\" (UID: \"eb977a38-ef3b-4820-a364-ad16d6c857d5\") " pod="openshift-console/console-f9d7485db-b6q56" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.101368 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54c34701-7d60-4396-9a64-81b91379fbe9-config\") pod \"machine-api-operator-5694c8668f-zxqcl\" (UID: \"54c34701-7d60-4396-9a64-81b91379fbe9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-zxqcl" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.101390 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0bfadb73-bbc1-41db-8374-9c3aaf00682d-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9h9hz\" (UID: \"0bfadb73-bbc1-41db-8374-9c3aaf00682d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9h9hz" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.101419 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02cd3769-c022-4645-b33e-eb1303133aea-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-ncmk8\" (UID: \"02cd3769-c022-4645-b33e-eb1303133aea\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ncmk8" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.101441 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/69e0e716-3ae8-482b-bb9e-57f6f5d859cd-trusted-ca\") pod \"ingress-operator-5b745b69d9-fmt5q\" (UID: \"69e0e716-3ae8-482b-bb9e-57f6f5d859cd\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fmt5q" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.101464 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74488d89-3c1c-4e78-9c26-35be09ac8cde-config\") pod \"kube-controller-manager-operator-78b949d7b-dprns\" (UID: \"74488d89-3c1c-4e78-9c26-35be09ac8cde\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dprns" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.101491 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/54c34701-7d60-4396-9a64-81b91379fbe9-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-zxqcl\" (UID: \"54c34701-7d60-4396-9a64-81b91379fbe9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-zxqcl" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.101513 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/02cd3769-c022-4645-b33e-eb1303133aea-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-ncmk8\" (UID: \"02cd3769-c022-4645-b33e-eb1303133aea\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ncmk8" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.101536 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxlks\" (UniqueName: \"kubernetes.io/projected/69e0e716-3ae8-482b-bb9e-57f6f5d859cd-kube-api-access-wxlks\") pod \"ingress-operator-5b745b69d9-fmt5q\" (UID: \"69e0e716-3ae8-482b-bb9e-57f6f5d859cd\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fmt5q" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.101552 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74488d89-3c1c-4e78-9c26-35be09ac8cde-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-dprns\" (UID: \"74488d89-3c1c-4e78-9c26-35be09ac8cde\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dprns" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.101578 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bcac2427-8311-481a-85f6-4c9b96d3bbe2-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-7xzpm\" (UID: \"bcac2427-8311-481a-85f6-4c9b96d3bbe2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7xzpm" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.101595 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93aad091-5fd5-4eb5-a123-b7932dc268fe-service-ca-bundle\") pod \"router-default-5444994796-mdwcw\" (UID: \"93aad091-5fd5-4eb5-a123-b7932dc268fe\") " pod="openshift-ingress/router-default-5444994796-mdwcw" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.101617 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eb977a38-ef3b-4820-a364-ad16d6c857d5-console-oauth-config\") pod \"console-f9d7485db-b6q56\" (UID: \"eb977a38-ef3b-4820-a364-ad16d6c857d5\") " pod="openshift-console/console-f9d7485db-b6q56" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.101633 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/98fa14b7-a351-47ab-bcd2-83e2a81f9859-audit-policies\") pod \"apiserver-7bbb656c7d-wsqp8\" (UID: \"98fa14b7-a351-47ab-bcd2-83e2a81f9859\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wsqp8" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.101649 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bfadb73-bbc1-41db-8374-9c3aaf00682d-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9h9hz\" (UID: \"0bfadb73-bbc1-41db-8374-9c3aaf00682d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9h9hz" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.101667 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f086907d-7c4a-488b-a783-bc24e827e5e6-config\") pod \"etcd-operator-b45778765-w9zwj\" (UID: \"f086907d-7c4a-488b-a783-bc24e827e5e6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w9zwj" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.101700 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/f086907d-7c4a-488b-a783-bc24e827e5e6-etcd-ca\") pod \"etcd-operator-b45778765-w9zwj\" (UID: \"f086907d-7c4a-488b-a783-bc24e827e5e6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w9zwj" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.101714 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e9baf15f-27e7-442f-98d8-fdb29719ac71-serving-cert\") pod \"service-ca-operator-777779d784-47d6m\" (UID: \"e9baf15f-27e7-442f-98d8-fdb29719ac71\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-47d6m" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.101748 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gj5pb\" (UniqueName: \"kubernetes.io/projected/de12eb0a-0d0a-455c-a275-a3bfdb6f9d72-kube-api-access-gj5pb\") pod \"migrator-59844c95c7-p5m42\" (UID: \"de12eb0a-0d0a-455c-a275-a3bfdb6f9d72\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-p5m42" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.101780 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eb977a38-ef3b-4820-a364-ad16d6c857d5-service-ca\") pod \"console-f9d7485db-b6q56\" (UID: \"eb977a38-ef3b-4820-a364-ad16d6c857d5\") " pod="openshift-console/console-f9d7485db-b6q56" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.101802 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f086907d-7c4a-488b-a783-bc24e827e5e6-serving-cert\") pod \"etcd-operator-b45778765-w9zwj\" (UID: \"f086907d-7c4a-488b-a783-bc24e827e5e6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w9zwj" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.101825 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4e4971a-01da-4555-b3a5-2ab4a9118163-config\") pod \"kube-apiserver-operator-766d6c64bb-rl2fl\" (UID: \"e4e4971a-01da-4555-b3a5-2ab4a9118163\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rl2fl" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.101851 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4e4971a-01da-4555-b3a5-2ab4a9118163-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-rl2fl\" (UID: \"e4e4971a-01da-4555-b3a5-2ab4a9118163\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rl2fl" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.101906 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fj8fr\" (UniqueName: \"kubernetes.io/projected/bcac2427-8311-481a-85f6-4c9b96d3bbe2-kube-api-access-fj8fr\") pod \"openshift-controller-manager-operator-756b6f6bc6-7xzpm\" (UID: \"bcac2427-8311-481a-85f6-4c9b96d3bbe2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7xzpm" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.101932 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/f086907d-7c4a-488b-a783-bc24e827e5e6-etcd-service-ca\") pod \"etcd-operator-b45778765-w9zwj\" (UID: \"f086907d-7c4a-488b-a783-bc24e827e5e6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w9zwj" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.101956 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjhcj\" (UniqueName: \"kubernetes.io/projected/805f5605-a80f-40be-aaa2-5cedbc94960f-kube-api-access-kjhcj\") pod \"cluster-samples-operator-665b6dd947-9m6tj\" (UID: \"805f5605-a80f-40be-aaa2-5cedbc94960f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9m6tj" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.101978 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f086907d-7c4a-488b-a783-bc24e827e5e6-etcd-client\") pod \"etcd-operator-b45778765-w9zwj\" (UID: \"f086907d-7c4a-488b-a783-bc24e827e5e6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w9zwj" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.102011 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcac2427-8311-481a-85f6-4c9b96d3bbe2-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-7xzpm\" (UID: \"bcac2427-8311-481a-85f6-4c9b96d3bbe2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7xzpm" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.102031 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/69e0e716-3ae8-482b-bb9e-57f6f5d859cd-bound-sa-token\") pod \"ingress-operator-5b745b69d9-fmt5q\" (UID: \"69e0e716-3ae8-482b-bb9e-57f6f5d859cd\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fmt5q" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.102053 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/93aad091-5fd5-4eb5-a123-b7932dc268fe-stats-auth\") pod \"router-default-5444994796-mdwcw\" (UID: \"93aad091-5fd5-4eb5-a123-b7932dc268fe\") " pod="openshift-ingress/router-default-5444994796-mdwcw" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.102084 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/93aad091-5fd5-4eb5-a123-b7932dc268fe-default-certificate\") pod \"router-default-5444994796-mdwcw\" (UID: \"93aad091-5fd5-4eb5-a123-b7932dc268fe\") " pod="openshift-ingress/router-default-5444994796-mdwcw" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.102129 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/74488d89-3c1c-4e78-9c26-35be09ac8cde-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-dprns\" (UID: \"74488d89-3c1c-4e78-9c26-35be09ac8cde\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dprns" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.102155 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/805f5605-a80f-40be-aaa2-5cedbc94960f-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-9m6tj\" (UID: \"805f5605-a80f-40be-aaa2-5cedbc94960f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9m6tj" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.102176 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98fa14b7-a351-47ab-bcd2-83e2a81f9859-serving-cert\") pod \"apiserver-7bbb656c7d-wsqp8\" (UID: \"98fa14b7-a351-47ab-bcd2-83e2a81f9859\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wsqp8" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.102199 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98fa14b7-a351-47ab-bcd2-83e2a81f9859-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-wsqp8\" (UID: \"98fa14b7-a351-47ab-bcd2-83e2a81f9859\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wsqp8" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.102222 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93aad091-5fd5-4eb5-a123-b7932dc268fe-metrics-certs\") pod \"router-default-5444994796-mdwcw\" (UID: \"93aad091-5fd5-4eb5-a123-b7932dc268fe\") " pod="openshift-ingress/router-default-5444994796-mdwcw" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.102244 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/98fa14b7-a351-47ab-bcd2-83e2a81f9859-audit-dir\") pod \"apiserver-7bbb656c7d-wsqp8\" (UID: \"98fa14b7-a351-47ab-bcd2-83e2a81f9859\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wsqp8" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.102266 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/54c34701-7d60-4396-9a64-81b91379fbe9-images\") pod \"machine-api-operator-5694c8668f-zxqcl\" (UID: \"54c34701-7d60-4396-9a64-81b91379fbe9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-zxqcl" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.102288 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gr4mn\" (UniqueName: \"kubernetes.io/projected/02cd3769-c022-4645-b33e-eb1303133aea-kube-api-access-gr4mn\") pod \"kube-storage-version-migrator-operator-b67b599dd-ncmk8\" (UID: \"02cd3769-c022-4645-b33e-eb1303133aea\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ncmk8" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.102352 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsxnw\" (UniqueName: \"kubernetes.io/projected/e9baf15f-27e7-442f-98d8-fdb29719ac71-kube-api-access-wsxnw\") pod \"service-ca-operator-777779d784-47d6m\" (UID: \"e9baf15f-27e7-442f-98d8-fdb29719ac71\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-47d6m" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.102379 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hnkb\" (UniqueName: \"kubernetes.io/projected/54c34701-7d60-4396-9a64-81b91379fbe9-kube-api-access-6hnkb\") pod \"machine-api-operator-5694c8668f-zxqcl\" (UID: \"54c34701-7d60-4396-9a64-81b91379fbe9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-zxqcl" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.102405 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/98fa14b7-a351-47ab-bcd2-83e2a81f9859-etcd-client\") pod \"apiserver-7bbb656c7d-wsqp8\" (UID: \"98fa14b7-a351-47ab-bcd2-83e2a81f9859\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wsqp8" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.102444 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/98fa14b7-a351-47ab-bcd2-83e2a81f9859-encryption-config\") pod \"apiserver-7bbb656c7d-wsqp8\" (UID: \"98fa14b7-a351-47ab-bcd2-83e2a81f9859\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wsqp8" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.102468 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjnd4\" (UniqueName: \"kubernetes.io/projected/98fa14b7-a351-47ab-bcd2-83e2a81f9859-kube-api-access-kjnd4\") pod \"apiserver-7bbb656c7d-wsqp8\" (UID: \"98fa14b7-a351-47ab-bcd2-83e2a81f9859\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wsqp8" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.102496 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0bfadb73-bbc1-41db-8374-9c3aaf00682d-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9h9hz\" (UID: \"0bfadb73-bbc1-41db-8374-9c3aaf00682d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9h9hz" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.102520 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-969tm\" (UniqueName: \"kubernetes.io/projected/f086907d-7c4a-488b-a783-bc24e827e5e6-kube-api-access-969tm\") pod \"etcd-operator-b45778765-w9zwj\" (UID: \"f086907d-7c4a-488b-a783-bc24e827e5e6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w9zwj" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.103251 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/98fa14b7-a351-47ab-bcd2-83e2a81f9859-audit-dir\") pod \"apiserver-7bbb656c7d-wsqp8\" (UID: \"98fa14b7-a351-47ab-bcd2-83e2a81f9859\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wsqp8" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.104119 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54c34701-7d60-4396-9a64-81b91379fbe9-config\") pod \"machine-api-operator-5694c8668f-zxqcl\" (UID: \"54c34701-7d60-4396-9a64-81b91379fbe9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-zxqcl" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.104313 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/98fa14b7-a351-47ab-bcd2-83e2a81f9859-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-wsqp8\" (UID: \"98fa14b7-a351-47ab-bcd2-83e2a81f9859\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wsqp8" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.104573 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eb977a38-ef3b-4820-a364-ad16d6c857d5-oauth-serving-cert\") pod \"console-f9d7485db-b6q56\" (UID: \"eb977a38-ef3b-4820-a364-ad16d6c857d5\") " pod="openshift-console/console-f9d7485db-b6q56" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.104623 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/54c34701-7d60-4396-9a64-81b91379fbe9-images\") pod \"machine-api-operator-5694c8668f-zxqcl\" (UID: \"54c34701-7d60-4396-9a64-81b91379fbe9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-zxqcl" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.104786 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb977a38-ef3b-4820-a364-ad16d6c857d5-trusted-ca-bundle\") pod \"console-f9d7485db-b6q56\" (UID: \"eb977a38-ef3b-4820-a364-ad16d6c857d5\") " pod="openshift-console/console-f9d7485db-b6q56" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.104868 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eb977a38-ef3b-4820-a364-ad16d6c857d5-console-config\") pod \"console-f9d7485db-b6q56\" (UID: \"eb977a38-ef3b-4820-a364-ad16d6c857d5\") " pod="openshift-console/console-f9d7485db-b6q56" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.105341 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcac2427-8311-481a-85f6-4c9b96d3bbe2-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-7xzpm\" (UID: \"bcac2427-8311-481a-85f6-4c9b96d3bbe2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7xzpm" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.105405 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98fa14b7-a351-47ab-bcd2-83e2a81f9859-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-wsqp8\" (UID: \"98fa14b7-a351-47ab-bcd2-83e2a81f9859\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wsqp8" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.105505 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/98fa14b7-a351-47ab-bcd2-83e2a81f9859-audit-policies\") pod \"apiserver-7bbb656c7d-wsqp8\" (UID: \"98fa14b7-a351-47ab-bcd2-83e2a81f9859\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wsqp8" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.106894 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bcac2427-8311-481a-85f6-4c9b96d3bbe2-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-7xzpm\" (UID: \"bcac2427-8311-481a-85f6-4c9b96d3bbe2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7xzpm" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.107973 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/98fa14b7-a351-47ab-bcd2-83e2a81f9859-etcd-client\") pod \"apiserver-7bbb656c7d-wsqp8\" (UID: \"98fa14b7-a351-47ab-bcd2-83e2a81f9859\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wsqp8" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.108186 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4e4971a-01da-4555-b3a5-2ab4a9118163-config\") pod \"kube-apiserver-operator-766d6c64bb-rl2fl\" (UID: \"e4e4971a-01da-4555-b3a5-2ab4a9118163\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rl2fl" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.108234 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eb977a38-ef3b-4820-a364-ad16d6c857d5-service-ca\") pod \"console-f9d7485db-b6q56\" (UID: \"eb977a38-ef3b-4820-a364-ad16d6c857d5\") " pod="openshift-console/console-f9d7485db-b6q56" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.108400 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4e4971a-01da-4555-b3a5-2ab4a9118163-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-rl2fl\" (UID: \"e4e4971a-01da-4555-b3a5-2ab4a9118163\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rl2fl" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.109008 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eb977a38-ef3b-4820-a364-ad16d6c857d5-console-serving-cert\") pod \"console-f9d7485db-b6q56\" (UID: \"eb977a38-ef3b-4820-a364-ad16d6c857d5\") " pod="openshift-console/console-f9d7485db-b6q56" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.113949 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98fa14b7-a351-47ab-bcd2-83e2a81f9859-serving-cert\") pod \"apiserver-7bbb656c7d-wsqp8\" (UID: \"98fa14b7-a351-47ab-bcd2-83e2a81f9859\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wsqp8" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.114139 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0bfadb73-bbc1-41db-8374-9c3aaf00682d-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9h9hz\" (UID: \"0bfadb73-bbc1-41db-8374-9c3aaf00682d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9h9hz" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.114198 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/805f5605-a80f-40be-aaa2-5cedbc94960f-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-9m6tj\" (UID: \"805f5605-a80f-40be-aaa2-5cedbc94960f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9m6tj" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.114946 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/54c34701-7d60-4396-9a64-81b91379fbe9-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-zxqcl\" (UID: \"54c34701-7d60-4396-9a64-81b91379fbe9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-zxqcl" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.117395 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/98fa14b7-a351-47ab-bcd2-83e2a81f9859-encryption-config\") pod \"apiserver-7bbb656c7d-wsqp8\" (UID: \"98fa14b7-a351-47ab-bcd2-83e2a81f9859\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wsqp8" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.118161 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.123714 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eb977a38-ef3b-4820-a364-ad16d6c857d5-console-oauth-config\") pod \"console-f9d7485db-b6q56\" (UID: \"eb977a38-ef3b-4820-a364-ad16d6c857d5\") " pod="openshift-console/console-f9d7485db-b6q56" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.126768 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bfadb73-bbc1-41db-8374-9c3aaf00682d-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9h9hz\" (UID: \"0bfadb73-bbc1-41db-8374-9c3aaf00682d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9h9hz" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.137245 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.157048 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.177870 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.237155 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.237156 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.237474 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.246326 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93aad091-5fd5-4eb5-a123-b7932dc268fe-metrics-certs\") pod \"router-default-5444994796-mdwcw\" (UID: \"93aad091-5fd5-4eb5-a123-b7932dc268fe\") " pod="openshift-ingress/router-default-5444994796-mdwcw" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.249009 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/93aad091-5fd5-4eb5-a123-b7932dc268fe-default-certificate\") pod \"router-default-5444994796-mdwcw\" (UID: \"93aad091-5fd5-4eb5-a123-b7932dc268fe\") " pod="openshift-ingress/router-default-5444994796-mdwcw" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.251228 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/93aad091-5fd5-4eb5-a123-b7932dc268fe-stats-auth\") pod \"router-default-5444994796-mdwcw\" (UID: \"93aad091-5fd5-4eb5-a123-b7932dc268fe\") " pod="openshift-ingress/router-default-5444994796-mdwcw" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.257791 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.264101 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93aad091-5fd5-4eb5-a123-b7932dc268fe-service-ca-bundle\") pod \"router-default-5444994796-mdwcw\" (UID: \"93aad091-5fd5-4eb5-a123-b7932dc268fe\") " pod="openshift-ingress/router-default-5444994796-mdwcw" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.277402 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.317284 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.337272 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.356705 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.361063 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/69e0e716-3ae8-482b-bb9e-57f6f5d859cd-metrics-tls\") pod \"ingress-operator-5b745b69d9-fmt5q\" (UID: \"69e0e716-3ae8-482b-bb9e-57f6f5d859cd\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fmt5q" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.377655 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.408337 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.413712 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/69e0e716-3ae8-482b-bb9e-57f6f5d859cd-trusted-ca\") pod \"ingress-operator-5b745b69d9-fmt5q\" (UID: \"69e0e716-3ae8-482b-bb9e-57f6f5d859cd\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fmt5q" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.417172 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.438031 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.459238 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.476652 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.486425 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74488d89-3c1c-4e78-9c26-35be09ac8cde-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-dprns\" (UID: \"74488d89-3c1c-4e78-9c26-35be09ac8cde\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dprns" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.498109 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.503459 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74488d89-3c1c-4e78-9c26-35be09ac8cde-config\") pod \"kube-controller-manager-operator-78b949d7b-dprns\" (UID: \"74488d89-3c1c-4e78-9c26-35be09ac8cde\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dprns" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.521821 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.527809 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f086907d-7c4a-488b-a783-bc24e827e5e6-config\") pod \"etcd-operator-b45778765-w9zwj\" (UID: \"f086907d-7c4a-488b-a783-bc24e827e5e6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w9zwj" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.537270 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.557727 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.572348 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f086907d-7c4a-488b-a783-bc24e827e5e6-serving-cert\") pod \"etcd-operator-b45778765-w9zwj\" (UID: \"f086907d-7c4a-488b-a783-bc24e827e5e6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w9zwj" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.577848 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.597053 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f086907d-7c4a-488b-a783-bc24e827e5e6-etcd-client\") pod \"etcd-operator-b45778765-w9zwj\" (UID: \"f086907d-7c4a-488b-a783-bc24e827e5e6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w9zwj" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.598262 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.607602 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/f086907d-7c4a-488b-a783-bc24e827e5e6-etcd-ca\") pod \"etcd-operator-b45778765-w9zwj\" (UID: \"f086907d-7c4a-488b-a783-bc24e827e5e6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w9zwj" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.621404 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.637585 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.657820 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.677322 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.697633 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.717547 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.737806 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.756926 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.777167 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.797836 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.803817 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/f086907d-7c4a-488b-a783-bc24e827e5e6-etcd-service-ca\") pod \"etcd-operator-b45778765-w9zwj\" (UID: \"f086907d-7c4a-488b-a783-bc24e827e5e6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w9zwj" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.818101 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.837395 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.857640 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.867413 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/02cd3769-c022-4645-b33e-eb1303133aea-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-ncmk8\" (UID: \"02cd3769-c022-4645-b33e-eb1303133aea\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ncmk8" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.877505 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.882934 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02cd3769-c022-4645-b33e-eb1303133aea-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-ncmk8\" (UID: \"02cd3769-c022-4645-b33e-eb1303133aea\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ncmk8" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.898112 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.916085 4930 request.go:700] Waited for 1.001777549s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-service-ca-operator/secrets?fieldSelector=metadata.name%3Dserving-cert&limit=500&resourceVersion=0 Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.918198 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.931114 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e9baf15f-27e7-442f-98d8-fdb29719ac71-serving-cert\") pod \"service-ca-operator-777779d784-47d6m\" (UID: \"e9baf15f-27e7-442f-98d8-fdb29719ac71\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-47d6m" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.937203 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.956978 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.977092 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.983092 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9baf15f-27e7-442f-98d8-fdb29719ac71-config\") pod \"service-ca-operator-777779d784-47d6m\" (UID: \"e9baf15f-27e7-442f-98d8-fdb29719ac71\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-47d6m" Oct 12 05:43:28 crc kubenswrapper[4930]: I1012 05:43:28.998785 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Oct 12 05:43:29 crc kubenswrapper[4930]: I1012 05:43:29.039152 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Oct 12 05:43:29 crc kubenswrapper[4930]: I1012 05:43:29.058575 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Oct 12 05:43:29 crc kubenswrapper[4930]: I1012 05:43:29.078642 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Oct 12 05:43:29 crc kubenswrapper[4930]: I1012 05:43:29.098347 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Oct 12 05:43:29 crc kubenswrapper[4930]: I1012 05:43:29.117585 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Oct 12 05:43:29 crc kubenswrapper[4930]: I1012 05:43:29.138232 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Oct 12 05:43:29 crc kubenswrapper[4930]: I1012 05:43:29.157954 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Oct 12 05:43:29 crc kubenswrapper[4930]: I1012 05:43:29.179707 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Oct 12 05:43:29 crc kubenswrapper[4930]: I1012 05:43:29.196922 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 12 05:43:29 crc kubenswrapper[4930]: I1012 05:43:29.217371 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 12 05:43:29 crc kubenswrapper[4930]: I1012 05:43:29.237230 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Oct 12 05:43:29 crc kubenswrapper[4930]: I1012 05:43:29.256692 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Oct 12 05:43:29 crc kubenswrapper[4930]: I1012 05:43:29.277021 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Oct 12 05:43:29 crc kubenswrapper[4930]: I1012 05:43:29.297432 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Oct 12 05:43:29 crc kubenswrapper[4930]: I1012 05:43:29.319045 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Oct 12 05:43:29 crc kubenswrapper[4930]: I1012 05:43:29.337781 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Oct 12 05:43:29 crc kubenswrapper[4930]: I1012 05:43:29.357704 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Oct 12 05:43:29 crc kubenswrapper[4930]: I1012 05:43:29.377630 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Oct 12 05:43:29 crc kubenswrapper[4930]: I1012 05:43:29.397654 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Oct 12 05:43:29 crc kubenswrapper[4930]: I1012 05:43:29.418475 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Oct 12 05:43:29 crc kubenswrapper[4930]: I1012 05:43:29.437941 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Oct 12 05:43:29 crc kubenswrapper[4930]: I1012 05:43:29.458173 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Oct 12 05:43:29 crc kubenswrapper[4930]: I1012 05:43:29.478923 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Oct 12 05:43:29 crc kubenswrapper[4930]: I1012 05:43:29.499101 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Oct 12 05:43:29 crc kubenswrapper[4930]: I1012 05:43:29.518319 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Oct 12 05:43:29 crc kubenswrapper[4930]: I1012 05:43:29.550041 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Oct 12 05:43:29 crc kubenswrapper[4930]: I1012 05:43:29.557857 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Oct 12 05:43:29 crc kubenswrapper[4930]: I1012 05:43:29.578350 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Oct 12 05:43:29 crc kubenswrapper[4930]: I1012 05:43:29.598319 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Oct 12 05:43:29 crc kubenswrapper[4930]: I1012 05:43:29.617645 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Oct 12 05:43:29 crc kubenswrapper[4930]: I1012 05:43:29.638321 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Oct 12 05:43:29 crc kubenswrapper[4930]: I1012 05:43:29.657992 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Oct 12 05:43:29 crc kubenswrapper[4930]: I1012 05:43:29.678635 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Oct 12 05:43:29 crc kubenswrapper[4930]: I1012 05:43:29.697062 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Oct 12 05:43:29 crc kubenswrapper[4930]: I1012 05:43:29.718410 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Oct 12 05:43:29 crc kubenswrapper[4930]: I1012 05:43:29.738792 4930 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Oct 12 05:43:29 crc kubenswrapper[4930]: I1012 05:43:29.757980 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Oct 12 05:43:29 crc kubenswrapper[4930]: I1012 05:43:29.777868 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Oct 12 05:43:29 crc kubenswrapper[4930]: I1012 05:43:29.797604 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Oct 12 05:43:29 crc kubenswrapper[4930]: I1012 05:43:29.817469 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Oct 12 05:43:29 crc kubenswrapper[4930]: I1012 05:43:29.880801 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t49ts\" (UniqueName: \"kubernetes.io/projected/882d387b-cc67-4a31-b25a-8b4197bd42f2-kube-api-access-t49ts\") pod \"openshift-apiserver-operator-796bbdcf4f-m4qrq\" (UID: \"882d387b-cc67-4a31-b25a-8b4197bd42f2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m4qrq" Oct 12 05:43:29 crc kubenswrapper[4930]: I1012 05:43:29.892189 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gb6w\" (UniqueName: \"kubernetes.io/projected/8f0c1ca3-f083-49b3-936d-84311a33c5a3-kube-api-access-9gb6w\") pod \"cluster-image-registry-operator-dc59b4c8b-xctd8\" (UID: \"8f0c1ca3-f083-49b3-936d-84311a33c5a3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xctd8" Oct 12 05:43:29 crc kubenswrapper[4930]: I1012 05:43:29.909966 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdxhp\" (UniqueName: \"kubernetes.io/projected/62f81c40-00a5-41d6-978a-a12cd3878495-kube-api-access-tdxhp\") pod \"authentication-operator-69f744f599-zkzbp\" (UID: \"62f81c40-00a5-41d6-978a-a12cd3878495\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zkzbp" Oct 12 05:43:29 crc kubenswrapper[4930]: I1012 05:43:29.927049 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hvbg\" (UniqueName: \"kubernetes.io/projected/14ec6d5d-4697-48f0-be34-aa3c028fc8a4-kube-api-access-5hvbg\") pod \"openshift-config-operator-7777fb866f-lmr9n\" (UID: \"14ec6d5d-4697-48f0-be34-aa3c028fc8a4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lmr9n" Oct 12 05:43:29 crc kubenswrapper[4930]: I1012 05:43:29.936324 4930 request.go:700] Waited for 1.932585436s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-apiserver/serviceaccounts/openshift-apiserver-sa/token Oct 12 05:43:29 crc kubenswrapper[4930]: I1012 05:43:29.946147 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4srb\" (UniqueName: \"kubernetes.io/projected/9ed0c781-82fd-4490-adcb-0c03fff3a7c8-kube-api-access-t4srb\") pod \"machine-approver-56656f9798-x89kc\" (UID: \"9ed0c781-82fd-4490-adcb-0c03fff3a7c8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x89kc" Oct 12 05:43:29 crc kubenswrapper[4930]: I1012 05:43:29.959286 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-th5xb\" (UniqueName: \"kubernetes.io/projected/cf1f38ec-f38b-47e0-9b80-e66740ebdaba-kube-api-access-th5xb\") pod \"apiserver-76f77b778f-6gz2l\" (UID: \"cf1f38ec-f38b-47e0-9b80-e66740ebdaba\") " pod="openshift-apiserver/apiserver-76f77b778f-6gz2l" Oct 12 05:43:29 crc kubenswrapper[4930]: I1012 05:43:29.983904 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mhct\" (UniqueName: \"kubernetes.io/projected/cd39a673-35f1-435a-bd8d-02b253c12f1c-kube-api-access-8mhct\") pod \"route-controller-manager-6576b87f9c-cqctf\" (UID: \"cd39a673-35f1-435a-bd8d-02b253c12f1c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cqctf" Oct 12 05:43:29 crc kubenswrapper[4930]: I1012 05:43:29.995592 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-6gz2l" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.006152 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gk2rs\" (UniqueName: \"kubernetes.io/projected/6c179d5f-d7dd-42b9-b248-cd3c34237961-kube-api-access-gk2rs\") pod \"downloads-7954f5f757-ztn69\" (UID: \"6c179d5f-d7dd-42b9-b248-cd3c34237961\") " pod="openshift-console/downloads-7954f5f757-ztn69" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.012455 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-zkzbp" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.016026 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q99cr\" (UniqueName: \"kubernetes.io/projected/79650663-cd46-4fef-a731-dfc45f8fa945-kube-api-access-q99cr\") pod \"controller-manager-879f6c89f-t48rv\" (UID: \"79650663-cd46-4fef-a731-dfc45f8fa945\") " pod="openshift-controller-manager/controller-manager-879f6c89f-t48rv" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.021831 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cqctf" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.031838 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m4qrq" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.048274 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lmr9n" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.050143 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlx2z\" (UniqueName: \"kubernetes.io/projected/9ce83b79-8e6f-4188-a019-30399c8367f7-kube-api-access-tlx2z\") pod \"console-operator-58897d9998-rg6mp\" (UID: \"9ce83b79-8e6f-4188-a019-30399c8367f7\") " pod="openshift-console-operator/console-operator-58897d9998-rg6mp" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.056554 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f0c1ca3-f083-49b3-936d-84311a33c5a3-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-xctd8\" (UID: \"8f0c1ca3-f083-49b3-936d-84311a33c5a3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xctd8" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.057330 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-ztn69" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.067333 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xctd8" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.072507 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-rg6mp" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.073977 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxlks\" (UniqueName: \"kubernetes.io/projected/69e0e716-3ae8-482b-bb9e-57f6f5d859cd-kube-api-access-wxlks\") pod \"ingress-operator-5b745b69d9-fmt5q\" (UID: \"69e0e716-3ae8-482b-bb9e-57f6f5d859cd\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fmt5q" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.103968 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e4e4971a-01da-4555-b3a5-2ab4a9118163-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-rl2fl\" (UID: \"e4e4971a-01da-4555-b3a5-2ab4a9118163\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rl2fl" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.122944 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcrg2\" (UniqueName: \"kubernetes.io/projected/93aad091-5fd5-4eb5-a123-b7932dc268fe-kube-api-access-qcrg2\") pod \"router-default-5444994796-mdwcw\" (UID: \"93aad091-5fd5-4eb5-a123-b7932dc268fe\") " pod="openshift-ingress/router-default-5444994796-mdwcw" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.149262 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rl2fl" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.154492 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/69e0e716-3ae8-482b-bb9e-57f6f5d859cd-bound-sa-token\") pod \"ingress-operator-5b745b69d9-fmt5q\" (UID: \"69e0e716-3ae8-482b-bb9e-57f6f5d859cd\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fmt5q" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.158845 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fj8fr\" (UniqueName: \"kubernetes.io/projected/bcac2427-8311-481a-85f6-4c9b96d3bbe2-kube-api-access-fj8fr\") pod \"openshift-controller-manager-operator-756b6f6bc6-7xzpm\" (UID: \"bcac2427-8311-481a-85f6-4c9b96d3bbe2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7xzpm" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.169924 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-mdwcw" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.174198 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fmt5q" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.175441 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0bfadb73-bbc1-41db-8374-9c3aaf00682d-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9h9hz\" (UID: \"0bfadb73-bbc1-41db-8374-9c3aaf00682d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9h9hz" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.180135 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x89kc" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.196527 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjhcj\" (UniqueName: \"kubernetes.io/projected/805f5605-a80f-40be-aaa2-5cedbc94960f-kube-api-access-kjhcj\") pod \"cluster-samples-operator-665b6dd947-9m6tj\" (UID: \"805f5605-a80f-40be-aaa2-5cedbc94960f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9m6tj" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.217271 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsxnw\" (UniqueName: \"kubernetes.io/projected/e9baf15f-27e7-442f-98d8-fdb29719ac71-kube-api-access-wsxnw\") pod \"service-ca-operator-777779d784-47d6m\" (UID: \"e9baf15f-27e7-442f-98d8-fdb29719ac71\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-47d6m" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.231279 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gr4mn\" (UniqueName: \"kubernetes.io/projected/02cd3769-c022-4645-b33e-eb1303133aea-kube-api-access-gr4mn\") pod \"kube-storage-version-migrator-operator-b67b599dd-ncmk8\" (UID: \"02cd3769-c022-4645-b33e-eb1303133aea\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ncmk8" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.239241 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-t48rv" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.243013 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ncmk8" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.250429 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-47d6m" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.265351 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-969tm\" (UniqueName: \"kubernetes.io/projected/f086907d-7c4a-488b-a783-bc24e827e5e6-kube-api-access-969tm\") pod \"etcd-operator-b45778765-w9zwj\" (UID: \"f086907d-7c4a-488b-a783-bc24e827e5e6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w9zwj" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.279983 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hnkb\" (UniqueName: \"kubernetes.io/projected/54c34701-7d60-4396-9a64-81b91379fbe9-kube-api-access-6hnkb\") pod \"machine-api-operator-5694c8668f-zxqcl\" (UID: \"54c34701-7d60-4396-9a64-81b91379fbe9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-zxqcl" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.295317 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjnd4\" (UniqueName: \"kubernetes.io/projected/98fa14b7-a351-47ab-bcd2-83e2a81f9859-kube-api-access-kjnd4\") pod \"apiserver-7bbb656c7d-wsqp8\" (UID: \"98fa14b7-a351-47ab-bcd2-83e2a81f9859\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wsqp8" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.322759 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8ns4\" (UniqueName: \"kubernetes.io/projected/eb977a38-ef3b-4820-a364-ad16d6c857d5-kube-api-access-l8ns4\") pod \"console-f9d7485db-b6q56\" (UID: \"eb977a38-ef3b-4820-a364-ad16d6c857d5\") " pod="openshift-console/console-f9d7485db-b6q56" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.351067 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/74488d89-3c1c-4e78-9c26-35be09ac8cde-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-dprns\" (UID: \"74488d89-3c1c-4e78-9c26-35be09ac8cde\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dprns" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.357421 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gj5pb\" (UniqueName: \"kubernetes.io/projected/de12eb0a-0d0a-455c-a275-a3bfdb6f9d72-kube-api-access-gj5pb\") pod \"migrator-59844c95c7-p5m42\" (UID: \"de12eb0a-0d0a-455c-a275-a3bfdb6f9d72\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-p5m42" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.384078 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7xzpm" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.390652 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7sw2\" (UniqueName: \"kubernetes.io/projected/51d2d15f-f2ac-4939-b11d-f41e5891323d-kube-api-access-s7sw2\") pod \"image-registry-697d97f7c8-bzqf2\" (UID: \"51d2d15f-f2ac-4939-b11d-f41e5891323d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bzqf2" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.390705 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ef886351-d776-447e-ac43-1c16568cac4f-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-gbqmf\" (UID: \"ef886351-d776-447e-ac43-1c16568cac4f\") " pod="openshift-authentication/oauth-openshift-558db77b4-gbqmf" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.390754 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/51d2d15f-f2ac-4939-b11d-f41e5891323d-registry-tls\") pod \"image-registry-697d97f7c8-bzqf2\" (UID: \"51d2d15f-f2ac-4939-b11d-f41e5891323d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bzqf2" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.390776 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/51d2d15f-f2ac-4939-b11d-f41e5891323d-registry-certificates\") pod \"image-registry-697d97f7c8-bzqf2\" (UID: \"51d2d15f-f2ac-4939-b11d-f41e5891323d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bzqf2" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.390825 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/51d2d15f-f2ac-4939-b11d-f41e5891323d-trusted-ca\") pod \"image-registry-697d97f7c8-bzqf2\" (UID: \"51d2d15f-f2ac-4939-b11d-f41e5891323d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bzqf2" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.390845 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/51d2d15f-f2ac-4939-b11d-f41e5891323d-bound-sa-token\") pod \"image-registry-697d97f7c8-bzqf2\" (UID: \"51d2d15f-f2ac-4939-b11d-f41e5891323d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bzqf2" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.390876 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bzqf2\" (UID: \"51d2d15f-f2ac-4939-b11d-f41e5891323d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bzqf2" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.390899 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ef886351-d776-447e-ac43-1c16568cac4f-audit-policies\") pod \"oauth-openshift-558db77b4-gbqmf\" (UID: \"ef886351-d776-447e-ac43-1c16568cac4f\") " pod="openshift-authentication/oauth-openshift-558db77b4-gbqmf" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.390929 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/51d2d15f-f2ac-4939-b11d-f41e5891323d-installation-pull-secrets\") pod \"image-registry-697d97f7c8-bzqf2\" (UID: \"51d2d15f-f2ac-4939-b11d-f41e5891323d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bzqf2" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.390965 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/51d2d15f-f2ac-4939-b11d-f41e5891323d-ca-trust-extracted\") pod \"image-registry-697d97f7c8-bzqf2\" (UID: \"51d2d15f-f2ac-4939-b11d-f41e5891323d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bzqf2" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.390992 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ef886351-d776-447e-ac43-1c16568cac4f-audit-dir\") pod \"oauth-openshift-558db77b4-gbqmf\" (UID: \"ef886351-d776-447e-ac43-1c16568cac4f\") " pod="openshift-authentication/oauth-openshift-558db77b4-gbqmf" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.391015 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ef886351-d776-447e-ac43-1c16568cac4f-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-gbqmf\" (UID: \"ef886351-d776-447e-ac43-1c16568cac4f\") " pod="openshift-authentication/oauth-openshift-558db77b4-gbqmf" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.391054 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ef886351-d776-447e-ac43-1c16568cac4f-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-gbqmf\" (UID: \"ef886351-d776-447e-ac43-1c16568cac4f\") " pod="openshift-authentication/oauth-openshift-558db77b4-gbqmf" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.391082 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ef886351-d776-447e-ac43-1c16568cac4f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-gbqmf\" (UID: \"ef886351-d776-447e-ac43-1c16568cac4f\") " pod="openshift-authentication/oauth-openshift-558db77b4-gbqmf" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.391163 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ef886351-d776-447e-ac43-1c16568cac4f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-gbqmf\" (UID: \"ef886351-d776-447e-ac43-1c16568cac4f\") " pod="openshift-authentication/oauth-openshift-558db77b4-gbqmf" Oct 12 05:43:30 crc kubenswrapper[4930]: E1012 05:43:30.391961 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 05:43:30.891941161 +0000 UTC m=+143.434042926 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bzqf2" (UID: "51d2d15f-f2ac-4939-b11d-f41e5891323d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.400574 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-b6q56" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.400971 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9m6tj" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.436775 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-zxqcl" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.440611 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wsqp8" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.460374 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9h9hz" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.480892 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-w9zwj" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.487605 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dprns" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.492301 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.492421 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ef886351-d776-447e-ac43-1c16568cac4f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-gbqmf\" (UID: \"ef886351-d776-447e-ac43-1c16568cac4f\") " pod="openshift-authentication/oauth-openshift-558db77b4-gbqmf" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.492458 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/51d2d15f-f2ac-4939-b11d-f41e5891323d-trusted-ca\") pod \"image-registry-697d97f7c8-bzqf2\" (UID: \"51d2d15f-f2ac-4939-b11d-f41e5891323d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bzqf2" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.492478 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ef886351-d776-447e-ac43-1c16568cac4f-audit-dir\") pod \"oauth-openshift-558db77b4-gbqmf\" (UID: \"ef886351-d776-447e-ac43-1c16568cac4f\") " pod="openshift-authentication/oauth-openshift-558db77b4-gbqmf" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.492494 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ef886351-d776-447e-ac43-1c16568cac4f-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-gbqmf\" (UID: \"ef886351-d776-447e-ac43-1c16568cac4f\") " pod="openshift-authentication/oauth-openshift-558db77b4-gbqmf" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.492515 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ef886351-d776-447e-ac43-1c16568cac4f-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-gbqmf\" (UID: \"ef886351-d776-447e-ac43-1c16568cac4f\") " pod="openshift-authentication/oauth-openshift-558db77b4-gbqmf" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.492538 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ef886351-d776-447e-ac43-1c16568cac4f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-gbqmf\" (UID: \"ef886351-d776-447e-ac43-1c16568cac4f\") " pod="openshift-authentication/oauth-openshift-558db77b4-gbqmf" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.492567 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ef886351-d776-447e-ac43-1c16568cac4f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-gbqmf\" (UID: \"ef886351-d776-447e-ac43-1c16568cac4f\") " pod="openshift-authentication/oauth-openshift-558db77b4-gbqmf" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.492592 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7sw2\" (UniqueName: \"kubernetes.io/projected/51d2d15f-f2ac-4939-b11d-f41e5891323d-kube-api-access-s7sw2\") pod \"image-registry-697d97f7c8-bzqf2\" (UID: \"51d2d15f-f2ac-4939-b11d-f41e5891323d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bzqf2" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.492610 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/4af4dbea-ada1-465b-a6d9-24843bf3808d-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-6r5ch\" (UID: \"4af4dbea-ada1-465b-a6d9-24843bf3808d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6r5ch" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.492630 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ef886351-d776-447e-ac43-1c16568cac4f-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-gbqmf\" (UID: \"ef886351-d776-447e-ac43-1c16568cac4f\") " pod="openshift-authentication/oauth-openshift-558db77b4-gbqmf" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.492646 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/51d2d15f-f2ac-4939-b11d-f41e5891323d-registry-tls\") pod \"image-registry-697d97f7c8-bzqf2\" (UID: \"51d2d15f-f2ac-4939-b11d-f41e5891323d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bzqf2" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.492660 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/51d2d15f-f2ac-4939-b11d-f41e5891323d-registry-certificates\") pod \"image-registry-697d97f7c8-bzqf2\" (UID: \"51d2d15f-f2ac-4939-b11d-f41e5891323d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bzqf2" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.492677 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29898\" (UniqueName: \"kubernetes.io/projected/42753d84-7f68-4c88-85e1-a06eacd8052d-kube-api-access-29898\") pod \"dns-operator-744455d44c-hbgd8\" (UID: \"42753d84-7f68-4c88-85e1-a06eacd8052d\") " pod="openshift-dns-operator/dns-operator-744455d44c-hbgd8" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.492697 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/51d2d15f-f2ac-4939-b11d-f41e5891323d-bound-sa-token\") pod \"image-registry-697d97f7c8-bzqf2\" (UID: \"51d2d15f-f2ac-4939-b11d-f41e5891323d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bzqf2" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.492714 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mds2t\" (UniqueName: \"kubernetes.io/projected/4af4dbea-ada1-465b-a6d9-24843bf3808d-kube-api-access-mds2t\") pod \"control-plane-machine-set-operator-78cbb6b69f-6r5ch\" (UID: \"4af4dbea-ada1-465b-a6d9-24843bf3808d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6r5ch" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.492748 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ef886351-d776-447e-ac43-1c16568cac4f-audit-policies\") pod \"oauth-openshift-558db77b4-gbqmf\" (UID: \"ef886351-d776-447e-ac43-1c16568cac4f\") " pod="openshift-authentication/oauth-openshift-558db77b4-gbqmf" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.492776 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/51d2d15f-f2ac-4939-b11d-f41e5891323d-installation-pull-secrets\") pod \"image-registry-697d97f7c8-bzqf2\" (UID: \"51d2d15f-f2ac-4939-b11d-f41e5891323d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bzqf2" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.492800 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/51d2d15f-f2ac-4939-b11d-f41e5891323d-ca-trust-extracted\") pod \"image-registry-697d97f7c8-bzqf2\" (UID: \"51d2d15f-f2ac-4939-b11d-f41e5891323d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bzqf2" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.492822 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ef886351-d776-447e-ac43-1c16568cac4f-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-gbqmf\" (UID: \"ef886351-d776-447e-ac43-1c16568cac4f\") " pod="openshift-authentication/oauth-openshift-558db77b4-gbqmf" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.492847 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ef886351-d776-447e-ac43-1c16568cac4f-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-gbqmf\" (UID: \"ef886351-d776-447e-ac43-1c16568cac4f\") " pod="openshift-authentication/oauth-openshift-558db77b4-gbqmf" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.492869 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/42753d84-7f68-4c88-85e1-a06eacd8052d-metrics-tls\") pod \"dns-operator-744455d44c-hbgd8\" (UID: \"42753d84-7f68-4c88-85e1-a06eacd8052d\") " pod="openshift-dns-operator/dns-operator-744455d44c-hbgd8" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.492891 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ef886351-d776-447e-ac43-1c16568cac4f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-gbqmf\" (UID: \"ef886351-d776-447e-ac43-1c16568cac4f\") " pod="openshift-authentication/oauth-openshift-558db77b4-gbqmf" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.492911 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ef886351-d776-447e-ac43-1c16568cac4f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-gbqmf\" (UID: \"ef886351-d776-447e-ac43-1c16568cac4f\") " pod="openshift-authentication/oauth-openshift-558db77b4-gbqmf" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.492962 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htwmq\" (UniqueName: \"kubernetes.io/projected/ef886351-d776-447e-ac43-1c16568cac4f-kube-api-access-htwmq\") pod \"oauth-openshift-558db77b4-gbqmf\" (UID: \"ef886351-d776-447e-ac43-1c16568cac4f\") " pod="openshift-authentication/oauth-openshift-558db77b4-gbqmf" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.492980 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ef886351-d776-447e-ac43-1c16568cac4f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-gbqmf\" (UID: \"ef886351-d776-447e-ac43-1c16568cac4f\") " pod="openshift-authentication/oauth-openshift-558db77b4-gbqmf" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.493217 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ef886351-d776-447e-ac43-1c16568cac4f-audit-dir\") pod \"oauth-openshift-558db77b4-gbqmf\" (UID: \"ef886351-d776-447e-ac43-1c16568cac4f\") " pod="openshift-authentication/oauth-openshift-558db77b4-gbqmf" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.493721 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ef886351-d776-447e-ac43-1c16568cac4f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-gbqmf\" (UID: \"ef886351-d776-447e-ac43-1c16568cac4f\") " pod="openshift-authentication/oauth-openshift-558db77b4-gbqmf" Oct 12 05:43:30 crc kubenswrapper[4930]: E1012 05:43:30.494196 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 05:43:30.994177775 +0000 UTC m=+143.536279530 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.494561 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/51d2d15f-f2ac-4939-b11d-f41e5891323d-trusted-ca\") pod \"image-registry-697d97f7c8-bzqf2\" (UID: \"51d2d15f-f2ac-4939-b11d-f41e5891323d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bzqf2" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.494683 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/51d2d15f-f2ac-4939-b11d-f41e5891323d-registry-certificates\") pod \"image-registry-697d97f7c8-bzqf2\" (UID: \"51d2d15f-f2ac-4939-b11d-f41e5891323d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bzqf2" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.495323 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-p5m42" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.496537 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ef886351-d776-447e-ac43-1c16568cac4f-audit-policies\") pod \"oauth-openshift-558db77b4-gbqmf\" (UID: \"ef886351-d776-447e-ac43-1c16568cac4f\") " pod="openshift-authentication/oauth-openshift-558db77b4-gbqmf" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.496720 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ef886351-d776-447e-ac43-1c16568cac4f-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-gbqmf\" (UID: \"ef886351-d776-447e-ac43-1c16568cac4f\") " pod="openshift-authentication/oauth-openshift-558db77b4-gbqmf" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.497038 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/51d2d15f-f2ac-4939-b11d-f41e5891323d-registry-tls\") pod \"image-registry-697d97f7c8-bzqf2\" (UID: \"51d2d15f-f2ac-4939-b11d-f41e5891323d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bzqf2" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.497271 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/51d2d15f-f2ac-4939-b11d-f41e5891323d-ca-trust-extracted\") pod \"image-registry-697d97f7c8-bzqf2\" (UID: \"51d2d15f-f2ac-4939-b11d-f41e5891323d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bzqf2" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.498206 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/51d2d15f-f2ac-4939-b11d-f41e5891323d-installation-pull-secrets\") pod \"image-registry-697d97f7c8-bzqf2\" (UID: \"51d2d15f-f2ac-4939-b11d-f41e5891323d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bzqf2" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.499133 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ef886351-d776-447e-ac43-1c16568cac4f-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-gbqmf\" (UID: \"ef886351-d776-447e-ac43-1c16568cac4f\") " pod="openshift-authentication/oauth-openshift-558db77b4-gbqmf" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.501445 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ef886351-d776-447e-ac43-1c16568cac4f-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-gbqmf\" (UID: \"ef886351-d776-447e-ac43-1c16568cac4f\") " pod="openshift-authentication/oauth-openshift-558db77b4-gbqmf" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.504539 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ef886351-d776-447e-ac43-1c16568cac4f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-gbqmf\" (UID: \"ef886351-d776-447e-ac43-1c16568cac4f\") " pod="openshift-authentication/oauth-openshift-558db77b4-gbqmf" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.538964 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/51d2d15f-f2ac-4939-b11d-f41e5891323d-bound-sa-token\") pod \"image-registry-697d97f7c8-bzqf2\" (UID: \"51d2d15f-f2ac-4939-b11d-f41e5891323d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bzqf2" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.554121 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7sw2\" (UniqueName: \"kubernetes.io/projected/51d2d15f-f2ac-4939-b11d-f41e5891323d-kube-api-access-s7sw2\") pod \"image-registry-697d97f7c8-bzqf2\" (UID: \"51d2d15f-f2ac-4939-b11d-f41e5891323d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bzqf2" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.566483 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-zkzbp"] Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.578296 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-ztn69"] Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.588320 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-6gz2l"] Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.595012 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a8c9e0c4-c47b-4bbc-ab98-ce7fa42d8e0c-proxy-tls\") pod \"machine-config-controller-84d6567774-d6mxh\" (UID: \"a8c9e0c4-c47b-4bbc-ab98-ce7fa42d8e0c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-d6mxh" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.595127 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nbf2\" (UniqueName: \"kubernetes.io/projected/a51e2785-1c3d-4354-a071-dadb05075c68-kube-api-access-6nbf2\") pod \"marketplace-operator-79b997595-bs29k\" (UID: \"a51e2785-1c3d-4354-a071-dadb05075c68\") " pod="openshift-marketplace/marketplace-operator-79b997595-bs29k" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.595265 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ef886351-d776-447e-ac43-1c16568cac4f-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-gbqmf\" (UID: \"ef886351-d776-447e-ac43-1c16568cac4f\") " pod="openshift-authentication/oauth-openshift-558db77b4-gbqmf" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.596067 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65ldr\" (UniqueName: \"kubernetes.io/projected/fc56c928-9a28-4b6a-9652-49bf164d7104-kube-api-access-65ldr\") pod \"machine-config-server-92ngq\" (UID: \"fc56c928-9a28-4b6a-9652-49bf164d7104\") " pod="openshift-machine-config-operator/machine-config-server-92ngq" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.596105 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/db2e99ac-f9a3-41cf-8a72-b05db4880b78-apiservice-cert\") pod \"packageserver-d55dfcdfc-psfdl\" (UID: \"db2e99ac-f9a3-41cf-8a72-b05db4880b78\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-psfdl" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.596122 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/131bb70c-fac8-4696-ad60-1acc99df28c2-signing-key\") pod \"service-ca-9c57cc56f-phltz\" (UID: \"131bb70c-fac8-4696-ad60-1acc99df28c2\") " pod="openshift-service-ca/service-ca-9c57cc56f-phltz" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.596140 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c393fc2e-599b-4842-9c6c-b46cef38f7e6-registration-dir\") pod \"csi-hostpathplugin-25pm7\" (UID: \"c393fc2e-599b-4842-9c6c-b46cef38f7e6\") " pod="hostpath-provisioner/csi-hostpathplugin-25pm7" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.596172 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/42753d84-7f68-4c88-85e1-a06eacd8052d-metrics-tls\") pod \"dns-operator-744455d44c-hbgd8\" (UID: \"42753d84-7f68-4c88-85e1-a06eacd8052d\") " pod="openshift-dns-operator/dns-operator-744455d44c-hbgd8" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.596187 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/c393fc2e-599b-4842-9c6c-b46cef38f7e6-csi-data-dir\") pod \"csi-hostpathplugin-25pm7\" (UID: \"c393fc2e-599b-4842-9c6c-b46cef38f7e6\") " pod="hostpath-provisioner/csi-hostpathplugin-25pm7" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.596203 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a51e2785-1c3d-4354-a071-dadb05075c68-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-bs29k\" (UID: \"a51e2785-1c3d-4354-a071-dadb05075c68\") " pod="openshift-marketplace/marketplace-operator-79b997595-bs29k" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.596221 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/17d6a32b-5d07-4158-853a-eefab6440720-proxy-tls\") pod \"machine-config-operator-74547568cd-lk89q\" (UID: \"17d6a32b-5d07-4158-853a-eefab6440720\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lk89q" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.596269 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/17d6a32b-5d07-4158-853a-eefab6440720-images\") pod \"machine-config-operator-74547568cd-lk89q\" (UID: \"17d6a32b-5d07-4158-853a-eefab6440720\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lk89q" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.596285 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cmps\" (UniqueName: \"kubernetes.io/projected/bb64ba69-2433-4d68-b7bc-c5c4f62d5f2f-kube-api-access-9cmps\") pod \"ingress-canary-5v7hl\" (UID: \"bb64ba69-2433-4d68-b7bc-c5c4f62d5f2f\") " pod="openshift-ingress-canary/ingress-canary-5v7hl" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.596303 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ef886351-d776-447e-ac43-1c16568cac4f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-gbqmf\" (UID: \"ef886351-d776-447e-ac43-1c16568cac4f\") " pod="openshift-authentication/oauth-openshift-558db77b4-gbqmf" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.596374 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/fc56c928-9a28-4b6a-9652-49bf164d7104-node-bootstrap-token\") pod \"machine-config-server-92ngq\" (UID: \"fc56c928-9a28-4b6a-9652-49bf164d7104\") " pod="openshift-machine-config-operator/machine-config-server-92ngq" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.596392 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c393fc2e-599b-4842-9c6c-b46cef38f7e6-socket-dir\") pod \"csi-hostpathplugin-25pm7\" (UID: \"c393fc2e-599b-4842-9c6c-b46cef38f7e6\") " pod="hostpath-provisioner/csi-hostpathplugin-25pm7" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.596409 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ef886351-d776-447e-ac43-1c16568cac4f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-gbqmf\" (UID: \"ef886351-d776-447e-ac43-1c16568cac4f\") " pod="openshift-authentication/oauth-openshift-558db77b4-gbqmf" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.596476 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a8c9e0c4-c47b-4bbc-ab98-ce7fa42d8e0c-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-d6mxh\" (UID: \"a8c9e0c4-c47b-4bbc-ab98-ce7fa42d8e0c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-d6mxh" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.596500 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/220d40c1-2181-40b6-9d6b-6704fe99ec24-profile-collector-cert\") pod \"olm-operator-6b444d44fb-968tr\" (UID: \"220d40c1-2181-40b6-9d6b-6704fe99ec24\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-968tr" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.596528 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/74be97dd-4d16-40ed-87e4-b707eccf422e-secret-volume\") pod \"collect-profiles-29337450-9g4rr\" (UID: \"74be97dd-4d16-40ed-87e4-b707eccf422e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337450-9g4rr" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.596557 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/db2e99ac-f9a3-41cf-8a72-b05db4880b78-tmpfs\") pod \"packageserver-d55dfcdfc-psfdl\" (UID: \"db2e99ac-f9a3-41cf-8a72-b05db4880b78\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-psfdl" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.596594 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/c5333e14-992e-411c-bd42-9848ff42b021-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-2lc4l\" (UID: \"c5333e14-992e-411c-bd42-9848ff42b021\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2lc4l" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.596662 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htwmq\" (UniqueName: \"kubernetes.io/projected/ef886351-d776-447e-ac43-1c16568cac4f-kube-api-access-htwmq\") pod \"oauth-openshift-558db77b4-gbqmf\" (UID: \"ef886351-d776-447e-ac43-1c16568cac4f\") " pod="openshift-authentication/oauth-openshift-558db77b4-gbqmf" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.596709 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ef886351-d776-447e-ac43-1c16568cac4f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-gbqmf\" (UID: \"ef886351-d776-447e-ac43-1c16568cac4f\") " pod="openshift-authentication/oauth-openshift-558db77b4-gbqmf" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.596775 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/36224327-6c6b-4ff0-86e2-97c02039b8c6-config-volume\") pod \"dns-default-g8lk2\" (UID: \"36224327-6c6b-4ff0-86e2-97c02039b8c6\") " pod="openshift-dns/dns-default-g8lk2" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.596852 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ef886351-d776-447e-ac43-1c16568cac4f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-gbqmf\" (UID: \"ef886351-d776-447e-ac43-1c16568cac4f\") " pod="openshift-authentication/oauth-openshift-558db77b4-gbqmf" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.596910 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqchn\" (UniqueName: \"kubernetes.io/projected/17d6a32b-5d07-4158-853a-eefab6440720-kube-api-access-xqchn\") pod \"machine-config-operator-74547568cd-lk89q\" (UID: \"17d6a32b-5d07-4158-853a-eefab6440720\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lk89q" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.596950 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkt7l\" (UniqueName: \"kubernetes.io/projected/220d40c1-2181-40b6-9d6b-6704fe99ec24-kube-api-access-xkt7l\") pod \"olm-operator-6b444d44fb-968tr\" (UID: \"220d40c1-2181-40b6-9d6b-6704fe99ec24\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-968tr" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.596972 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsch7\" (UniqueName: \"kubernetes.io/projected/db2e99ac-f9a3-41cf-8a72-b05db4880b78-kube-api-access-bsch7\") pod \"packageserver-d55dfcdfc-psfdl\" (UID: \"db2e99ac-f9a3-41cf-8a72-b05db4880b78\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-psfdl" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.597023 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfzjc\" (UniqueName: \"kubernetes.io/projected/2a81e403-a7dc-405a-bab2-8a9f28fa22fa-kube-api-access-cfzjc\") pod \"catalog-operator-68c6474976-wdpcs\" (UID: \"2a81e403-a7dc-405a-bab2-8a9f28fa22fa\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wdpcs" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.600119 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ef886351-d776-447e-ac43-1c16568cac4f-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-gbqmf\" (UID: \"ef886351-d776-447e-ac43-1c16568cac4f\") " pod="openshift-authentication/oauth-openshift-558db77b4-gbqmf" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.600544 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ef886351-d776-447e-ac43-1c16568cac4f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-gbqmf\" (UID: \"ef886351-d776-447e-ac43-1c16568cac4f\") " pod="openshift-authentication/oauth-openshift-558db77b4-gbqmf" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.600629 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ef886351-d776-447e-ac43-1c16568cac4f-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-gbqmf\" (UID: \"ef886351-d776-447e-ac43-1c16568cac4f\") " pod="openshift-authentication/oauth-openshift-558db77b4-gbqmf" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.600655 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/c393fc2e-599b-4842-9c6c-b46cef38f7e6-mountpoint-dir\") pod \"csi-hostpathplugin-25pm7\" (UID: \"c393fc2e-599b-4842-9c6c-b46cef38f7e6\") " pod="hostpath-provisioner/csi-hostpathplugin-25pm7" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.600722 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnhtc\" (UniqueName: \"kubernetes.io/projected/7051fcad-4fe8-4c5c-8694-2a6e63b541d0-kube-api-access-rnhtc\") pod \"multus-admission-controller-857f4d67dd-vd458\" (UID: \"7051fcad-4fe8-4c5c-8694-2a6e63b541d0\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vd458" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.601128 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/74be97dd-4d16-40ed-87e4-b707eccf422e-config-volume\") pod \"collect-profiles-29337450-9g4rr\" (UID: \"74be97dd-4d16-40ed-87e4-b707eccf422e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337450-9g4rr" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.601201 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/fc56c928-9a28-4b6a-9652-49bf164d7104-certs\") pod \"machine-config-server-92ngq\" (UID: \"fc56c928-9a28-4b6a-9652-49bf164d7104\") " pod="openshift-machine-config-operator/machine-config-server-92ngq" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.601234 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/131bb70c-fac8-4696-ad60-1acc99df28c2-signing-cabundle\") pod \"service-ca-9c57cc56f-phltz\" (UID: \"131bb70c-fac8-4696-ad60-1acc99df28c2\") " pod="openshift-service-ca/service-ca-9c57cc56f-phltz" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.601488 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/220d40c1-2181-40b6-9d6b-6704fe99ec24-srv-cert\") pod \"olm-operator-6b444d44fb-968tr\" (UID: \"220d40c1-2181-40b6-9d6b-6704fe99ec24\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-968tr" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.601862 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7051fcad-4fe8-4c5c-8694-2a6e63b541d0-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-vd458\" (UID: \"7051fcad-4fe8-4c5c-8694-2a6e63b541d0\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vd458" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.601885 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/36224327-6c6b-4ff0-86e2-97c02039b8c6-metrics-tls\") pod \"dns-default-g8lk2\" (UID: \"36224327-6c6b-4ff0-86e2-97c02039b8c6\") " pod="openshift-dns/dns-default-g8lk2" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.601940 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a51e2785-1c3d-4354-a071-dadb05075c68-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-bs29k\" (UID: \"a51e2785-1c3d-4354-a071-dadb05075c68\") " pod="openshift-marketplace/marketplace-operator-79b997595-bs29k" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.601958 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/c393fc2e-599b-4842-9c6c-b46cef38f7e6-plugins-dir\") pod \"csi-hostpathplugin-25pm7\" (UID: \"c393fc2e-599b-4842-9c6c-b46cef38f7e6\") " pod="hostpath-provisioner/csi-hostpathplugin-25pm7" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.602022 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/17d6a32b-5d07-4158-853a-eefab6440720-auth-proxy-config\") pod \"machine-config-operator-74547568cd-lk89q\" (UID: \"17d6a32b-5d07-4158-853a-eefab6440720\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lk89q" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.602863 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/db2e99ac-f9a3-41cf-8a72-b05db4880b78-webhook-cert\") pod \"packageserver-d55dfcdfc-psfdl\" (UID: \"db2e99ac-f9a3-41cf-8a72-b05db4880b78\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-psfdl" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.602974 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/2a81e403-a7dc-405a-bab2-8a9f28fa22fa-srv-cert\") pod \"catalog-operator-68c6474976-wdpcs\" (UID: \"2a81e403-a7dc-405a-bab2-8a9f28fa22fa\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wdpcs" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.603023 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/4af4dbea-ada1-465b-a6d9-24843bf3808d-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-6r5ch\" (UID: \"4af4dbea-ada1-465b-a6d9-24843bf3808d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6r5ch" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.603339 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54wvt\" (UniqueName: \"kubernetes.io/projected/c393fc2e-599b-4842-9c6c-b46cef38f7e6-kube-api-access-54wvt\") pod \"csi-hostpathplugin-25pm7\" (UID: \"c393fc2e-599b-4842-9c6c-b46cef38f7e6\") " pod="hostpath-provisioner/csi-hostpathplugin-25pm7" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.603403 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/2a81e403-a7dc-405a-bab2-8a9f28fa22fa-profile-collector-cert\") pod \"catalog-operator-68c6474976-wdpcs\" (UID: \"2a81e403-a7dc-405a-bab2-8a9f28fa22fa\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wdpcs" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.603426 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rglkf\" (UniqueName: \"kubernetes.io/projected/131bb70c-fac8-4696-ad60-1acc99df28c2-kube-api-access-rglkf\") pod \"service-ca-9c57cc56f-phltz\" (UID: \"131bb70c-fac8-4696-ad60-1acc99df28c2\") " pod="openshift-service-ca/service-ca-9c57cc56f-phltz" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.603470 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29898\" (UniqueName: \"kubernetes.io/projected/42753d84-7f68-4c88-85e1-a06eacd8052d-kube-api-access-29898\") pod \"dns-operator-744455d44c-hbgd8\" (UID: \"42753d84-7f68-4c88-85e1-a06eacd8052d\") " pod="openshift-dns-operator/dns-operator-744455d44c-hbgd8" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.603946 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrzxq\" (UniqueName: \"kubernetes.io/projected/36224327-6c6b-4ff0-86e2-97c02039b8c6-kube-api-access-vrzxq\") pod \"dns-default-g8lk2\" (UID: \"36224327-6c6b-4ff0-86e2-97c02039b8c6\") " pod="openshift-dns/dns-default-g8lk2" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.604846 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9nz6\" (UniqueName: \"kubernetes.io/projected/74be97dd-4d16-40ed-87e4-b707eccf422e-kube-api-access-j9nz6\") pod \"collect-profiles-29337450-9g4rr\" (UID: \"74be97dd-4d16-40ed-87e4-b707eccf422e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337450-9g4rr" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.605027 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mds2t\" (UniqueName: \"kubernetes.io/projected/4af4dbea-ada1-465b-a6d9-24843bf3808d-kube-api-access-mds2t\") pod \"control-plane-machine-set-operator-78cbb6b69f-6r5ch\" (UID: \"4af4dbea-ada1-465b-a6d9-24843bf3808d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6r5ch" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.605051 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bb64ba69-2433-4d68-b7bc-c5c4f62d5f2f-cert\") pod \"ingress-canary-5v7hl\" (UID: \"bb64ba69-2433-4d68-b7bc-c5c4f62d5f2f\") " pod="openshift-ingress-canary/ingress-canary-5v7hl" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.605129 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bzqf2\" (UID: \"51d2d15f-f2ac-4939-b11d-f41e5891323d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bzqf2" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.605152 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szc8g\" (UniqueName: \"kubernetes.io/projected/a8c9e0c4-c47b-4bbc-ab98-ce7fa42d8e0c-kube-api-access-szc8g\") pod \"machine-config-controller-84d6567774-d6mxh\" (UID: \"a8c9e0c4-c47b-4bbc-ab98-ce7fa42d8e0c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-d6mxh" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.605380 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zsgj\" (UniqueName: \"kubernetes.io/projected/c5333e14-992e-411c-bd42-9848ff42b021-kube-api-access-9zsgj\") pod \"package-server-manager-789f6589d5-2lc4l\" (UID: \"c5333e14-992e-411c-bd42-9848ff42b021\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2lc4l" Oct 12 05:43:30 crc kubenswrapper[4930]: E1012 05:43:30.606693 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 05:43:31.106680328 +0000 UTC m=+143.648782083 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bzqf2" (UID: "51d2d15f-f2ac-4939-b11d-f41e5891323d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.619532 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ef886351-d776-447e-ac43-1c16568cac4f-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-gbqmf\" (UID: \"ef886351-d776-447e-ac43-1c16568cac4f\") " pod="openshift-authentication/oauth-openshift-558db77b4-gbqmf" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.620870 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ef886351-d776-447e-ac43-1c16568cac4f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-gbqmf\" (UID: \"ef886351-d776-447e-ac43-1c16568cac4f\") " pod="openshift-authentication/oauth-openshift-558db77b4-gbqmf" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.621792 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/42753d84-7f68-4c88-85e1-a06eacd8052d-metrics-tls\") pod \"dns-operator-744455d44c-hbgd8\" (UID: \"42753d84-7f68-4c88-85e1-a06eacd8052d\") " pod="openshift-dns-operator/dns-operator-744455d44c-hbgd8" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.623138 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ef886351-d776-447e-ac43-1c16568cac4f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-gbqmf\" (UID: \"ef886351-d776-447e-ac43-1c16568cac4f\") " pod="openshift-authentication/oauth-openshift-558db77b4-gbqmf" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.624472 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ef886351-d776-447e-ac43-1c16568cac4f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-gbqmf\" (UID: \"ef886351-d776-447e-ac43-1c16568cac4f\") " pod="openshift-authentication/oauth-openshift-558db77b4-gbqmf" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.625052 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-lmr9n"] Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.625764 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/4af4dbea-ada1-465b-a6d9-24843bf3808d-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-6r5ch\" (UID: \"4af4dbea-ada1-465b-a6d9-24843bf3808d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6r5ch" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.663504 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htwmq\" (UniqueName: \"kubernetes.io/projected/ef886351-d776-447e-ac43-1c16568cac4f-kube-api-access-htwmq\") pod \"oauth-openshift-558db77b4-gbqmf\" (UID: \"ef886351-d776-447e-ac43-1c16568cac4f\") " pod="openshift-authentication/oauth-openshift-558db77b4-gbqmf" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.676573 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rl2fl"] Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.682000 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29898\" (UniqueName: \"kubernetes.io/projected/42753d84-7f68-4c88-85e1-a06eacd8052d-kube-api-access-29898\") pod \"dns-operator-744455d44c-hbgd8\" (UID: \"42753d84-7f68-4c88-85e1-a06eacd8052d\") " pod="openshift-dns-operator/dns-operator-744455d44c-hbgd8" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.690428 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-cqctf"] Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.693061 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-rg6mp"] Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.705585 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-gbqmf" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.706054 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.706243 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nbf2\" (UniqueName: \"kubernetes.io/projected/a51e2785-1c3d-4354-a071-dadb05075c68-kube-api-access-6nbf2\") pod \"marketplace-operator-79b997595-bs29k\" (UID: \"a51e2785-1c3d-4354-a071-dadb05075c68\") " pod="openshift-marketplace/marketplace-operator-79b997595-bs29k" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.706275 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65ldr\" (UniqueName: \"kubernetes.io/projected/fc56c928-9a28-4b6a-9652-49bf164d7104-kube-api-access-65ldr\") pod \"machine-config-server-92ngq\" (UID: \"fc56c928-9a28-4b6a-9652-49bf164d7104\") " pod="openshift-machine-config-operator/machine-config-server-92ngq" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.706292 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/db2e99ac-f9a3-41cf-8a72-b05db4880b78-apiservice-cert\") pod \"packageserver-d55dfcdfc-psfdl\" (UID: \"db2e99ac-f9a3-41cf-8a72-b05db4880b78\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-psfdl" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.706310 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/131bb70c-fac8-4696-ad60-1acc99df28c2-signing-key\") pod \"service-ca-9c57cc56f-phltz\" (UID: \"131bb70c-fac8-4696-ad60-1acc99df28c2\") " pod="openshift-service-ca/service-ca-9c57cc56f-phltz" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.706326 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c393fc2e-599b-4842-9c6c-b46cef38f7e6-registration-dir\") pod \"csi-hostpathplugin-25pm7\" (UID: \"c393fc2e-599b-4842-9c6c-b46cef38f7e6\") " pod="hostpath-provisioner/csi-hostpathplugin-25pm7" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.706339 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/c393fc2e-599b-4842-9c6c-b46cef38f7e6-csi-data-dir\") pod \"csi-hostpathplugin-25pm7\" (UID: \"c393fc2e-599b-4842-9c6c-b46cef38f7e6\") " pod="hostpath-provisioner/csi-hostpathplugin-25pm7" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.706355 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a51e2785-1c3d-4354-a071-dadb05075c68-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-bs29k\" (UID: \"a51e2785-1c3d-4354-a071-dadb05075c68\") " pod="openshift-marketplace/marketplace-operator-79b997595-bs29k" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.706371 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/17d6a32b-5d07-4158-853a-eefab6440720-proxy-tls\") pod \"machine-config-operator-74547568cd-lk89q\" (UID: \"17d6a32b-5d07-4158-853a-eefab6440720\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lk89q" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.706389 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/17d6a32b-5d07-4158-853a-eefab6440720-images\") pod \"machine-config-operator-74547568cd-lk89q\" (UID: \"17d6a32b-5d07-4158-853a-eefab6440720\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lk89q" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.706405 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cmps\" (UniqueName: \"kubernetes.io/projected/bb64ba69-2433-4d68-b7bc-c5c4f62d5f2f-kube-api-access-9cmps\") pod \"ingress-canary-5v7hl\" (UID: \"bb64ba69-2433-4d68-b7bc-c5c4f62d5f2f\") " pod="openshift-ingress-canary/ingress-canary-5v7hl" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.706424 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/fc56c928-9a28-4b6a-9652-49bf164d7104-node-bootstrap-token\") pod \"machine-config-server-92ngq\" (UID: \"fc56c928-9a28-4b6a-9652-49bf164d7104\") " pod="openshift-machine-config-operator/machine-config-server-92ngq" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.706439 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c393fc2e-599b-4842-9c6c-b46cef38f7e6-socket-dir\") pod \"csi-hostpathplugin-25pm7\" (UID: \"c393fc2e-599b-4842-9c6c-b46cef38f7e6\") " pod="hostpath-provisioner/csi-hostpathplugin-25pm7" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.706466 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a8c9e0c4-c47b-4bbc-ab98-ce7fa42d8e0c-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-d6mxh\" (UID: \"a8c9e0c4-c47b-4bbc-ab98-ce7fa42d8e0c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-d6mxh" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.706483 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/220d40c1-2181-40b6-9d6b-6704fe99ec24-profile-collector-cert\") pod \"olm-operator-6b444d44fb-968tr\" (UID: \"220d40c1-2181-40b6-9d6b-6704fe99ec24\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-968tr" Oct 12 05:43:30 crc kubenswrapper[4930]: E1012 05:43:30.706527 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 05:43:31.206506004 +0000 UTC m=+143.748607769 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.706570 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/db2e99ac-f9a3-41cf-8a72-b05db4880b78-tmpfs\") pod \"packageserver-d55dfcdfc-psfdl\" (UID: \"db2e99ac-f9a3-41cf-8a72-b05db4880b78\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-psfdl" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.706606 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/74be97dd-4d16-40ed-87e4-b707eccf422e-secret-volume\") pod \"collect-profiles-29337450-9g4rr\" (UID: \"74be97dd-4d16-40ed-87e4-b707eccf422e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337450-9g4rr" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.707437 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c393fc2e-599b-4842-9c6c-b46cef38f7e6-registration-dir\") pod \"csi-hostpathplugin-25pm7\" (UID: \"c393fc2e-599b-4842-9c6c-b46cef38f7e6\") " pod="hostpath-provisioner/csi-hostpathplugin-25pm7" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.710132 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/c5333e14-992e-411c-bd42-9848ff42b021-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-2lc4l\" (UID: \"c5333e14-992e-411c-bd42-9848ff42b021\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2lc4l" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.710190 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/36224327-6c6b-4ff0-86e2-97c02039b8c6-config-volume\") pod \"dns-default-g8lk2\" (UID: \"36224327-6c6b-4ff0-86e2-97c02039b8c6\") " pod="openshift-dns/dns-default-g8lk2" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.710315 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqchn\" (UniqueName: \"kubernetes.io/projected/17d6a32b-5d07-4158-853a-eefab6440720-kube-api-access-xqchn\") pod \"machine-config-operator-74547568cd-lk89q\" (UID: \"17d6a32b-5d07-4158-853a-eefab6440720\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lk89q" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.710340 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkt7l\" (UniqueName: \"kubernetes.io/projected/220d40c1-2181-40b6-9d6b-6704fe99ec24-kube-api-access-xkt7l\") pod \"olm-operator-6b444d44fb-968tr\" (UID: \"220d40c1-2181-40b6-9d6b-6704fe99ec24\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-968tr" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.710356 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsch7\" (UniqueName: \"kubernetes.io/projected/db2e99ac-f9a3-41cf-8a72-b05db4880b78-kube-api-access-bsch7\") pod \"packageserver-d55dfcdfc-psfdl\" (UID: \"db2e99ac-f9a3-41cf-8a72-b05db4880b78\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-psfdl" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.710376 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfzjc\" (UniqueName: \"kubernetes.io/projected/2a81e403-a7dc-405a-bab2-8a9f28fa22fa-kube-api-access-cfzjc\") pod \"catalog-operator-68c6474976-wdpcs\" (UID: \"2a81e403-a7dc-405a-bab2-8a9f28fa22fa\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wdpcs" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.710396 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/c393fc2e-599b-4842-9c6c-b46cef38f7e6-mountpoint-dir\") pod \"csi-hostpathplugin-25pm7\" (UID: \"c393fc2e-599b-4842-9c6c-b46cef38f7e6\") " pod="hostpath-provisioner/csi-hostpathplugin-25pm7" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.710420 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnhtc\" (UniqueName: \"kubernetes.io/projected/7051fcad-4fe8-4c5c-8694-2a6e63b541d0-kube-api-access-rnhtc\") pod \"multus-admission-controller-857f4d67dd-vd458\" (UID: \"7051fcad-4fe8-4c5c-8694-2a6e63b541d0\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vd458" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.710446 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/74be97dd-4d16-40ed-87e4-b707eccf422e-config-volume\") pod \"collect-profiles-29337450-9g4rr\" (UID: \"74be97dd-4d16-40ed-87e4-b707eccf422e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337450-9g4rr" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.710467 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/fc56c928-9a28-4b6a-9652-49bf164d7104-certs\") pod \"machine-config-server-92ngq\" (UID: \"fc56c928-9a28-4b6a-9652-49bf164d7104\") " pod="openshift-machine-config-operator/machine-config-server-92ngq" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.710486 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/131bb70c-fac8-4696-ad60-1acc99df28c2-signing-cabundle\") pod \"service-ca-9c57cc56f-phltz\" (UID: \"131bb70c-fac8-4696-ad60-1acc99df28c2\") " pod="openshift-service-ca/service-ca-9c57cc56f-phltz" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.710511 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/220d40c1-2181-40b6-9d6b-6704fe99ec24-srv-cert\") pod \"olm-operator-6b444d44fb-968tr\" (UID: \"220d40c1-2181-40b6-9d6b-6704fe99ec24\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-968tr" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.710537 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7051fcad-4fe8-4c5c-8694-2a6e63b541d0-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-vd458\" (UID: \"7051fcad-4fe8-4c5c-8694-2a6e63b541d0\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vd458" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.710551 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/36224327-6c6b-4ff0-86e2-97c02039b8c6-metrics-tls\") pod \"dns-default-g8lk2\" (UID: \"36224327-6c6b-4ff0-86e2-97c02039b8c6\") " pod="openshift-dns/dns-default-g8lk2" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.710567 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a51e2785-1c3d-4354-a071-dadb05075c68-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-bs29k\" (UID: \"a51e2785-1c3d-4354-a071-dadb05075c68\") " pod="openshift-marketplace/marketplace-operator-79b997595-bs29k" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.710582 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/c393fc2e-599b-4842-9c6c-b46cef38f7e6-plugins-dir\") pod \"csi-hostpathplugin-25pm7\" (UID: \"c393fc2e-599b-4842-9c6c-b46cef38f7e6\") " pod="hostpath-provisioner/csi-hostpathplugin-25pm7" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.710602 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/17d6a32b-5d07-4158-853a-eefab6440720-auth-proxy-config\") pod \"machine-config-operator-74547568cd-lk89q\" (UID: \"17d6a32b-5d07-4158-853a-eefab6440720\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lk89q" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.710624 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/db2e99ac-f9a3-41cf-8a72-b05db4880b78-webhook-cert\") pod \"packageserver-d55dfcdfc-psfdl\" (UID: \"db2e99ac-f9a3-41cf-8a72-b05db4880b78\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-psfdl" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.710651 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/2a81e403-a7dc-405a-bab2-8a9f28fa22fa-srv-cert\") pod \"catalog-operator-68c6474976-wdpcs\" (UID: \"2a81e403-a7dc-405a-bab2-8a9f28fa22fa\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wdpcs" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.710677 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54wvt\" (UniqueName: \"kubernetes.io/projected/c393fc2e-599b-4842-9c6c-b46cef38f7e6-kube-api-access-54wvt\") pod \"csi-hostpathplugin-25pm7\" (UID: \"c393fc2e-599b-4842-9c6c-b46cef38f7e6\") " pod="hostpath-provisioner/csi-hostpathplugin-25pm7" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.710693 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/2a81e403-a7dc-405a-bab2-8a9f28fa22fa-profile-collector-cert\") pod \"catalog-operator-68c6474976-wdpcs\" (UID: \"2a81e403-a7dc-405a-bab2-8a9f28fa22fa\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wdpcs" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.710708 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rglkf\" (UniqueName: \"kubernetes.io/projected/131bb70c-fac8-4696-ad60-1acc99df28c2-kube-api-access-rglkf\") pod \"service-ca-9c57cc56f-phltz\" (UID: \"131bb70c-fac8-4696-ad60-1acc99df28c2\") " pod="openshift-service-ca/service-ca-9c57cc56f-phltz" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.710780 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrzxq\" (UniqueName: \"kubernetes.io/projected/36224327-6c6b-4ff0-86e2-97c02039b8c6-kube-api-access-vrzxq\") pod \"dns-default-g8lk2\" (UID: \"36224327-6c6b-4ff0-86e2-97c02039b8c6\") " pod="openshift-dns/dns-default-g8lk2" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.710786 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/c393fc2e-599b-4842-9c6c-b46cef38f7e6-csi-data-dir\") pod \"csi-hostpathplugin-25pm7\" (UID: \"c393fc2e-599b-4842-9c6c-b46cef38f7e6\") " pod="hostpath-provisioner/csi-hostpathplugin-25pm7" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.710803 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9nz6\" (UniqueName: \"kubernetes.io/projected/74be97dd-4d16-40ed-87e4-b707eccf422e-kube-api-access-j9nz6\") pod \"collect-profiles-29337450-9g4rr\" (UID: \"74be97dd-4d16-40ed-87e4-b707eccf422e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337450-9g4rr" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.710862 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bb64ba69-2433-4d68-b7bc-c5c4f62d5f2f-cert\") pod \"ingress-canary-5v7hl\" (UID: \"bb64ba69-2433-4d68-b7bc-c5c4f62d5f2f\") " pod="openshift-ingress-canary/ingress-canary-5v7hl" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.710903 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bzqf2\" (UID: \"51d2d15f-f2ac-4939-b11d-f41e5891323d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bzqf2" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.710936 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szc8g\" (UniqueName: \"kubernetes.io/projected/a8c9e0c4-c47b-4bbc-ab98-ce7fa42d8e0c-kube-api-access-szc8g\") pod \"machine-config-controller-84d6567774-d6mxh\" (UID: \"a8c9e0c4-c47b-4bbc-ab98-ce7fa42d8e0c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-d6mxh" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.710974 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zsgj\" (UniqueName: \"kubernetes.io/projected/c5333e14-992e-411c-bd42-9848ff42b021-kube-api-access-9zsgj\") pod \"package-server-manager-789f6589d5-2lc4l\" (UID: \"c5333e14-992e-411c-bd42-9848ff42b021\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2lc4l" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.710995 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a8c9e0c4-c47b-4bbc-ab98-ce7fa42d8e0c-proxy-tls\") pod \"machine-config-controller-84d6567774-d6mxh\" (UID: \"a8c9e0c4-c47b-4bbc-ab98-ce7fa42d8e0c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-d6mxh" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.711231 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xctd8"] Oct 12 05:43:30 crc kubenswrapper[4930]: E1012 05:43:30.711478 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 05:43:31.211465164 +0000 UTC m=+143.753566929 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bzqf2" (UID: "51d2d15f-f2ac-4939-b11d-f41e5891323d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.711873 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/17d6a32b-5d07-4158-853a-eefab6440720-images\") pod \"machine-config-operator-74547568cd-lk89q\" (UID: \"17d6a32b-5d07-4158-853a-eefab6440720\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lk89q" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.712026 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c393fc2e-599b-4842-9c6c-b46cef38f7e6-socket-dir\") pod \"csi-hostpathplugin-25pm7\" (UID: \"c393fc2e-599b-4842-9c6c-b46cef38f7e6\") " pod="hostpath-provisioner/csi-hostpathplugin-25pm7" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.712105 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/131bb70c-fac8-4696-ad60-1acc99df28c2-signing-cabundle\") pod \"service-ca-9c57cc56f-phltz\" (UID: \"131bb70c-fac8-4696-ad60-1acc99df28c2\") " pod="openshift-service-ca/service-ca-9c57cc56f-phltz" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.712566 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a8c9e0c4-c47b-4bbc-ab98-ce7fa42d8e0c-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-d6mxh\" (UID: \"a8c9e0c4-c47b-4bbc-ab98-ce7fa42d8e0c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-d6mxh" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.713189 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/c393fc2e-599b-4842-9c6c-b46cef38f7e6-mountpoint-dir\") pod \"csi-hostpathplugin-25pm7\" (UID: \"c393fc2e-599b-4842-9c6c-b46cef38f7e6\") " pod="hostpath-provisioner/csi-hostpathplugin-25pm7" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.713415 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/c393fc2e-599b-4842-9c6c-b46cef38f7e6-plugins-dir\") pod \"csi-hostpathplugin-25pm7\" (UID: \"c393fc2e-599b-4842-9c6c-b46cef38f7e6\") " pod="hostpath-provisioner/csi-hostpathplugin-25pm7" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.715204 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/17d6a32b-5d07-4158-853a-eefab6440720-proxy-tls\") pod \"machine-config-operator-74547568cd-lk89q\" (UID: \"17d6a32b-5d07-4158-853a-eefab6440720\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lk89q" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.715835 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a51e2785-1c3d-4354-a071-dadb05075c68-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-bs29k\" (UID: \"a51e2785-1c3d-4354-a071-dadb05075c68\") " pod="openshift-marketplace/marketplace-operator-79b997595-bs29k" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.717195 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/74be97dd-4d16-40ed-87e4-b707eccf422e-config-volume\") pod \"collect-profiles-29337450-9g4rr\" (UID: \"74be97dd-4d16-40ed-87e4-b707eccf422e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337450-9g4rr" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.718060 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m4qrq"] Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.720376 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/17d6a32b-5d07-4158-853a-eefab6440720-auth-proxy-config\") pod \"machine-config-operator-74547568cd-lk89q\" (UID: \"17d6a32b-5d07-4158-853a-eefab6440720\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lk89q" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.720429 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/36224327-6c6b-4ff0-86e2-97c02039b8c6-config-volume\") pod \"dns-default-g8lk2\" (UID: \"36224327-6c6b-4ff0-86e2-97c02039b8c6\") " pod="openshift-dns/dns-default-g8lk2" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.721292 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/db2e99ac-f9a3-41cf-8a72-b05db4880b78-tmpfs\") pod \"packageserver-d55dfcdfc-psfdl\" (UID: \"db2e99ac-f9a3-41cf-8a72-b05db4880b78\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-psfdl" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.724818 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/131bb70c-fac8-4696-ad60-1acc99df28c2-signing-key\") pod \"service-ca-9c57cc56f-phltz\" (UID: \"131bb70c-fac8-4696-ad60-1acc99df28c2\") " pod="openshift-service-ca/service-ca-9c57cc56f-phltz" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.726626 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/2a81e403-a7dc-405a-bab2-8a9f28fa22fa-srv-cert\") pod \"catalog-operator-68c6474976-wdpcs\" (UID: \"2a81e403-a7dc-405a-bab2-8a9f28fa22fa\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wdpcs" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.727071 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/36224327-6c6b-4ff0-86e2-97c02039b8c6-metrics-tls\") pod \"dns-default-g8lk2\" (UID: \"36224327-6c6b-4ff0-86e2-97c02039b8c6\") " pod="openshift-dns/dns-default-g8lk2" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.727324 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/74be97dd-4d16-40ed-87e4-b707eccf422e-secret-volume\") pod \"collect-profiles-29337450-9g4rr\" (UID: \"74be97dd-4d16-40ed-87e4-b707eccf422e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337450-9g4rr" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.729261 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/220d40c1-2181-40b6-9d6b-6704fe99ec24-srv-cert\") pod \"olm-operator-6b444d44fb-968tr\" (UID: \"220d40c1-2181-40b6-9d6b-6704fe99ec24\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-968tr" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.736109 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/db2e99ac-f9a3-41cf-8a72-b05db4880b78-apiservice-cert\") pod \"packageserver-d55dfcdfc-psfdl\" (UID: \"db2e99ac-f9a3-41cf-8a72-b05db4880b78\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-psfdl" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.739026 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/db2e99ac-f9a3-41cf-8a72-b05db4880b78-webhook-cert\") pod \"packageserver-d55dfcdfc-psfdl\" (UID: \"db2e99ac-f9a3-41cf-8a72-b05db4880b78\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-psfdl" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.740185 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/220d40c1-2181-40b6-9d6b-6704fe99ec24-profile-collector-cert\") pod \"olm-operator-6b444d44fb-968tr\" (UID: \"220d40c1-2181-40b6-9d6b-6704fe99ec24\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-968tr" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.740288 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/fc56c928-9a28-4b6a-9652-49bf164d7104-node-bootstrap-token\") pod \"machine-config-server-92ngq\" (UID: \"fc56c928-9a28-4b6a-9652-49bf164d7104\") " pod="openshift-machine-config-operator/machine-config-server-92ngq" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.740683 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bb64ba69-2433-4d68-b7bc-c5c4f62d5f2f-cert\") pod \"ingress-canary-5v7hl\" (UID: \"bb64ba69-2433-4d68-b7bc-c5c4f62d5f2f\") " pod="openshift-ingress-canary/ingress-canary-5v7hl" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.740829 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7051fcad-4fe8-4c5c-8694-2a6e63b541d0-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-vd458\" (UID: \"7051fcad-4fe8-4c5c-8694-2a6e63b541d0\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vd458" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.741565 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/2a81e403-a7dc-405a-bab2-8a9f28fa22fa-profile-collector-cert\") pod \"catalog-operator-68c6474976-wdpcs\" (UID: \"2a81e403-a7dc-405a-bab2-8a9f28fa22fa\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wdpcs" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.744400 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a51e2785-1c3d-4354-a071-dadb05075c68-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-bs29k\" (UID: \"a51e2785-1c3d-4354-a071-dadb05075c68\") " pod="openshift-marketplace/marketplace-operator-79b997595-bs29k" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.745440 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a8c9e0c4-c47b-4bbc-ab98-ce7fa42d8e0c-proxy-tls\") pod \"machine-config-controller-84d6567774-d6mxh\" (UID: \"a8c9e0c4-c47b-4bbc-ab98-ce7fa42d8e0c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-d6mxh" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.745446 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/c5333e14-992e-411c-bd42-9848ff42b021-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-2lc4l\" (UID: \"c5333e14-992e-411c-bd42-9848ff42b021\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2lc4l" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.745605 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mds2t\" (UniqueName: \"kubernetes.io/projected/4af4dbea-ada1-465b-a6d9-24843bf3808d-kube-api-access-mds2t\") pod \"control-plane-machine-set-operator-78cbb6b69f-6r5ch\" (UID: \"4af4dbea-ada1-465b-a6d9-24843bf3808d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6r5ch" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.745971 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/fc56c928-9a28-4b6a-9652-49bf164d7104-certs\") pod \"machine-config-server-92ngq\" (UID: \"fc56c928-9a28-4b6a-9652-49bf164d7104\") " pod="openshift-machine-config-operator/machine-config-server-92ngq" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.755208 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-hbgd8" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.755724 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65ldr\" (UniqueName: \"kubernetes.io/projected/fc56c928-9a28-4b6a-9652-49bf164d7104-kube-api-access-65ldr\") pod \"machine-config-server-92ngq\" (UID: \"fc56c928-9a28-4b6a-9652-49bf164d7104\") " pod="openshift-machine-config-operator/machine-config-server-92ngq" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.773888 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9nz6\" (UniqueName: \"kubernetes.io/projected/74be97dd-4d16-40ed-87e4-b707eccf422e-kube-api-access-j9nz6\") pod \"collect-profiles-29337450-9g4rr\" (UID: \"74be97dd-4d16-40ed-87e4-b707eccf422e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337450-9g4rr" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.812132 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 05:43:30 crc kubenswrapper[4930]: E1012 05:43:30.812636 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 05:43:31.312614858 +0000 UTC m=+143.854716623 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.812955 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bzqf2\" (UID: \"51d2d15f-f2ac-4939-b11d-f41e5891323d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bzqf2" Oct 12 05:43:30 crc kubenswrapper[4930]: E1012 05:43:30.813596 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 05:43:31.313577175 +0000 UTC m=+143.855678940 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bzqf2" (UID: "51d2d15f-f2ac-4939-b11d-f41e5891323d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.814722 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nbf2\" (UniqueName: \"kubernetes.io/projected/a51e2785-1c3d-4354-a071-dadb05075c68-kube-api-access-6nbf2\") pod \"marketplace-operator-79b997595-bs29k\" (UID: \"a51e2785-1c3d-4354-a071-dadb05075c68\") " pod="openshift-marketplace/marketplace-operator-79b997595-bs29k" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.815535 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-t48rv"] Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.820547 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szc8g\" (UniqueName: \"kubernetes.io/projected/a8c9e0c4-c47b-4bbc-ab98-ce7fa42d8e0c-kube-api-access-szc8g\") pod \"machine-config-controller-84d6567774-d6mxh\" (UID: \"a8c9e0c4-c47b-4bbc-ab98-ce7fa42d8e0c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-d6mxh" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.824959 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-fmt5q"] Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.826752 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-47d6m"] Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.832004 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cmps\" (UniqueName: \"kubernetes.io/projected/bb64ba69-2433-4d68-b7bc-c5c4f62d5f2f-kube-api-access-9cmps\") pod \"ingress-canary-5v7hl\" (UID: \"bb64ba69-2433-4d68-b7bc-c5c4f62d5f2f\") " pod="openshift-ingress-canary/ingress-canary-5v7hl" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.833776 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6r5ch" Oct 12 05:43:30 crc kubenswrapper[4930]: W1012 05:43:30.843003 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd39a673_35f1_435a_bd8d_02b253c12f1c.slice/crio-1ce60dbc9d6af70d2963d52a196ea75759088f4f30b448cffcf59a7b9483a792 WatchSource:0}: Error finding container 1ce60dbc9d6af70d2963d52a196ea75759088f4f30b448cffcf59a7b9483a792: Status 404 returned error can't find the container with id 1ce60dbc9d6af70d2963d52a196ea75759088f4f30b448cffcf59a7b9483a792 Oct 12 05:43:30 crc kubenswrapper[4930]: W1012 05:43:30.848508 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod882d387b_cc67_4a31_b25a_8b4197bd42f2.slice/crio-df7e18154310a242b7349c0f8a6dee145cd8126728eeeab1b45c2ae5893afb26 WatchSource:0}: Error finding container df7e18154310a242b7349c0f8a6dee145cd8126728eeeab1b45c2ae5893afb26: Status 404 returned error can't find the container with id df7e18154310a242b7349c0f8a6dee145cd8126728eeeab1b45c2ae5893afb26 Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.862576 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsch7\" (UniqueName: \"kubernetes.io/projected/db2e99ac-f9a3-41cf-8a72-b05db4880b78-kube-api-access-bsch7\") pod \"packageserver-d55dfcdfc-psfdl\" (UID: \"db2e99ac-f9a3-41cf-8a72-b05db4880b78\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-psfdl" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.865733 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-psfdl" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.871627 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-d6mxh" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.873278 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zsgj\" (UniqueName: \"kubernetes.io/projected/c5333e14-992e-411c-bd42-9848ff42b021-kube-api-access-9zsgj\") pod \"package-server-manager-789f6589d5-2lc4l\" (UID: \"c5333e14-992e-411c-bd42-9848ff42b021\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2lc4l" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.879296 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29337450-9g4rr" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.899638 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfzjc\" (UniqueName: \"kubernetes.io/projected/2a81e403-a7dc-405a-bab2-8a9f28fa22fa-kube-api-access-cfzjc\") pod \"catalog-operator-68c6474976-wdpcs\" (UID: \"2a81e403-a7dc-405a-bab2-8a9f28fa22fa\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wdpcs" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.914363 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.914506 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnhtc\" (UniqueName: \"kubernetes.io/projected/7051fcad-4fe8-4c5c-8694-2a6e63b541d0-kube-api-access-rnhtc\") pod \"multus-admission-controller-857f4d67dd-vd458\" (UID: \"7051fcad-4fe8-4c5c-8694-2a6e63b541d0\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vd458" Oct 12 05:43:30 crc kubenswrapper[4930]: E1012 05:43:30.915567 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 05:43:31.415551941 +0000 UTC m=+143.957653706 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.942012 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ncmk8"] Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.943021 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54wvt\" (UniqueName: \"kubernetes.io/projected/c393fc2e-599b-4842-9c6c-b46cef38f7e6-kube-api-access-54wvt\") pod \"csi-hostpathplugin-25pm7\" (UID: \"c393fc2e-599b-4842-9c6c-b46cef38f7e6\") " pod="hostpath-provisioner/csi-hostpathplugin-25pm7" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.947956 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wdpcs" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.958815 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqchn\" (UniqueName: \"kubernetes.io/projected/17d6a32b-5d07-4158-853a-eefab6440720-kube-api-access-xqchn\") pod \"machine-config-operator-74547568cd-lk89q\" (UID: \"17d6a32b-5d07-4158-853a-eefab6440720\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lk89q" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.965143 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-vd458" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.973648 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-92ngq" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.979810 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5v7hl" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.983225 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-b6q56"] Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.983344 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkt7l\" (UniqueName: \"kubernetes.io/projected/220d40c1-2181-40b6-9d6b-6704fe99ec24-kube-api-access-xkt7l\") pod \"olm-operator-6b444d44fb-968tr\" (UID: \"220d40c1-2181-40b6-9d6b-6704fe99ec24\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-968tr" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.991978 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rl2fl" event={"ID":"e4e4971a-01da-4555-b3a5-2ab4a9118163","Type":"ContainerStarted","Data":"8c6cee71eb22bdfe386a5636f59395b23e7f31ca1f4d417e17c400fa46acd33d"} Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.996246 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-bs29k" Oct 12 05:43:30 crc kubenswrapper[4930]: I1012 05:43:30.996844 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cqctf" event={"ID":"cd39a673-35f1-435a-bd8d-02b253c12f1c","Type":"ContainerStarted","Data":"1ce60dbc9d6af70d2963d52a196ea75759088f4f30b448cffcf59a7b9483a792"} Oct 12 05:43:31 crc kubenswrapper[4930]: I1012 05:43:31.004165 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7xzpm"] Oct 12 05:43:31 crc kubenswrapper[4930]: I1012 05:43:31.005524 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-6gz2l" event={"ID":"cf1f38ec-f38b-47e0-9b80-e66740ebdaba","Type":"ContainerStarted","Data":"0fc64bfbdcfd6481475a055b50f21cfb6bfbe31a9ea8a5f67b657c5db16c4ba2"} Oct 12 05:43:31 crc kubenswrapper[4930]: I1012 05:43:31.005966 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9m6tj"] Oct 12 05:43:31 crc kubenswrapper[4930]: I1012 05:43:31.008299 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-47d6m" event={"ID":"e9baf15f-27e7-442f-98d8-fdb29719ac71","Type":"ContainerStarted","Data":"bd0d51484fb4ec82931e8ebc57000e348a3c812b7e4e2638f155a1e074e2937c"} Oct 12 05:43:31 crc kubenswrapper[4930]: I1012 05:43:31.014061 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rglkf\" (UniqueName: \"kubernetes.io/projected/131bb70c-fac8-4696-ad60-1acc99df28c2-kube-api-access-rglkf\") pod \"service-ca-9c57cc56f-phltz\" (UID: \"131bb70c-fac8-4696-ad60-1acc99df28c2\") " pod="openshift-service-ca/service-ca-9c57cc56f-phltz" Oct 12 05:43:31 crc kubenswrapper[4930]: I1012 05:43:31.015834 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xctd8" event={"ID":"8f0c1ca3-f083-49b3-936d-84311a33c5a3","Type":"ContainerStarted","Data":"d1d01a8244d58d4b6906ae83fada7d66785e87f5e86a6adf76ddae2f0269467f"} Oct 12 05:43:31 crc kubenswrapper[4930]: I1012 05:43:31.016127 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrzxq\" (UniqueName: \"kubernetes.io/projected/36224327-6c6b-4ff0-86e2-97c02039b8c6-kube-api-access-vrzxq\") pod \"dns-default-g8lk2\" (UID: \"36224327-6c6b-4ff0-86e2-97c02039b8c6\") " pod="openshift-dns/dns-default-g8lk2" Oct 12 05:43:31 crc kubenswrapper[4930]: I1012 05:43:31.016689 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bzqf2\" (UID: \"51d2d15f-f2ac-4939-b11d-f41e5891323d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bzqf2" Oct 12 05:43:31 crc kubenswrapper[4930]: E1012 05:43:31.017204 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 05:43:31.517186178 +0000 UTC m=+144.059287943 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bzqf2" (UID: "51d2d15f-f2ac-4939-b11d-f41e5891323d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:43:31 crc kubenswrapper[4930]: I1012 05:43:31.018363 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-rg6mp" event={"ID":"9ce83b79-8e6f-4188-a019-30399c8367f7","Type":"ContainerStarted","Data":"395c1d6d8c32291a8e6b3b669657590b489ccbc84e9ee2013cfa617d7d70ee74"} Oct 12 05:43:31 crc kubenswrapper[4930]: I1012 05:43:31.024849 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-25pm7" Oct 12 05:43:31 crc kubenswrapper[4930]: I1012 05:43:31.030966 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-g8lk2" Oct 12 05:43:31 crc kubenswrapper[4930]: I1012 05:43:31.040019 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lmr9n" event={"ID":"14ec6d5d-4697-48f0-be34-aa3c028fc8a4","Type":"ContainerStarted","Data":"10fbfb4fbedfd8e9b46123e9a631e8413217fbaba81811e0a8122fda4c35ef91"} Oct 12 05:43:31 crc kubenswrapper[4930]: I1012 05:43:31.042975 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-t48rv" event={"ID":"79650663-cd46-4fef-a731-dfc45f8fa945","Type":"ContainerStarted","Data":"9b08e544d6fac720cfae74ee3a8e55a6f2e2b819b4f01602a50cb1c4485cc0e2"} Oct 12 05:43:31 crc kubenswrapper[4930]: I1012 05:43:31.044667 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-ztn69" event={"ID":"6c179d5f-d7dd-42b9-b248-cd3c34237961","Type":"ContainerStarted","Data":"37e336b2e9e2d5aa5daca454a8fe74916ecb3425fa322cef7a1d832362919c4c"} Oct 12 05:43:31 crc kubenswrapper[4930]: I1012 05:43:31.044724 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-ztn69" event={"ID":"6c179d5f-d7dd-42b9-b248-cd3c34237961","Type":"ContainerStarted","Data":"d9ffbfc1faa01851cafc9ec80e2081147411be7dd9e23bfe15c14c08f87e7a9e"} Oct 12 05:43:31 crc kubenswrapper[4930]: I1012 05:43:31.045463 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-ztn69" Oct 12 05:43:31 crc kubenswrapper[4930]: I1012 05:43:31.049378 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-zkzbp" event={"ID":"62f81c40-00a5-41d6-978a-a12cd3878495","Type":"ContainerStarted","Data":"4ca0e5b9ea83fdc43dbee9f406a4623097ac51d2e899e142cf33297b36ae391e"} Oct 12 05:43:31 crc kubenswrapper[4930]: I1012 05:43:31.049432 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-zkzbp" event={"ID":"62f81c40-00a5-41d6-978a-a12cd3878495","Type":"ContainerStarted","Data":"878ea24bcc75da82803306114667d166be211f11d1f714bc6abb148090dff073"} Oct 12 05:43:31 crc kubenswrapper[4930]: I1012 05:43:31.051316 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-mdwcw" event={"ID":"93aad091-5fd5-4eb5-a123-b7932dc268fe","Type":"ContainerStarted","Data":"33d2ffa871d57e875a4f78399bfbc7a2ec7103be1ae8da3bc30c75cc04c94fc9"} Oct 12 05:43:31 crc kubenswrapper[4930]: I1012 05:43:31.051348 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-mdwcw" event={"ID":"93aad091-5fd5-4eb5-a123-b7932dc268fe","Type":"ContainerStarted","Data":"43ae9659bd547264cac72e71b4f74eea377b62a908fda6b3ae1df95debbca9c5"} Oct 12 05:43:31 crc kubenswrapper[4930]: I1012 05:43:31.053036 4930 patch_prober.go:28] interesting pod/downloads-7954f5f757-ztn69 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Oct 12 05:43:31 crc kubenswrapper[4930]: I1012 05:43:31.053077 4930 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-ztn69" podUID="6c179d5f-d7dd-42b9-b248-cd3c34237961" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Oct 12 05:43:31 crc kubenswrapper[4930]: I1012 05:43:31.097025 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x89kc" event={"ID":"9ed0c781-82fd-4490-adcb-0c03fff3a7c8","Type":"ContainerStarted","Data":"92dd4fcc54c714d822f58c9e338d75c3395b501ef5258ac4b7f443bc53399935"} Oct 12 05:43:31 crc kubenswrapper[4930]: I1012 05:43:31.097066 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x89kc" event={"ID":"9ed0c781-82fd-4490-adcb-0c03fff3a7c8","Type":"ContainerStarted","Data":"ddbcffe95797110f7426985b83eeb922105bc61588a5513e35e86913198d7535"} Oct 12 05:43:31 crc kubenswrapper[4930]: I1012 05:43:31.105266 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m4qrq" event={"ID":"882d387b-cc67-4a31-b25a-8b4197bd42f2","Type":"ContainerStarted","Data":"df7e18154310a242b7349c0f8a6dee145cd8126728eeeab1b45c2ae5893afb26"} Oct 12 05:43:31 crc kubenswrapper[4930]: I1012 05:43:31.107121 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fmt5q" event={"ID":"69e0e716-3ae8-482b-bb9e-57f6f5d859cd","Type":"ContainerStarted","Data":"56f37262cf24367e25a36ae73f6ec9451756190b70da11d271d768a0a48814d8"} Oct 12 05:43:31 crc kubenswrapper[4930]: I1012 05:43:31.110632 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ncmk8" event={"ID":"02cd3769-c022-4645-b33e-eb1303133aea","Type":"ContainerStarted","Data":"a66577d7f4fcc85e10e265435f1f71d3cec902bc17c60aaf5501e191296d6f04"} Oct 12 05:43:31 crc kubenswrapper[4930]: I1012 05:43:31.118066 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 05:43:31 crc kubenswrapper[4930]: E1012 05:43:31.119770 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 05:43:31.619728741 +0000 UTC m=+144.161830506 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:43:31 crc kubenswrapper[4930]: W1012 05:43:31.138607 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb977a38_ef3b_4820_a364_ad16d6c857d5.slice/crio-4e3ebf72ea3463a27fee00b739b2d7b986e74d3a0d2211c35ebed95c3791200a WatchSource:0}: Error finding container 4e3ebf72ea3463a27fee00b739b2d7b986e74d3a0d2211c35ebed95c3791200a: Status 404 returned error can't find the container with id 4e3ebf72ea3463a27fee00b739b2d7b986e74d3a0d2211c35ebed95c3791200a Oct 12 05:43:31 crc kubenswrapper[4930]: I1012 05:43:31.156873 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2lc4l" Oct 12 05:43:31 crc kubenswrapper[4930]: I1012 05:43:31.173109 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-mdwcw" Oct 12 05:43:31 crc kubenswrapper[4930]: I1012 05:43:31.181705 4930 patch_prober.go:28] interesting pod/router-default-5444994796-mdwcw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 12 05:43:31 crc kubenswrapper[4930]: [-]has-synced failed: reason withheld Oct 12 05:43:31 crc kubenswrapper[4930]: [+]process-running ok Oct 12 05:43:31 crc kubenswrapper[4930]: healthz check failed Oct 12 05:43:31 crc kubenswrapper[4930]: I1012 05:43:31.181760 4930 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mdwcw" podUID="93aad091-5fd5-4eb5-a123-b7932dc268fe" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 12 05:43:31 crc kubenswrapper[4930]: I1012 05:43:31.193594 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lk89q" Oct 12 05:43:31 crc kubenswrapper[4930]: I1012 05:43:31.212280 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-hbgd8"] Oct 12 05:43:31 crc kubenswrapper[4930]: I1012 05:43:31.217014 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dprns"] Oct 12 05:43:31 crc kubenswrapper[4930]: I1012 05:43:31.218303 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-w9zwj"] Oct 12 05:43:31 crc kubenswrapper[4930]: I1012 05:43:31.221387 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-wsqp8"] Oct 12 05:43:31 crc kubenswrapper[4930]: I1012 05:43:31.223699 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bzqf2\" (UID: \"51d2d15f-f2ac-4939-b11d-f41e5891323d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bzqf2" Oct 12 05:43:31 crc kubenswrapper[4930]: I1012 05:43:31.224073 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6r5ch"] Oct 12 05:43:31 crc kubenswrapper[4930]: E1012 05:43:31.224690 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 05:43:31.724676382 +0000 UTC m=+144.266778147 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bzqf2" (UID: "51d2d15f-f2ac-4939-b11d-f41e5891323d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:43:31 crc kubenswrapper[4930]: I1012 05:43:31.255916 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-968tr" Oct 12 05:43:31 crc kubenswrapper[4930]: W1012 05:43:31.282684 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4af4dbea_ada1_465b_a6d9_24843bf3808d.slice/crio-ecd334746c0f544b93d768cd1f6dea530a59d6f22ee2d59c2fce281a48203ac5 WatchSource:0}: Error finding container ecd334746c0f544b93d768cd1f6dea530a59d6f22ee2d59c2fce281a48203ac5: Status 404 returned error can't find the container with id ecd334746c0f544b93d768cd1f6dea530a59d6f22ee2d59c2fce281a48203ac5 Oct 12 05:43:31 crc kubenswrapper[4930]: I1012 05:43:31.304899 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-phltz" Oct 12 05:43:31 crc kubenswrapper[4930]: I1012 05:43:31.324234 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 05:43:31 crc kubenswrapper[4930]: E1012 05:43:31.324564 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 05:43:31.824517508 +0000 UTC m=+144.366619263 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:43:31 crc kubenswrapper[4930]: I1012 05:43:31.324847 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bzqf2\" (UID: \"51d2d15f-f2ac-4939-b11d-f41e5891323d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bzqf2" Oct 12 05:43:31 crc kubenswrapper[4930]: E1012 05:43:31.326280 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 05:43:31.826262107 +0000 UTC m=+144.368363872 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bzqf2" (UID: "51d2d15f-f2ac-4939-b11d-f41e5891323d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:43:31 crc kubenswrapper[4930]: I1012 05:43:31.330854 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-p5m42"] Oct 12 05:43:31 crc kubenswrapper[4930]: I1012 05:43:31.332946 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9h9hz"] Oct 12 05:43:31 crc kubenswrapper[4930]: I1012 05:43:31.354392 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-zxqcl"] Oct 12 05:43:31 crc kubenswrapper[4930]: I1012 05:43:31.376849 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-gbqmf"] Oct 12 05:43:31 crc kubenswrapper[4930]: I1012 05:43:31.429271 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 05:43:31 crc kubenswrapper[4930]: E1012 05:43:31.433002 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 05:43:31.932960447 +0000 UTC m=+144.475062212 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:43:31 crc kubenswrapper[4930]: I1012 05:43:31.433347 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bzqf2\" (UID: \"51d2d15f-f2ac-4939-b11d-f41e5891323d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bzqf2" Oct 12 05:43:31 crc kubenswrapper[4930]: E1012 05:43:31.433724 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 05:43:31.933717729 +0000 UTC m=+144.475819484 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bzqf2" (UID: "51d2d15f-f2ac-4939-b11d-f41e5891323d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:43:31 crc kubenswrapper[4930]: I1012 05:43:31.444977 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-psfdl"] Oct 12 05:43:31 crc kubenswrapper[4930]: W1012 05:43:31.449157 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0bfadb73_bbc1_41db_8374_9c3aaf00682d.slice/crio-32960b32c63abb792601df236a811e21fd225deadf43c09b4c8d4a180a4609be WatchSource:0}: Error finding container 32960b32c63abb792601df236a811e21fd225deadf43c09b4c8d4a180a4609be: Status 404 returned error can't find the container with id 32960b32c63abb792601df236a811e21fd225deadf43c09b4c8d4a180a4609be Oct 12 05:43:31 crc kubenswrapper[4930]: I1012 05:43:31.534450 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-g8lk2"] Oct 12 05:43:31 crc kubenswrapper[4930]: I1012 05:43:31.534506 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 05:43:31 crc kubenswrapper[4930]: E1012 05:43:31.534629 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 05:43:32.034612405 +0000 UTC m=+144.576714170 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:43:31 crc kubenswrapper[4930]: I1012 05:43:31.535392 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bzqf2\" (UID: \"51d2d15f-f2ac-4939-b11d-f41e5891323d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bzqf2" Oct 12 05:43:31 crc kubenswrapper[4930]: E1012 05:43:31.535850 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 05:43:32.035829489 +0000 UTC m=+144.577931254 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bzqf2" (UID: "51d2d15f-f2ac-4939-b11d-f41e5891323d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:43:31 crc kubenswrapper[4930]: W1012 05:43:31.565784 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb2e99ac_f9a3_41cf_8a72_b05db4880b78.slice/crio-26997a1a0df87d4924e191e9b4e41b5aded85ae8848b60955c5e80dd9ad3749d WatchSource:0}: Error finding container 26997a1a0df87d4924e191e9b4e41b5aded85ae8848b60955c5e80dd9ad3749d: Status 404 returned error can't find the container with id 26997a1a0df87d4924e191e9b4e41b5aded85ae8848b60955c5e80dd9ad3749d Oct 12 05:43:31 crc kubenswrapper[4930]: I1012 05:43:31.604172 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-d6mxh"] Oct 12 05:43:31 crc kubenswrapper[4930]: I1012 05:43:31.636635 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 05:43:31 crc kubenswrapper[4930]: E1012 05:43:31.636993 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 05:43:32.136978082 +0000 UTC m=+144.679079847 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:43:31 crc kubenswrapper[4930]: W1012 05:43:31.712628 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8c9e0c4_c47b_4bbc_ab98_ce7fa42d8e0c.slice/crio-4bea269d5377761138b61f26fae9f2d576242473237fed381b9731f0805b8141 WatchSource:0}: Error finding container 4bea269d5377761138b61f26fae9f2d576242473237fed381b9731f0805b8141: Status 404 returned error can't find the container with id 4bea269d5377761138b61f26fae9f2d576242473237fed381b9731f0805b8141 Oct 12 05:43:31 crc kubenswrapper[4930]: I1012 05:43:31.741442 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bzqf2\" (UID: \"51d2d15f-f2ac-4939-b11d-f41e5891323d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bzqf2" Oct 12 05:43:31 crc kubenswrapper[4930]: E1012 05:43:31.742365 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 05:43:32.242349695 +0000 UTC m=+144.784451460 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bzqf2" (UID: "51d2d15f-f2ac-4939-b11d-f41e5891323d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:43:31 crc kubenswrapper[4930]: I1012 05:43:31.806069 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-ztn69" podStartSLOduration=122.806051112 podStartE2EDuration="2m2.806051112s" podCreationTimestamp="2025-10-12 05:41:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:43:31.775808159 +0000 UTC m=+144.317909924" watchObservedRunningTime="2025-10-12 05:43:31.806051112 +0000 UTC m=+144.348152877" Oct 12 05:43:31 crc kubenswrapper[4930]: I1012 05:43:31.842948 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 05:43:31 crc kubenswrapper[4930]: E1012 05:43:31.843348 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 05:43:32.343335013 +0000 UTC m=+144.885436778 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:43:31 crc kubenswrapper[4930]: I1012 05:43:31.873640 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wdpcs"] Oct 12 05:43:31 crc kubenswrapper[4930]: I1012 05:43:31.898482 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29337450-9g4rr"] Oct 12 05:43:31 crc kubenswrapper[4930]: I1012 05:43:31.945624 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bzqf2\" (UID: \"51d2d15f-f2ac-4939-b11d-f41e5891323d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bzqf2" Oct 12 05:43:31 crc kubenswrapper[4930]: E1012 05:43:31.946061 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 05:43:32.446049461 +0000 UTC m=+144.988151216 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bzqf2" (UID: "51d2d15f-f2ac-4939-b11d-f41e5891323d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:43:31 crc kubenswrapper[4930]: I1012 05:43:31.953503 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-5v7hl"] Oct 12 05:43:31 crc kubenswrapper[4930]: W1012 05:43:31.987415 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74be97dd_4d16_40ed_87e4_b707eccf422e.slice/crio-3efd8d4f1fad50e702a8d8ac66b0f7c78a206680d5761549f29cffa73304605e WatchSource:0}: Error finding container 3efd8d4f1fad50e702a8d8ac66b0f7c78a206680d5761549f29cffa73304605e: Status 404 returned error can't find the container with id 3efd8d4f1fad50e702a8d8ac66b0f7c78a206680d5761549f29cffa73304605e Oct 12 05:43:32 crc kubenswrapper[4930]: I1012 05:43:32.047623 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 05:43:32 crc kubenswrapper[4930]: E1012 05:43:32.048373 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 05:43:32.548354027 +0000 UTC m=+145.090455792 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:43:32 crc kubenswrapper[4930]: I1012 05:43:32.094054 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-vd458"] Oct 12 05:43:32 crc kubenswrapper[4930]: I1012 05:43:32.137309 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-zkzbp" podStartSLOduration=123.137291506 podStartE2EDuration="2m3.137291506s" podCreationTimestamp="2025-10-12 05:41:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:43:32.134504837 +0000 UTC m=+144.676606602" watchObservedRunningTime="2025-10-12 05:43:32.137291506 +0000 UTC m=+144.679393271" Oct 12 05:43:32 crc kubenswrapper[4930]: I1012 05:43:32.149385 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bzqf2\" (UID: \"51d2d15f-f2ac-4939-b11d-f41e5891323d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bzqf2" Oct 12 05:43:32 crc kubenswrapper[4930]: E1012 05:43:32.153873 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 05:43:32.653853213 +0000 UTC m=+145.195954978 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bzqf2" (UID: "51d2d15f-f2ac-4939-b11d-f41e5891323d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:43:32 crc kubenswrapper[4930]: I1012 05:43:32.183257 4930 patch_prober.go:28] interesting pod/router-default-5444994796-mdwcw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 12 05:43:32 crc kubenswrapper[4930]: [-]has-synced failed: reason withheld Oct 12 05:43:32 crc kubenswrapper[4930]: [+]process-running ok Oct 12 05:43:32 crc kubenswrapper[4930]: healthz check failed Oct 12 05:43:32 crc kubenswrapper[4930]: I1012 05:43:32.183304 4930 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mdwcw" podUID="93aad091-5fd5-4eb5-a123-b7932dc268fe" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 12 05:43:32 crc kubenswrapper[4930]: I1012 05:43:32.184575 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bs29k"] Oct 12 05:43:32 crc kubenswrapper[4930]: I1012 05:43:32.184685 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-25pm7"] Oct 12 05:43:32 crc kubenswrapper[4930]: I1012 05:43:32.186984 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xctd8" event={"ID":"8f0c1ca3-f083-49b3-936d-84311a33c5a3","Type":"ContainerStarted","Data":"a94004f6c398bc0624fbf282fd939f1c68e2d109fdd481558572775f336ec04d"} Oct 12 05:43:32 crc kubenswrapper[4930]: I1012 05:43:32.206481 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fmt5q" event={"ID":"69e0e716-3ae8-482b-bb9e-57f6f5d859cd","Type":"ContainerStarted","Data":"60893cf9e5342f85611796c1a4cf0c34fc93762d22124028c59498c6f8ad1264"} Oct 12 05:43:32 crc kubenswrapper[4930]: I1012 05:43:32.212302 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-p5m42" event={"ID":"de12eb0a-0d0a-455c-a275-a3bfdb6f9d72","Type":"ContainerStarted","Data":"c8b85923bfa61b8218e4310f1b9bfe34554554ca6fbe1d3f79059b9455dd0ea9"} Oct 12 05:43:32 crc kubenswrapper[4930]: I1012 05:43:32.223923 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7xzpm" event={"ID":"bcac2427-8311-481a-85f6-4c9b96d3bbe2","Type":"ContainerStarted","Data":"de8c6de77cce5a59ea3e7258cf8bae9c9f1573e73cc0260b4bccbcdd43bb0965"} Oct 12 05:43:32 crc kubenswrapper[4930]: I1012 05:43:32.223964 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7xzpm" event={"ID":"bcac2427-8311-481a-85f6-4c9b96d3bbe2","Type":"ContainerStarted","Data":"d321e8fb994712fc6e8cacd671fa2f421e00205cdea21b26273dfb853c1f4559"} Oct 12 05:43:32 crc kubenswrapper[4930]: I1012 05:43:32.258150 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 05:43:32 crc kubenswrapper[4930]: E1012 05:43:32.259168 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 05:43:32.759153093 +0000 UTC m=+145.301254858 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:43:32 crc kubenswrapper[4930]: I1012 05:43:32.260308 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-968tr"] Oct 12 05:43:32 crc kubenswrapper[4930]: W1012 05:43:32.348371 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7051fcad_4fe8_4c5c_8694_2a6e63b541d0.slice/crio-94b25837be0f546e05aa6e54bed36bac003e68a7fbb55d285a6d4f19a38e0f6d WatchSource:0}: Error finding container 94b25837be0f546e05aa6e54bed36bac003e68a7fbb55d285a6d4f19a38e0f6d: Status 404 returned error can't find the container with id 94b25837be0f546e05aa6e54bed36bac003e68a7fbb55d285a6d4f19a38e0f6d Oct 12 05:43:32 crc kubenswrapper[4930]: I1012 05:43:32.352282 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-g8lk2" event={"ID":"36224327-6c6b-4ff0-86e2-97c02039b8c6","Type":"ContainerStarted","Data":"58887e008ef0899c79a5fd2731004b36ba8f385714b67fe51ff7d86ae1ae1912"} Oct 12 05:43:32 crc kubenswrapper[4930]: I1012 05:43:32.354547 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2lc4l"] Oct 12 05:43:32 crc kubenswrapper[4930]: I1012 05:43:32.360414 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bzqf2\" (UID: \"51d2d15f-f2ac-4939-b11d-f41e5891323d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bzqf2" Oct 12 05:43:32 crc kubenswrapper[4930]: E1012 05:43:32.360883 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 05:43:32.860871083 +0000 UTC m=+145.402972838 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bzqf2" (UID: "51d2d15f-f2ac-4939-b11d-f41e5891323d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:43:32 crc kubenswrapper[4930]: I1012 05:43:32.363473 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rl2fl" event={"ID":"e4e4971a-01da-4555-b3a5-2ab4a9118163","Type":"ContainerStarted","Data":"acad6ea17973a85916c7f41eecfac80195a80a91ef18068dff91b2ff060b662b"} Oct 12 05:43:32 crc kubenswrapper[4930]: I1012 05:43:32.373781 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-b6q56" event={"ID":"eb977a38-ef3b-4820-a364-ad16d6c857d5","Type":"ContainerStarted","Data":"dcbd971c5f0222d6cf67950c14c24621a3d891093126d1370402ec38c51fe4b0"} Oct 12 05:43:32 crc kubenswrapper[4930]: I1012 05:43:32.373827 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-b6q56" event={"ID":"eb977a38-ef3b-4820-a364-ad16d6c857d5","Type":"ContainerStarted","Data":"4e3ebf72ea3463a27fee00b739b2d7b986e74d3a0d2211c35ebed95c3791200a"} Oct 12 05:43:32 crc kubenswrapper[4930]: I1012 05:43:32.404346 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-hbgd8" event={"ID":"42753d84-7f68-4c88-85e1-a06eacd8052d","Type":"ContainerStarted","Data":"bb70d8ceb1dbef66a045cd4a7665d02a73adbd9e85de6e24039bdd75f5af5fb6"} Oct 12 05:43:32 crc kubenswrapper[4930]: I1012 05:43:32.404381 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-hbgd8" event={"ID":"42753d84-7f68-4c88-85e1-a06eacd8052d","Type":"ContainerStarted","Data":"fe48688bc0e71df5f1ada41dadc28cb655414c72e98b60ac3554f8a8e7747ef8"} Oct 12 05:43:32 crc kubenswrapper[4930]: I1012 05:43:32.406450 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-phltz"] Oct 12 05:43:32 crc kubenswrapper[4930]: I1012 05:43:32.410420 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-lk89q"] Oct 12 05:43:32 crc kubenswrapper[4930]: I1012 05:43:32.412709 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9m6tj" event={"ID":"805f5605-a80f-40be-aaa2-5cedbc94960f","Type":"ContainerStarted","Data":"24d958193017265e6eb7ab841f2d19eec7b12bfa387d3fe2b5f8460eb87e2f57"} Oct 12 05:43:32 crc kubenswrapper[4930]: I1012 05:43:32.417257 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wdpcs" event={"ID":"2a81e403-a7dc-405a-bab2-8a9f28fa22fa","Type":"ContainerStarted","Data":"071cd4cf78eaa8f9a13d7fdc38b2b8ea5e4ad90f37260bf43b9e8636c2260d0e"} Oct 12 05:43:32 crc kubenswrapper[4930]: I1012 05:43:32.461066 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 05:43:32 crc kubenswrapper[4930]: E1012 05:43:32.464063 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 05:43:32.964046843 +0000 UTC m=+145.506148608 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:43:32 crc kubenswrapper[4930]: I1012 05:43:32.466867 4930 generic.go:334] "Generic (PLEG): container finished" podID="14ec6d5d-4697-48f0-be34-aa3c028fc8a4" containerID="11c98f95bebfd97b88075fc626c2a5054fae3593b66c7c2369cb90176bd82da2" exitCode=0 Oct 12 05:43:32 crc kubenswrapper[4930]: I1012 05:43:32.466980 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lmr9n" event={"ID":"14ec6d5d-4697-48f0-be34-aa3c028fc8a4","Type":"ContainerDied","Data":"11c98f95bebfd97b88075fc626c2a5054fae3593b66c7c2369cb90176bd82da2"} Oct 12 05:43:32 crc kubenswrapper[4930]: I1012 05:43:32.469835 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-92ngq" event={"ID":"fc56c928-9a28-4b6a-9652-49bf164d7104","Type":"ContainerStarted","Data":"3110c561ba2ecd26e78843668c6b9d0af1e9e6c32aaa6f9d65b4939cb0defb63"} Oct 12 05:43:32 crc kubenswrapper[4930]: I1012 05:43:32.469858 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-92ngq" event={"ID":"fc56c928-9a28-4b6a-9652-49bf164d7104","Type":"ContainerStarted","Data":"a8acdbbc65d1881bb0724000a6ae1e4e9509a74a9fd4717ffeb6ad137f7d4c37"} Oct 12 05:43:32 crc kubenswrapper[4930]: I1012 05:43:32.475322 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-gbqmf" event={"ID":"ef886351-d776-447e-ac43-1c16568cac4f","Type":"ContainerStarted","Data":"645112c8fb2436feee6f09d29307a3461b8966f9fa20747297fb4af43a79c3ef"} Oct 12 05:43:32 crc kubenswrapper[4930]: I1012 05:43:32.477034 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dprns" event={"ID":"74488d89-3c1c-4e78-9c26-35be09ac8cde","Type":"ContainerStarted","Data":"09f504eeb409389cf5358375444e091d8d35ec0bca96994e876a2d738ca8009c"} Oct 12 05:43:32 crc kubenswrapper[4930]: W1012 05:43:32.511812 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod131bb70c_fac8_4696_ad60_1acc99df28c2.slice/crio-8e4aa626d1e4c5d63d27bac8d8cec777f56596e9841e456890ce8280c16c029a WatchSource:0}: Error finding container 8e4aa626d1e4c5d63d27bac8d8cec777f56596e9841e456890ce8280c16c029a: Status 404 returned error can't find the container with id 8e4aa626d1e4c5d63d27bac8d8cec777f56596e9841e456890ce8280c16c029a Oct 12 05:43:32 crc kubenswrapper[4930]: I1012 05:43:32.512086 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-t48rv" event={"ID":"79650663-cd46-4fef-a731-dfc45f8fa945","Type":"ContainerStarted","Data":"3d98b32d90922bf1f0fceddbbbb4f16ff4395068f862d8f64d6a439e8301f149"} Oct 12 05:43:32 crc kubenswrapper[4930]: I1012 05:43:32.512897 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-t48rv" Oct 12 05:43:32 crc kubenswrapper[4930]: I1012 05:43:32.514140 4930 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-t48rv container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Oct 12 05:43:32 crc kubenswrapper[4930]: I1012 05:43:32.514182 4930 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-t48rv" podUID="79650663-cd46-4fef-a731-dfc45f8fa945" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Oct 12 05:43:32 crc kubenswrapper[4930]: I1012 05:43:32.517399 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-5v7hl" event={"ID":"bb64ba69-2433-4d68-b7bc-c5c4f62d5f2f","Type":"ContainerStarted","Data":"889227e5df7abebb7d68979fed56f5a9244f52bd8b5c9b5b1803082e0ffcf634"} Oct 12 05:43:32 crc kubenswrapper[4930]: I1012 05:43:32.564402 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bzqf2\" (UID: \"51d2d15f-f2ac-4939-b11d-f41e5891323d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bzqf2" Oct 12 05:43:32 crc kubenswrapper[4930]: E1012 05:43:32.567433 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 05:43:33.067421379 +0000 UTC m=+145.609523144 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bzqf2" (UID: "51d2d15f-f2ac-4939-b11d-f41e5891323d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:43:32 crc kubenswrapper[4930]: I1012 05:43:32.574010 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-w9zwj" event={"ID":"f086907d-7c4a-488b-a783-bc24e827e5e6","Type":"ContainerStarted","Data":"7d8571c54e058df080bd02805143ab14da7f26d4bc97060087d12bd70295c0c9"} Oct 12 05:43:32 crc kubenswrapper[4930]: I1012 05:43:32.650297 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-rg6mp" event={"ID":"9ce83b79-8e6f-4188-a019-30399c8367f7","Type":"ContainerStarted","Data":"dd8a503e02aa68a4136541a3ef31f1e3be8ebbd75eb18b026bf7ef1d370964d8"} Oct 12 05:43:32 crc kubenswrapper[4930]: I1012 05:43:32.650343 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-rg6mp" Oct 12 05:43:32 crc kubenswrapper[4930]: I1012 05:43:32.651390 4930 patch_prober.go:28] interesting pod/console-operator-58897d9998-rg6mp container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Oct 12 05:43:32 crc kubenswrapper[4930]: I1012 05:43:32.651512 4930 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-rg6mp" podUID="9ce83b79-8e6f-4188-a019-30399c8367f7" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" Oct 12 05:43:32 crc kubenswrapper[4930]: I1012 05:43:32.665916 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 05:43:32 crc kubenswrapper[4930]: E1012 05:43:32.669031 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 05:43:33.168985524 +0000 UTC m=+145.711087289 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:43:32 crc kubenswrapper[4930]: I1012 05:43:32.675474 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cqctf" event={"ID":"cd39a673-35f1-435a-bd8d-02b253c12f1c","Type":"ContainerStarted","Data":"35f261bf33cbdef5ee122b1903a8108c926f33285cb7e99169a83c41193f36c5"} Oct 12 05:43:32 crc kubenswrapper[4930]: I1012 05:43:32.676487 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cqctf" Oct 12 05:43:32 crc kubenswrapper[4930]: I1012 05:43:32.722714 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-b6q56" podStartSLOduration=123.722697429 podStartE2EDuration="2m3.722697429s" podCreationTimestamp="2025-10-12 05:41:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:43:32.71953181 +0000 UTC m=+145.261633575" watchObservedRunningTime="2025-10-12 05:43:32.722697429 +0000 UTC m=+145.264799194" Oct 12 05:43:32 crc kubenswrapper[4930]: I1012 05:43:32.726460 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6r5ch" event={"ID":"4af4dbea-ada1-465b-a6d9-24843bf3808d","Type":"ContainerStarted","Data":"3b48f92dd3f98b7e420f49fa8f559bb78ccca6a0b20fc24610a7f042ff9082d3"} Oct 12 05:43:32 crc kubenswrapper[4930]: I1012 05:43:32.726564 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6r5ch" event={"ID":"4af4dbea-ada1-465b-a6d9-24843bf3808d","Type":"ContainerStarted","Data":"ecd334746c0f544b93d768cd1f6dea530a59d6f22ee2d59c2fce281a48203ac5"} Oct 12 05:43:32 crc kubenswrapper[4930]: I1012 05:43:32.739109 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wsqp8" event={"ID":"98fa14b7-a351-47ab-bcd2-83e2a81f9859","Type":"ContainerStarted","Data":"a6f1d24bfed0a825d92508460bb144254ddcebc16de5b077c653e688f3a14adc"} Oct 12 05:43:32 crc kubenswrapper[4930]: I1012 05:43:32.768228 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bzqf2\" (UID: \"51d2d15f-f2ac-4939-b11d-f41e5891323d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bzqf2" Oct 12 05:43:32 crc kubenswrapper[4930]: E1012 05:43:32.769288 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 05:43:33.269274022 +0000 UTC m=+145.811375787 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bzqf2" (UID: "51d2d15f-f2ac-4939-b11d-f41e5891323d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:43:32 crc kubenswrapper[4930]: I1012 05:43:32.775133 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x89kc" event={"ID":"9ed0c781-82fd-4490-adcb-0c03fff3a7c8","Type":"ContainerStarted","Data":"d4bb0c41125200acdc12935bf0d17b3851881e6c3a555819d005a69e16a84e7a"} Oct 12 05:43:32 crc kubenswrapper[4930]: I1012 05:43:32.796134 4930 generic.go:334] "Generic (PLEG): container finished" podID="cf1f38ec-f38b-47e0-9b80-e66740ebdaba" containerID="965c16b196424496aa3bb3b14b6ae6fc718b8f2a2303ddc2844afa41ecded5f4" exitCode=0 Oct 12 05:43:32 crc kubenswrapper[4930]: I1012 05:43:32.796221 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-6gz2l" event={"ID":"cf1f38ec-f38b-47e0-9b80-e66740ebdaba","Type":"ContainerDied","Data":"965c16b196424496aa3bb3b14b6ae6fc718b8f2a2303ddc2844afa41ecded5f4"} Oct 12 05:43:32 crc kubenswrapper[4930]: I1012 05:43:32.868062 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-47d6m" event={"ID":"e9baf15f-27e7-442f-98d8-fdb29719ac71","Type":"ContainerStarted","Data":"996ee0a7eeacfa94c2ea7b16a3471fa9b8bf9b46c1f71b2b0866cbbe68841b37"} Oct 12 05:43:32 crc kubenswrapper[4930]: I1012 05:43:32.870463 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 05:43:32 crc kubenswrapper[4930]: E1012 05:43:32.871173 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 05:43:33.371158956 +0000 UTC m=+145.913260721 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:43:32 crc kubenswrapper[4930]: I1012 05:43:32.878331 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-mdwcw" podStartSLOduration=123.878312588 podStartE2EDuration="2m3.878312588s" podCreationTimestamp="2025-10-12 05:41:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:43:32.834983946 +0000 UTC m=+145.377085711" watchObservedRunningTime="2025-10-12 05:43:32.878312588 +0000 UTC m=+145.420414353" Oct 12 05:43:32 crc kubenswrapper[4930]: I1012 05:43:32.887050 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-zxqcl" event={"ID":"54c34701-7d60-4396-9a64-81b91379fbe9","Type":"ContainerStarted","Data":"d05ca14956871b376c01e3065a041e7ce39663a1ba949da8d492c8fb89b882ad"} Oct 12 05:43:32 crc kubenswrapper[4930]: I1012 05:43:32.916022 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29337450-9g4rr" event={"ID":"74be97dd-4d16-40ed-87e4-b707eccf422e","Type":"ContainerStarted","Data":"3efd8d4f1fad50e702a8d8ac66b0f7c78a206680d5761549f29cffa73304605e"} Oct 12 05:43:32 crc kubenswrapper[4930]: I1012 05:43:32.965174 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ncmk8" event={"ID":"02cd3769-c022-4645-b33e-eb1303133aea","Type":"ContainerStarted","Data":"f08362fc47b6c76b6416ef4536ae9fe7c75f7de285cce36d72d04a8ba75ff591"} Oct 12 05:43:32 crc kubenswrapper[4930]: I1012 05:43:32.967056 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7xzpm" podStartSLOduration=123.967040911 podStartE2EDuration="2m3.967040911s" podCreationTimestamp="2025-10-12 05:41:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:43:32.927699781 +0000 UTC m=+145.469801546" watchObservedRunningTime="2025-10-12 05:43:32.967040911 +0000 UTC m=+145.509142676" Oct 12 05:43:32 crc kubenswrapper[4930]: I1012 05:43:32.970402 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-d6mxh" event={"ID":"a8c9e0c4-c47b-4bbc-ab98-ce7fa42d8e0c","Type":"ContainerStarted","Data":"4bea269d5377761138b61f26fae9f2d576242473237fed381b9731f0805b8141"} Oct 12 05:43:32 crc kubenswrapper[4930]: I1012 05:43:32.971440 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m4qrq" event={"ID":"882d387b-cc67-4a31-b25a-8b4197bd42f2","Type":"ContainerStarted","Data":"f44bee20c95edb6c879898a55d21b4bb8a1e5acc3b93d99a6a0d718137d0fc46"} Oct 12 05:43:32 crc kubenswrapper[4930]: I1012 05:43:32.976682 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bzqf2\" (UID: \"51d2d15f-f2ac-4939-b11d-f41e5891323d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bzqf2" Oct 12 05:43:32 crc kubenswrapper[4930]: E1012 05:43:32.976960 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 05:43:33.476946081 +0000 UTC m=+146.019047846 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bzqf2" (UID: "51d2d15f-f2ac-4939-b11d-f41e5891323d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:43:32 crc kubenswrapper[4930]: I1012 05:43:32.985005 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-psfdl" event={"ID":"db2e99ac-f9a3-41cf-8a72-b05db4880b78","Type":"ContainerStarted","Data":"26997a1a0df87d4924e191e9b4e41b5aded85ae8848b60955c5e80dd9ad3749d"} Oct 12 05:43:32 crc kubenswrapper[4930]: I1012 05:43:32.985719 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-psfdl" Oct 12 05:43:32 crc kubenswrapper[4930]: I1012 05:43:32.995096 4930 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-psfdl container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.32:5443/healthz\": dial tcp 10.217.0.32:5443: connect: connection refused" start-of-body= Oct 12 05:43:32 crc kubenswrapper[4930]: I1012 05:43:32.995137 4930 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-psfdl" podUID="db2e99ac-f9a3-41cf-8a72-b05db4880b78" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.32:5443/healthz\": dial tcp 10.217.0.32:5443: connect: connection refused" Oct 12 05:43:33 crc kubenswrapper[4930]: I1012 05:43:33.041536 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9h9hz" event={"ID":"0bfadb73-bbc1-41db-8374-9c3aaf00682d","Type":"ContainerStarted","Data":"32960b32c63abb792601df236a811e21fd225deadf43c09b4c8d4a180a4609be"} Oct 12 05:43:33 crc kubenswrapper[4930]: I1012 05:43:33.043616 4930 patch_prober.go:28] interesting pod/downloads-7954f5f757-ztn69 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Oct 12 05:43:33 crc kubenswrapper[4930]: I1012 05:43:33.043654 4930 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-ztn69" podUID="6c179d5f-d7dd-42b9-b248-cd3c34237961" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Oct 12 05:43:33 crc kubenswrapper[4930]: I1012 05:43:33.084834 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 05:43:33 crc kubenswrapper[4930]: E1012 05:43:33.084990 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 05:43:33.584966798 +0000 UTC m=+146.127068563 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:43:33 crc kubenswrapper[4930]: I1012 05:43:33.085953 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bzqf2\" (UID: \"51d2d15f-f2ac-4939-b11d-f41e5891323d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bzqf2" Oct 12 05:43:33 crc kubenswrapper[4930]: E1012 05:43:33.087278 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 05:43:33.587267063 +0000 UTC m=+146.129368818 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bzqf2" (UID: "51d2d15f-f2ac-4939-b11d-f41e5891323d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:43:33 crc kubenswrapper[4930]: I1012 05:43:33.141883 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cqctf" Oct 12 05:43:33 crc kubenswrapper[4930]: I1012 05:43:33.182103 4930 patch_prober.go:28] interesting pod/router-default-5444994796-mdwcw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 12 05:43:33 crc kubenswrapper[4930]: [-]has-synced failed: reason withheld Oct 12 05:43:33 crc kubenswrapper[4930]: [+]process-running ok Oct 12 05:43:33 crc kubenswrapper[4930]: healthz check failed Oct 12 05:43:33 crc kubenswrapper[4930]: I1012 05:43:33.182182 4930 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mdwcw" podUID="93aad091-5fd5-4eb5-a123-b7932dc268fe" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 12 05:43:33 crc kubenswrapper[4930]: I1012 05:43:33.195240 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 05:43:33 crc kubenswrapper[4930]: E1012 05:43:33.195631 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 05:43:33.695615019 +0000 UTC m=+146.237716784 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:43:33 crc kubenswrapper[4930]: I1012 05:43:33.207101 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rl2fl" podStartSLOduration=124.207082963 podStartE2EDuration="2m4.207082963s" podCreationTimestamp="2025-10-12 05:41:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:43:33.167595129 +0000 UTC m=+145.709696894" watchObservedRunningTime="2025-10-12 05:43:33.207082963 +0000 UTC m=+145.749184728" Oct 12 05:43:33 crc kubenswrapper[4930]: I1012 05:43:33.297022 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bzqf2\" (UID: \"51d2d15f-f2ac-4939-b11d-f41e5891323d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bzqf2" Oct 12 05:43:33 crc kubenswrapper[4930]: E1012 05:43:33.297348 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 05:43:33.797336319 +0000 UTC m=+146.339438084 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bzqf2" (UID: "51d2d15f-f2ac-4939-b11d-f41e5891323d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:43:33 crc kubenswrapper[4930]: I1012 05:43:33.331939 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xctd8" podStartSLOduration=124.331915454 podStartE2EDuration="2m4.331915454s" podCreationTimestamp="2025-10-12 05:41:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:43:33.330165975 +0000 UTC m=+145.872267740" watchObservedRunningTime="2025-10-12 05:43:33.331915454 +0000 UTC m=+145.874017219" Oct 12 05:43:33 crc kubenswrapper[4930]: I1012 05:43:33.364170 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29337450-9g4rr" podStartSLOduration=124.364152113 podStartE2EDuration="2m4.364152113s" podCreationTimestamp="2025-10-12 05:41:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:43:33.363000781 +0000 UTC m=+145.905102546" watchObservedRunningTime="2025-10-12 05:43:33.364152113 +0000 UTC m=+145.906253878" Oct 12 05:43:33 crc kubenswrapper[4930]: I1012 05:43:33.396700 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-t48rv" podStartSLOduration=124.396682071 podStartE2EDuration="2m4.396682071s" podCreationTimestamp="2025-10-12 05:41:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:43:33.395803076 +0000 UTC m=+145.937904841" watchObservedRunningTime="2025-10-12 05:43:33.396682071 +0000 UTC m=+145.938783836" Oct 12 05:43:33 crc kubenswrapper[4930]: I1012 05:43:33.400632 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 05:43:33 crc kubenswrapper[4930]: E1012 05:43:33.401294 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 05:43:33.901281941 +0000 UTC m=+146.443383706 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:43:33 crc kubenswrapper[4930]: I1012 05:43:33.452312 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-m4qrq" podStartSLOduration=124.45229461 podStartE2EDuration="2m4.45229461s" podCreationTimestamp="2025-10-12 05:41:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:43:33.446228159 +0000 UTC m=+145.988329924" watchObservedRunningTime="2025-10-12 05:43:33.45229461 +0000 UTC m=+145.994396375" Oct 12 05:43:33 crc kubenswrapper[4930]: I1012 05:43:33.492179 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-92ngq" podStartSLOduration=6.492164534 podStartE2EDuration="6.492164534s" podCreationTimestamp="2025-10-12 05:43:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:43:33.490849447 +0000 UTC m=+146.032951212" watchObservedRunningTime="2025-10-12 05:43:33.492164534 +0000 UTC m=+146.034266299" Oct 12 05:43:33 crc kubenswrapper[4930]: I1012 05:43:33.502225 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bzqf2\" (UID: \"51d2d15f-f2ac-4939-b11d-f41e5891323d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bzqf2" Oct 12 05:43:33 crc kubenswrapper[4930]: E1012 05:43:33.502632 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 05:43:34.00262057 +0000 UTC m=+146.544722335 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bzqf2" (UID: "51d2d15f-f2ac-4939-b11d-f41e5891323d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:43:33 crc kubenswrapper[4930]: I1012 05:43:33.534584 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cqctf" podStartSLOduration=124.534567121 podStartE2EDuration="2m4.534567121s" podCreationTimestamp="2025-10-12 05:41:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:43:33.534029695 +0000 UTC m=+146.076131460" watchObservedRunningTime="2025-10-12 05:43:33.534567121 +0000 UTC m=+146.076668886" Oct 12 05:43:33 crc kubenswrapper[4930]: I1012 05:43:33.561455 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x89kc" podStartSLOduration=124.561437799 podStartE2EDuration="2m4.561437799s" podCreationTimestamp="2025-10-12 05:41:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:43:33.559627828 +0000 UTC m=+146.101729593" watchObservedRunningTime="2025-10-12 05:43:33.561437799 +0000 UTC m=+146.103539564" Oct 12 05:43:33 crc kubenswrapper[4930]: I1012 05:43:33.607279 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-w9zwj" podStartSLOduration=124.607262831 podStartE2EDuration="2m4.607262831s" podCreationTimestamp="2025-10-12 05:41:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:43:33.606773537 +0000 UTC m=+146.148875302" watchObservedRunningTime="2025-10-12 05:43:33.607262831 +0000 UTC m=+146.149364606" Oct 12 05:43:33 crc kubenswrapper[4930]: I1012 05:43:33.611982 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 05:43:33 crc kubenswrapper[4930]: E1012 05:43:33.612418 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 05:43:34.112400696 +0000 UTC m=+146.654502461 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:43:33 crc kubenswrapper[4930]: I1012 05:43:33.664614 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-rg6mp" podStartSLOduration=124.664597359 podStartE2EDuration="2m4.664597359s" podCreationTimestamp="2025-10-12 05:41:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:43:33.664407383 +0000 UTC m=+146.206509148" watchObservedRunningTime="2025-10-12 05:43:33.664597359 +0000 UTC m=+146.206699124" Oct 12 05:43:33 crc kubenswrapper[4930]: I1012 05:43:33.681453 4930 patch_prober.go:28] interesting pod/machine-config-daemon-mk4tf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 05:43:33 crc kubenswrapper[4930]: I1012 05:43:33.681503 4930 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 05:43:33 crc kubenswrapper[4930]: I1012 05:43:33.727439 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bzqf2\" (UID: \"51d2d15f-f2ac-4939-b11d-f41e5891323d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bzqf2" Oct 12 05:43:33 crc kubenswrapper[4930]: E1012 05:43:33.728078 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 05:43:34.228066019 +0000 UTC m=+146.770167784 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bzqf2" (UID: "51d2d15f-f2ac-4939-b11d-f41e5891323d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:43:33 crc kubenswrapper[4930]: I1012 05:43:33.830485 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 05:43:33 crc kubenswrapper[4930]: E1012 05:43:33.830793 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 05:43:34.330777476 +0000 UTC m=+146.872879241 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:43:33 crc kubenswrapper[4930]: I1012 05:43:33.887611 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-6r5ch" podStartSLOduration=124.887596239 podStartE2EDuration="2m4.887596239s" podCreationTimestamp="2025-10-12 05:41:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:43:33.738823423 +0000 UTC m=+146.280925178" watchObservedRunningTime="2025-10-12 05:43:33.887596239 +0000 UTC m=+146.429698004" Oct 12 05:43:33 crc kubenswrapper[4930]: I1012 05:43:33.932264 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bzqf2\" (UID: \"51d2d15f-f2ac-4939-b11d-f41e5891323d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bzqf2" Oct 12 05:43:33 crc kubenswrapper[4930]: E1012 05:43:33.932758 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 05:43:34.432720812 +0000 UTC m=+146.974822587 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bzqf2" (UID: "51d2d15f-f2ac-4939-b11d-f41e5891323d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.035532 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 05:43:34 crc kubenswrapper[4930]: E1012 05:43:34.036005 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 05:43:34.535981725 +0000 UTC m=+147.078083490 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.036075 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bzqf2\" (UID: \"51d2d15f-f2ac-4939-b11d-f41e5891323d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bzqf2" Oct 12 05:43:34 crc kubenswrapper[4930]: E1012 05:43:34.036436 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 05:43:34.536424088 +0000 UTC m=+147.078525843 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bzqf2" (UID: "51d2d15f-f2ac-4939-b11d-f41e5891323d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.101354 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-968tr" event={"ID":"220d40c1-2181-40b6-9d6b-6704fe99ec24","Type":"ContainerStarted","Data":"6c882f38c0d62b44821b937c13117ac9799be9302e0ecc2b9635fe25709e866e"} Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.101406 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-968tr" event={"ID":"220d40c1-2181-40b6-9d6b-6704fe99ec24","Type":"ContainerStarted","Data":"064c2e6f0c86584c479a72cd6657bf6344ddfecb62923dad989b43a263ab43f9"} Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.101849 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-968tr" Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.104146 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-g8lk2" event={"ID":"36224327-6c6b-4ff0-86e2-97c02039b8c6","Type":"ContainerStarted","Data":"5eb6652270552d3f63719577b32c35df8869af7d9a0058bbb32ccd5b2c1aece3"} Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.105596 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-psfdl" podStartSLOduration=125.105576698 podStartE2EDuration="2m5.105576698s" podCreationTimestamp="2025-10-12 05:41:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:43:34.010953059 +0000 UTC m=+146.553054824" watchObservedRunningTime="2025-10-12 05:43:34.105576698 +0000 UTC m=+146.647678463" Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.113820 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wdpcs" event={"ID":"2a81e403-a7dc-405a-bab2-8a9f28fa22fa","Type":"ContainerStarted","Data":"7ce09d765a8ef4a875f83147dd4a2f018f48580eee64cfad3ac68cb29e3d74d5"} Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.114242 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wdpcs" Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.132165 4930 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-968tr container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" start-of-body= Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.132230 4930 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-968tr" podUID="220d40c1-2181-40b6-9d6b-6704fe99ec24" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.138178 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 05:43:34 crc kubenswrapper[4930]: E1012 05:43:34.138281 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 05:43:34.6382578 +0000 UTC m=+147.180359565 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.138538 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bzqf2\" (UID: \"51d2d15f-f2ac-4939-b11d-f41e5891323d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bzqf2" Oct 12 05:43:34 crc kubenswrapper[4930]: E1012 05:43:34.138843 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 05:43:34.638831736 +0000 UTC m=+147.180933501 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bzqf2" (UID: "51d2d15f-f2ac-4939-b11d-f41e5891323d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.149325 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9h9hz" podStartSLOduration=125.149311522 podStartE2EDuration="2m5.149311522s" podCreationTimestamp="2025-10-12 05:41:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:43:34.147682056 +0000 UTC m=+146.689783811" watchObservedRunningTime="2025-10-12 05:43:34.149311522 +0000 UTC m=+146.691413287" Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.194106 4930 patch_prober.go:28] interesting pod/router-default-5444994796-mdwcw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 12 05:43:34 crc kubenswrapper[4930]: [-]has-synced failed: reason withheld Oct 12 05:43:34 crc kubenswrapper[4930]: [+]process-running ok Oct 12 05:43:34 crc kubenswrapper[4930]: healthz check failed Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.194175 4930 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mdwcw" podUID="93aad091-5fd5-4eb5-a123-b7932dc268fe" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.206045 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-psfdl" event={"ID":"db2e99ac-f9a3-41cf-8a72-b05db4880b78","Type":"ContainerStarted","Data":"142b8f640d886200b9e661753d4381bc085cab444157ae68087554432eb36416"} Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.208319 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29337450-9g4rr" event={"ID":"74be97dd-4d16-40ed-87e4-b707eccf422e","Type":"ContainerStarted","Data":"03c67c5984d37c63a32d182586fc0c79ea9235d37261509a22f89f4ee0167502"} Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.214532 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2lc4l" event={"ID":"c5333e14-992e-411c-bd42-9848ff42b021","Type":"ContainerStarted","Data":"98dd1baa2a5d26188fe228062dded06a8b142c750bc3d052be05dfb169a7fec2"} Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.214576 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2lc4l" event={"ID":"c5333e14-992e-411c-bd42-9848ff42b021","Type":"ContainerStarted","Data":"a43035142935eca22c6dafb36d01b77d6f89730a13f1b5aaf22f94e6ed12c67f"} Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.214586 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2lc4l" event={"ID":"c5333e14-992e-411c-bd42-9848ff42b021","Type":"ContainerStarted","Data":"35c3f0eda37d941c16aec5764fc559276011b446dae85820eee6e8399310df36"} Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.215202 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2lc4l" Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.242535 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 05:43:34 crc kubenswrapper[4930]: E1012 05:43:34.243076 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 05:43:34.743063017 +0000 UTC m=+147.285164782 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.251704 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ncmk8" podStartSLOduration=125.2516803 podStartE2EDuration="2m5.2516803s" podCreationTimestamp="2025-10-12 05:41:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:43:34.241869253 +0000 UTC m=+146.783971018" watchObservedRunningTime="2025-10-12 05:43:34.2516803 +0000 UTC m=+146.793782055" Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.253095 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-47d6m" podStartSLOduration=125.253086639 podStartE2EDuration="2m5.253086639s" podCreationTimestamp="2025-10-12 05:41:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:43:34.187276243 +0000 UTC m=+146.729378008" watchObservedRunningTime="2025-10-12 05:43:34.253086639 +0000 UTC m=+146.795188404" Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.273088 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wdpcs" Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.281987 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-d6mxh" event={"ID":"a8c9e0c4-c47b-4bbc-ab98-ce7fa42d8e0c","Type":"ContainerStarted","Data":"a3c22070fc8720c90d602c96c3a0991e849a71b7bc0af2d91a3ed8f6ebe915a8"} Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.282050 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-d6mxh" event={"ID":"a8c9e0c4-c47b-4bbc-ab98-ce7fa42d8e0c","Type":"ContainerStarted","Data":"d30f21df45044073be319e4e481a5b94184e7a37157abeeaccf25d2f96ea60b5"} Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.290521 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-25pm7" event={"ID":"c393fc2e-599b-4842-9c6c-b46cef38f7e6","Type":"ContainerStarted","Data":"022b80ff49dcf8c7a97d2323996c7f67b0d449081db63af4da36e0e0a64fc54c"} Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.305858 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-hbgd8" event={"ID":"42753d84-7f68-4c88-85e1-a06eacd8052d","Type":"ContainerStarted","Data":"9494a75dafb959c4be354cd72095321d1319a8f62f023f275dceb76fe051247a"} Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.314434 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-phltz" event={"ID":"131bb70c-fac8-4696-ad60-1acc99df28c2","Type":"ContainerStarted","Data":"edf38a4ad45c340b9ed730c99573eed0530a240fc2f6b505cdc643e2dae8a6fb"} Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.314484 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-phltz" event={"ID":"131bb70c-fac8-4696-ad60-1acc99df28c2","Type":"ContainerStarted","Data":"8e4aa626d1e4c5d63d27bac8d8cec777f56596e9841e456890ce8280c16c029a"} Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.317731 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fmt5q" event={"ID":"69e0e716-3ae8-482b-bb9e-57f6f5d859cd","Type":"ContainerStarted","Data":"af6f44c3cf72599ba390ebb4cde36194dc576898d3956478b647b8a78ed608b3"} Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.330202 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-zxqcl" event={"ID":"54c34701-7d60-4396-9a64-81b91379fbe9","Type":"ContainerStarted","Data":"62b0dcfd1b8fc09c1b3334584dec3dae03f288e566cd0c9a0582a6d24716d663"} Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.330249 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-zxqcl" event={"ID":"54c34701-7d60-4396-9a64-81b91379fbe9","Type":"ContainerStarted","Data":"f5b3382a4824d2bad06a73602325912b3b0b2338ff68f15fb6fd288b6a3e8472"} Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.351893 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bzqf2\" (UID: \"51d2d15f-f2ac-4939-b11d-f41e5891323d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bzqf2" Oct 12 05:43:34 crc kubenswrapper[4930]: E1012 05:43:34.353127 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 05:43:34.853111461 +0000 UTC m=+147.395213226 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bzqf2" (UID: "51d2d15f-f2ac-4939-b11d-f41e5891323d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.399910 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-968tr" podStartSLOduration=125.399892731 podStartE2EDuration="2m5.399892731s" podCreationTimestamp="2025-10-12 05:41:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:43:34.299127238 +0000 UTC m=+146.841229003" watchObservedRunningTime="2025-10-12 05:43:34.399892731 +0000 UTC m=+146.941994496" Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.401007 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-vd458" event={"ID":"7051fcad-4fe8-4c5c-8694-2a6e63b541d0","Type":"ContainerStarted","Data":"6272d9977124105418957f0cf55c6f25a80cd27094d24fcfeaf829bcec89a755"} Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.401046 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-vd458" event={"ID":"7051fcad-4fe8-4c5c-8694-2a6e63b541d0","Type":"ContainerStarted","Data":"94b25837be0f546e05aa6e54bed36bac003e68a7fbb55d285a6d4f19a38e0f6d"} Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.436901 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lmr9n" event={"ID":"14ec6d5d-4697-48f0-be34-aa3c028fc8a4","Type":"ContainerStarted","Data":"e81ca642637f03cc17905fd82881a8147a58f63fd50e1a55b423e46d15d38f99"} Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.437468 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lmr9n" Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.460287 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wdpcs" podStartSLOduration=125.460269274 podStartE2EDuration="2m5.460269274s" podCreationTimestamp="2025-10-12 05:41:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:43:34.401506976 +0000 UTC m=+146.943608741" watchObservedRunningTime="2025-10-12 05:43:34.460269274 +0000 UTC m=+147.002371039" Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.462229 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 05:43:34 crc kubenswrapper[4930]: E1012 05:43:34.463548 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 05:43:34.963531196 +0000 UTC m=+147.505632961 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.480065 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9m6tj" event={"ID":"805f5605-a80f-40be-aaa2-5cedbc94960f","Type":"ContainerStarted","Data":"2971d555f12668fe1fb0b44dd148a6b7094c3a173a3c036e268c3413e0c839e0"} Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.480129 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9m6tj" event={"ID":"805f5605-a80f-40be-aaa2-5cedbc94960f","Type":"ContainerStarted","Data":"14ec96d41427d87c22d034b0bd0745a7342ec3f8bfbce9c875ceb880ebe91ab4"} Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.490003 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2lc4l" podStartSLOduration=125.489988692 podStartE2EDuration="2m5.489988692s" podCreationTimestamp="2025-10-12 05:41:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:43:34.461650453 +0000 UTC m=+147.003752218" watchObservedRunningTime="2025-10-12 05:43:34.489988692 +0000 UTC m=+147.032090457" Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.490719 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fmt5q" podStartSLOduration=125.490715533 podStartE2EDuration="2m5.490715533s" podCreationTimestamp="2025-10-12 05:41:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:43:34.490123006 +0000 UTC m=+147.032224771" watchObservedRunningTime="2025-10-12 05:43:34.490715533 +0000 UTC m=+147.032817288" Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.497156 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dprns" event={"ID":"74488d89-3c1c-4e78-9c26-35be09ac8cde","Type":"ContainerStarted","Data":"6a8eeef05a79cfd029db7f37e89a4a3531620720ea2c134b93c6f58f308acb16"} Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.498895 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-gbqmf" event={"ID":"ef886351-d776-447e-ac43-1c16568cac4f","Type":"ContainerStarted","Data":"4fc010d9104846bcfb5c6059ec440b8a132b5fed8d48728e8c2e5d14d744923b"} Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.499511 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-gbqmf" Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.501637 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-5v7hl" event={"ID":"bb64ba69-2433-4d68-b7bc-c5c4f62d5f2f","Type":"ContainerStarted","Data":"0e03687d25fae247c8a21ab16f3fe170ee1855975c7254648afacc5964242f64"} Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.528240 4930 generic.go:334] "Generic (PLEG): container finished" podID="98fa14b7-a351-47ab-bcd2-83e2a81f9859" containerID="fd956f884ea61061dcafe98364c19417958998824777fb1e3db7280fc93b7da2" exitCode=0 Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.528528 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wsqp8" event={"ID":"98fa14b7-a351-47ab-bcd2-83e2a81f9859","Type":"ContainerDied","Data":"fd956f884ea61061dcafe98364c19417958998824777fb1e3db7280fc93b7da2"} Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.534387 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-hbgd8" podStartSLOduration=125.534355014 podStartE2EDuration="2m5.534355014s" podCreationTimestamp="2025-10-12 05:41:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:43:34.53316532 +0000 UTC m=+147.075267085" watchObservedRunningTime="2025-10-12 05:43:34.534355014 +0000 UTC m=+147.076456779" Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.546172 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-bs29k" event={"ID":"a51e2785-1c3d-4354-a071-dadb05075c68","Type":"ContainerStarted","Data":"8efdde2ce6389e72ad82f1eb5cc2b576bbf55fbaf5f135dbc92a4179d946f76e"} Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.546226 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-bs29k" event={"ID":"a51e2785-1c3d-4354-a071-dadb05075c68","Type":"ContainerStarted","Data":"d26c532eedf22d2fb6d8aa58bc15884c5b780b34d4bd50a10f4744b5ace41db0"} Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.547107 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-bs29k" Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.563931 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bzqf2\" (UID: \"51d2d15f-f2ac-4939-b11d-f41e5891323d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bzqf2" Oct 12 05:43:34 crc kubenswrapper[4930]: E1012 05:43:34.565401 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 05:43:35.065389389 +0000 UTC m=+147.607491154 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bzqf2" (UID: "51d2d15f-f2ac-4939-b11d-f41e5891323d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.606334 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-p5m42" event={"ID":"de12eb0a-0d0a-455c-a275-a3bfdb6f9d72","Type":"ContainerStarted","Data":"0a05de220d6ba99e36cb473e88fcaf9adfd9b89b773e74e85c57d277569399dc"} Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.606687 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-p5m42" event={"ID":"de12eb0a-0d0a-455c-a275-a3bfdb6f9d72","Type":"ContainerStarted","Data":"cb065e1feda3821d4a674bd8b7db114f5712a1230139514a5426813d752b7f45"} Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.618111 4930 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-bs29k container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.618166 4930 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-bs29k" podUID="a51e2785-1c3d-4354-a071-dadb05075c68" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.620427 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-6gz2l" event={"ID":"cf1f38ec-f38b-47e0-9b80-e66740ebdaba","Type":"ContainerStarted","Data":"0960c4b91992297cf92e33f3c87a90760835bc07570dcc86434bbf31300c83c6"} Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.626112 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lk89q" event={"ID":"17d6a32b-5d07-4158-853a-eefab6440720","Type":"ContainerStarted","Data":"26f58149aab126323b07625cd2abcf2ebbc5af63811fb3e712af981f89cb65fd"} Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.626151 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lk89q" event={"ID":"17d6a32b-5d07-4158-853a-eefab6440720","Type":"ContainerStarted","Data":"295c411ed4e3f63bc7b79afa48102fb5e43aacca8f9c0f7604a71e3913f46393"} Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.626176 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lk89q" event={"ID":"17d6a32b-5d07-4158-853a-eefab6440720","Type":"ContainerStarted","Data":"06074cb858da1c1f7fd1b68e46ffbd436435ffbfb1206e803702cf49a93499b5"} Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.630021 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-w9zwj" event={"ID":"f086907d-7c4a-488b-a783-bc24e827e5e6","Type":"ContainerStarted","Data":"4595c2572b0ce00854adcbac96a19a5c7666b3e6497c5e64b770f5788a739cbc"} Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.641928 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-phltz" podStartSLOduration=125.641915428 podStartE2EDuration="2m5.641915428s" podCreationTimestamp="2025-10-12 05:41:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:43:34.641537417 +0000 UTC m=+147.183639182" watchObservedRunningTime="2025-10-12 05:43:34.641915428 +0000 UTC m=+147.184017193" Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.643920 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-d6mxh" podStartSLOduration=125.643896284 podStartE2EDuration="2m5.643896284s" podCreationTimestamp="2025-10-12 05:41:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:43:34.597212427 +0000 UTC m=+147.139314192" watchObservedRunningTime="2025-10-12 05:43:34.643896284 +0000 UTC m=+147.185998049" Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.662138 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9h9hz" event={"ID":"0bfadb73-bbc1-41db-8374-9c3aaf00682d","Type":"ContainerStarted","Data":"d53d17cc34a67af3eda71b27385fe266d0a98bc0dec0893cfdbc3914a941c789"} Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.665896 4930 patch_prober.go:28] interesting pod/downloads-7954f5f757-ztn69 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.665942 4930 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-ztn69" podUID="6c179d5f-d7dd-42b9-b248-cd3c34237961" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.672306 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 05:43:34 crc kubenswrapper[4930]: E1012 05:43:34.681560 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 05:43:35.181542996 +0000 UTC m=+147.723644761 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.699884 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-zxqcl" podStartSLOduration=125.699866383 podStartE2EDuration="2m5.699866383s" podCreationTimestamp="2025-10-12 05:41:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:43:34.699714808 +0000 UTC m=+147.241816573" watchObservedRunningTime="2025-10-12 05:43:34.699866383 +0000 UTC m=+147.241968148" Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.716256 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-t48rv" Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.762286 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-rg6mp" Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.781089 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bzqf2\" (UID: \"51d2d15f-f2ac-4939-b11d-f41e5891323d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bzqf2" Oct 12 05:43:34 crc kubenswrapper[4930]: E1012 05:43:34.787005 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 05:43:35.28699023 +0000 UTC m=+147.829091995 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bzqf2" (UID: "51d2d15f-f2ac-4939-b11d-f41e5891323d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.803822 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lmr9n" podStartSLOduration=125.803807675 podStartE2EDuration="2m5.803807675s" podCreationTimestamp="2025-10-12 05:41:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:43:34.798880706 +0000 UTC m=+147.340982461" watchObservedRunningTime="2025-10-12 05:43:34.803807675 +0000 UTC m=+147.345909440" Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.882672 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 05:43:34 crc kubenswrapper[4930]: E1012 05:43:34.882705 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 05:43:35.3826893 +0000 UTC m=+147.924791065 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.883194 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bzqf2\" (UID: \"51d2d15f-f2ac-4939-b11d-f41e5891323d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bzqf2" Oct 12 05:43:34 crc kubenswrapper[4930]: E1012 05:43:34.883489 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 05:43:35.383472892 +0000 UTC m=+147.925574657 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bzqf2" (UID: "51d2d15f-f2ac-4939-b11d-f41e5891323d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.896761 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-6gz2l" podStartSLOduration=125.896730336 podStartE2EDuration="2m5.896730336s" podCreationTimestamp="2025-10-12 05:41:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:43:34.851824389 +0000 UTC m=+147.393926154" watchObservedRunningTime="2025-10-12 05:43:34.896730336 +0000 UTC m=+147.438832101" Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.897391 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9m6tj" podStartSLOduration=125.897385905 podStartE2EDuration="2m5.897385905s" podCreationTimestamp="2025-10-12 05:41:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:43:34.896708285 +0000 UTC m=+147.438810050" watchObservedRunningTime="2025-10-12 05:43:34.897385905 +0000 UTC m=+147.439487670" Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.929854 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-gbqmf" podStartSLOduration=125.92983834 podStartE2EDuration="2m5.92983834s" podCreationTimestamp="2025-10-12 05:41:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:43:34.927237547 +0000 UTC m=+147.469339312" watchObservedRunningTime="2025-10-12 05:43:34.92983834 +0000 UTC m=+147.471940105" Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.961501 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-p5m42" podStartSLOduration=125.961483453 podStartE2EDuration="2m5.961483453s" podCreationTimestamp="2025-10-12 05:41:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:43:34.956592055 +0000 UTC m=+147.498693820" watchObservedRunningTime="2025-10-12 05:43:34.961483453 +0000 UTC m=+147.503585218" Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.984005 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 05:43:34 crc kubenswrapper[4930]: E1012 05:43:34.984496 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 05:43:35.484480881 +0000 UTC m=+148.026582646 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.997077 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-6gz2l" Oct 12 05:43:34 crc kubenswrapper[4930]: I1012 05:43:34.997467 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-6gz2l" Oct 12 05:43:35 crc kubenswrapper[4930]: I1012 05:43:34.998615 4930 patch_prober.go:28] interesting pod/apiserver-76f77b778f-6gz2l container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="Get \"https://10.217.0.6:8443/livez\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Oct 12 05:43:35 crc kubenswrapper[4930]: I1012 05:43:34.998664 4930 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-6gz2l" podUID="cf1f38ec-f38b-47e0-9b80-e66740ebdaba" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.6:8443/livez\": dial tcp 10.217.0.6:8443: connect: connection refused" Oct 12 05:43:35 crc kubenswrapper[4930]: I1012 05:43:35.087675 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bzqf2\" (UID: \"51d2d15f-f2ac-4939-b11d-f41e5891323d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bzqf2" Oct 12 05:43:35 crc kubenswrapper[4930]: E1012 05:43:35.088024 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 05:43:35.588011232 +0000 UTC m=+148.130112997 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bzqf2" (UID: "51d2d15f-f2ac-4939-b11d-f41e5891323d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:43:35 crc kubenswrapper[4930]: I1012 05:43:35.150225 4930 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-psfdl container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.32:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 12 05:43:35 crc kubenswrapper[4930]: I1012 05:43:35.150284 4930 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-psfdl" podUID="db2e99ac-f9a3-41cf-8a72-b05db4880b78" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.32:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 12 05:43:35 crc kubenswrapper[4930]: I1012 05:43:35.180459 4930 patch_prober.go:28] interesting pod/router-default-5444994796-mdwcw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 12 05:43:35 crc kubenswrapper[4930]: [-]has-synced failed: reason withheld Oct 12 05:43:35 crc kubenswrapper[4930]: [+]process-running ok Oct 12 05:43:35 crc kubenswrapper[4930]: healthz check failed Oct 12 05:43:35 crc kubenswrapper[4930]: I1012 05:43:35.180509 4930 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mdwcw" podUID="93aad091-5fd5-4eb5-a123-b7932dc268fe" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 12 05:43:35 crc kubenswrapper[4930]: I1012 05:43:35.188908 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 05:43:35 crc kubenswrapper[4930]: E1012 05:43:35.189252 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 05:43:35.689238667 +0000 UTC m=+148.231340433 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:43:35 crc kubenswrapper[4930]: I1012 05:43:35.234062 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-5v7hl" podStartSLOduration=8.234045701 podStartE2EDuration="8.234045701s" podCreationTimestamp="2025-10-12 05:43:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:43:35.012999846 +0000 UTC m=+147.555101611" watchObservedRunningTime="2025-10-12 05:43:35.234045701 +0000 UTC m=+147.776147466" Oct 12 05:43:35 crc kubenswrapper[4930]: I1012 05:43:35.290603 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bzqf2\" (UID: \"51d2d15f-f2ac-4939-b11d-f41e5891323d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bzqf2" Oct 12 05:43:35 crc kubenswrapper[4930]: E1012 05:43:35.290886 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 05:43:35.790875075 +0000 UTC m=+148.332976840 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bzqf2" (UID: "51d2d15f-f2ac-4939-b11d-f41e5891323d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:43:35 crc kubenswrapper[4930]: I1012 05:43:35.307021 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-bs29k" podStartSLOduration=126.306983169 podStartE2EDuration="2m6.306983169s" podCreationTimestamp="2025-10-12 05:41:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:43:35.235307447 +0000 UTC m=+147.777409222" watchObservedRunningTime="2025-10-12 05:43:35.306983169 +0000 UTC m=+147.849084934" Oct 12 05:43:35 crc kubenswrapper[4930]: I1012 05:43:35.393226 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 05:43:35 crc kubenswrapper[4930]: E1012 05:43:35.393492 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 05:43:35.893443448 +0000 UTC m=+148.435545213 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:43:35 crc kubenswrapper[4930]: I1012 05:43:35.393772 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bzqf2\" (UID: \"51d2d15f-f2ac-4939-b11d-f41e5891323d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bzqf2" Oct 12 05:43:35 crc kubenswrapper[4930]: E1012 05:43:35.394363 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 05:43:35.894350324 +0000 UTC m=+148.436452089 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bzqf2" (UID: "51d2d15f-f2ac-4939-b11d-f41e5891323d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:43:35 crc kubenswrapper[4930]: I1012 05:43:35.402841 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lk89q" podStartSLOduration=126.402805672 podStartE2EDuration="2m6.402805672s" podCreationTimestamp="2025-10-12 05:41:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:43:35.402647348 +0000 UTC m=+147.944749113" watchObservedRunningTime="2025-10-12 05:43:35.402805672 +0000 UTC m=+147.944907437" Oct 12 05:43:35 crc kubenswrapper[4930]: I1012 05:43:35.494954 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 05:43:35 crc kubenswrapper[4930]: E1012 05:43:35.495342 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 05:43:35.995326762 +0000 UTC m=+148.537428527 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:43:35 crc kubenswrapper[4930]: I1012 05:43:35.500128 4930 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-gbqmf container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.14:6443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 12 05:43:35 crc kubenswrapper[4930]: I1012 05:43:35.500197 4930 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-gbqmf" podUID="ef886351-d776-447e-ac43-1c16568cac4f" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.14:6443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 12 05:43:35 crc kubenswrapper[4930]: I1012 05:43:35.596455 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bzqf2\" (UID: \"51d2d15f-f2ac-4939-b11d-f41e5891323d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bzqf2" Oct 12 05:43:35 crc kubenswrapper[4930]: E1012 05:43:35.596800 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 05:43:36.096789794 +0000 UTC m=+148.638891559 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bzqf2" (UID: "51d2d15f-f2ac-4939-b11d-f41e5891323d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:43:35 crc kubenswrapper[4930]: I1012 05:43:35.625859 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-dprns" podStartSLOduration=126.625835294 podStartE2EDuration="2m6.625835294s" podCreationTimestamp="2025-10-12 05:41:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:43:35.56118208 +0000 UTC m=+148.103283845" watchObservedRunningTime="2025-10-12 05:43:35.625835294 +0000 UTC m=+148.167937069" Oct 12 05:43:35 crc kubenswrapper[4930]: I1012 05:43:35.679955 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-25pm7" event={"ID":"c393fc2e-599b-4842-9c6c-b46cef38f7e6","Type":"ContainerStarted","Data":"50c37541af3cc6f619051c451b80243ebd25dcb95723107fe14a05caf0fd4c1e"} Oct 12 05:43:35 crc kubenswrapper[4930]: I1012 05:43:35.684369 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wsqp8" event={"ID":"98fa14b7-a351-47ab-bcd2-83e2a81f9859","Type":"ContainerStarted","Data":"bfed8553ac64b84feec3f39b110ed7a7aa27bc90f72997626cbb3b7ee6cb2b52"} Oct 12 05:43:35 crc kubenswrapper[4930]: I1012 05:43:35.694068 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-g8lk2" event={"ID":"36224327-6c6b-4ff0-86e2-97c02039b8c6","Type":"ContainerStarted","Data":"5404a68ffbddafb524aecdc919f2a24899d4ec61616d0baed243832d1fc619f4"} Oct 12 05:43:35 crc kubenswrapper[4930]: I1012 05:43:35.694390 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-g8lk2" Oct 12 05:43:35 crc kubenswrapper[4930]: I1012 05:43:35.698110 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 05:43:35 crc kubenswrapper[4930]: E1012 05:43:35.698578 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 05:43:36.198563865 +0000 UTC m=+148.740665630 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:43:35 crc kubenswrapper[4930]: I1012 05:43:35.718188 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-vd458" event={"ID":"7051fcad-4fe8-4c5c-8694-2a6e63b541d0","Type":"ContainerStarted","Data":"e168422336ffe3e4c5aa8ae93ea2e62362282330db9ac02e0c7a4647ea29b896"} Oct 12 05:43:35 crc kubenswrapper[4930]: I1012 05:43:35.737913 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-6gz2l" event={"ID":"cf1f38ec-f38b-47e0-9b80-e66740ebdaba","Type":"ContainerStarted","Data":"0a3eaa7f06b5d4ccfd00dc4d9c53ec38bd05a07d42ad11db58751dcccbf1d8fc"} Oct 12 05:43:35 crc kubenswrapper[4930]: I1012 05:43:35.755447 4930 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-bs29k container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Oct 12 05:43:35 crc kubenswrapper[4930]: I1012 05:43:35.755489 4930 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-bs29k" podUID="a51e2785-1c3d-4354-a071-dadb05075c68" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" Oct 12 05:43:35 crc kubenswrapper[4930]: I1012 05:43:35.779923 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-psfdl" Oct 12 05:43:35 crc kubenswrapper[4930]: I1012 05:43:35.782347 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wsqp8" podStartSLOduration=126.782338188 podStartE2EDuration="2m6.782338188s" podCreationTimestamp="2025-10-12 05:41:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:43:35.730080044 +0000 UTC m=+148.272181809" watchObservedRunningTime="2025-10-12 05:43:35.782338188 +0000 UTC m=+148.324439953" Oct 12 05:43:35 crc kubenswrapper[4930]: I1012 05:43:35.783921 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-g8lk2" podStartSLOduration=8.783917123 podStartE2EDuration="8.783917123s" podCreationTimestamp="2025-10-12 05:43:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:43:35.779053156 +0000 UTC m=+148.321154921" watchObservedRunningTime="2025-10-12 05:43:35.783917123 +0000 UTC m=+148.326018888" Oct 12 05:43:35 crc kubenswrapper[4930]: I1012 05:43:35.794855 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xvjv6"] Oct 12 05:43:35 crc kubenswrapper[4930]: I1012 05:43:35.795725 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xvjv6" Oct 12 05:43:35 crc kubenswrapper[4930]: I1012 05:43:35.797911 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 12 05:43:35 crc kubenswrapper[4930]: I1012 05:43:35.799246 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bzqf2\" (UID: \"51d2d15f-f2ac-4939-b11d-f41e5891323d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bzqf2" Oct 12 05:43:35 crc kubenswrapper[4930]: I1012 05:43:35.814572 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xvjv6"] Oct 12 05:43:35 crc kubenswrapper[4930]: E1012 05:43:35.815290 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 05:43:36.315276758 +0000 UTC m=+148.857378523 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bzqf2" (UID: "51d2d15f-f2ac-4939-b11d-f41e5891323d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:43:35 crc kubenswrapper[4930]: I1012 05:43:35.820758 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-vd458" podStartSLOduration=126.820724291 podStartE2EDuration="2m6.820724291s" podCreationTimestamp="2025-10-12 05:41:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:43:35.819113166 +0000 UTC m=+148.361214931" watchObservedRunningTime="2025-10-12 05:43:35.820724291 +0000 UTC m=+148.362826056" Oct 12 05:43:35 crc kubenswrapper[4930]: I1012 05:43:35.891902 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-968tr" Oct 12 05:43:35 crc kubenswrapper[4930]: I1012 05:43:35.901946 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 05:43:35 crc kubenswrapper[4930]: E1012 05:43:35.902123 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 05:43:36.402105627 +0000 UTC m=+148.944207392 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:43:35 crc kubenswrapper[4930]: I1012 05:43:35.902475 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/deece869-6cf9-4922-b75a-294c828c6e9e-catalog-content\") pod \"community-operators-xvjv6\" (UID: \"deece869-6cf9-4922-b75a-294c828c6e9e\") " pod="openshift-marketplace/community-operators-xvjv6" Oct 12 05:43:35 crc kubenswrapper[4930]: I1012 05:43:35.902599 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bzqf2\" (UID: \"51d2d15f-f2ac-4939-b11d-f41e5891323d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bzqf2" Oct 12 05:43:35 crc kubenswrapper[4930]: I1012 05:43:35.902645 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/deece869-6cf9-4922-b75a-294c828c6e9e-utilities\") pod \"community-operators-xvjv6\" (UID: \"deece869-6cf9-4922-b75a-294c828c6e9e\") " pod="openshift-marketplace/community-operators-xvjv6" Oct 12 05:43:35 crc kubenswrapper[4930]: I1012 05:43:35.902726 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvkvn\" (UniqueName: \"kubernetes.io/projected/deece869-6cf9-4922-b75a-294c828c6e9e-kube-api-access-vvkvn\") pod \"community-operators-xvjv6\" (UID: \"deece869-6cf9-4922-b75a-294c828c6e9e\") " pod="openshift-marketplace/community-operators-xvjv6" Oct 12 05:43:35 crc kubenswrapper[4930]: E1012 05:43:35.903080 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 05:43:36.403065714 +0000 UTC m=+148.945167479 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bzqf2" (UID: "51d2d15f-f2ac-4939-b11d-f41e5891323d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:43:35 crc kubenswrapper[4930]: I1012 05:43:35.979788 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8wdhk"] Oct 12 05:43:35 crc kubenswrapper[4930]: I1012 05:43:35.980718 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8wdhk" Oct 12 05:43:35 crc kubenswrapper[4930]: I1012 05:43:35.984696 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 12 05:43:36 crc kubenswrapper[4930]: I1012 05:43:36.005506 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 05:43:36 crc kubenswrapper[4930]: I1012 05:43:36.005663 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvkvn\" (UniqueName: \"kubernetes.io/projected/deece869-6cf9-4922-b75a-294c828c6e9e-kube-api-access-vvkvn\") pod \"community-operators-xvjv6\" (UID: \"deece869-6cf9-4922-b75a-294c828c6e9e\") " pod="openshift-marketplace/community-operators-xvjv6" Oct 12 05:43:36 crc kubenswrapper[4930]: I1012 05:43:36.005758 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/deece869-6cf9-4922-b75a-294c828c6e9e-catalog-content\") pod \"community-operators-xvjv6\" (UID: \"deece869-6cf9-4922-b75a-294c828c6e9e\") " pod="openshift-marketplace/community-operators-xvjv6" Oct 12 05:43:36 crc kubenswrapper[4930]: I1012 05:43:36.005834 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/deece869-6cf9-4922-b75a-294c828c6e9e-utilities\") pod \"community-operators-xvjv6\" (UID: \"deece869-6cf9-4922-b75a-294c828c6e9e\") " pod="openshift-marketplace/community-operators-xvjv6" Oct 12 05:43:36 crc kubenswrapper[4930]: I1012 05:43:36.006206 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/deece869-6cf9-4922-b75a-294c828c6e9e-utilities\") pod \"community-operators-xvjv6\" (UID: \"deece869-6cf9-4922-b75a-294c828c6e9e\") " pod="openshift-marketplace/community-operators-xvjv6" Oct 12 05:43:36 crc kubenswrapper[4930]: I1012 05:43:36.006414 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/deece869-6cf9-4922-b75a-294c828c6e9e-catalog-content\") pod \"community-operators-xvjv6\" (UID: \"deece869-6cf9-4922-b75a-294c828c6e9e\") " pod="openshift-marketplace/community-operators-xvjv6" Oct 12 05:43:36 crc kubenswrapper[4930]: E1012 05:43:36.006483 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 05:43:36.506465321 +0000 UTC m=+149.048567086 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:43:36 crc kubenswrapper[4930]: I1012 05:43:36.007025 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8wdhk"] Oct 12 05:43:36 crc kubenswrapper[4930]: I1012 05:43:36.062490 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvkvn\" (UniqueName: \"kubernetes.io/projected/deece869-6cf9-4922-b75a-294c828c6e9e-kube-api-access-vvkvn\") pod \"community-operators-xvjv6\" (UID: \"deece869-6cf9-4922-b75a-294c828c6e9e\") " pod="openshift-marketplace/community-operators-xvjv6" Oct 12 05:43:36 crc kubenswrapper[4930]: I1012 05:43:36.107478 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bzqf2\" (UID: \"51d2d15f-f2ac-4939-b11d-f41e5891323d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bzqf2" Oct 12 05:43:36 crc kubenswrapper[4930]: I1012 05:43:36.107518 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 05:43:36 crc kubenswrapper[4930]: I1012 05:43:36.107542 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlw4t\" (UniqueName: \"kubernetes.io/projected/54acd3fa-9208-450c-9a6a-4bae6962c325-kube-api-access-qlw4t\") pod \"certified-operators-8wdhk\" (UID: \"54acd3fa-9208-450c-9a6a-4bae6962c325\") " pod="openshift-marketplace/certified-operators-8wdhk" Oct 12 05:43:36 crc kubenswrapper[4930]: I1012 05:43:36.107563 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54acd3fa-9208-450c-9a6a-4bae6962c325-utilities\") pod \"certified-operators-8wdhk\" (UID: \"54acd3fa-9208-450c-9a6a-4bae6962c325\") " pod="openshift-marketplace/certified-operators-8wdhk" Oct 12 05:43:36 crc kubenswrapper[4930]: I1012 05:43:36.107588 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54acd3fa-9208-450c-9a6a-4bae6962c325-catalog-content\") pod \"certified-operators-8wdhk\" (UID: \"54acd3fa-9208-450c-9a6a-4bae6962c325\") " pod="openshift-marketplace/certified-operators-8wdhk" Oct 12 05:43:36 crc kubenswrapper[4930]: I1012 05:43:36.107608 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 05:43:36 crc kubenswrapper[4930]: I1012 05:43:36.107638 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 05:43:36 crc kubenswrapper[4930]: I1012 05:43:36.107671 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 05:43:36 crc kubenswrapper[4930]: I1012 05:43:36.112107 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 05:43:36 crc kubenswrapper[4930]: E1012 05:43:36.112348 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 05:43:36.612336557 +0000 UTC m=+149.154438322 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bzqf2" (UID: "51d2d15f-f2ac-4939-b11d-f41e5891323d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:43:36 crc kubenswrapper[4930]: I1012 05:43:36.114791 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 05:43:36 crc kubenswrapper[4930]: I1012 05:43:36.120228 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 05:43:36 crc kubenswrapper[4930]: I1012 05:43:36.128602 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 05:43:36 crc kubenswrapper[4930]: I1012 05:43:36.146106 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xvjv6" Oct 12 05:43:36 crc kubenswrapper[4930]: I1012 05:43:36.160341 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 12 05:43:36 crc kubenswrapper[4930]: I1012 05:43:36.173038 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 05:43:36 crc kubenswrapper[4930]: I1012 05:43:36.185103 4930 patch_prober.go:28] interesting pod/router-default-5444994796-mdwcw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 12 05:43:36 crc kubenswrapper[4930]: [-]has-synced failed: reason withheld Oct 12 05:43:36 crc kubenswrapper[4930]: [+]process-running ok Oct 12 05:43:36 crc kubenswrapper[4930]: healthz check failed Oct 12 05:43:36 crc kubenswrapper[4930]: I1012 05:43:36.185181 4930 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mdwcw" podUID="93aad091-5fd5-4eb5-a123-b7932dc268fe" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 12 05:43:36 crc kubenswrapper[4930]: I1012 05:43:36.189327 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qdh6w"] Oct 12 05:43:36 crc kubenswrapper[4930]: I1012 05:43:36.194042 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qdh6w" Oct 12 05:43:36 crc kubenswrapper[4930]: I1012 05:43:36.201101 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 12 05:43:36 crc kubenswrapper[4930]: I1012 05:43:36.215072 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qdh6w"] Oct 12 05:43:36 crc kubenswrapper[4930]: I1012 05:43:36.219595 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 05:43:36 crc kubenswrapper[4930]: E1012 05:43:36.226140 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 05:43:36.726115847 +0000 UTC m=+149.268217612 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:43:36 crc kubenswrapper[4930]: I1012 05:43:36.231821 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bzqf2\" (UID: \"51d2d15f-f2ac-4939-b11d-f41e5891323d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bzqf2" Oct 12 05:43:36 crc kubenswrapper[4930]: I1012 05:43:36.231900 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlw4t\" (UniqueName: \"kubernetes.io/projected/54acd3fa-9208-450c-9a6a-4bae6962c325-kube-api-access-qlw4t\") pod \"certified-operators-8wdhk\" (UID: \"54acd3fa-9208-450c-9a6a-4bae6962c325\") " pod="openshift-marketplace/certified-operators-8wdhk" Oct 12 05:43:36 crc kubenswrapper[4930]: I1012 05:43:36.231925 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54acd3fa-9208-450c-9a6a-4bae6962c325-utilities\") pod \"certified-operators-8wdhk\" (UID: \"54acd3fa-9208-450c-9a6a-4bae6962c325\") " pod="openshift-marketplace/certified-operators-8wdhk" Oct 12 05:43:36 crc kubenswrapper[4930]: I1012 05:43:36.232009 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54acd3fa-9208-450c-9a6a-4bae6962c325-catalog-content\") pod \"certified-operators-8wdhk\" (UID: \"54acd3fa-9208-450c-9a6a-4bae6962c325\") " pod="openshift-marketplace/certified-operators-8wdhk" Oct 12 05:43:36 crc kubenswrapper[4930]: I1012 05:43:36.232676 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54acd3fa-9208-450c-9a6a-4bae6962c325-catalog-content\") pod \"certified-operators-8wdhk\" (UID: \"54acd3fa-9208-450c-9a6a-4bae6962c325\") " pod="openshift-marketplace/certified-operators-8wdhk" Oct 12 05:43:36 crc kubenswrapper[4930]: E1012 05:43:36.233177 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 05:43:36.733157596 +0000 UTC m=+149.275259361 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bzqf2" (UID: "51d2d15f-f2ac-4939-b11d-f41e5891323d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:43:36 crc kubenswrapper[4930]: I1012 05:43:36.234017 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54acd3fa-9208-450c-9a6a-4bae6962c325-utilities\") pod \"certified-operators-8wdhk\" (UID: \"54acd3fa-9208-450c-9a6a-4bae6962c325\") " pod="openshift-marketplace/certified-operators-8wdhk" Oct 12 05:43:36 crc kubenswrapper[4930]: I1012 05:43:36.285045 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlw4t\" (UniqueName: \"kubernetes.io/projected/54acd3fa-9208-450c-9a6a-4bae6962c325-kube-api-access-qlw4t\") pod \"certified-operators-8wdhk\" (UID: \"54acd3fa-9208-450c-9a6a-4bae6962c325\") " pod="openshift-marketplace/certified-operators-8wdhk" Oct 12 05:43:36 crc kubenswrapper[4930]: I1012 05:43:36.316447 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8wdhk" Oct 12 05:43:36 crc kubenswrapper[4930]: I1012 05:43:36.334858 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 05:43:36 crc kubenswrapper[4930]: I1012 05:43:36.335046 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/054cc455-fa72-4e63-a082-59b56822991e-utilities\") pod \"community-operators-qdh6w\" (UID: \"054cc455-fa72-4e63-a082-59b56822991e\") " pod="openshift-marketplace/community-operators-qdh6w" Oct 12 05:43:36 crc kubenswrapper[4930]: I1012 05:43:36.335078 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sv68b\" (UniqueName: \"kubernetes.io/projected/054cc455-fa72-4e63-a082-59b56822991e-kube-api-access-sv68b\") pod \"community-operators-qdh6w\" (UID: \"054cc455-fa72-4e63-a082-59b56822991e\") " pod="openshift-marketplace/community-operators-qdh6w" Oct 12 05:43:36 crc kubenswrapper[4930]: I1012 05:43:36.335102 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/054cc455-fa72-4e63-a082-59b56822991e-catalog-content\") pod \"community-operators-qdh6w\" (UID: \"054cc455-fa72-4e63-a082-59b56822991e\") " pod="openshift-marketplace/community-operators-qdh6w" Oct 12 05:43:36 crc kubenswrapper[4930]: E1012 05:43:36.335248 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 05:43:36.835232905 +0000 UTC m=+149.377334670 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:43:36 crc kubenswrapper[4930]: I1012 05:43:36.374088 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gn4wn"] Oct 12 05:43:36 crc kubenswrapper[4930]: I1012 05:43:36.375251 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gn4wn" Oct 12 05:43:36 crc kubenswrapper[4930]: I1012 05:43:36.394662 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gn4wn"] Oct 12 05:43:36 crc kubenswrapper[4930]: I1012 05:43:36.436119 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bzqf2\" (UID: \"51d2d15f-f2ac-4939-b11d-f41e5891323d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bzqf2" Oct 12 05:43:36 crc kubenswrapper[4930]: I1012 05:43:36.436174 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/390607e9-0185-4b5b-a006-70ecef170a3b-catalog-content\") pod \"certified-operators-gn4wn\" (UID: \"390607e9-0185-4b5b-a006-70ecef170a3b\") " pod="openshift-marketplace/certified-operators-gn4wn" Oct 12 05:43:36 crc kubenswrapper[4930]: I1012 05:43:36.436236 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/390607e9-0185-4b5b-a006-70ecef170a3b-utilities\") pod \"certified-operators-gn4wn\" (UID: \"390607e9-0185-4b5b-a006-70ecef170a3b\") " pod="openshift-marketplace/certified-operators-gn4wn" Oct 12 05:43:36 crc kubenswrapper[4930]: I1012 05:43:36.436254 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnb2h\" (UniqueName: \"kubernetes.io/projected/390607e9-0185-4b5b-a006-70ecef170a3b-kube-api-access-wnb2h\") pod \"certified-operators-gn4wn\" (UID: \"390607e9-0185-4b5b-a006-70ecef170a3b\") " pod="openshift-marketplace/certified-operators-gn4wn" Oct 12 05:43:36 crc kubenswrapper[4930]: I1012 05:43:36.436278 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/054cc455-fa72-4e63-a082-59b56822991e-utilities\") pod \"community-operators-qdh6w\" (UID: \"054cc455-fa72-4e63-a082-59b56822991e\") " pod="openshift-marketplace/community-operators-qdh6w" Oct 12 05:43:36 crc kubenswrapper[4930]: I1012 05:43:36.436317 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sv68b\" (UniqueName: \"kubernetes.io/projected/054cc455-fa72-4e63-a082-59b56822991e-kube-api-access-sv68b\") pod \"community-operators-qdh6w\" (UID: \"054cc455-fa72-4e63-a082-59b56822991e\") " pod="openshift-marketplace/community-operators-qdh6w" Oct 12 05:43:36 crc kubenswrapper[4930]: I1012 05:43:36.436342 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/054cc455-fa72-4e63-a082-59b56822991e-catalog-content\") pod \"community-operators-qdh6w\" (UID: \"054cc455-fa72-4e63-a082-59b56822991e\") " pod="openshift-marketplace/community-operators-qdh6w" Oct 12 05:43:36 crc kubenswrapper[4930]: E1012 05:43:36.436546 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 05:43:36.936530082 +0000 UTC m=+149.478631847 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bzqf2" (UID: "51d2d15f-f2ac-4939-b11d-f41e5891323d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:43:36 crc kubenswrapper[4930]: I1012 05:43:36.437105 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/054cc455-fa72-4e63-a082-59b56822991e-catalog-content\") pod \"community-operators-qdh6w\" (UID: \"054cc455-fa72-4e63-a082-59b56822991e\") " pod="openshift-marketplace/community-operators-qdh6w" Oct 12 05:43:36 crc kubenswrapper[4930]: I1012 05:43:36.437170 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/054cc455-fa72-4e63-a082-59b56822991e-utilities\") pod \"community-operators-qdh6w\" (UID: \"054cc455-fa72-4e63-a082-59b56822991e\") " pod="openshift-marketplace/community-operators-qdh6w" Oct 12 05:43:36 crc kubenswrapper[4930]: I1012 05:43:36.471702 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sv68b\" (UniqueName: \"kubernetes.io/projected/054cc455-fa72-4e63-a082-59b56822991e-kube-api-access-sv68b\") pod \"community-operators-qdh6w\" (UID: \"054cc455-fa72-4e63-a082-59b56822991e\") " pod="openshift-marketplace/community-operators-qdh6w" Oct 12 05:43:36 crc kubenswrapper[4930]: I1012 05:43:36.487105 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-gbqmf" Oct 12 05:43:36 crc kubenswrapper[4930]: I1012 05:43:36.526622 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qdh6w" Oct 12 05:43:36 crc kubenswrapper[4930]: I1012 05:43:36.538010 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 05:43:36 crc kubenswrapper[4930]: I1012 05:43:36.538461 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/390607e9-0185-4b5b-a006-70ecef170a3b-utilities\") pod \"certified-operators-gn4wn\" (UID: \"390607e9-0185-4b5b-a006-70ecef170a3b\") " pod="openshift-marketplace/certified-operators-gn4wn" Oct 12 05:43:36 crc kubenswrapper[4930]: I1012 05:43:36.538483 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnb2h\" (UniqueName: \"kubernetes.io/projected/390607e9-0185-4b5b-a006-70ecef170a3b-kube-api-access-wnb2h\") pod \"certified-operators-gn4wn\" (UID: \"390607e9-0185-4b5b-a006-70ecef170a3b\") " pod="openshift-marketplace/certified-operators-gn4wn" Oct 12 05:43:36 crc kubenswrapper[4930]: I1012 05:43:36.538556 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/390607e9-0185-4b5b-a006-70ecef170a3b-catalog-content\") pod \"certified-operators-gn4wn\" (UID: \"390607e9-0185-4b5b-a006-70ecef170a3b\") " pod="openshift-marketplace/certified-operators-gn4wn" Oct 12 05:43:36 crc kubenswrapper[4930]: I1012 05:43:36.538979 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/390607e9-0185-4b5b-a006-70ecef170a3b-catalog-content\") pod \"certified-operators-gn4wn\" (UID: \"390607e9-0185-4b5b-a006-70ecef170a3b\") " pod="openshift-marketplace/certified-operators-gn4wn" Oct 12 05:43:36 crc kubenswrapper[4930]: E1012 05:43:36.539050 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 05:43:37.039035703 +0000 UTC m=+149.581137468 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:43:36 crc kubenswrapper[4930]: I1012 05:43:36.539235 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/390607e9-0185-4b5b-a006-70ecef170a3b-utilities\") pod \"certified-operators-gn4wn\" (UID: \"390607e9-0185-4b5b-a006-70ecef170a3b\") " pod="openshift-marketplace/certified-operators-gn4wn" Oct 12 05:43:36 crc kubenswrapper[4930]: I1012 05:43:36.583559 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnb2h\" (UniqueName: \"kubernetes.io/projected/390607e9-0185-4b5b-a006-70ecef170a3b-kube-api-access-wnb2h\") pod \"certified-operators-gn4wn\" (UID: \"390607e9-0185-4b5b-a006-70ecef170a3b\") " pod="openshift-marketplace/certified-operators-gn4wn" Oct 12 05:43:36 crc kubenswrapper[4930]: I1012 05:43:36.639619 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bzqf2\" (UID: \"51d2d15f-f2ac-4939-b11d-f41e5891323d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bzqf2" Oct 12 05:43:36 crc kubenswrapper[4930]: E1012 05:43:36.640337 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 05:43:37.140322121 +0000 UTC m=+149.682423886 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bzqf2" (UID: "51d2d15f-f2ac-4939-b11d-f41e5891323d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:43:36 crc kubenswrapper[4930]: I1012 05:43:36.691090 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gn4wn" Oct 12 05:43:36 crc kubenswrapper[4930]: I1012 05:43:36.740542 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 05:43:36 crc kubenswrapper[4930]: E1012 05:43:36.740886 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 05:43:37.240869407 +0000 UTC m=+149.782971172 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:43:36 crc kubenswrapper[4930]: I1012 05:43:36.781986 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-25pm7" event={"ID":"c393fc2e-599b-4842-9c6c-b46cef38f7e6","Type":"ContainerStarted","Data":"b5f73442bda8bb831e0323b13b9a6de968626fa962d67147bc702a6842cb4a03"} Oct 12 05:43:36 crc kubenswrapper[4930]: I1012 05:43:36.782777 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-25pm7" event={"ID":"c393fc2e-599b-4842-9c6c-b46cef38f7e6","Type":"ContainerStarted","Data":"b797b793fe841aa895c1af8a048edca23c4c0f3d0aa4f07cb28c57397ee07811"} Oct 12 05:43:36 crc kubenswrapper[4930]: I1012 05:43:36.784064 4930 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-bs29k container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Oct 12 05:43:36 crc kubenswrapper[4930]: I1012 05:43:36.784114 4930 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-bs29k" podUID="a51e2785-1c3d-4354-a071-dadb05075c68" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" Oct 12 05:43:36 crc kubenswrapper[4930]: I1012 05:43:36.834279 4930 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Oct 12 05:43:36 crc kubenswrapper[4930]: I1012 05:43:36.847180 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bzqf2\" (UID: \"51d2d15f-f2ac-4939-b11d-f41e5891323d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bzqf2" Oct 12 05:43:36 crc kubenswrapper[4930]: E1012 05:43:36.847568 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 05:43:37.347551366 +0000 UTC m=+149.889653131 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bzqf2" (UID: "51d2d15f-f2ac-4939-b11d-f41e5891323d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:43:36 crc kubenswrapper[4930]: I1012 05:43:36.906015 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lmr9n" Oct 12 05:43:36 crc kubenswrapper[4930]: I1012 05:43:36.968488 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 05:43:36 crc kubenswrapper[4930]: E1012 05:43:36.968887 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 05:43:37.468862458 +0000 UTC m=+150.010964213 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:43:37 crc kubenswrapper[4930]: I1012 05:43:37.072403 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bzqf2\" (UID: \"51d2d15f-f2ac-4939-b11d-f41e5891323d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bzqf2" Oct 12 05:43:37 crc kubenswrapper[4930]: E1012 05:43:37.072982 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 05:43:37.572957915 +0000 UTC m=+150.115059680 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bzqf2" (UID: "51d2d15f-f2ac-4939-b11d-f41e5891323d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:43:37 crc kubenswrapper[4930]: I1012 05:43:37.175187 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 05:43:37 crc kubenswrapper[4930]: E1012 05:43:37.175797 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-12 05:43:37.675763185 +0000 UTC m=+150.217864950 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:43:37 crc kubenswrapper[4930]: I1012 05:43:37.183840 4930 patch_prober.go:28] interesting pod/router-default-5444994796-mdwcw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 12 05:43:37 crc kubenswrapper[4930]: [-]has-synced failed: reason withheld Oct 12 05:43:37 crc kubenswrapper[4930]: [+]process-running ok Oct 12 05:43:37 crc kubenswrapper[4930]: healthz check failed Oct 12 05:43:37 crc kubenswrapper[4930]: I1012 05:43:37.183921 4930 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mdwcw" podUID="93aad091-5fd5-4eb5-a123-b7932dc268fe" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 12 05:43:37 crc kubenswrapper[4930]: I1012 05:43:37.224054 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xvjv6"] Oct 12 05:43:37 crc kubenswrapper[4930]: I1012 05:43:37.279091 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bzqf2\" (UID: \"51d2d15f-f2ac-4939-b11d-f41e5891323d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bzqf2" Oct 12 05:43:37 crc kubenswrapper[4930]: E1012 05:43:37.279396 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-12 05:43:37.779385158 +0000 UTC m=+150.321486923 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bzqf2" (UID: "51d2d15f-f2ac-4939-b11d-f41e5891323d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 12 05:43:37 crc kubenswrapper[4930]: I1012 05:43:37.328135 4930 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-10-12T05:43:36.834306713Z","Handler":null,"Name":""} Oct 12 05:43:37 crc kubenswrapper[4930]: I1012 05:43:37.364951 4930 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Oct 12 05:43:37 crc kubenswrapper[4930]: I1012 05:43:37.365030 4930 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Oct 12 05:43:37 crc kubenswrapper[4930]: I1012 05:43:37.381953 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 12 05:43:37 crc kubenswrapper[4930]: I1012 05:43:37.388942 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 12 05:43:37 crc kubenswrapper[4930]: I1012 05:43:37.430374 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8wdhk"] Oct 12 05:43:37 crc kubenswrapper[4930]: I1012 05:43:37.482902 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bzqf2\" (UID: \"51d2d15f-f2ac-4939-b11d-f41e5891323d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bzqf2" Oct 12 05:43:37 crc kubenswrapper[4930]: I1012 05:43:37.486673 4930 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 12 05:43:37 crc kubenswrapper[4930]: I1012 05:43:37.486708 4930 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bzqf2\" (UID: \"51d2d15f-f2ac-4939-b11d-f41e5891323d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-bzqf2" Oct 12 05:43:37 crc kubenswrapper[4930]: I1012 05:43:37.509032 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gn4wn"] Oct 12 05:43:37 crc kubenswrapper[4930]: W1012 05:43:37.527041 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod390607e9_0185_4b5b_a006_70ecef170a3b.slice/crio-0ec402df5720f7a73cdc63bb337f7b99865b85aab0b100bddf4bfd1efd007c88 WatchSource:0}: Error finding container 0ec402df5720f7a73cdc63bb337f7b99865b85aab0b100bddf4bfd1efd007c88: Status 404 returned error can't find the container with id 0ec402df5720f7a73cdc63bb337f7b99865b85aab0b100bddf4bfd1efd007c88 Oct 12 05:43:37 crc kubenswrapper[4930]: I1012 05:43:37.658254 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qdh6w"] Oct 12 05:43:37 crc kubenswrapper[4930]: I1012 05:43:37.772989 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9vqgr"] Oct 12 05:43:37 crc kubenswrapper[4930]: I1012 05:43:37.773922 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9vqgr" Oct 12 05:43:37 crc kubenswrapper[4930]: I1012 05:43:37.777365 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 12 05:43:37 crc kubenswrapper[4930]: I1012 05:43:37.794630 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bzqf2\" (UID: \"51d2d15f-f2ac-4939-b11d-f41e5891323d\") " pod="openshift-image-registry/image-registry-697d97f7c8-bzqf2" Oct 12 05:43:37 crc kubenswrapper[4930]: I1012 05:43:37.794908 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xvjv6" event={"ID":"deece869-6cf9-4922-b75a-294c828c6e9e","Type":"ContainerStarted","Data":"33c422288a82d81935f37b170503a392d369f96e1973511e6b38e0dbc97c9b56"} Oct 12 05:43:37 crc kubenswrapper[4930]: I1012 05:43:37.801604 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gn4wn" event={"ID":"390607e9-0185-4b5b-a006-70ecef170a3b","Type":"ContainerStarted","Data":"0ec402df5720f7a73cdc63bb337f7b99865b85aab0b100bddf4bfd1efd007c88"} Oct 12 05:43:37 crc kubenswrapper[4930]: I1012 05:43:37.802930 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"7de1ad868d3292566145d799e8dba37c366fcce07c7d672b0b5b5f7dd2debf45"} Oct 12 05:43:37 crc kubenswrapper[4930]: I1012 05:43:37.806917 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9vqgr"] Oct 12 05:43:37 crc kubenswrapper[4930]: I1012 05:43:37.847851 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qdh6w" event={"ID":"054cc455-fa72-4e63-a082-59b56822991e","Type":"ContainerStarted","Data":"703dcb896f950d71f69903412a19fde91cff53eee36b0ca61b388f756bb0e486"} Oct 12 05:43:37 crc kubenswrapper[4930]: I1012 05:43:37.861056 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"22386ecb9b3a21292b8779f95991dd9ac311812e89a41320a82c2eacb428d0ce"} Oct 12 05:43:37 crc kubenswrapper[4930]: I1012 05:43:37.870172 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8wdhk" event={"ID":"54acd3fa-9208-450c-9a6a-4bae6962c325","Type":"ContainerStarted","Data":"5b728964c5334400aea56ec7a0b4d636e7d35f0d1c9cd1e881637bc40f301de7"} Oct 12 05:43:37 crc kubenswrapper[4930]: I1012 05:43:37.892332 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"59892bf4fdb35b2247d6d6190ac95913ea3bf2530f4b28047feb5ffe45742d88"} Oct 12 05:43:37 crc kubenswrapper[4930]: I1012 05:43:37.892378 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"c461b808ccd1ea08f331c2ad038491c6c5b29c25ddbae3054e34cc62380c41f2"} Oct 12 05:43:37 crc kubenswrapper[4930]: I1012 05:43:37.893000 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 05:43:37 crc kubenswrapper[4930]: I1012 05:43:37.895403 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e085d670-8038-4828-acb6-dddc74a33655-utilities\") pod \"redhat-marketplace-9vqgr\" (UID: \"e085d670-8038-4828-acb6-dddc74a33655\") " pod="openshift-marketplace/redhat-marketplace-9vqgr" Oct 12 05:43:37 crc kubenswrapper[4930]: I1012 05:43:37.895443 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e085d670-8038-4828-acb6-dddc74a33655-catalog-content\") pod \"redhat-marketplace-9vqgr\" (UID: \"e085d670-8038-4828-acb6-dddc74a33655\") " pod="openshift-marketplace/redhat-marketplace-9vqgr" Oct 12 05:43:37 crc kubenswrapper[4930]: I1012 05:43:37.895469 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kq7rv\" (UniqueName: \"kubernetes.io/projected/e085d670-8038-4828-acb6-dddc74a33655-kube-api-access-kq7rv\") pod \"redhat-marketplace-9vqgr\" (UID: \"e085d670-8038-4828-acb6-dddc74a33655\") " pod="openshift-marketplace/redhat-marketplace-9vqgr" Oct 12 05:43:37 crc kubenswrapper[4930]: I1012 05:43:37.998091 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e085d670-8038-4828-acb6-dddc74a33655-utilities\") pod \"redhat-marketplace-9vqgr\" (UID: \"e085d670-8038-4828-acb6-dddc74a33655\") " pod="openshift-marketplace/redhat-marketplace-9vqgr" Oct 12 05:43:37 crc kubenswrapper[4930]: I1012 05:43:37.998215 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e085d670-8038-4828-acb6-dddc74a33655-catalog-content\") pod \"redhat-marketplace-9vqgr\" (UID: \"e085d670-8038-4828-acb6-dddc74a33655\") " pod="openshift-marketplace/redhat-marketplace-9vqgr" Oct 12 05:43:37 crc kubenswrapper[4930]: I1012 05:43:37.998338 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kq7rv\" (UniqueName: \"kubernetes.io/projected/e085d670-8038-4828-acb6-dddc74a33655-kube-api-access-kq7rv\") pod \"redhat-marketplace-9vqgr\" (UID: \"e085d670-8038-4828-acb6-dddc74a33655\") " pod="openshift-marketplace/redhat-marketplace-9vqgr" Oct 12 05:43:38 crc kubenswrapper[4930]: I1012 05:43:38.001432 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e085d670-8038-4828-acb6-dddc74a33655-utilities\") pod \"redhat-marketplace-9vqgr\" (UID: \"e085d670-8038-4828-acb6-dddc74a33655\") " pod="openshift-marketplace/redhat-marketplace-9vqgr" Oct 12 05:43:38 crc kubenswrapper[4930]: I1012 05:43:38.001516 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-bzqf2" Oct 12 05:43:38 crc kubenswrapper[4930]: I1012 05:43:38.003618 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e085d670-8038-4828-acb6-dddc74a33655-catalog-content\") pod \"redhat-marketplace-9vqgr\" (UID: \"e085d670-8038-4828-acb6-dddc74a33655\") " pod="openshift-marketplace/redhat-marketplace-9vqgr" Oct 12 05:43:38 crc kubenswrapper[4930]: I1012 05:43:38.025537 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kq7rv\" (UniqueName: \"kubernetes.io/projected/e085d670-8038-4828-acb6-dddc74a33655-kube-api-access-kq7rv\") pod \"redhat-marketplace-9vqgr\" (UID: \"e085d670-8038-4828-acb6-dddc74a33655\") " pod="openshift-marketplace/redhat-marketplace-9vqgr" Oct 12 05:43:38 crc kubenswrapper[4930]: I1012 05:43:38.105114 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9vqgr" Oct 12 05:43:38 crc kubenswrapper[4930]: I1012 05:43:38.145861 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Oct 12 05:43:38 crc kubenswrapper[4930]: I1012 05:43:38.168230 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8fq8j"] Oct 12 05:43:38 crc kubenswrapper[4930]: I1012 05:43:38.169869 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8fq8j" Oct 12 05:43:38 crc kubenswrapper[4930]: I1012 05:43:38.174401 4930 patch_prober.go:28] interesting pod/router-default-5444994796-mdwcw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 12 05:43:38 crc kubenswrapper[4930]: [-]has-synced failed: reason withheld Oct 12 05:43:38 crc kubenswrapper[4930]: [+]process-running ok Oct 12 05:43:38 crc kubenswrapper[4930]: healthz check failed Oct 12 05:43:38 crc kubenswrapper[4930]: I1012 05:43:38.174470 4930 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mdwcw" podUID="93aad091-5fd5-4eb5-a123-b7932dc268fe" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 12 05:43:38 crc kubenswrapper[4930]: I1012 05:43:38.180337 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8fq8j"] Oct 12 05:43:38 crc kubenswrapper[4930]: I1012 05:43:38.305258 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4l8z\" (UniqueName: \"kubernetes.io/projected/3086651e-154b-4f50-98fd-727d6ede2e3c-kube-api-access-h4l8z\") pod \"redhat-marketplace-8fq8j\" (UID: \"3086651e-154b-4f50-98fd-727d6ede2e3c\") " pod="openshift-marketplace/redhat-marketplace-8fq8j" Oct 12 05:43:38 crc kubenswrapper[4930]: I1012 05:43:38.305567 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3086651e-154b-4f50-98fd-727d6ede2e3c-catalog-content\") pod \"redhat-marketplace-8fq8j\" (UID: \"3086651e-154b-4f50-98fd-727d6ede2e3c\") " pod="openshift-marketplace/redhat-marketplace-8fq8j" Oct 12 05:43:38 crc kubenswrapper[4930]: I1012 05:43:38.305592 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3086651e-154b-4f50-98fd-727d6ede2e3c-utilities\") pod \"redhat-marketplace-8fq8j\" (UID: \"3086651e-154b-4f50-98fd-727d6ede2e3c\") " pod="openshift-marketplace/redhat-marketplace-8fq8j" Oct 12 05:43:38 crc kubenswrapper[4930]: I1012 05:43:38.343232 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bzqf2"] Oct 12 05:43:38 crc kubenswrapper[4930]: I1012 05:43:38.395614 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9vqgr"] Oct 12 05:43:38 crc kubenswrapper[4930]: W1012 05:43:38.403305 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode085d670_8038_4828_acb6_dddc74a33655.slice/crio-b851fac227366e817d3ca6132d22cf2d8c6722bcee4dbe33753141d59f1c645c WatchSource:0}: Error finding container b851fac227366e817d3ca6132d22cf2d8c6722bcee4dbe33753141d59f1c645c: Status 404 returned error can't find the container with id b851fac227366e817d3ca6132d22cf2d8c6722bcee4dbe33753141d59f1c645c Oct 12 05:43:38 crc kubenswrapper[4930]: I1012 05:43:38.406242 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3086651e-154b-4f50-98fd-727d6ede2e3c-catalog-content\") pod \"redhat-marketplace-8fq8j\" (UID: \"3086651e-154b-4f50-98fd-727d6ede2e3c\") " pod="openshift-marketplace/redhat-marketplace-8fq8j" Oct 12 05:43:38 crc kubenswrapper[4930]: I1012 05:43:38.406285 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3086651e-154b-4f50-98fd-727d6ede2e3c-utilities\") pod \"redhat-marketplace-8fq8j\" (UID: \"3086651e-154b-4f50-98fd-727d6ede2e3c\") " pod="openshift-marketplace/redhat-marketplace-8fq8j" Oct 12 05:43:38 crc kubenswrapper[4930]: I1012 05:43:38.406356 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4l8z\" (UniqueName: \"kubernetes.io/projected/3086651e-154b-4f50-98fd-727d6ede2e3c-kube-api-access-h4l8z\") pod \"redhat-marketplace-8fq8j\" (UID: \"3086651e-154b-4f50-98fd-727d6ede2e3c\") " pod="openshift-marketplace/redhat-marketplace-8fq8j" Oct 12 05:43:38 crc kubenswrapper[4930]: I1012 05:43:38.406664 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3086651e-154b-4f50-98fd-727d6ede2e3c-catalog-content\") pod \"redhat-marketplace-8fq8j\" (UID: \"3086651e-154b-4f50-98fd-727d6ede2e3c\") " pod="openshift-marketplace/redhat-marketplace-8fq8j" Oct 12 05:43:38 crc kubenswrapper[4930]: I1012 05:43:38.406922 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3086651e-154b-4f50-98fd-727d6ede2e3c-utilities\") pod \"redhat-marketplace-8fq8j\" (UID: \"3086651e-154b-4f50-98fd-727d6ede2e3c\") " pod="openshift-marketplace/redhat-marketplace-8fq8j" Oct 12 05:43:38 crc kubenswrapper[4930]: I1012 05:43:38.429900 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4l8z\" (UniqueName: \"kubernetes.io/projected/3086651e-154b-4f50-98fd-727d6ede2e3c-kube-api-access-h4l8z\") pod \"redhat-marketplace-8fq8j\" (UID: \"3086651e-154b-4f50-98fd-727d6ede2e3c\") " pod="openshift-marketplace/redhat-marketplace-8fq8j" Oct 12 05:43:38 crc kubenswrapper[4930]: I1012 05:43:38.527120 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8fq8j" Oct 12 05:43:38 crc kubenswrapper[4930]: I1012 05:43:38.746136 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8fq8j"] Oct 12 05:43:38 crc kubenswrapper[4930]: W1012 05:43:38.750970 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3086651e_154b_4f50_98fd_727d6ede2e3c.slice/crio-d97144446470ae2ff9acf3bcd946c435152c4ddd2978f788b235b2696176d321 WatchSource:0}: Error finding container d97144446470ae2ff9acf3bcd946c435152c4ddd2978f788b235b2696176d321: Status 404 returned error can't find the container with id d97144446470ae2ff9acf3bcd946c435152c4ddd2978f788b235b2696176d321 Oct 12 05:43:38 crc kubenswrapper[4930]: I1012 05:43:38.900822 4930 generic.go:334] "Generic (PLEG): container finished" podID="e085d670-8038-4828-acb6-dddc74a33655" containerID="72b49ebebe7e590ce849e00cf7f6ea8691173ec5465245f6479fe0a8cfbec7fb" exitCode=0 Oct 12 05:43:38 crc kubenswrapper[4930]: I1012 05:43:38.900940 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9vqgr" event={"ID":"e085d670-8038-4828-acb6-dddc74a33655","Type":"ContainerDied","Data":"72b49ebebe7e590ce849e00cf7f6ea8691173ec5465245f6479fe0a8cfbec7fb"} Oct 12 05:43:38 crc kubenswrapper[4930]: I1012 05:43:38.901037 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9vqgr" event={"ID":"e085d670-8038-4828-acb6-dddc74a33655","Type":"ContainerStarted","Data":"b851fac227366e817d3ca6132d22cf2d8c6722bcee4dbe33753141d59f1c645c"} Oct 12 05:43:38 crc kubenswrapper[4930]: I1012 05:43:38.910546 4930 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 12 05:43:38 crc kubenswrapper[4930]: I1012 05:43:38.917524 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"ecc97e4da4b6f65ac52e9153df4f1a4840b2c9593178a831a9ed71a7e8e7e33a"} Oct 12 05:43:38 crc kubenswrapper[4930]: I1012 05:43:38.919858 4930 generic.go:334] "Generic (PLEG): container finished" podID="054cc455-fa72-4e63-a082-59b56822991e" containerID="1c82b68a6a649f63e0730f80253c2724f8884617e3434c060b81ceaa4524983e" exitCode=0 Oct 12 05:43:38 crc kubenswrapper[4930]: I1012 05:43:38.919946 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qdh6w" event={"ID":"054cc455-fa72-4e63-a082-59b56822991e","Type":"ContainerDied","Data":"1c82b68a6a649f63e0730f80253c2724f8884617e3434c060b81ceaa4524983e"} Oct 12 05:43:38 crc kubenswrapper[4930]: I1012 05:43:38.922466 4930 generic.go:334] "Generic (PLEG): container finished" podID="74be97dd-4d16-40ed-87e4-b707eccf422e" containerID="03c67c5984d37c63a32d182586fc0c79ea9235d37261509a22f89f4ee0167502" exitCode=0 Oct 12 05:43:38 crc kubenswrapper[4930]: I1012 05:43:38.922559 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29337450-9g4rr" event={"ID":"74be97dd-4d16-40ed-87e4-b707eccf422e","Type":"ContainerDied","Data":"03c67c5984d37c63a32d182586fc0c79ea9235d37261509a22f89f4ee0167502"} Oct 12 05:43:38 crc kubenswrapper[4930]: I1012 05:43:38.932338 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-bzqf2" event={"ID":"51d2d15f-f2ac-4939-b11d-f41e5891323d","Type":"ContainerStarted","Data":"4f28a2f8f2ff69d7a0202b3453ede968740ec9493f520531fbe412d499c2b185"} Oct 12 05:43:38 crc kubenswrapper[4930]: I1012 05:43:38.932384 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-bzqf2" event={"ID":"51d2d15f-f2ac-4939-b11d-f41e5891323d","Type":"ContainerStarted","Data":"b3a22c74397ec2a625417b3a4b08711d8bac8474461f5096ca2c969fee5f15c8"} Oct 12 05:43:38 crc kubenswrapper[4930]: I1012 05:43:38.932560 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-bzqf2" Oct 12 05:43:38 crc kubenswrapper[4930]: I1012 05:43:38.945298 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-25pm7" event={"ID":"c393fc2e-599b-4842-9c6c-b46cef38f7e6","Type":"ContainerStarted","Data":"f46c001d4c81c93e84420902ece0372e10207259878ee8f49a8cefdda6145648"} Oct 12 05:43:38 crc kubenswrapper[4930]: I1012 05:43:38.955208 4930 generic.go:334] "Generic (PLEG): container finished" podID="deece869-6cf9-4922-b75a-294c828c6e9e" containerID="4f47e304d8de582c36f6ade26a50f3e06bafa72cab73632b8733254447f8d097" exitCode=0 Oct 12 05:43:38 crc kubenswrapper[4930]: I1012 05:43:38.955361 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xvjv6" event={"ID":"deece869-6cf9-4922-b75a-294c828c6e9e","Type":"ContainerDied","Data":"4f47e304d8de582c36f6ade26a50f3e06bafa72cab73632b8733254447f8d097"} Oct 12 05:43:38 crc kubenswrapper[4930]: I1012 05:43:38.957531 4930 generic.go:334] "Generic (PLEG): container finished" podID="390607e9-0185-4b5b-a006-70ecef170a3b" containerID="a4dd2cdca82f4ae4808c68879a86ea0d4a3376d6418923ab4c894d8eb923ebe3" exitCode=0 Oct 12 05:43:38 crc kubenswrapper[4930]: I1012 05:43:38.958076 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gn4wn" event={"ID":"390607e9-0185-4b5b-a006-70ecef170a3b","Type":"ContainerDied","Data":"a4dd2cdca82f4ae4808c68879a86ea0d4a3376d6418923ab4c894d8eb923ebe3"} Oct 12 05:43:38 crc kubenswrapper[4930]: I1012 05:43:38.961471 4930 generic.go:334] "Generic (PLEG): container finished" podID="54acd3fa-9208-450c-9a6a-4bae6962c325" containerID="1b864609006f936003f95f2bb11e3b4761f05c3be5574089377f3429b3e36369" exitCode=0 Oct 12 05:43:38 crc kubenswrapper[4930]: I1012 05:43:38.961528 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8wdhk" event={"ID":"54acd3fa-9208-450c-9a6a-4bae6962c325","Type":"ContainerDied","Data":"1b864609006f936003f95f2bb11e3b4761f05c3be5574089377f3429b3e36369"} Oct 12 05:43:38 crc kubenswrapper[4930]: I1012 05:43:38.970398 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8fq8j" event={"ID":"3086651e-154b-4f50-98fd-727d6ede2e3c","Type":"ContainerStarted","Data":"d97144446470ae2ff9acf3bcd946c435152c4ddd2978f788b235b2696176d321"} Oct 12 05:43:38 crc kubenswrapper[4930]: I1012 05:43:38.977754 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"8864bb88b080aff87e84ece93f85281b04914576e42c0d9f6f19d3ff8661f1c2"} Oct 12 05:43:38 crc kubenswrapper[4930]: I1012 05:43:38.983237 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xxb6v"] Oct 12 05:43:38 crc kubenswrapper[4930]: I1012 05:43:38.984502 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xxb6v" Oct 12 05:43:38 crc kubenswrapper[4930]: I1012 05:43:38.987255 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 12 05:43:38 crc kubenswrapper[4930]: I1012 05:43:38.998687 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xxb6v"] Oct 12 05:43:39 crc kubenswrapper[4930]: I1012 05:43:39.054754 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-bzqf2" podStartSLOduration=130.054706608 podStartE2EDuration="2m10.054706608s" podCreationTimestamp="2025-10-12 05:41:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:43:39.045985372 +0000 UTC m=+151.588087137" watchObservedRunningTime="2025-10-12 05:43:39.054706608 +0000 UTC m=+151.596808373" Oct 12 05:43:39 crc kubenswrapper[4930]: I1012 05:43:39.116457 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlfh7\" (UniqueName: \"kubernetes.io/projected/abe3804c-7b56-43f4-be75-206e80471232-kube-api-access-wlfh7\") pod \"redhat-operators-xxb6v\" (UID: \"abe3804c-7b56-43f4-be75-206e80471232\") " pod="openshift-marketplace/redhat-operators-xxb6v" Oct 12 05:43:39 crc kubenswrapper[4930]: I1012 05:43:39.117065 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abe3804c-7b56-43f4-be75-206e80471232-utilities\") pod \"redhat-operators-xxb6v\" (UID: \"abe3804c-7b56-43f4-be75-206e80471232\") " pod="openshift-marketplace/redhat-operators-xxb6v" Oct 12 05:43:39 crc kubenswrapper[4930]: I1012 05:43:39.117189 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abe3804c-7b56-43f4-be75-206e80471232-catalog-content\") pod \"redhat-operators-xxb6v\" (UID: \"abe3804c-7b56-43f4-be75-206e80471232\") " pod="openshift-marketplace/redhat-operators-xxb6v" Oct 12 05:43:39 crc kubenswrapper[4930]: I1012 05:43:39.118210 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-25pm7" podStartSLOduration=12.118191809 podStartE2EDuration="12.118191809s" podCreationTimestamp="2025-10-12 05:43:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:43:39.116563653 +0000 UTC m=+151.658665428" watchObservedRunningTime="2025-10-12 05:43:39.118191809 +0000 UTC m=+151.660293574" Oct 12 05:43:39 crc kubenswrapper[4930]: I1012 05:43:39.177367 4930 patch_prober.go:28] interesting pod/router-default-5444994796-mdwcw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 12 05:43:39 crc kubenswrapper[4930]: [-]has-synced failed: reason withheld Oct 12 05:43:39 crc kubenswrapper[4930]: [+]process-running ok Oct 12 05:43:39 crc kubenswrapper[4930]: healthz check failed Oct 12 05:43:39 crc kubenswrapper[4930]: I1012 05:43:39.177464 4930 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mdwcw" podUID="93aad091-5fd5-4eb5-a123-b7932dc268fe" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 12 05:43:39 crc kubenswrapper[4930]: I1012 05:43:39.219192 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abe3804c-7b56-43f4-be75-206e80471232-utilities\") pod \"redhat-operators-xxb6v\" (UID: \"abe3804c-7b56-43f4-be75-206e80471232\") " pod="openshift-marketplace/redhat-operators-xxb6v" Oct 12 05:43:39 crc kubenswrapper[4930]: I1012 05:43:39.219241 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abe3804c-7b56-43f4-be75-206e80471232-catalog-content\") pod \"redhat-operators-xxb6v\" (UID: \"abe3804c-7b56-43f4-be75-206e80471232\") " pod="openshift-marketplace/redhat-operators-xxb6v" Oct 12 05:43:39 crc kubenswrapper[4930]: I1012 05:43:39.219273 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlfh7\" (UniqueName: \"kubernetes.io/projected/abe3804c-7b56-43f4-be75-206e80471232-kube-api-access-wlfh7\") pod \"redhat-operators-xxb6v\" (UID: \"abe3804c-7b56-43f4-be75-206e80471232\") " pod="openshift-marketplace/redhat-operators-xxb6v" Oct 12 05:43:39 crc kubenswrapper[4930]: I1012 05:43:39.220057 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abe3804c-7b56-43f4-be75-206e80471232-utilities\") pod \"redhat-operators-xxb6v\" (UID: \"abe3804c-7b56-43f4-be75-206e80471232\") " pod="openshift-marketplace/redhat-operators-xxb6v" Oct 12 05:43:39 crc kubenswrapper[4930]: I1012 05:43:39.220110 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abe3804c-7b56-43f4-be75-206e80471232-catalog-content\") pod \"redhat-operators-xxb6v\" (UID: \"abe3804c-7b56-43f4-be75-206e80471232\") " pod="openshift-marketplace/redhat-operators-xxb6v" Oct 12 05:43:39 crc kubenswrapper[4930]: I1012 05:43:39.252562 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlfh7\" (UniqueName: \"kubernetes.io/projected/abe3804c-7b56-43f4-be75-206e80471232-kube-api-access-wlfh7\") pod \"redhat-operators-xxb6v\" (UID: \"abe3804c-7b56-43f4-be75-206e80471232\") " pod="openshift-marketplace/redhat-operators-xxb6v" Oct 12 05:43:39 crc kubenswrapper[4930]: I1012 05:43:39.348390 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xxb6v" Oct 12 05:43:39 crc kubenswrapper[4930]: I1012 05:43:39.372495 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-95997"] Oct 12 05:43:39 crc kubenswrapper[4930]: I1012 05:43:39.373830 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-95997" Oct 12 05:43:39 crc kubenswrapper[4930]: I1012 05:43:39.394929 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-95997"] Oct 12 05:43:39 crc kubenswrapper[4930]: I1012 05:43:39.422103 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e303adcd-2f97-4c4d-8bb4-77f7263ab93b-catalog-content\") pod \"redhat-operators-95997\" (UID: \"e303adcd-2f97-4c4d-8bb4-77f7263ab93b\") " pod="openshift-marketplace/redhat-operators-95997" Oct 12 05:43:39 crc kubenswrapper[4930]: I1012 05:43:39.422163 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e303adcd-2f97-4c4d-8bb4-77f7263ab93b-utilities\") pod \"redhat-operators-95997\" (UID: \"e303adcd-2f97-4c4d-8bb4-77f7263ab93b\") " pod="openshift-marketplace/redhat-operators-95997" Oct 12 05:43:39 crc kubenswrapper[4930]: I1012 05:43:39.422448 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xbnj\" (UniqueName: \"kubernetes.io/projected/e303adcd-2f97-4c4d-8bb4-77f7263ab93b-kube-api-access-7xbnj\") pod \"redhat-operators-95997\" (UID: \"e303adcd-2f97-4c4d-8bb4-77f7263ab93b\") " pod="openshift-marketplace/redhat-operators-95997" Oct 12 05:43:39 crc kubenswrapper[4930]: I1012 05:43:39.486994 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 12 05:43:39 crc kubenswrapper[4930]: I1012 05:43:39.488325 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 12 05:43:39 crc kubenswrapper[4930]: I1012 05:43:39.493427 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 12 05:43:39 crc kubenswrapper[4930]: I1012 05:43:39.499354 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Oct 12 05:43:39 crc kubenswrapper[4930]: I1012 05:43:39.499617 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Oct 12 05:43:39 crc kubenswrapper[4930]: I1012 05:43:39.524427 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xbnj\" (UniqueName: \"kubernetes.io/projected/e303adcd-2f97-4c4d-8bb4-77f7263ab93b-kube-api-access-7xbnj\") pod \"redhat-operators-95997\" (UID: \"e303adcd-2f97-4c4d-8bb4-77f7263ab93b\") " pod="openshift-marketplace/redhat-operators-95997" Oct 12 05:43:39 crc kubenswrapper[4930]: I1012 05:43:39.524508 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/997f2f95-2109-480c-833a-f9d751ff059b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"997f2f95-2109-480c-833a-f9d751ff059b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 12 05:43:39 crc kubenswrapper[4930]: I1012 05:43:39.524547 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/997f2f95-2109-480c-833a-f9d751ff059b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"997f2f95-2109-480c-833a-f9d751ff059b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 12 05:43:39 crc kubenswrapper[4930]: I1012 05:43:39.524576 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e303adcd-2f97-4c4d-8bb4-77f7263ab93b-catalog-content\") pod \"redhat-operators-95997\" (UID: \"e303adcd-2f97-4c4d-8bb4-77f7263ab93b\") " pod="openshift-marketplace/redhat-operators-95997" Oct 12 05:43:39 crc kubenswrapper[4930]: I1012 05:43:39.524655 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e303adcd-2f97-4c4d-8bb4-77f7263ab93b-utilities\") pod \"redhat-operators-95997\" (UID: \"e303adcd-2f97-4c4d-8bb4-77f7263ab93b\") " pod="openshift-marketplace/redhat-operators-95997" Oct 12 05:43:39 crc kubenswrapper[4930]: I1012 05:43:39.525217 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e303adcd-2f97-4c4d-8bb4-77f7263ab93b-catalog-content\") pod \"redhat-operators-95997\" (UID: \"e303adcd-2f97-4c4d-8bb4-77f7263ab93b\") " pod="openshift-marketplace/redhat-operators-95997" Oct 12 05:43:39 crc kubenswrapper[4930]: I1012 05:43:39.525221 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e303adcd-2f97-4c4d-8bb4-77f7263ab93b-utilities\") pod \"redhat-operators-95997\" (UID: \"e303adcd-2f97-4c4d-8bb4-77f7263ab93b\") " pod="openshift-marketplace/redhat-operators-95997" Oct 12 05:43:39 crc kubenswrapper[4930]: I1012 05:43:39.550475 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xbnj\" (UniqueName: \"kubernetes.io/projected/e303adcd-2f97-4c4d-8bb4-77f7263ab93b-kube-api-access-7xbnj\") pod \"redhat-operators-95997\" (UID: \"e303adcd-2f97-4c4d-8bb4-77f7263ab93b\") " pod="openshift-marketplace/redhat-operators-95997" Oct 12 05:43:39 crc kubenswrapper[4930]: I1012 05:43:39.626173 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/997f2f95-2109-480c-833a-f9d751ff059b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"997f2f95-2109-480c-833a-f9d751ff059b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 12 05:43:39 crc kubenswrapper[4930]: I1012 05:43:39.626257 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/997f2f95-2109-480c-833a-f9d751ff059b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"997f2f95-2109-480c-833a-f9d751ff059b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 12 05:43:39 crc kubenswrapper[4930]: I1012 05:43:39.626348 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/997f2f95-2109-480c-833a-f9d751ff059b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"997f2f95-2109-480c-833a-f9d751ff059b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 12 05:43:39 crc kubenswrapper[4930]: I1012 05:43:39.643672 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/997f2f95-2109-480c-833a-f9d751ff059b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"997f2f95-2109-480c-833a-f9d751ff059b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 12 05:43:39 crc kubenswrapper[4930]: I1012 05:43:39.739780 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-95997" Oct 12 05:43:39 crc kubenswrapper[4930]: I1012 05:43:39.763954 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xxb6v"] Oct 12 05:43:39 crc kubenswrapper[4930]: W1012 05:43:39.789383 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podabe3804c_7b56_43f4_be75_206e80471232.slice/crio-ae1c8b94b1a8bb350f33a0166cdd82162b5af09667ef055e07b09d45ea10485a WatchSource:0}: Error finding container ae1c8b94b1a8bb350f33a0166cdd82162b5af09667ef055e07b09d45ea10485a: Status 404 returned error can't find the container with id ae1c8b94b1a8bb350f33a0166cdd82162b5af09667ef055e07b09d45ea10485a Oct 12 05:43:39 crc kubenswrapper[4930]: I1012 05:43:39.817063 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 12 05:43:39 crc kubenswrapper[4930]: I1012 05:43:39.986718 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xxb6v" event={"ID":"abe3804c-7b56-43f4-be75-206e80471232","Type":"ContainerStarted","Data":"ae1c8b94b1a8bb350f33a0166cdd82162b5af09667ef055e07b09d45ea10485a"} Oct 12 05:43:39 crc kubenswrapper[4930]: I1012 05:43:39.989241 4930 generic.go:334] "Generic (PLEG): container finished" podID="3086651e-154b-4f50-98fd-727d6ede2e3c" containerID="a08834a16cc8ae033cfda2c844e730f4322851a7113cd6512e6b538c7acbf78f" exitCode=0 Oct 12 05:43:39 crc kubenswrapper[4930]: I1012 05:43:39.989352 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8fq8j" event={"ID":"3086651e-154b-4f50-98fd-727d6ede2e3c","Type":"ContainerDied","Data":"a08834a16cc8ae033cfda2c844e730f4322851a7113cd6512e6b538c7acbf78f"} Oct 12 05:43:40 crc kubenswrapper[4930]: I1012 05:43:40.006707 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-6gz2l" Oct 12 05:43:40 crc kubenswrapper[4930]: I1012 05:43:40.028711 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-6gz2l" Oct 12 05:43:40 crc kubenswrapper[4930]: I1012 05:43:40.030661 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-95997"] Oct 12 05:43:40 crc kubenswrapper[4930]: I1012 05:43:40.072242 4930 patch_prober.go:28] interesting pod/downloads-7954f5f757-ztn69 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Oct 12 05:43:40 crc kubenswrapper[4930]: I1012 05:43:40.072293 4930 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-ztn69" podUID="6c179d5f-d7dd-42b9-b248-cd3c34237961" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Oct 12 05:43:40 crc kubenswrapper[4930]: I1012 05:43:40.072599 4930 patch_prober.go:28] interesting pod/downloads-7954f5f757-ztn69 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Oct 12 05:43:40 crc kubenswrapper[4930]: I1012 05:43:40.072620 4930 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-ztn69" podUID="6c179d5f-d7dd-42b9-b248-cd3c34237961" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Oct 12 05:43:40 crc kubenswrapper[4930]: I1012 05:43:40.175706 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-mdwcw" Oct 12 05:43:40 crc kubenswrapper[4930]: I1012 05:43:40.196218 4930 patch_prober.go:28] interesting pod/router-default-5444994796-mdwcw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 12 05:43:40 crc kubenswrapper[4930]: [-]has-synced failed: reason withheld Oct 12 05:43:40 crc kubenswrapper[4930]: [+]process-running ok Oct 12 05:43:40 crc kubenswrapper[4930]: healthz check failed Oct 12 05:43:40 crc kubenswrapper[4930]: I1012 05:43:40.196268 4930 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mdwcw" podUID="93aad091-5fd5-4eb5-a123-b7932dc268fe" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 12 05:43:40 crc kubenswrapper[4930]: I1012 05:43:40.226341 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 12 05:43:40 crc kubenswrapper[4930]: I1012 05:43:40.227056 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 12 05:43:40 crc kubenswrapper[4930]: I1012 05:43:40.231476 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 12 05:43:40 crc kubenswrapper[4930]: I1012 05:43:40.235102 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Oct 12 05:43:40 crc kubenswrapper[4930]: I1012 05:43:40.235362 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Oct 12 05:43:40 crc kubenswrapper[4930]: I1012 05:43:40.243111 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bdeaac2c-b6ce-4e8f-8cf2-07623b7f48ed-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"bdeaac2c-b6ce-4e8f-8cf2-07623b7f48ed\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 12 05:43:40 crc kubenswrapper[4930]: I1012 05:43:40.243224 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bdeaac2c-b6ce-4e8f-8cf2-07623b7f48ed-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"bdeaac2c-b6ce-4e8f-8cf2-07623b7f48ed\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 12 05:43:40 crc kubenswrapper[4930]: I1012 05:43:40.344141 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bdeaac2c-b6ce-4e8f-8cf2-07623b7f48ed-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"bdeaac2c-b6ce-4e8f-8cf2-07623b7f48ed\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 12 05:43:40 crc kubenswrapper[4930]: I1012 05:43:40.344191 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bdeaac2c-b6ce-4e8f-8cf2-07623b7f48ed-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"bdeaac2c-b6ce-4e8f-8cf2-07623b7f48ed\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 12 05:43:40 crc kubenswrapper[4930]: I1012 05:43:40.344248 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bdeaac2c-b6ce-4e8f-8cf2-07623b7f48ed-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"bdeaac2c-b6ce-4e8f-8cf2-07623b7f48ed\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 12 05:43:40 crc kubenswrapper[4930]: I1012 05:43:40.365464 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bdeaac2c-b6ce-4e8f-8cf2-07623b7f48ed-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"bdeaac2c-b6ce-4e8f-8cf2-07623b7f48ed\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 12 05:43:40 crc kubenswrapper[4930]: I1012 05:43:40.401306 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-b6q56" Oct 12 05:43:40 crc kubenswrapper[4930]: I1012 05:43:40.401904 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-b6q56" Oct 12 05:43:40 crc kubenswrapper[4930]: I1012 05:43:40.406555 4930 patch_prober.go:28] interesting pod/console-f9d7485db-b6q56 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.31:8443/health\": dial tcp 10.217.0.31:8443: connect: connection refused" start-of-body= Oct 12 05:43:40 crc kubenswrapper[4930]: I1012 05:43:40.406703 4930 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-b6q56" podUID="eb977a38-ef3b-4820-a364-ad16d6c857d5" containerName="console" probeResult="failure" output="Get \"https://10.217.0.31:8443/health\": dial tcp 10.217.0.31:8443: connect: connection refused" Oct 12 05:43:40 crc kubenswrapper[4930]: I1012 05:43:40.407683 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 12 05:43:40 crc kubenswrapper[4930]: I1012 05:43:40.453031 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wsqp8" Oct 12 05:43:40 crc kubenswrapper[4930]: I1012 05:43:40.458351 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wsqp8" Oct 12 05:43:40 crc kubenswrapper[4930]: I1012 05:43:40.467441 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wsqp8" Oct 12 05:43:40 crc kubenswrapper[4930]: I1012 05:43:40.549242 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29337450-9g4rr" Oct 12 05:43:40 crc kubenswrapper[4930]: I1012 05:43:40.558677 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 12 05:43:40 crc kubenswrapper[4930]: I1012 05:43:40.648331 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/74be97dd-4d16-40ed-87e4-b707eccf422e-config-volume\") pod \"74be97dd-4d16-40ed-87e4-b707eccf422e\" (UID: \"74be97dd-4d16-40ed-87e4-b707eccf422e\") " Oct 12 05:43:40 crc kubenswrapper[4930]: I1012 05:43:40.648681 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9nz6\" (UniqueName: \"kubernetes.io/projected/74be97dd-4d16-40ed-87e4-b707eccf422e-kube-api-access-j9nz6\") pod \"74be97dd-4d16-40ed-87e4-b707eccf422e\" (UID: \"74be97dd-4d16-40ed-87e4-b707eccf422e\") " Oct 12 05:43:40 crc kubenswrapper[4930]: I1012 05:43:40.648764 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/74be97dd-4d16-40ed-87e4-b707eccf422e-secret-volume\") pod \"74be97dd-4d16-40ed-87e4-b707eccf422e\" (UID: \"74be97dd-4d16-40ed-87e4-b707eccf422e\") " Oct 12 05:43:40 crc kubenswrapper[4930]: I1012 05:43:40.650118 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74be97dd-4d16-40ed-87e4-b707eccf422e-config-volume" (OuterVolumeSpecName: "config-volume") pod "74be97dd-4d16-40ed-87e4-b707eccf422e" (UID: "74be97dd-4d16-40ed-87e4-b707eccf422e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:43:40 crc kubenswrapper[4930]: I1012 05:43:40.654213 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74be97dd-4d16-40ed-87e4-b707eccf422e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "74be97dd-4d16-40ed-87e4-b707eccf422e" (UID: "74be97dd-4d16-40ed-87e4-b707eccf422e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:43:40 crc kubenswrapper[4930]: I1012 05:43:40.654623 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74be97dd-4d16-40ed-87e4-b707eccf422e-kube-api-access-j9nz6" (OuterVolumeSpecName: "kube-api-access-j9nz6") pod "74be97dd-4d16-40ed-87e4-b707eccf422e" (UID: "74be97dd-4d16-40ed-87e4-b707eccf422e"). InnerVolumeSpecName "kube-api-access-j9nz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:43:40 crc kubenswrapper[4930]: I1012 05:43:40.750201 4930 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/74be97dd-4d16-40ed-87e4-b707eccf422e-config-volume\") on node \"crc\" DevicePath \"\"" Oct 12 05:43:40 crc kubenswrapper[4930]: I1012 05:43:40.750236 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9nz6\" (UniqueName: \"kubernetes.io/projected/74be97dd-4d16-40ed-87e4-b707eccf422e-kube-api-access-j9nz6\") on node \"crc\" DevicePath \"\"" Oct 12 05:43:40 crc kubenswrapper[4930]: I1012 05:43:40.750249 4930 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/74be97dd-4d16-40ed-87e4-b707eccf422e-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 12 05:43:40 crc kubenswrapper[4930]: I1012 05:43:40.760578 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 12 05:43:41 crc kubenswrapper[4930]: I1012 05:43:40.999837 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-bs29k" Oct 12 05:43:41 crc kubenswrapper[4930]: I1012 05:43:41.061694 4930 generic.go:334] "Generic (PLEG): container finished" podID="e303adcd-2f97-4c4d-8bb4-77f7263ab93b" containerID="03484064af533ba8e6f6d19a0745fdb506c14203d412f6ca31e821eb26537be5" exitCode=0 Oct 12 05:43:41 crc kubenswrapper[4930]: I1012 05:43:41.061800 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-95997" event={"ID":"e303adcd-2f97-4c4d-8bb4-77f7263ab93b","Type":"ContainerDied","Data":"03484064af533ba8e6f6d19a0745fdb506c14203d412f6ca31e821eb26537be5"} Oct 12 05:43:41 crc kubenswrapper[4930]: I1012 05:43:41.061832 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-95997" event={"ID":"e303adcd-2f97-4c4d-8bb4-77f7263ab93b","Type":"ContainerStarted","Data":"a8017996338fe3a6e59214fea1125050d5a1eef415769b56bd1d482f7164b1de"} Oct 12 05:43:41 crc kubenswrapper[4930]: I1012 05:43:41.071397 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29337450-9g4rr" event={"ID":"74be97dd-4d16-40ed-87e4-b707eccf422e","Type":"ContainerDied","Data":"3efd8d4f1fad50e702a8d8ac66b0f7c78a206680d5761549f29cffa73304605e"} Oct 12 05:43:41 crc kubenswrapper[4930]: I1012 05:43:41.071432 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29337450-9g4rr" Oct 12 05:43:41 crc kubenswrapper[4930]: I1012 05:43:41.071451 4930 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3efd8d4f1fad50e702a8d8ac66b0f7c78a206680d5761549f29cffa73304605e" Oct 12 05:43:41 crc kubenswrapper[4930]: I1012 05:43:41.080003 4930 generic.go:334] "Generic (PLEG): container finished" podID="abe3804c-7b56-43f4-be75-206e80471232" containerID="a63e26652077fa08b432f6cf4b30dde293ecdb2a2f7f48859e8134568510ac13" exitCode=0 Oct 12 05:43:41 crc kubenswrapper[4930]: I1012 05:43:41.080067 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xxb6v" event={"ID":"abe3804c-7b56-43f4-be75-206e80471232","Type":"ContainerDied","Data":"a63e26652077fa08b432f6cf4b30dde293ecdb2a2f7f48859e8134568510ac13"} Oct 12 05:43:41 crc kubenswrapper[4930]: I1012 05:43:41.081775 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"997f2f95-2109-480c-833a-f9d751ff059b","Type":"ContainerStarted","Data":"5824072331f5e0e4ef4edafa5e65d125481483b196a2d491ade50297bf314d55"} Oct 12 05:43:41 crc kubenswrapper[4930]: I1012 05:43:41.090420 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"bdeaac2c-b6ce-4e8f-8cf2-07623b7f48ed","Type":"ContainerStarted","Data":"418a79133199338e6239702cf2d73289021c491e5eb47ad4713efa0860859d35"} Oct 12 05:43:41 crc kubenswrapper[4930]: I1012 05:43:41.100184 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wsqp8" Oct 12 05:43:41 crc kubenswrapper[4930]: I1012 05:43:41.192577 4930 patch_prober.go:28] interesting pod/router-default-5444994796-mdwcw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 12 05:43:41 crc kubenswrapper[4930]: [-]has-synced failed: reason withheld Oct 12 05:43:41 crc kubenswrapper[4930]: [+]process-running ok Oct 12 05:43:41 crc kubenswrapper[4930]: healthz check failed Oct 12 05:43:41 crc kubenswrapper[4930]: I1012 05:43:41.192630 4930 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mdwcw" podUID="93aad091-5fd5-4eb5-a123-b7932dc268fe" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 12 05:43:42 crc kubenswrapper[4930]: I1012 05:43:42.101169 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"bdeaac2c-b6ce-4e8f-8cf2-07623b7f48ed","Type":"ContainerStarted","Data":"92564dbec6ac2a903ee3933dac20b31d2ea2204a1db58baff17bd8a58f16c87f"} Oct 12 05:43:42 crc kubenswrapper[4930]: I1012 05:43:42.105147 4930 generic.go:334] "Generic (PLEG): container finished" podID="997f2f95-2109-480c-833a-f9d751ff059b" containerID="1ee721e037f9c4c412edefe1fcff716379c8e306ba8317630ee24cc8cb377aab" exitCode=0 Oct 12 05:43:42 crc kubenswrapper[4930]: I1012 05:43:42.105211 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"997f2f95-2109-480c-833a-f9d751ff059b","Type":"ContainerDied","Data":"1ee721e037f9c4c412edefe1fcff716379c8e306ba8317630ee24cc8cb377aab"} Oct 12 05:43:42 crc kubenswrapper[4930]: I1012 05:43:42.155982 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.15594571 podStartE2EDuration="2.15594571s" podCreationTimestamp="2025-10-12 05:43:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:43:42.148517101 +0000 UTC m=+154.690618866" watchObservedRunningTime="2025-10-12 05:43:42.15594571 +0000 UTC m=+154.698047475" Oct 12 05:43:42 crc kubenswrapper[4930]: I1012 05:43:42.172846 4930 patch_prober.go:28] interesting pod/router-default-5444994796-mdwcw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 12 05:43:42 crc kubenswrapper[4930]: [-]has-synced failed: reason withheld Oct 12 05:43:42 crc kubenswrapper[4930]: [+]process-running ok Oct 12 05:43:42 crc kubenswrapper[4930]: healthz check failed Oct 12 05:43:42 crc kubenswrapper[4930]: I1012 05:43:42.172894 4930 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mdwcw" podUID="93aad091-5fd5-4eb5-a123-b7932dc268fe" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 12 05:43:42 crc kubenswrapper[4930]: I1012 05:43:42.246633 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" Oct 12 05:43:43 crc kubenswrapper[4930]: I1012 05:43:43.038165 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-g8lk2" Oct 12 05:43:43 crc kubenswrapper[4930]: I1012 05:43:43.118906 4930 generic.go:334] "Generic (PLEG): container finished" podID="bdeaac2c-b6ce-4e8f-8cf2-07623b7f48ed" containerID="92564dbec6ac2a903ee3933dac20b31d2ea2204a1db58baff17bd8a58f16c87f" exitCode=0 Oct 12 05:43:43 crc kubenswrapper[4930]: I1012 05:43:43.119054 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"bdeaac2c-b6ce-4e8f-8cf2-07623b7f48ed","Type":"ContainerDied","Data":"92564dbec6ac2a903ee3933dac20b31d2ea2204a1db58baff17bd8a58f16c87f"} Oct 12 05:43:43 crc kubenswrapper[4930]: I1012 05:43:43.173983 4930 patch_prober.go:28] interesting pod/router-default-5444994796-mdwcw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 12 05:43:43 crc kubenswrapper[4930]: [-]has-synced failed: reason withheld Oct 12 05:43:43 crc kubenswrapper[4930]: [+]process-running ok Oct 12 05:43:43 crc kubenswrapper[4930]: healthz check failed Oct 12 05:43:43 crc kubenswrapper[4930]: I1012 05:43:43.174092 4930 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mdwcw" podUID="93aad091-5fd5-4eb5-a123-b7932dc268fe" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 12 05:43:43 crc kubenswrapper[4930]: I1012 05:43:43.578083 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 12 05:43:43 crc kubenswrapper[4930]: I1012 05:43:43.697484 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/997f2f95-2109-480c-833a-f9d751ff059b-kubelet-dir\") pod \"997f2f95-2109-480c-833a-f9d751ff059b\" (UID: \"997f2f95-2109-480c-833a-f9d751ff059b\") " Oct 12 05:43:43 crc kubenswrapper[4930]: I1012 05:43:43.697719 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/997f2f95-2109-480c-833a-f9d751ff059b-kube-api-access\") pod \"997f2f95-2109-480c-833a-f9d751ff059b\" (UID: \"997f2f95-2109-480c-833a-f9d751ff059b\") " Oct 12 05:43:43 crc kubenswrapper[4930]: I1012 05:43:43.697587 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/997f2f95-2109-480c-833a-f9d751ff059b-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "997f2f95-2109-480c-833a-f9d751ff059b" (UID: "997f2f95-2109-480c-833a-f9d751ff059b"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 05:43:43 crc kubenswrapper[4930]: I1012 05:43:43.698861 4930 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/997f2f95-2109-480c-833a-f9d751ff059b-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 12 05:43:43 crc kubenswrapper[4930]: I1012 05:43:43.719373 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/997f2f95-2109-480c-833a-f9d751ff059b-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "997f2f95-2109-480c-833a-f9d751ff059b" (UID: "997f2f95-2109-480c-833a-f9d751ff059b"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:43:43 crc kubenswrapper[4930]: I1012 05:43:43.800499 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/997f2f95-2109-480c-833a-f9d751ff059b-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 12 05:43:44 crc kubenswrapper[4930]: I1012 05:43:44.145122 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 12 05:43:44 crc kubenswrapper[4930]: I1012 05:43:44.158163 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"997f2f95-2109-480c-833a-f9d751ff059b","Type":"ContainerDied","Data":"5824072331f5e0e4ef4edafa5e65d125481483b196a2d491ade50297bf314d55"} Oct 12 05:43:44 crc kubenswrapper[4930]: I1012 05:43:44.158207 4930 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5824072331f5e0e4ef4edafa5e65d125481483b196a2d491ade50297bf314d55" Oct 12 05:43:44 crc kubenswrapper[4930]: I1012 05:43:44.172351 4930 patch_prober.go:28] interesting pod/router-default-5444994796-mdwcw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 12 05:43:44 crc kubenswrapper[4930]: [-]has-synced failed: reason withheld Oct 12 05:43:44 crc kubenswrapper[4930]: [+]process-running ok Oct 12 05:43:44 crc kubenswrapper[4930]: healthz check failed Oct 12 05:43:44 crc kubenswrapper[4930]: I1012 05:43:44.172403 4930 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mdwcw" podUID="93aad091-5fd5-4eb5-a123-b7932dc268fe" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 12 05:43:44 crc kubenswrapper[4930]: I1012 05:43:44.571093 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 12 05:43:44 crc kubenswrapper[4930]: I1012 05:43:44.711032 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bdeaac2c-b6ce-4e8f-8cf2-07623b7f48ed-kubelet-dir\") pod \"bdeaac2c-b6ce-4e8f-8cf2-07623b7f48ed\" (UID: \"bdeaac2c-b6ce-4e8f-8cf2-07623b7f48ed\") " Oct 12 05:43:44 crc kubenswrapper[4930]: I1012 05:43:44.711282 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bdeaac2c-b6ce-4e8f-8cf2-07623b7f48ed-kube-api-access\") pod \"bdeaac2c-b6ce-4e8f-8cf2-07623b7f48ed\" (UID: \"bdeaac2c-b6ce-4e8f-8cf2-07623b7f48ed\") " Oct 12 05:43:44 crc kubenswrapper[4930]: I1012 05:43:44.711134 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bdeaac2c-b6ce-4e8f-8cf2-07623b7f48ed-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "bdeaac2c-b6ce-4e8f-8cf2-07623b7f48ed" (UID: "bdeaac2c-b6ce-4e8f-8cf2-07623b7f48ed"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 05:43:44 crc kubenswrapper[4930]: I1012 05:43:44.715971 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdeaac2c-b6ce-4e8f-8cf2-07623b7f48ed-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "bdeaac2c-b6ce-4e8f-8cf2-07623b7f48ed" (UID: "bdeaac2c-b6ce-4e8f-8cf2-07623b7f48ed"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:43:44 crc kubenswrapper[4930]: I1012 05:43:44.812190 4930 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bdeaac2c-b6ce-4e8f-8cf2-07623b7f48ed-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 12 05:43:44 crc kubenswrapper[4930]: I1012 05:43:44.812225 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bdeaac2c-b6ce-4e8f-8cf2-07623b7f48ed-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 12 05:43:45 crc kubenswrapper[4930]: I1012 05:43:45.152239 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"bdeaac2c-b6ce-4e8f-8cf2-07623b7f48ed","Type":"ContainerDied","Data":"418a79133199338e6239702cf2d73289021c491e5eb47ad4713efa0860859d35"} Oct 12 05:43:45 crc kubenswrapper[4930]: I1012 05:43:45.152275 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 12 05:43:45 crc kubenswrapper[4930]: I1012 05:43:45.152280 4930 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="418a79133199338e6239702cf2d73289021c491e5eb47ad4713efa0860859d35" Oct 12 05:43:45 crc kubenswrapper[4930]: I1012 05:43:45.173175 4930 patch_prober.go:28] interesting pod/router-default-5444994796-mdwcw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 12 05:43:45 crc kubenswrapper[4930]: [-]has-synced failed: reason withheld Oct 12 05:43:45 crc kubenswrapper[4930]: [+]process-running ok Oct 12 05:43:45 crc kubenswrapper[4930]: healthz check failed Oct 12 05:43:45 crc kubenswrapper[4930]: I1012 05:43:45.173224 4930 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mdwcw" podUID="93aad091-5fd5-4eb5-a123-b7932dc268fe" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 12 05:43:46 crc kubenswrapper[4930]: I1012 05:43:46.174186 4930 patch_prober.go:28] interesting pod/router-default-5444994796-mdwcw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 12 05:43:46 crc kubenswrapper[4930]: [-]has-synced failed: reason withheld Oct 12 05:43:46 crc kubenswrapper[4930]: [+]process-running ok Oct 12 05:43:46 crc kubenswrapper[4930]: healthz check failed Oct 12 05:43:46 crc kubenswrapper[4930]: I1012 05:43:46.174443 4930 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mdwcw" podUID="93aad091-5fd5-4eb5-a123-b7932dc268fe" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 12 05:43:47 crc kubenswrapper[4930]: I1012 05:43:47.173297 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-mdwcw" Oct 12 05:43:47 crc kubenswrapper[4930]: I1012 05:43:47.175658 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-mdwcw" Oct 12 05:43:50 crc kubenswrapper[4930]: I1012 05:43:50.063783 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-ztn69" Oct 12 05:43:50 crc kubenswrapper[4930]: I1012 05:43:50.411893 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-b6q56" Oct 12 05:43:50 crc kubenswrapper[4930]: I1012 05:43:50.415230 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-b6q56" Oct 12 05:43:52 crc kubenswrapper[4930]: I1012 05:43:52.224272 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dda08509-105f-4935-a8a9-ff852e73c3ce-metrics-certs\") pod \"network-metrics-daemon-7cjzn\" (UID: \"dda08509-105f-4935-a8a9-ff852e73c3ce\") " pod="openshift-multus/network-metrics-daemon-7cjzn" Oct 12 05:43:52 crc kubenswrapper[4930]: I1012 05:43:52.232202 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dda08509-105f-4935-a8a9-ff852e73c3ce-metrics-certs\") pod \"network-metrics-daemon-7cjzn\" (UID: \"dda08509-105f-4935-a8a9-ff852e73c3ce\") " pod="openshift-multus/network-metrics-daemon-7cjzn" Oct 12 05:43:52 crc kubenswrapper[4930]: I1012 05:43:52.387935 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7cjzn" Oct 12 05:43:58 crc kubenswrapper[4930]: I1012 05:43:58.010617 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-bzqf2" Oct 12 05:44:03 crc kubenswrapper[4930]: I1012 05:44:03.670070 4930 patch_prober.go:28] interesting pod/machine-config-daemon-mk4tf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 05:44:03 crc kubenswrapper[4930]: I1012 05:44:03.670515 4930 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 05:44:03 crc kubenswrapper[4930]: E1012 05:44:03.961929 4930 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 12 05:44:03 crc kubenswrapper[4930]: E1012 05:44:03.962109 4930 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h4l8z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-8fq8j_openshift-marketplace(3086651e-154b-4f50-98fd-727d6ede2e3c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 12 05:44:03 crc kubenswrapper[4930]: E1012 05:44:03.963467 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-8fq8j" podUID="3086651e-154b-4f50-98fd-727d6ede2e3c" Oct 12 05:44:04 crc kubenswrapper[4930]: I1012 05:44:04.246768 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-7cjzn"] Oct 12 05:44:04 crc kubenswrapper[4930]: I1012 05:44:04.287909 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xvjv6" event={"ID":"deece869-6cf9-4922-b75a-294c828c6e9e","Type":"ContainerStarted","Data":"ac8ca8ec2ddc7771f1010bdc0ef9016553538d0b9d4b08d3fcf05aed6b3055a4"} Oct 12 05:44:04 crc kubenswrapper[4930]: I1012 05:44:04.290770 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gn4wn" event={"ID":"390607e9-0185-4b5b-a006-70ecef170a3b","Type":"ContainerStarted","Data":"c9de93009d8918eaeefa70f5f72f9de2442b1c4258c24279b7f42b09c9321b0e"} Oct 12 05:44:04 crc kubenswrapper[4930]: I1012 05:44:04.296344 4930 generic.go:334] "Generic (PLEG): container finished" podID="e085d670-8038-4828-acb6-dddc74a33655" containerID="0c09a956fc89f0ffd2c23fdb762c2e7357a8fc91882b5078d63f673c6eb723e5" exitCode=0 Oct 12 05:44:04 crc kubenswrapper[4930]: I1012 05:44:04.296438 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9vqgr" event={"ID":"e085d670-8038-4828-acb6-dddc74a33655","Type":"ContainerDied","Data":"0c09a956fc89f0ffd2c23fdb762c2e7357a8fc91882b5078d63f673c6eb723e5"} Oct 12 05:44:04 crc kubenswrapper[4930]: I1012 05:44:04.297283 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7cjzn" event={"ID":"dda08509-105f-4935-a8a9-ff852e73c3ce","Type":"ContainerStarted","Data":"e0d03afc05816ddccb80dc7fe0b4c57778cd599919bd1c44231506441d93810b"} Oct 12 05:44:04 crc kubenswrapper[4930]: I1012 05:44:04.306440 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-95997" event={"ID":"e303adcd-2f97-4c4d-8bb4-77f7263ab93b","Type":"ContainerStarted","Data":"34fc364554c0d74dde77a04f4cfee532bed8c4f4274526976a70678733a3efa1"} Oct 12 05:44:04 crc kubenswrapper[4930]: I1012 05:44:04.308269 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qdh6w" event={"ID":"054cc455-fa72-4e63-a082-59b56822991e","Type":"ContainerStarted","Data":"aea3bcda1a3abfac9255a0a035fe8e9afecb0e2bbd7459b0e3701da3f9394a6e"} Oct 12 05:44:04 crc kubenswrapper[4930]: I1012 05:44:04.309806 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xxb6v" event={"ID":"abe3804c-7b56-43f4-be75-206e80471232","Type":"ContainerStarted","Data":"0be7df8747cfd61f5d45ee0aa2d0c130474d39a6eafa719052735a324056df5f"} Oct 12 05:44:04 crc kubenswrapper[4930]: I1012 05:44:04.312980 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8wdhk" event={"ID":"54acd3fa-9208-450c-9a6a-4bae6962c325","Type":"ContainerStarted","Data":"04f0e0b6485d6dcf08e8f87350c0ed2e0fde93f32018c42f28b70bf314c28bf1"} Oct 12 05:44:04 crc kubenswrapper[4930]: E1012 05:44:04.319954 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-8fq8j" podUID="3086651e-154b-4f50-98fd-727d6ede2e3c" Oct 12 05:44:05 crc kubenswrapper[4930]: I1012 05:44:05.327028 4930 generic.go:334] "Generic (PLEG): container finished" podID="390607e9-0185-4b5b-a006-70ecef170a3b" containerID="c9de93009d8918eaeefa70f5f72f9de2442b1c4258c24279b7f42b09c9321b0e" exitCode=0 Oct 12 05:44:05 crc kubenswrapper[4930]: I1012 05:44:05.327103 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gn4wn" event={"ID":"390607e9-0185-4b5b-a006-70ecef170a3b","Type":"ContainerDied","Data":"c9de93009d8918eaeefa70f5f72f9de2442b1c4258c24279b7f42b09c9321b0e"} Oct 12 05:44:05 crc kubenswrapper[4930]: I1012 05:44:05.328649 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7cjzn" event={"ID":"dda08509-105f-4935-a8a9-ff852e73c3ce","Type":"ContainerStarted","Data":"8849babd997c71d13651b0619ebb9f49d8bec3e20b8ad1fe08c5ea2e53eab04b"} Oct 12 05:44:05 crc kubenswrapper[4930]: I1012 05:44:05.333313 4930 generic.go:334] "Generic (PLEG): container finished" podID="e303adcd-2f97-4c4d-8bb4-77f7263ab93b" containerID="34fc364554c0d74dde77a04f4cfee532bed8c4f4274526976a70678733a3efa1" exitCode=0 Oct 12 05:44:05 crc kubenswrapper[4930]: I1012 05:44:05.334195 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-95997" event={"ID":"e303adcd-2f97-4c4d-8bb4-77f7263ab93b","Type":"ContainerDied","Data":"34fc364554c0d74dde77a04f4cfee532bed8c4f4274526976a70678733a3efa1"} Oct 12 05:44:05 crc kubenswrapper[4930]: I1012 05:44:05.337400 4930 generic.go:334] "Generic (PLEG): container finished" podID="054cc455-fa72-4e63-a082-59b56822991e" containerID="aea3bcda1a3abfac9255a0a035fe8e9afecb0e2bbd7459b0e3701da3f9394a6e" exitCode=0 Oct 12 05:44:05 crc kubenswrapper[4930]: I1012 05:44:05.337468 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qdh6w" event={"ID":"054cc455-fa72-4e63-a082-59b56822991e","Type":"ContainerDied","Data":"aea3bcda1a3abfac9255a0a035fe8e9afecb0e2bbd7459b0e3701da3f9394a6e"} Oct 12 05:44:05 crc kubenswrapper[4930]: I1012 05:44:05.341967 4930 generic.go:334] "Generic (PLEG): container finished" podID="abe3804c-7b56-43f4-be75-206e80471232" containerID="0be7df8747cfd61f5d45ee0aa2d0c130474d39a6eafa719052735a324056df5f" exitCode=0 Oct 12 05:44:05 crc kubenswrapper[4930]: I1012 05:44:05.342026 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xxb6v" event={"ID":"abe3804c-7b56-43f4-be75-206e80471232","Type":"ContainerDied","Data":"0be7df8747cfd61f5d45ee0aa2d0c130474d39a6eafa719052735a324056df5f"} Oct 12 05:44:05 crc kubenswrapper[4930]: I1012 05:44:05.350550 4930 generic.go:334] "Generic (PLEG): container finished" podID="54acd3fa-9208-450c-9a6a-4bae6962c325" containerID="04f0e0b6485d6dcf08e8f87350c0ed2e0fde93f32018c42f28b70bf314c28bf1" exitCode=0 Oct 12 05:44:05 crc kubenswrapper[4930]: I1012 05:44:05.350687 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8wdhk" event={"ID":"54acd3fa-9208-450c-9a6a-4bae6962c325","Type":"ContainerDied","Data":"04f0e0b6485d6dcf08e8f87350c0ed2e0fde93f32018c42f28b70bf314c28bf1"} Oct 12 05:44:05 crc kubenswrapper[4930]: I1012 05:44:05.359345 4930 generic.go:334] "Generic (PLEG): container finished" podID="deece869-6cf9-4922-b75a-294c828c6e9e" containerID="ac8ca8ec2ddc7771f1010bdc0ef9016553538d0b9d4b08d3fcf05aed6b3055a4" exitCode=0 Oct 12 05:44:05 crc kubenswrapper[4930]: I1012 05:44:05.359389 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xvjv6" event={"ID":"deece869-6cf9-4922-b75a-294c828c6e9e","Type":"ContainerDied","Data":"ac8ca8ec2ddc7771f1010bdc0ef9016553538d0b9d4b08d3fcf05aed6b3055a4"} Oct 12 05:44:06 crc kubenswrapper[4930]: I1012 05:44:06.367045 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8wdhk" event={"ID":"54acd3fa-9208-450c-9a6a-4bae6962c325","Type":"ContainerStarted","Data":"145b675bc067d35b38e5cd864995c6a9a845ff73bbcb5eb405b22713b3f5d4e4"} Oct 12 05:44:06 crc kubenswrapper[4930]: I1012 05:44:06.369696 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xvjv6" event={"ID":"deece869-6cf9-4922-b75a-294c828c6e9e","Type":"ContainerStarted","Data":"0bd46b9577918758a6ed15658c75081322b397febd13e5338a664eca151808e0"} Oct 12 05:44:06 crc kubenswrapper[4930]: I1012 05:44:06.372418 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gn4wn" event={"ID":"390607e9-0185-4b5b-a006-70ecef170a3b","Type":"ContainerStarted","Data":"5a451ada33d28fdc1de1293b1f4cee8037e590f976f8eeb2da3f0dceebfab9e1"} Oct 12 05:44:06 crc kubenswrapper[4930]: I1012 05:44:06.374994 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9vqgr" event={"ID":"e085d670-8038-4828-acb6-dddc74a33655","Type":"ContainerStarted","Data":"52f5333b79ef9ba518babc5712e2c9e74bc8edb1e5ad1d1c2ed44e2b78d9605e"} Oct 12 05:44:06 crc kubenswrapper[4930]: I1012 05:44:06.376691 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7cjzn" event={"ID":"dda08509-105f-4935-a8a9-ff852e73c3ce","Type":"ContainerStarted","Data":"e94c012061cdfe2a319457ddcccd27405b6424e0063231e7426e1cdf2593418f"} Oct 12 05:44:06 crc kubenswrapper[4930]: I1012 05:44:06.379305 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-95997" event={"ID":"e303adcd-2f97-4c4d-8bb4-77f7263ab93b","Type":"ContainerStarted","Data":"161b6146bb94e3a8bed24475efeaf2560392abd92ac26be249802ce3450ffb94"} Oct 12 05:44:06 crc kubenswrapper[4930]: I1012 05:44:06.381064 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qdh6w" event={"ID":"054cc455-fa72-4e63-a082-59b56822991e","Type":"ContainerStarted","Data":"a4efb2ddf6db6aaee95e0cfe84708ce47788c3bb0f0f061c70de2c799e27a013"} Oct 12 05:44:06 crc kubenswrapper[4930]: I1012 05:44:06.382649 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xxb6v" event={"ID":"abe3804c-7b56-43f4-be75-206e80471232","Type":"ContainerStarted","Data":"637133fadd4a9e2255f70b2c6bd88f6a4042364db51ff9a32e870a57f3d67666"} Oct 12 05:44:06 crc kubenswrapper[4930]: I1012 05:44:06.401075 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8wdhk" podStartSLOduration=4.444143094 podStartE2EDuration="31.401057955s" podCreationTimestamp="2025-10-12 05:43:35 +0000 UTC" firstStartedPulling="2025-10-12 05:43:38.965151942 +0000 UTC m=+151.507253697" lastFinishedPulling="2025-10-12 05:44:05.922066763 +0000 UTC m=+178.464168558" observedRunningTime="2025-10-12 05:44:06.396699382 +0000 UTC m=+178.938801137" watchObservedRunningTime="2025-10-12 05:44:06.401057955 +0000 UTC m=+178.943159730" Oct 12 05:44:06 crc kubenswrapper[4930]: I1012 05:44:06.421333 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qdh6w" podStartSLOduration=3.539831243 podStartE2EDuration="30.421319887s" podCreationTimestamp="2025-10-12 05:43:36 +0000 UTC" firstStartedPulling="2025-10-12 05:43:38.921455869 +0000 UTC m=+151.463557674" lastFinishedPulling="2025-10-12 05:44:05.802944553 +0000 UTC m=+178.345046318" observedRunningTime="2025-10-12 05:44:06.418274351 +0000 UTC m=+178.960376116" watchObservedRunningTime="2025-10-12 05:44:06.421319887 +0000 UTC m=+178.963421652" Oct 12 05:44:06 crc kubenswrapper[4930]: I1012 05:44:06.463732 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xxb6v" podStartSLOduration=3.813894612 podStartE2EDuration="28.463716163s" podCreationTimestamp="2025-10-12 05:43:38 +0000 UTC" firstStartedPulling="2025-10-12 05:43:41.08323336 +0000 UTC m=+153.625335125" lastFinishedPulling="2025-10-12 05:44:05.733054911 +0000 UTC m=+178.275156676" observedRunningTime="2025-10-12 05:44:06.44305335 +0000 UTC m=+178.985155115" watchObservedRunningTime="2025-10-12 05:44:06.463716163 +0000 UTC m=+179.005817928" Oct 12 05:44:06 crc kubenswrapper[4930]: I1012 05:44:06.484220 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xvjv6" podStartSLOduration=4.598675203 podStartE2EDuration="31.48419948s" podCreationTimestamp="2025-10-12 05:43:35 +0000 UTC" firstStartedPulling="2025-10-12 05:43:38.957340462 +0000 UTC m=+151.499442227" lastFinishedPulling="2025-10-12 05:44:05.842864699 +0000 UTC m=+178.384966504" observedRunningTime="2025-10-12 05:44:06.463194228 +0000 UTC m=+179.005295993" watchObservedRunningTime="2025-10-12 05:44:06.48419948 +0000 UTC m=+179.026301245" Oct 12 05:44:06 crc kubenswrapper[4930]: I1012 05:44:06.518170 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-95997" podStartSLOduration=2.685211711 podStartE2EDuration="27.518152588s" podCreationTimestamp="2025-10-12 05:43:39 +0000 UTC" firstStartedPulling="2025-10-12 05:43:41.065654994 +0000 UTC m=+153.607756749" lastFinishedPulling="2025-10-12 05:44:05.898595861 +0000 UTC m=+178.440697626" observedRunningTime="2025-10-12 05:44:06.487932866 +0000 UTC m=+179.030034631" watchObservedRunningTime="2025-10-12 05:44:06.518152588 +0000 UTC m=+179.060254353" Oct 12 05:44:06 crc kubenswrapper[4930]: I1012 05:44:06.527712 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qdh6w" Oct 12 05:44:06 crc kubenswrapper[4930]: I1012 05:44:06.527785 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qdh6w" Oct 12 05:44:06 crc kubenswrapper[4930]: I1012 05:44:06.538720 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gn4wn" podStartSLOduration=3.524965714 podStartE2EDuration="30.538691678s" podCreationTimestamp="2025-10-12 05:43:36 +0000 UTC" firstStartedPulling="2025-10-12 05:43:38.95904734 +0000 UTC m=+151.501149105" lastFinishedPulling="2025-10-12 05:44:05.972773304 +0000 UTC m=+178.514875069" observedRunningTime="2025-10-12 05:44:06.520428232 +0000 UTC m=+179.062530008" watchObservedRunningTime="2025-10-12 05:44:06.538691678 +0000 UTC m=+179.080793443" Oct 12 05:44:06 crc kubenswrapper[4930]: I1012 05:44:06.540847 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9vqgr" podStartSLOduration=3.164648358 podStartE2EDuration="29.540840588s" podCreationTimestamp="2025-10-12 05:43:37 +0000 UTC" firstStartedPulling="2025-10-12 05:43:38.910214362 +0000 UTC m=+151.452316127" lastFinishedPulling="2025-10-12 05:44:05.286406552 +0000 UTC m=+177.828508357" observedRunningTime="2025-10-12 05:44:06.539700886 +0000 UTC m=+179.081802651" watchObservedRunningTime="2025-10-12 05:44:06.540840588 +0000 UTC m=+179.082942353" Oct 12 05:44:06 crc kubenswrapper[4930]: I1012 05:44:06.559592 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-7cjzn" podStartSLOduration=157.559560576 podStartE2EDuration="2m37.559560576s" podCreationTimestamp="2025-10-12 05:41:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:44:06.559055562 +0000 UTC m=+179.101157327" watchObservedRunningTime="2025-10-12 05:44:06.559560576 +0000 UTC m=+179.101662341" Oct 12 05:44:06 crc kubenswrapper[4930]: I1012 05:44:06.692084 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gn4wn" Oct 12 05:44:06 crc kubenswrapper[4930]: I1012 05:44:06.692177 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gn4wn" Oct 12 05:44:07 crc kubenswrapper[4930]: I1012 05:44:07.692235 4930 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-qdh6w" podUID="054cc455-fa72-4e63-a082-59b56822991e" containerName="registry-server" probeResult="failure" output=< Oct 12 05:44:07 crc kubenswrapper[4930]: timeout: failed to connect service ":50051" within 1s Oct 12 05:44:07 crc kubenswrapper[4930]: > Oct 12 05:44:07 crc kubenswrapper[4930]: I1012 05:44:07.737019 4930 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-gn4wn" podUID="390607e9-0185-4b5b-a006-70ecef170a3b" containerName="registry-server" probeResult="failure" output=< Oct 12 05:44:07 crc kubenswrapper[4930]: timeout: failed to connect service ":50051" within 1s Oct 12 05:44:07 crc kubenswrapper[4930]: > Oct 12 05:44:08 crc kubenswrapper[4930]: I1012 05:44:08.106563 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9vqgr" Oct 12 05:44:08 crc kubenswrapper[4930]: I1012 05:44:08.106758 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9vqgr" Oct 12 05:44:08 crc kubenswrapper[4930]: I1012 05:44:08.180009 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9vqgr" Oct 12 05:44:09 crc kubenswrapper[4930]: I1012 05:44:09.349632 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xxb6v" Oct 12 05:44:09 crc kubenswrapper[4930]: I1012 05:44:09.350647 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xxb6v" Oct 12 05:44:09 crc kubenswrapper[4930]: I1012 05:44:09.740007 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-95997" Oct 12 05:44:09 crc kubenswrapper[4930]: I1012 05:44:09.740395 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-95997" Oct 12 05:44:10 crc kubenswrapper[4930]: I1012 05:44:10.399788 4930 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xxb6v" podUID="abe3804c-7b56-43f4-be75-206e80471232" containerName="registry-server" probeResult="failure" output=< Oct 12 05:44:10 crc kubenswrapper[4930]: timeout: failed to connect service ":50051" within 1s Oct 12 05:44:10 crc kubenswrapper[4930]: > Oct 12 05:44:10 crc kubenswrapper[4930]: I1012 05:44:10.806454 4930 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-95997" podUID="e303adcd-2f97-4c4d-8bb4-77f7263ab93b" containerName="registry-server" probeResult="failure" output=< Oct 12 05:44:10 crc kubenswrapper[4930]: timeout: failed to connect service ":50051" within 1s Oct 12 05:44:10 crc kubenswrapper[4930]: > Oct 12 05:44:11 crc kubenswrapper[4930]: I1012 05:44:11.175243 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2lc4l" Oct 12 05:44:16 crc kubenswrapper[4930]: I1012 05:44:16.155648 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xvjv6" Oct 12 05:44:16 crc kubenswrapper[4930]: I1012 05:44:16.156622 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xvjv6" Oct 12 05:44:16 crc kubenswrapper[4930]: I1012 05:44:16.179258 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 12 05:44:16 crc kubenswrapper[4930]: I1012 05:44:16.251404 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xvjv6" Oct 12 05:44:16 crc kubenswrapper[4930]: I1012 05:44:16.317536 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8wdhk" Oct 12 05:44:16 crc kubenswrapper[4930]: I1012 05:44:16.317599 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8wdhk" Oct 12 05:44:16 crc kubenswrapper[4930]: I1012 05:44:16.365176 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8wdhk" Oct 12 05:44:16 crc kubenswrapper[4930]: I1012 05:44:16.510985 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xvjv6" Oct 12 05:44:16 crc kubenswrapper[4930]: I1012 05:44:16.518718 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8wdhk" Oct 12 05:44:16 crc kubenswrapper[4930]: I1012 05:44:16.604878 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qdh6w" Oct 12 05:44:16 crc kubenswrapper[4930]: I1012 05:44:16.651228 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qdh6w" Oct 12 05:44:16 crc kubenswrapper[4930]: I1012 05:44:16.752030 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gn4wn" Oct 12 05:44:16 crc kubenswrapper[4930]: I1012 05:44:16.816299 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gn4wn" Oct 12 05:44:18 crc kubenswrapper[4930]: I1012 05:44:18.174968 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9vqgr" Oct 12 05:44:18 crc kubenswrapper[4930]: I1012 05:44:18.195477 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gn4wn"] Oct 12 05:44:18 crc kubenswrapper[4930]: I1012 05:44:18.457591 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gn4wn" podUID="390607e9-0185-4b5b-a006-70ecef170a3b" containerName="registry-server" containerID="cri-o://5a451ada33d28fdc1de1293b1f4cee8037e590f976f8eeb2da3f0dceebfab9e1" gracePeriod=2 Oct 12 05:44:18 crc kubenswrapper[4930]: I1012 05:44:18.793837 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qdh6w"] Oct 12 05:44:18 crc kubenswrapper[4930]: I1012 05:44:18.794201 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qdh6w" podUID="054cc455-fa72-4e63-a082-59b56822991e" containerName="registry-server" containerID="cri-o://a4efb2ddf6db6aaee95e0cfe84708ce47788c3bb0f0f061c70de2c799e27a013" gracePeriod=2 Oct 12 05:44:19 crc kubenswrapper[4930]: I1012 05:44:19.409476 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xxb6v" Oct 12 05:44:19 crc kubenswrapper[4930]: I1012 05:44:19.470174 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xxb6v" Oct 12 05:44:19 crc kubenswrapper[4930]: I1012 05:44:19.470773 4930 generic.go:334] "Generic (PLEG): container finished" podID="054cc455-fa72-4e63-a082-59b56822991e" containerID="a4efb2ddf6db6aaee95e0cfe84708ce47788c3bb0f0f061c70de2c799e27a013" exitCode=0 Oct 12 05:44:19 crc kubenswrapper[4930]: I1012 05:44:19.470858 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qdh6w" event={"ID":"054cc455-fa72-4e63-a082-59b56822991e","Type":"ContainerDied","Data":"a4efb2ddf6db6aaee95e0cfe84708ce47788c3bb0f0f061c70de2c799e27a013"} Oct 12 05:44:19 crc kubenswrapper[4930]: I1012 05:44:19.474716 4930 generic.go:334] "Generic (PLEG): container finished" podID="390607e9-0185-4b5b-a006-70ecef170a3b" containerID="5a451ada33d28fdc1de1293b1f4cee8037e590f976f8eeb2da3f0dceebfab9e1" exitCode=0 Oct 12 05:44:19 crc kubenswrapper[4930]: I1012 05:44:19.474763 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gn4wn" event={"ID":"390607e9-0185-4b5b-a006-70ecef170a3b","Type":"ContainerDied","Data":"5a451ada33d28fdc1de1293b1f4cee8037e590f976f8eeb2da3f0dceebfab9e1"} Oct 12 05:44:19 crc kubenswrapper[4930]: I1012 05:44:19.710325 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gn4wn" Oct 12 05:44:19 crc kubenswrapper[4930]: I1012 05:44:19.797411 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-95997" Oct 12 05:44:19 crc kubenswrapper[4930]: I1012 05:44:19.853263 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-95997" Oct 12 05:44:19 crc kubenswrapper[4930]: I1012 05:44:19.870706 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/390607e9-0185-4b5b-a006-70ecef170a3b-catalog-content\") pod \"390607e9-0185-4b5b-a006-70ecef170a3b\" (UID: \"390607e9-0185-4b5b-a006-70ecef170a3b\") " Oct 12 05:44:19 crc kubenswrapper[4930]: I1012 05:44:19.870771 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/390607e9-0185-4b5b-a006-70ecef170a3b-utilities\") pod \"390607e9-0185-4b5b-a006-70ecef170a3b\" (UID: \"390607e9-0185-4b5b-a006-70ecef170a3b\") " Oct 12 05:44:19 crc kubenswrapper[4930]: I1012 05:44:19.870833 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnb2h\" (UniqueName: \"kubernetes.io/projected/390607e9-0185-4b5b-a006-70ecef170a3b-kube-api-access-wnb2h\") pod \"390607e9-0185-4b5b-a006-70ecef170a3b\" (UID: \"390607e9-0185-4b5b-a006-70ecef170a3b\") " Oct 12 05:44:19 crc kubenswrapper[4930]: I1012 05:44:19.876283 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/390607e9-0185-4b5b-a006-70ecef170a3b-kube-api-access-wnb2h" (OuterVolumeSpecName: "kube-api-access-wnb2h") pod "390607e9-0185-4b5b-a006-70ecef170a3b" (UID: "390607e9-0185-4b5b-a006-70ecef170a3b"). InnerVolumeSpecName "kube-api-access-wnb2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:44:19 crc kubenswrapper[4930]: I1012 05:44:19.877066 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/390607e9-0185-4b5b-a006-70ecef170a3b-utilities" (OuterVolumeSpecName: "utilities") pod "390607e9-0185-4b5b-a006-70ecef170a3b" (UID: "390607e9-0185-4b5b-a006-70ecef170a3b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 05:44:19 crc kubenswrapper[4930]: I1012 05:44:19.877625 4930 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/390607e9-0185-4b5b-a006-70ecef170a3b-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 05:44:19 crc kubenswrapper[4930]: I1012 05:44:19.877644 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wnb2h\" (UniqueName: \"kubernetes.io/projected/390607e9-0185-4b5b-a006-70ecef170a3b-kube-api-access-wnb2h\") on node \"crc\" DevicePath \"\"" Oct 12 05:44:19 crc kubenswrapper[4930]: I1012 05:44:19.911716 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qdh6w" Oct 12 05:44:19 crc kubenswrapper[4930]: I1012 05:44:19.926558 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/390607e9-0185-4b5b-a006-70ecef170a3b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "390607e9-0185-4b5b-a006-70ecef170a3b" (UID: "390607e9-0185-4b5b-a006-70ecef170a3b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 05:44:19 crc kubenswrapper[4930]: I1012 05:44:19.978668 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/054cc455-fa72-4e63-a082-59b56822991e-catalog-content\") pod \"054cc455-fa72-4e63-a082-59b56822991e\" (UID: \"054cc455-fa72-4e63-a082-59b56822991e\") " Oct 12 05:44:19 crc kubenswrapper[4930]: I1012 05:44:19.986299 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sv68b\" (UniqueName: \"kubernetes.io/projected/054cc455-fa72-4e63-a082-59b56822991e-kube-api-access-sv68b\") pod \"054cc455-fa72-4e63-a082-59b56822991e\" (UID: \"054cc455-fa72-4e63-a082-59b56822991e\") " Oct 12 05:44:19 crc kubenswrapper[4930]: I1012 05:44:19.986385 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/054cc455-fa72-4e63-a082-59b56822991e-utilities\") pod \"054cc455-fa72-4e63-a082-59b56822991e\" (UID: \"054cc455-fa72-4e63-a082-59b56822991e\") " Oct 12 05:44:19 crc kubenswrapper[4930]: I1012 05:44:19.986922 4930 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/390607e9-0185-4b5b-a006-70ecef170a3b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 05:44:19 crc kubenswrapper[4930]: I1012 05:44:19.987422 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/054cc455-fa72-4e63-a082-59b56822991e-utilities" (OuterVolumeSpecName: "utilities") pod "054cc455-fa72-4e63-a082-59b56822991e" (UID: "054cc455-fa72-4e63-a082-59b56822991e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 05:44:19 crc kubenswrapper[4930]: I1012 05:44:19.990212 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/054cc455-fa72-4e63-a082-59b56822991e-kube-api-access-sv68b" (OuterVolumeSpecName: "kube-api-access-sv68b") pod "054cc455-fa72-4e63-a082-59b56822991e" (UID: "054cc455-fa72-4e63-a082-59b56822991e"). InnerVolumeSpecName "kube-api-access-sv68b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:44:20 crc kubenswrapper[4930]: I1012 05:44:20.053613 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/054cc455-fa72-4e63-a082-59b56822991e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "054cc455-fa72-4e63-a082-59b56822991e" (UID: "054cc455-fa72-4e63-a082-59b56822991e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 05:44:20 crc kubenswrapper[4930]: I1012 05:44:20.090903 4930 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/054cc455-fa72-4e63-a082-59b56822991e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 05:44:20 crc kubenswrapper[4930]: I1012 05:44:20.090938 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sv68b\" (UniqueName: \"kubernetes.io/projected/054cc455-fa72-4e63-a082-59b56822991e-kube-api-access-sv68b\") on node \"crc\" DevicePath \"\"" Oct 12 05:44:20 crc kubenswrapper[4930]: I1012 05:44:20.090952 4930 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/054cc455-fa72-4e63-a082-59b56822991e-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 05:44:20 crc kubenswrapper[4930]: I1012 05:44:20.481874 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gn4wn" event={"ID":"390607e9-0185-4b5b-a006-70ecef170a3b","Type":"ContainerDied","Data":"0ec402df5720f7a73cdc63bb337f7b99865b85aab0b100bddf4bfd1efd007c88"} Oct 12 05:44:20 crc kubenswrapper[4930]: I1012 05:44:20.482979 4930 scope.go:117] "RemoveContainer" containerID="5a451ada33d28fdc1de1293b1f4cee8037e590f976f8eeb2da3f0dceebfab9e1" Oct 12 05:44:20 crc kubenswrapper[4930]: I1012 05:44:20.482112 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gn4wn" Oct 12 05:44:20 crc kubenswrapper[4930]: I1012 05:44:20.484975 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8fq8j" event={"ID":"3086651e-154b-4f50-98fd-727d6ede2e3c","Type":"ContainerStarted","Data":"6bd62ee1f0dad9da77cb5e3cc43f0068b275ffa971f69182960dc8c63e6db373"} Oct 12 05:44:20 crc kubenswrapper[4930]: I1012 05:44:20.488484 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qdh6w" event={"ID":"054cc455-fa72-4e63-a082-59b56822991e","Type":"ContainerDied","Data":"703dcb896f950d71f69903412a19fde91cff53eee36b0ca61b388f756bb0e486"} Oct 12 05:44:20 crc kubenswrapper[4930]: I1012 05:44:20.489687 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qdh6w" Oct 12 05:44:20 crc kubenswrapper[4930]: I1012 05:44:20.499672 4930 scope.go:117] "RemoveContainer" containerID="c9de93009d8918eaeefa70f5f72f9de2442b1c4258c24279b7f42b09c9321b0e" Oct 12 05:44:20 crc kubenswrapper[4930]: I1012 05:44:20.507407 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gn4wn"] Oct 12 05:44:20 crc kubenswrapper[4930]: I1012 05:44:20.510988 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gn4wn"] Oct 12 05:44:20 crc kubenswrapper[4930]: I1012 05:44:20.523156 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qdh6w"] Oct 12 05:44:20 crc kubenswrapper[4930]: I1012 05:44:20.525631 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qdh6w"] Oct 12 05:44:20 crc kubenswrapper[4930]: I1012 05:44:20.530881 4930 scope.go:117] "RemoveContainer" containerID="a4dd2cdca82f4ae4808c68879a86ea0d4a3376d6418923ab4c894d8eb923ebe3" Oct 12 05:44:20 crc kubenswrapper[4930]: I1012 05:44:20.542483 4930 scope.go:117] "RemoveContainer" containerID="a4efb2ddf6db6aaee95e0cfe84708ce47788c3bb0f0f061c70de2c799e27a013" Oct 12 05:44:20 crc kubenswrapper[4930]: I1012 05:44:20.566144 4930 scope.go:117] "RemoveContainer" containerID="aea3bcda1a3abfac9255a0a035fe8e9afecb0e2bbd7459b0e3701da3f9394a6e" Oct 12 05:44:20 crc kubenswrapper[4930]: I1012 05:44:20.580763 4930 scope.go:117] "RemoveContainer" containerID="1c82b68a6a649f63e0730f80253c2724f8884617e3434c060b81ceaa4524983e" Oct 12 05:44:21 crc kubenswrapper[4930]: I1012 05:44:21.503092 4930 generic.go:334] "Generic (PLEG): container finished" podID="3086651e-154b-4f50-98fd-727d6ede2e3c" containerID="6bd62ee1f0dad9da77cb5e3cc43f0068b275ffa971f69182960dc8c63e6db373" exitCode=0 Oct 12 05:44:21 crc kubenswrapper[4930]: I1012 05:44:21.503140 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8fq8j" event={"ID":"3086651e-154b-4f50-98fd-727d6ede2e3c","Type":"ContainerDied","Data":"6bd62ee1f0dad9da77cb5e3cc43f0068b275ffa971f69182960dc8c63e6db373"} Oct 12 05:44:22 crc kubenswrapper[4930]: I1012 05:44:22.143443 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="054cc455-fa72-4e63-a082-59b56822991e" path="/var/lib/kubelet/pods/054cc455-fa72-4e63-a082-59b56822991e/volumes" Oct 12 05:44:22 crc kubenswrapper[4930]: I1012 05:44:22.144538 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="390607e9-0185-4b5b-a006-70ecef170a3b" path="/var/lib/kubelet/pods/390607e9-0185-4b5b-a006-70ecef170a3b/volumes" Oct 12 05:44:22 crc kubenswrapper[4930]: I1012 05:44:22.514391 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8fq8j" event={"ID":"3086651e-154b-4f50-98fd-727d6ede2e3c","Type":"ContainerStarted","Data":"342d34b802312a9117fe8f37e6fafedac9bfb368f6480e321f894d4056a0a76b"} Oct 12 05:44:22 crc kubenswrapper[4930]: I1012 05:44:22.539314 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8fq8j" podStartSLOduration=1.365806794 podStartE2EDuration="44.539283657s" podCreationTimestamp="2025-10-12 05:43:38 +0000 UTC" firstStartedPulling="2025-10-12 05:43:38.974196477 +0000 UTC m=+151.516298242" lastFinishedPulling="2025-10-12 05:44:22.14767334 +0000 UTC m=+194.689775105" observedRunningTime="2025-10-12 05:44:22.533502994 +0000 UTC m=+195.075604789" watchObservedRunningTime="2025-10-12 05:44:22.539283657 +0000 UTC m=+195.081385462" Oct 12 05:44:23 crc kubenswrapper[4930]: I1012 05:44:23.181197 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-95997"] Oct 12 05:44:23 crc kubenswrapper[4930]: I1012 05:44:23.181842 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-95997" podUID="e303adcd-2f97-4c4d-8bb4-77f7263ab93b" containerName="registry-server" containerID="cri-o://161b6146bb94e3a8bed24475efeaf2560392abd92ac26be249802ce3450ffb94" gracePeriod=2 Oct 12 05:44:23 crc kubenswrapper[4930]: I1012 05:44:23.521971 4930 generic.go:334] "Generic (PLEG): container finished" podID="e303adcd-2f97-4c4d-8bb4-77f7263ab93b" containerID="161b6146bb94e3a8bed24475efeaf2560392abd92ac26be249802ce3450ffb94" exitCode=0 Oct 12 05:44:23 crc kubenswrapper[4930]: I1012 05:44:23.522890 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-95997" event={"ID":"e303adcd-2f97-4c4d-8bb4-77f7263ab93b","Type":"ContainerDied","Data":"161b6146bb94e3a8bed24475efeaf2560392abd92ac26be249802ce3450ffb94"} Oct 12 05:44:23 crc kubenswrapper[4930]: I1012 05:44:23.522949 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-95997" event={"ID":"e303adcd-2f97-4c4d-8bb4-77f7263ab93b","Type":"ContainerDied","Data":"a8017996338fe3a6e59214fea1125050d5a1eef415769b56bd1d482f7164b1de"} Oct 12 05:44:23 crc kubenswrapper[4930]: I1012 05:44:23.522964 4930 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8017996338fe3a6e59214fea1125050d5a1eef415769b56bd1d482f7164b1de" Oct 12 05:44:23 crc kubenswrapper[4930]: I1012 05:44:23.574580 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-95997" Oct 12 05:44:23 crc kubenswrapper[4930]: I1012 05:44:23.753097 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e303adcd-2f97-4c4d-8bb4-77f7263ab93b-catalog-content\") pod \"e303adcd-2f97-4c4d-8bb4-77f7263ab93b\" (UID: \"e303adcd-2f97-4c4d-8bb4-77f7263ab93b\") " Oct 12 05:44:23 crc kubenswrapper[4930]: I1012 05:44:23.753143 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xbnj\" (UniqueName: \"kubernetes.io/projected/e303adcd-2f97-4c4d-8bb4-77f7263ab93b-kube-api-access-7xbnj\") pod \"e303adcd-2f97-4c4d-8bb4-77f7263ab93b\" (UID: \"e303adcd-2f97-4c4d-8bb4-77f7263ab93b\") " Oct 12 05:44:23 crc kubenswrapper[4930]: I1012 05:44:23.753231 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e303adcd-2f97-4c4d-8bb4-77f7263ab93b-utilities\") pod \"e303adcd-2f97-4c4d-8bb4-77f7263ab93b\" (UID: \"e303adcd-2f97-4c4d-8bb4-77f7263ab93b\") " Oct 12 05:44:23 crc kubenswrapper[4930]: I1012 05:44:23.754071 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e303adcd-2f97-4c4d-8bb4-77f7263ab93b-utilities" (OuterVolumeSpecName: "utilities") pod "e303adcd-2f97-4c4d-8bb4-77f7263ab93b" (UID: "e303adcd-2f97-4c4d-8bb4-77f7263ab93b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 05:44:23 crc kubenswrapper[4930]: I1012 05:44:23.762773 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e303adcd-2f97-4c4d-8bb4-77f7263ab93b-kube-api-access-7xbnj" (OuterVolumeSpecName: "kube-api-access-7xbnj") pod "e303adcd-2f97-4c4d-8bb4-77f7263ab93b" (UID: "e303adcd-2f97-4c4d-8bb4-77f7263ab93b"). InnerVolumeSpecName "kube-api-access-7xbnj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:44:23 crc kubenswrapper[4930]: I1012 05:44:23.855553 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xbnj\" (UniqueName: \"kubernetes.io/projected/e303adcd-2f97-4c4d-8bb4-77f7263ab93b-kube-api-access-7xbnj\") on node \"crc\" DevicePath \"\"" Oct 12 05:44:23 crc kubenswrapper[4930]: I1012 05:44:23.856104 4930 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e303adcd-2f97-4c4d-8bb4-77f7263ab93b-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 05:44:23 crc kubenswrapper[4930]: I1012 05:44:23.857221 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e303adcd-2f97-4c4d-8bb4-77f7263ab93b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e303adcd-2f97-4c4d-8bb4-77f7263ab93b" (UID: "e303adcd-2f97-4c4d-8bb4-77f7263ab93b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 05:44:23 crc kubenswrapper[4930]: I1012 05:44:23.957167 4930 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e303adcd-2f97-4c4d-8bb4-77f7263ab93b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 05:44:24 crc kubenswrapper[4930]: I1012 05:44:24.527874 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-95997" Oct 12 05:44:24 crc kubenswrapper[4930]: I1012 05:44:24.545954 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-95997"] Oct 12 05:44:24 crc kubenswrapper[4930]: I1012 05:44:24.550204 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-95997"] Oct 12 05:44:25 crc kubenswrapper[4930]: I1012 05:44:25.697242 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-gbqmf"] Oct 12 05:44:26 crc kubenswrapper[4930]: I1012 05:44:26.142196 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e303adcd-2f97-4c4d-8bb4-77f7263ab93b" path="/var/lib/kubelet/pods/e303adcd-2f97-4c4d-8bb4-77f7263ab93b/volumes" Oct 12 05:44:28 crc kubenswrapper[4930]: I1012 05:44:28.528680 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8fq8j" Oct 12 05:44:28 crc kubenswrapper[4930]: I1012 05:44:28.528723 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8fq8j" Oct 12 05:44:28 crc kubenswrapper[4930]: I1012 05:44:28.574709 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8fq8j" Oct 12 05:44:28 crc kubenswrapper[4930]: I1012 05:44:28.618900 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8fq8j" Oct 12 05:44:29 crc kubenswrapper[4930]: I1012 05:44:29.590650 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8fq8j"] Oct 12 05:44:30 crc kubenswrapper[4930]: I1012 05:44:30.557878 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8fq8j" podUID="3086651e-154b-4f50-98fd-727d6ede2e3c" containerName="registry-server" containerID="cri-o://342d34b802312a9117fe8f37e6fafedac9bfb368f6480e321f894d4056a0a76b" gracePeriod=2 Oct 12 05:44:30 crc kubenswrapper[4930]: I1012 05:44:30.935210 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8fq8j" Oct 12 05:44:31 crc kubenswrapper[4930]: I1012 05:44:31.070905 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3086651e-154b-4f50-98fd-727d6ede2e3c-utilities\") pod \"3086651e-154b-4f50-98fd-727d6ede2e3c\" (UID: \"3086651e-154b-4f50-98fd-727d6ede2e3c\") " Oct 12 05:44:31 crc kubenswrapper[4930]: I1012 05:44:31.071069 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4l8z\" (UniqueName: \"kubernetes.io/projected/3086651e-154b-4f50-98fd-727d6ede2e3c-kube-api-access-h4l8z\") pod \"3086651e-154b-4f50-98fd-727d6ede2e3c\" (UID: \"3086651e-154b-4f50-98fd-727d6ede2e3c\") " Oct 12 05:44:31 crc kubenswrapper[4930]: I1012 05:44:31.071112 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3086651e-154b-4f50-98fd-727d6ede2e3c-catalog-content\") pod \"3086651e-154b-4f50-98fd-727d6ede2e3c\" (UID: \"3086651e-154b-4f50-98fd-727d6ede2e3c\") " Oct 12 05:44:31 crc kubenswrapper[4930]: I1012 05:44:31.072829 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3086651e-154b-4f50-98fd-727d6ede2e3c-utilities" (OuterVolumeSpecName: "utilities") pod "3086651e-154b-4f50-98fd-727d6ede2e3c" (UID: "3086651e-154b-4f50-98fd-727d6ede2e3c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 05:44:31 crc kubenswrapper[4930]: I1012 05:44:31.076399 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3086651e-154b-4f50-98fd-727d6ede2e3c-kube-api-access-h4l8z" (OuterVolumeSpecName: "kube-api-access-h4l8z") pod "3086651e-154b-4f50-98fd-727d6ede2e3c" (UID: "3086651e-154b-4f50-98fd-727d6ede2e3c"). InnerVolumeSpecName "kube-api-access-h4l8z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:44:31 crc kubenswrapper[4930]: I1012 05:44:31.084410 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3086651e-154b-4f50-98fd-727d6ede2e3c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3086651e-154b-4f50-98fd-727d6ede2e3c" (UID: "3086651e-154b-4f50-98fd-727d6ede2e3c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 05:44:31 crc kubenswrapper[4930]: I1012 05:44:31.172677 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h4l8z\" (UniqueName: \"kubernetes.io/projected/3086651e-154b-4f50-98fd-727d6ede2e3c-kube-api-access-h4l8z\") on node \"crc\" DevicePath \"\"" Oct 12 05:44:31 crc kubenswrapper[4930]: I1012 05:44:31.172705 4930 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3086651e-154b-4f50-98fd-727d6ede2e3c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 05:44:31 crc kubenswrapper[4930]: I1012 05:44:31.172714 4930 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3086651e-154b-4f50-98fd-727d6ede2e3c-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 05:44:31 crc kubenswrapper[4930]: I1012 05:44:31.564818 4930 generic.go:334] "Generic (PLEG): container finished" podID="3086651e-154b-4f50-98fd-727d6ede2e3c" containerID="342d34b802312a9117fe8f37e6fafedac9bfb368f6480e321f894d4056a0a76b" exitCode=0 Oct 12 05:44:31 crc kubenswrapper[4930]: I1012 05:44:31.564875 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8fq8j" Oct 12 05:44:31 crc kubenswrapper[4930]: I1012 05:44:31.564915 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8fq8j" event={"ID":"3086651e-154b-4f50-98fd-727d6ede2e3c","Type":"ContainerDied","Data":"342d34b802312a9117fe8f37e6fafedac9bfb368f6480e321f894d4056a0a76b"} Oct 12 05:44:31 crc kubenswrapper[4930]: I1012 05:44:31.565363 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8fq8j" event={"ID":"3086651e-154b-4f50-98fd-727d6ede2e3c","Type":"ContainerDied","Data":"d97144446470ae2ff9acf3bcd946c435152c4ddd2978f788b235b2696176d321"} Oct 12 05:44:31 crc kubenswrapper[4930]: I1012 05:44:31.565394 4930 scope.go:117] "RemoveContainer" containerID="342d34b802312a9117fe8f37e6fafedac9bfb368f6480e321f894d4056a0a76b" Oct 12 05:44:31 crc kubenswrapper[4930]: I1012 05:44:31.595842 4930 scope.go:117] "RemoveContainer" containerID="6bd62ee1f0dad9da77cb5e3cc43f0068b275ffa971f69182960dc8c63e6db373" Oct 12 05:44:31 crc kubenswrapper[4930]: I1012 05:44:31.601673 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8fq8j"] Oct 12 05:44:31 crc kubenswrapper[4930]: I1012 05:44:31.601765 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8fq8j"] Oct 12 05:44:31 crc kubenswrapper[4930]: I1012 05:44:31.622120 4930 scope.go:117] "RemoveContainer" containerID="a08834a16cc8ae033cfda2c844e730f4322851a7113cd6512e6b538c7acbf78f" Oct 12 05:44:31 crc kubenswrapper[4930]: I1012 05:44:31.638662 4930 scope.go:117] "RemoveContainer" containerID="342d34b802312a9117fe8f37e6fafedac9bfb368f6480e321f894d4056a0a76b" Oct 12 05:44:31 crc kubenswrapper[4930]: E1012 05:44:31.639055 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"342d34b802312a9117fe8f37e6fafedac9bfb368f6480e321f894d4056a0a76b\": container with ID starting with 342d34b802312a9117fe8f37e6fafedac9bfb368f6480e321f894d4056a0a76b not found: ID does not exist" containerID="342d34b802312a9117fe8f37e6fafedac9bfb368f6480e321f894d4056a0a76b" Oct 12 05:44:31 crc kubenswrapper[4930]: I1012 05:44:31.639085 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"342d34b802312a9117fe8f37e6fafedac9bfb368f6480e321f894d4056a0a76b"} err="failed to get container status \"342d34b802312a9117fe8f37e6fafedac9bfb368f6480e321f894d4056a0a76b\": rpc error: code = NotFound desc = could not find container \"342d34b802312a9117fe8f37e6fafedac9bfb368f6480e321f894d4056a0a76b\": container with ID starting with 342d34b802312a9117fe8f37e6fafedac9bfb368f6480e321f894d4056a0a76b not found: ID does not exist" Oct 12 05:44:31 crc kubenswrapper[4930]: I1012 05:44:31.639138 4930 scope.go:117] "RemoveContainer" containerID="6bd62ee1f0dad9da77cb5e3cc43f0068b275ffa971f69182960dc8c63e6db373" Oct 12 05:44:31 crc kubenswrapper[4930]: E1012 05:44:31.639414 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6bd62ee1f0dad9da77cb5e3cc43f0068b275ffa971f69182960dc8c63e6db373\": container with ID starting with 6bd62ee1f0dad9da77cb5e3cc43f0068b275ffa971f69182960dc8c63e6db373 not found: ID does not exist" containerID="6bd62ee1f0dad9da77cb5e3cc43f0068b275ffa971f69182960dc8c63e6db373" Oct 12 05:44:31 crc kubenswrapper[4930]: I1012 05:44:31.639433 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bd62ee1f0dad9da77cb5e3cc43f0068b275ffa971f69182960dc8c63e6db373"} err="failed to get container status \"6bd62ee1f0dad9da77cb5e3cc43f0068b275ffa971f69182960dc8c63e6db373\": rpc error: code = NotFound desc = could not find container \"6bd62ee1f0dad9da77cb5e3cc43f0068b275ffa971f69182960dc8c63e6db373\": container with ID starting with 6bd62ee1f0dad9da77cb5e3cc43f0068b275ffa971f69182960dc8c63e6db373 not found: ID does not exist" Oct 12 05:44:31 crc kubenswrapper[4930]: I1012 05:44:31.639445 4930 scope.go:117] "RemoveContainer" containerID="a08834a16cc8ae033cfda2c844e730f4322851a7113cd6512e6b538c7acbf78f" Oct 12 05:44:31 crc kubenswrapper[4930]: E1012 05:44:31.639703 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a08834a16cc8ae033cfda2c844e730f4322851a7113cd6512e6b538c7acbf78f\": container with ID starting with a08834a16cc8ae033cfda2c844e730f4322851a7113cd6512e6b538c7acbf78f not found: ID does not exist" containerID="a08834a16cc8ae033cfda2c844e730f4322851a7113cd6512e6b538c7acbf78f" Oct 12 05:44:31 crc kubenswrapper[4930]: I1012 05:44:31.639770 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a08834a16cc8ae033cfda2c844e730f4322851a7113cd6512e6b538c7acbf78f"} err="failed to get container status \"a08834a16cc8ae033cfda2c844e730f4322851a7113cd6512e6b538c7acbf78f\": rpc error: code = NotFound desc = could not find container \"a08834a16cc8ae033cfda2c844e730f4322851a7113cd6512e6b538c7acbf78f\": container with ID starting with a08834a16cc8ae033cfda2c844e730f4322851a7113cd6512e6b538c7acbf78f not found: ID does not exist" Oct 12 05:44:32 crc kubenswrapper[4930]: I1012 05:44:32.158269 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3086651e-154b-4f50-98fd-727d6ede2e3c" path="/var/lib/kubelet/pods/3086651e-154b-4f50-98fd-727d6ede2e3c/volumes" Oct 12 05:44:33 crc kubenswrapper[4930]: I1012 05:44:33.669234 4930 patch_prober.go:28] interesting pod/machine-config-daemon-mk4tf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 05:44:33 crc kubenswrapper[4930]: I1012 05:44:33.669530 4930 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 05:44:33 crc kubenswrapper[4930]: I1012 05:44:33.669577 4930 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" Oct 12 05:44:33 crc kubenswrapper[4930]: I1012 05:44:33.670146 4930 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2193886cf3e24fbf30c3e6f91ab52d9b7e68968c8b42a4d0deafab5668555701"} pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 12 05:44:33 crc kubenswrapper[4930]: I1012 05:44:33.670198 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerName="machine-config-daemon" containerID="cri-o://2193886cf3e24fbf30c3e6f91ab52d9b7e68968c8b42a4d0deafab5668555701" gracePeriod=600 Oct 12 05:44:34 crc kubenswrapper[4930]: I1012 05:44:34.583095 4930 generic.go:334] "Generic (PLEG): container finished" podID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerID="2193886cf3e24fbf30c3e6f91ab52d9b7e68968c8b42a4d0deafab5668555701" exitCode=0 Oct 12 05:44:34 crc kubenswrapper[4930]: I1012 05:44:34.583187 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" event={"ID":"02f8684c-a3e4-44e8-9741-9f54488d8d8d","Type":"ContainerDied","Data":"2193886cf3e24fbf30c3e6f91ab52d9b7e68968c8b42a4d0deafab5668555701"} Oct 12 05:44:34 crc kubenswrapper[4930]: I1012 05:44:34.583776 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" event={"ID":"02f8684c-a3e4-44e8-9741-9f54488d8d8d","Type":"ContainerStarted","Data":"7434a5664387f355e60869203e8ba53d5bb3922f3950f2fa26ad54baf1b367ce"} Oct 12 05:44:50 crc kubenswrapper[4930]: I1012 05:44:50.731509 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-gbqmf" podUID="ef886351-d776-447e-ac43-1c16568cac4f" containerName="oauth-openshift" containerID="cri-o://4fc010d9104846bcfb5c6059ec440b8a132b5fed8d48728e8c2e5d14d744923b" gracePeriod=15 Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.197182 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-gbqmf" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.247497 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-76766fc778-8kmq5"] Oct 12 05:44:51 crc kubenswrapper[4930]: E1012 05:44:51.247825 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="390607e9-0185-4b5b-a006-70ecef170a3b" containerName="extract-utilities" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.247846 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="390607e9-0185-4b5b-a006-70ecef170a3b" containerName="extract-utilities" Oct 12 05:44:51 crc kubenswrapper[4930]: E1012 05:44:51.247861 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3086651e-154b-4f50-98fd-727d6ede2e3c" containerName="extract-content" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.247874 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="3086651e-154b-4f50-98fd-727d6ede2e3c" containerName="extract-content" Oct 12 05:44:51 crc kubenswrapper[4930]: E1012 05:44:51.247894 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdeaac2c-b6ce-4e8f-8cf2-07623b7f48ed" containerName="pruner" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.247909 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdeaac2c-b6ce-4e8f-8cf2-07623b7f48ed" containerName="pruner" Oct 12 05:44:51 crc kubenswrapper[4930]: E1012 05:44:51.247924 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="390607e9-0185-4b5b-a006-70ecef170a3b" containerName="extract-content" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.247935 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="390607e9-0185-4b5b-a006-70ecef170a3b" containerName="extract-content" Oct 12 05:44:51 crc kubenswrapper[4930]: E1012 05:44:51.247953 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e303adcd-2f97-4c4d-8bb4-77f7263ab93b" containerName="extract-content" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.247965 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="e303adcd-2f97-4c4d-8bb4-77f7263ab93b" containerName="extract-content" Oct 12 05:44:51 crc kubenswrapper[4930]: E1012 05:44:51.247989 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef886351-d776-447e-ac43-1c16568cac4f" containerName="oauth-openshift" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.248001 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef886351-d776-447e-ac43-1c16568cac4f" containerName="oauth-openshift" Oct 12 05:44:51 crc kubenswrapper[4930]: E1012 05:44:51.248018 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e303adcd-2f97-4c4d-8bb4-77f7263ab93b" containerName="extract-utilities" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.248030 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="e303adcd-2f97-4c4d-8bb4-77f7263ab93b" containerName="extract-utilities" Oct 12 05:44:51 crc kubenswrapper[4930]: E1012 05:44:51.248047 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="997f2f95-2109-480c-833a-f9d751ff059b" containerName="pruner" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.248058 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="997f2f95-2109-480c-833a-f9d751ff059b" containerName="pruner" Oct 12 05:44:51 crc kubenswrapper[4930]: E1012 05:44:51.248074 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="054cc455-fa72-4e63-a082-59b56822991e" containerName="extract-content" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.248085 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="054cc455-fa72-4e63-a082-59b56822991e" containerName="extract-content" Oct 12 05:44:51 crc kubenswrapper[4930]: E1012 05:44:51.248099 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="054cc455-fa72-4e63-a082-59b56822991e" containerName="registry-server" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.248111 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="054cc455-fa72-4e63-a082-59b56822991e" containerName="registry-server" Oct 12 05:44:51 crc kubenswrapper[4930]: E1012 05:44:51.248129 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3086651e-154b-4f50-98fd-727d6ede2e3c" containerName="registry-server" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.248141 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="3086651e-154b-4f50-98fd-727d6ede2e3c" containerName="registry-server" Oct 12 05:44:51 crc kubenswrapper[4930]: E1012 05:44:51.248159 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3086651e-154b-4f50-98fd-727d6ede2e3c" containerName="extract-utilities" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.248170 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="3086651e-154b-4f50-98fd-727d6ede2e3c" containerName="extract-utilities" Oct 12 05:44:51 crc kubenswrapper[4930]: E1012 05:44:51.248185 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="390607e9-0185-4b5b-a006-70ecef170a3b" containerName="registry-server" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.248196 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="390607e9-0185-4b5b-a006-70ecef170a3b" containerName="registry-server" Oct 12 05:44:51 crc kubenswrapper[4930]: E1012 05:44:51.248210 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="054cc455-fa72-4e63-a082-59b56822991e" containerName="extract-utilities" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.248223 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="054cc455-fa72-4e63-a082-59b56822991e" containerName="extract-utilities" Oct 12 05:44:51 crc kubenswrapper[4930]: E1012 05:44:51.248247 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e303adcd-2f97-4c4d-8bb4-77f7263ab93b" containerName="registry-server" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.248260 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="e303adcd-2f97-4c4d-8bb4-77f7263ab93b" containerName="registry-server" Oct 12 05:44:51 crc kubenswrapper[4930]: E1012 05:44:51.248278 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74be97dd-4d16-40ed-87e4-b707eccf422e" containerName="collect-profiles" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.248293 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="74be97dd-4d16-40ed-87e4-b707eccf422e" containerName="collect-profiles" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.248448 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="e303adcd-2f97-4c4d-8bb4-77f7263ab93b" containerName="registry-server" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.248467 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdeaac2c-b6ce-4e8f-8cf2-07623b7f48ed" containerName="pruner" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.248483 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="997f2f95-2109-480c-833a-f9d751ff059b" containerName="pruner" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.248503 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="74be97dd-4d16-40ed-87e4-b707eccf422e" containerName="collect-profiles" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.248522 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="054cc455-fa72-4e63-a082-59b56822991e" containerName="registry-server" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.248543 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="3086651e-154b-4f50-98fd-727d6ede2e3c" containerName="registry-server" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.248559 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef886351-d776-447e-ac43-1c16568cac4f" containerName="oauth-openshift" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.248574 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="390607e9-0185-4b5b-a006-70ecef170a3b" containerName="registry-server" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.250169 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-76766fc778-8kmq5" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.263240 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-76766fc778-8kmq5"] Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.293324 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ef886351-d776-447e-ac43-1c16568cac4f-v4-0-config-user-template-provider-selection\") pod \"ef886351-d776-447e-ac43-1c16568cac4f\" (UID: \"ef886351-d776-447e-ac43-1c16568cac4f\") " Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.293395 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ef886351-d776-447e-ac43-1c16568cac4f-v4-0-config-system-router-certs\") pod \"ef886351-d776-447e-ac43-1c16568cac4f\" (UID: \"ef886351-d776-447e-ac43-1c16568cac4f\") " Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.293440 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ef886351-d776-447e-ac43-1c16568cac4f-v4-0-config-system-session\") pod \"ef886351-d776-447e-ac43-1c16568cac4f\" (UID: \"ef886351-d776-447e-ac43-1c16568cac4f\") " Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.293477 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ef886351-d776-447e-ac43-1c16568cac4f-v4-0-config-system-trusted-ca-bundle\") pod \"ef886351-d776-447e-ac43-1c16568cac4f\" (UID: \"ef886351-d776-447e-ac43-1c16568cac4f\") " Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.293532 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ef886351-d776-447e-ac43-1c16568cac4f-v4-0-config-user-template-login\") pod \"ef886351-d776-447e-ac43-1c16568cac4f\" (UID: \"ef886351-d776-447e-ac43-1c16568cac4f\") " Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.293570 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ef886351-d776-447e-ac43-1c16568cac4f-v4-0-config-system-service-ca\") pod \"ef886351-d776-447e-ac43-1c16568cac4f\" (UID: \"ef886351-d776-447e-ac43-1c16568cac4f\") " Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.293631 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ef886351-d776-447e-ac43-1c16568cac4f-v4-0-config-user-idp-0-file-data\") pod \"ef886351-d776-447e-ac43-1c16568cac4f\" (UID: \"ef886351-d776-447e-ac43-1c16568cac4f\") " Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.293675 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ef886351-d776-447e-ac43-1c16568cac4f-v4-0-config-user-template-error\") pod \"ef886351-d776-447e-ac43-1c16568cac4f\" (UID: \"ef886351-d776-447e-ac43-1c16568cac4f\") " Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.293766 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ef886351-d776-447e-ac43-1c16568cac4f-v4-0-config-system-cliconfig\") pod \"ef886351-d776-447e-ac43-1c16568cac4f\" (UID: \"ef886351-d776-447e-ac43-1c16568cac4f\") " Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.293812 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ef886351-d776-447e-ac43-1c16568cac4f-audit-policies\") pod \"ef886351-d776-447e-ac43-1c16568cac4f\" (UID: \"ef886351-d776-447e-ac43-1c16568cac4f\") " Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.293846 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ef886351-d776-447e-ac43-1c16568cac4f-audit-dir\") pod \"ef886351-d776-447e-ac43-1c16568cac4f\" (UID: \"ef886351-d776-447e-ac43-1c16568cac4f\") " Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.293875 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ef886351-d776-447e-ac43-1c16568cac4f-v4-0-config-system-serving-cert\") pod \"ef886351-d776-447e-ac43-1c16568cac4f\" (UID: \"ef886351-d776-447e-ac43-1c16568cac4f\") " Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.293915 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ef886351-d776-447e-ac43-1c16568cac4f-v4-0-config-system-ocp-branding-template\") pod \"ef886351-d776-447e-ac43-1c16568cac4f\" (UID: \"ef886351-d776-447e-ac43-1c16568cac4f\") " Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.293947 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htwmq\" (UniqueName: \"kubernetes.io/projected/ef886351-d776-447e-ac43-1c16568cac4f-kube-api-access-htwmq\") pod \"ef886351-d776-447e-ac43-1c16568cac4f\" (UID: \"ef886351-d776-447e-ac43-1c16568cac4f\") " Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.296333 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ef886351-d776-447e-ac43-1c16568cac4f-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "ef886351-d776-447e-ac43-1c16568cac4f" (UID: "ef886351-d776-447e-ac43-1c16568cac4f"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.297457 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef886351-d776-447e-ac43-1c16568cac4f-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "ef886351-d776-447e-ac43-1c16568cac4f" (UID: "ef886351-d776-447e-ac43-1c16568cac4f"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.298423 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef886351-d776-447e-ac43-1c16568cac4f-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "ef886351-d776-447e-ac43-1c16568cac4f" (UID: "ef886351-d776-447e-ac43-1c16568cac4f"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.305637 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef886351-d776-447e-ac43-1c16568cac4f-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "ef886351-d776-447e-ac43-1c16568cac4f" (UID: "ef886351-d776-447e-ac43-1c16568cac4f"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.306092 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef886351-d776-447e-ac43-1c16568cac4f-kube-api-access-htwmq" (OuterVolumeSpecName: "kube-api-access-htwmq") pod "ef886351-d776-447e-ac43-1c16568cac4f" (UID: "ef886351-d776-447e-ac43-1c16568cac4f"). InnerVolumeSpecName "kube-api-access-htwmq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.306290 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef886351-d776-447e-ac43-1c16568cac4f-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "ef886351-d776-447e-ac43-1c16568cac4f" (UID: "ef886351-d776-447e-ac43-1c16568cac4f"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.306981 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef886351-d776-447e-ac43-1c16568cac4f-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "ef886351-d776-447e-ac43-1c16568cac4f" (UID: "ef886351-d776-447e-ac43-1c16568cac4f"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.307688 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef886351-d776-447e-ac43-1c16568cac4f-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "ef886351-d776-447e-ac43-1c16568cac4f" (UID: "ef886351-d776-447e-ac43-1c16568cac4f"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.307879 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef886351-d776-447e-ac43-1c16568cac4f-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "ef886351-d776-447e-ac43-1c16568cac4f" (UID: "ef886351-d776-447e-ac43-1c16568cac4f"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.309473 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef886351-d776-447e-ac43-1c16568cac4f-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "ef886351-d776-447e-ac43-1c16568cac4f" (UID: "ef886351-d776-447e-ac43-1c16568cac4f"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.310295 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef886351-d776-447e-ac43-1c16568cac4f-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "ef886351-d776-447e-ac43-1c16568cac4f" (UID: "ef886351-d776-447e-ac43-1c16568cac4f"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.311847 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef886351-d776-447e-ac43-1c16568cac4f-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "ef886351-d776-447e-ac43-1c16568cac4f" (UID: "ef886351-d776-447e-ac43-1c16568cac4f"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.312057 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef886351-d776-447e-ac43-1c16568cac4f-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "ef886351-d776-447e-ac43-1c16568cac4f" (UID: "ef886351-d776-447e-ac43-1c16568cac4f"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.314028 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef886351-d776-447e-ac43-1c16568cac4f-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "ef886351-d776-447e-ac43-1c16568cac4f" (UID: "ef886351-d776-447e-ac43-1c16568cac4f"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.395967 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ps6js\" (UniqueName: \"kubernetes.io/projected/eee9aa02-17a2-4214-814c-a7efa7f57954-kube-api-access-ps6js\") pod \"oauth-openshift-76766fc778-8kmq5\" (UID: \"eee9aa02-17a2-4214-814c-a7efa7f57954\") " pod="openshift-authentication/oauth-openshift-76766fc778-8kmq5" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.396050 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/eee9aa02-17a2-4214-814c-a7efa7f57954-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-76766fc778-8kmq5\" (UID: \"eee9aa02-17a2-4214-814c-a7efa7f57954\") " pod="openshift-authentication/oauth-openshift-76766fc778-8kmq5" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.396115 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/eee9aa02-17a2-4214-814c-a7efa7f57954-v4-0-config-user-template-login\") pod \"oauth-openshift-76766fc778-8kmq5\" (UID: \"eee9aa02-17a2-4214-814c-a7efa7f57954\") " pod="openshift-authentication/oauth-openshift-76766fc778-8kmq5" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.396163 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/eee9aa02-17a2-4214-814c-a7efa7f57954-v4-0-config-system-serving-cert\") pod \"oauth-openshift-76766fc778-8kmq5\" (UID: \"eee9aa02-17a2-4214-814c-a7efa7f57954\") " pod="openshift-authentication/oauth-openshift-76766fc778-8kmq5" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.396225 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/eee9aa02-17a2-4214-814c-a7efa7f57954-v4-0-config-system-service-ca\") pod \"oauth-openshift-76766fc778-8kmq5\" (UID: \"eee9aa02-17a2-4214-814c-a7efa7f57954\") " pod="openshift-authentication/oauth-openshift-76766fc778-8kmq5" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.396280 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/eee9aa02-17a2-4214-814c-a7efa7f57954-v4-0-config-system-session\") pod \"oauth-openshift-76766fc778-8kmq5\" (UID: \"eee9aa02-17a2-4214-814c-a7efa7f57954\") " pod="openshift-authentication/oauth-openshift-76766fc778-8kmq5" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.396363 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/eee9aa02-17a2-4214-814c-a7efa7f57954-v4-0-config-system-router-certs\") pod \"oauth-openshift-76766fc778-8kmq5\" (UID: \"eee9aa02-17a2-4214-814c-a7efa7f57954\") " pod="openshift-authentication/oauth-openshift-76766fc778-8kmq5" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.396425 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/eee9aa02-17a2-4214-814c-a7efa7f57954-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-76766fc778-8kmq5\" (UID: \"eee9aa02-17a2-4214-814c-a7efa7f57954\") " pod="openshift-authentication/oauth-openshift-76766fc778-8kmq5" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.396555 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eee9aa02-17a2-4214-814c-a7efa7f57954-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-76766fc778-8kmq5\" (UID: \"eee9aa02-17a2-4214-814c-a7efa7f57954\") " pod="openshift-authentication/oauth-openshift-76766fc778-8kmq5" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.396616 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/eee9aa02-17a2-4214-814c-a7efa7f57954-audit-dir\") pod \"oauth-openshift-76766fc778-8kmq5\" (UID: \"eee9aa02-17a2-4214-814c-a7efa7f57954\") " pod="openshift-authentication/oauth-openshift-76766fc778-8kmq5" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.396668 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/eee9aa02-17a2-4214-814c-a7efa7f57954-v4-0-config-system-cliconfig\") pod \"oauth-openshift-76766fc778-8kmq5\" (UID: \"eee9aa02-17a2-4214-814c-a7efa7f57954\") " pod="openshift-authentication/oauth-openshift-76766fc778-8kmq5" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.396723 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/eee9aa02-17a2-4214-814c-a7efa7f57954-audit-policies\") pod \"oauth-openshift-76766fc778-8kmq5\" (UID: \"eee9aa02-17a2-4214-814c-a7efa7f57954\") " pod="openshift-authentication/oauth-openshift-76766fc778-8kmq5" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.396831 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/eee9aa02-17a2-4214-814c-a7efa7f57954-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-76766fc778-8kmq5\" (UID: \"eee9aa02-17a2-4214-814c-a7efa7f57954\") " pod="openshift-authentication/oauth-openshift-76766fc778-8kmq5" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.396909 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/eee9aa02-17a2-4214-814c-a7efa7f57954-v4-0-config-user-template-error\") pod \"oauth-openshift-76766fc778-8kmq5\" (UID: \"eee9aa02-17a2-4214-814c-a7efa7f57954\") " pod="openshift-authentication/oauth-openshift-76766fc778-8kmq5" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.397055 4930 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ef886351-d776-447e-ac43-1c16568cac4f-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.397091 4930 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ef886351-d776-447e-ac43-1c16568cac4f-audit-dir\") on node \"crc\" DevicePath \"\"" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.397112 4930 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ef886351-d776-447e-ac43-1c16568cac4f-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.397133 4930 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ef886351-d776-447e-ac43-1c16568cac4f-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.397154 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htwmq\" (UniqueName: \"kubernetes.io/projected/ef886351-d776-447e-ac43-1c16568cac4f-kube-api-access-htwmq\") on node \"crc\" DevicePath \"\"" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.397173 4930 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ef886351-d776-447e-ac43-1c16568cac4f-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.397194 4930 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ef886351-d776-447e-ac43-1c16568cac4f-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.397214 4930 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ef886351-d776-447e-ac43-1c16568cac4f-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.397233 4930 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ef886351-d776-447e-ac43-1c16568cac4f-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.397252 4930 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ef886351-d776-447e-ac43-1c16568cac4f-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.397272 4930 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ef886351-d776-447e-ac43-1c16568cac4f-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.397292 4930 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ef886351-d776-447e-ac43-1c16568cac4f-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.397311 4930 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ef886351-d776-447e-ac43-1c16568cac4f-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.397335 4930 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ef886351-d776-447e-ac43-1c16568cac4f-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.498224 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/eee9aa02-17a2-4214-814c-a7efa7f57954-v4-0-config-user-template-login\") pod \"oauth-openshift-76766fc778-8kmq5\" (UID: \"eee9aa02-17a2-4214-814c-a7efa7f57954\") " pod="openshift-authentication/oauth-openshift-76766fc778-8kmq5" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.498305 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/eee9aa02-17a2-4214-814c-a7efa7f57954-v4-0-config-system-serving-cert\") pod \"oauth-openshift-76766fc778-8kmq5\" (UID: \"eee9aa02-17a2-4214-814c-a7efa7f57954\") " pod="openshift-authentication/oauth-openshift-76766fc778-8kmq5" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.498349 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/eee9aa02-17a2-4214-814c-a7efa7f57954-v4-0-config-system-service-ca\") pod \"oauth-openshift-76766fc778-8kmq5\" (UID: \"eee9aa02-17a2-4214-814c-a7efa7f57954\") " pod="openshift-authentication/oauth-openshift-76766fc778-8kmq5" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.498394 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/eee9aa02-17a2-4214-814c-a7efa7f57954-v4-0-config-system-session\") pod \"oauth-openshift-76766fc778-8kmq5\" (UID: \"eee9aa02-17a2-4214-814c-a7efa7f57954\") " pod="openshift-authentication/oauth-openshift-76766fc778-8kmq5" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.498465 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/eee9aa02-17a2-4214-814c-a7efa7f57954-v4-0-config-system-router-certs\") pod \"oauth-openshift-76766fc778-8kmq5\" (UID: \"eee9aa02-17a2-4214-814c-a7efa7f57954\") " pod="openshift-authentication/oauth-openshift-76766fc778-8kmq5" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.498510 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/eee9aa02-17a2-4214-814c-a7efa7f57954-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-76766fc778-8kmq5\" (UID: \"eee9aa02-17a2-4214-814c-a7efa7f57954\") " pod="openshift-authentication/oauth-openshift-76766fc778-8kmq5" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.498544 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eee9aa02-17a2-4214-814c-a7efa7f57954-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-76766fc778-8kmq5\" (UID: \"eee9aa02-17a2-4214-814c-a7efa7f57954\") " pod="openshift-authentication/oauth-openshift-76766fc778-8kmq5" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.498582 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/eee9aa02-17a2-4214-814c-a7efa7f57954-audit-dir\") pod \"oauth-openshift-76766fc778-8kmq5\" (UID: \"eee9aa02-17a2-4214-814c-a7efa7f57954\") " pod="openshift-authentication/oauth-openshift-76766fc778-8kmq5" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.498618 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/eee9aa02-17a2-4214-814c-a7efa7f57954-v4-0-config-system-cliconfig\") pod \"oauth-openshift-76766fc778-8kmq5\" (UID: \"eee9aa02-17a2-4214-814c-a7efa7f57954\") " pod="openshift-authentication/oauth-openshift-76766fc778-8kmq5" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.498656 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/eee9aa02-17a2-4214-814c-a7efa7f57954-audit-policies\") pod \"oauth-openshift-76766fc778-8kmq5\" (UID: \"eee9aa02-17a2-4214-814c-a7efa7f57954\") " pod="openshift-authentication/oauth-openshift-76766fc778-8kmq5" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.498697 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/eee9aa02-17a2-4214-814c-a7efa7f57954-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-76766fc778-8kmq5\" (UID: \"eee9aa02-17a2-4214-814c-a7efa7f57954\") " pod="openshift-authentication/oauth-openshift-76766fc778-8kmq5" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.498760 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/eee9aa02-17a2-4214-814c-a7efa7f57954-v4-0-config-user-template-error\") pod \"oauth-openshift-76766fc778-8kmq5\" (UID: \"eee9aa02-17a2-4214-814c-a7efa7f57954\") " pod="openshift-authentication/oauth-openshift-76766fc778-8kmq5" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.498767 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/eee9aa02-17a2-4214-814c-a7efa7f57954-audit-dir\") pod \"oauth-openshift-76766fc778-8kmq5\" (UID: \"eee9aa02-17a2-4214-814c-a7efa7f57954\") " pod="openshift-authentication/oauth-openshift-76766fc778-8kmq5" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.500274 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/eee9aa02-17a2-4214-814c-a7efa7f57954-v4-0-config-system-cliconfig\") pod \"oauth-openshift-76766fc778-8kmq5\" (UID: \"eee9aa02-17a2-4214-814c-a7efa7f57954\") " pod="openshift-authentication/oauth-openshift-76766fc778-8kmq5" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.500443 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ps6js\" (UniqueName: \"kubernetes.io/projected/eee9aa02-17a2-4214-814c-a7efa7f57954-kube-api-access-ps6js\") pod \"oauth-openshift-76766fc778-8kmq5\" (UID: \"eee9aa02-17a2-4214-814c-a7efa7f57954\") " pod="openshift-authentication/oauth-openshift-76766fc778-8kmq5" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.500483 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/eee9aa02-17a2-4214-814c-a7efa7f57954-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-76766fc778-8kmq5\" (UID: \"eee9aa02-17a2-4214-814c-a7efa7f57954\") " pod="openshift-authentication/oauth-openshift-76766fc778-8kmq5" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.500652 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/eee9aa02-17a2-4214-814c-a7efa7f57954-audit-policies\") pod \"oauth-openshift-76766fc778-8kmq5\" (UID: \"eee9aa02-17a2-4214-814c-a7efa7f57954\") " pod="openshift-authentication/oauth-openshift-76766fc778-8kmq5" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.500927 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/eee9aa02-17a2-4214-814c-a7efa7f57954-v4-0-config-system-service-ca\") pod \"oauth-openshift-76766fc778-8kmq5\" (UID: \"eee9aa02-17a2-4214-814c-a7efa7f57954\") " pod="openshift-authentication/oauth-openshift-76766fc778-8kmq5" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.501463 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eee9aa02-17a2-4214-814c-a7efa7f57954-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-76766fc778-8kmq5\" (UID: \"eee9aa02-17a2-4214-814c-a7efa7f57954\") " pod="openshift-authentication/oauth-openshift-76766fc778-8kmq5" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.504831 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/eee9aa02-17a2-4214-814c-a7efa7f57954-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-76766fc778-8kmq5\" (UID: \"eee9aa02-17a2-4214-814c-a7efa7f57954\") " pod="openshift-authentication/oauth-openshift-76766fc778-8kmq5" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.505263 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/eee9aa02-17a2-4214-814c-a7efa7f57954-v4-0-config-user-template-login\") pod \"oauth-openshift-76766fc778-8kmq5\" (UID: \"eee9aa02-17a2-4214-814c-a7efa7f57954\") " pod="openshift-authentication/oauth-openshift-76766fc778-8kmq5" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.505374 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/eee9aa02-17a2-4214-814c-a7efa7f57954-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-76766fc778-8kmq5\" (UID: \"eee9aa02-17a2-4214-814c-a7efa7f57954\") " pod="openshift-authentication/oauth-openshift-76766fc778-8kmq5" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.505400 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/eee9aa02-17a2-4214-814c-a7efa7f57954-v4-0-config-system-serving-cert\") pod \"oauth-openshift-76766fc778-8kmq5\" (UID: \"eee9aa02-17a2-4214-814c-a7efa7f57954\") " pod="openshift-authentication/oauth-openshift-76766fc778-8kmq5" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.506012 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/eee9aa02-17a2-4214-814c-a7efa7f57954-v4-0-config-system-router-certs\") pod \"oauth-openshift-76766fc778-8kmq5\" (UID: \"eee9aa02-17a2-4214-814c-a7efa7f57954\") " pod="openshift-authentication/oauth-openshift-76766fc778-8kmq5" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.506447 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/eee9aa02-17a2-4214-814c-a7efa7f57954-v4-0-config-system-session\") pod \"oauth-openshift-76766fc778-8kmq5\" (UID: \"eee9aa02-17a2-4214-814c-a7efa7f57954\") " pod="openshift-authentication/oauth-openshift-76766fc778-8kmq5" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.507795 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/eee9aa02-17a2-4214-814c-a7efa7f57954-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-76766fc778-8kmq5\" (UID: \"eee9aa02-17a2-4214-814c-a7efa7f57954\") " pod="openshift-authentication/oauth-openshift-76766fc778-8kmq5" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.509271 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/eee9aa02-17a2-4214-814c-a7efa7f57954-v4-0-config-user-template-error\") pod \"oauth-openshift-76766fc778-8kmq5\" (UID: \"eee9aa02-17a2-4214-814c-a7efa7f57954\") " pod="openshift-authentication/oauth-openshift-76766fc778-8kmq5" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.529714 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ps6js\" (UniqueName: \"kubernetes.io/projected/eee9aa02-17a2-4214-814c-a7efa7f57954-kube-api-access-ps6js\") pod \"oauth-openshift-76766fc778-8kmq5\" (UID: \"eee9aa02-17a2-4214-814c-a7efa7f57954\") " pod="openshift-authentication/oauth-openshift-76766fc778-8kmq5" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.584279 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-76766fc778-8kmq5" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.694518 4930 generic.go:334] "Generic (PLEG): container finished" podID="ef886351-d776-447e-ac43-1c16568cac4f" containerID="4fc010d9104846bcfb5c6059ec440b8a132b5fed8d48728e8c2e5d14d744923b" exitCode=0 Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.694576 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-gbqmf" event={"ID":"ef886351-d776-447e-ac43-1c16568cac4f","Type":"ContainerDied","Data":"4fc010d9104846bcfb5c6059ec440b8a132b5fed8d48728e8c2e5d14d744923b"} Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.694615 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-gbqmf" event={"ID":"ef886351-d776-447e-ac43-1c16568cac4f","Type":"ContainerDied","Data":"645112c8fb2436feee6f09d29307a3461b8966f9fa20747297fb4af43a79c3ef"} Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.694642 4930 scope.go:117] "RemoveContainer" containerID="4fc010d9104846bcfb5c6059ec440b8a132b5fed8d48728e8c2e5d14d744923b" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.694832 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-gbqmf" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.752862 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-gbqmf"] Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.755078 4930 scope.go:117] "RemoveContainer" containerID="4fc010d9104846bcfb5c6059ec440b8a132b5fed8d48728e8c2e5d14d744923b" Oct 12 05:44:51 crc kubenswrapper[4930]: E1012 05:44:51.755651 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fc010d9104846bcfb5c6059ec440b8a132b5fed8d48728e8c2e5d14d744923b\": container with ID starting with 4fc010d9104846bcfb5c6059ec440b8a132b5fed8d48728e8c2e5d14d744923b not found: ID does not exist" containerID="4fc010d9104846bcfb5c6059ec440b8a132b5fed8d48728e8c2e5d14d744923b" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.755712 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fc010d9104846bcfb5c6059ec440b8a132b5fed8d48728e8c2e5d14d744923b"} err="failed to get container status \"4fc010d9104846bcfb5c6059ec440b8a132b5fed8d48728e8c2e5d14d744923b\": rpc error: code = NotFound desc = could not find container \"4fc010d9104846bcfb5c6059ec440b8a132b5fed8d48728e8c2e5d14d744923b\": container with ID starting with 4fc010d9104846bcfb5c6059ec440b8a132b5fed8d48728e8c2e5d14d744923b not found: ID does not exist" Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.757292 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-gbqmf"] Oct 12 05:44:51 crc kubenswrapper[4930]: I1012 05:44:51.880475 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-76766fc778-8kmq5"] Oct 12 05:44:51 crc kubenswrapper[4930]: W1012 05:44:51.890085 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeee9aa02_17a2_4214_814c_a7efa7f57954.slice/crio-9e49469b29efe7796e274337871f9927cf3670a98d78c25e9ee5521242b1f2bf WatchSource:0}: Error finding container 9e49469b29efe7796e274337871f9927cf3670a98d78c25e9ee5521242b1f2bf: Status 404 returned error can't find the container with id 9e49469b29efe7796e274337871f9927cf3670a98d78c25e9ee5521242b1f2bf Oct 12 05:44:52 crc kubenswrapper[4930]: I1012 05:44:52.145775 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef886351-d776-447e-ac43-1c16568cac4f" path="/var/lib/kubelet/pods/ef886351-d776-447e-ac43-1c16568cac4f/volumes" Oct 12 05:44:52 crc kubenswrapper[4930]: I1012 05:44:52.704050 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-76766fc778-8kmq5" event={"ID":"eee9aa02-17a2-4214-814c-a7efa7f57954","Type":"ContainerStarted","Data":"ca38b07e9d484c43a3c874f3aa893d4b8324126ba5b8f3a87a438cce1e342331"} Oct 12 05:44:52 crc kubenswrapper[4930]: I1012 05:44:52.704110 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-76766fc778-8kmq5" event={"ID":"eee9aa02-17a2-4214-814c-a7efa7f57954","Type":"ContainerStarted","Data":"9e49469b29efe7796e274337871f9927cf3670a98d78c25e9ee5521242b1f2bf"} Oct 12 05:44:52 crc kubenswrapper[4930]: I1012 05:44:52.704479 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-76766fc778-8kmq5" Oct 12 05:44:52 crc kubenswrapper[4930]: I1012 05:44:52.716182 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-76766fc778-8kmq5" Oct 12 05:44:52 crc kubenswrapper[4930]: I1012 05:44:52.737017 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-76766fc778-8kmq5" podStartSLOduration=27.736993009 podStartE2EDuration="27.736993009s" podCreationTimestamp="2025-10-12 05:44:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:44:52.734957092 +0000 UTC m=+225.277058937" watchObservedRunningTime="2025-10-12 05:44:52.736993009 +0000 UTC m=+225.279094804" Oct 12 05:45:00 crc kubenswrapper[4930]: I1012 05:45:00.150214 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29337465-t9m4c"] Oct 12 05:45:00 crc kubenswrapper[4930]: I1012 05:45:00.153685 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29337465-t9m4c" Oct 12 05:45:00 crc kubenswrapper[4930]: I1012 05:45:00.156419 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 12 05:45:00 crc kubenswrapper[4930]: I1012 05:45:00.157899 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 12 05:45:00 crc kubenswrapper[4930]: I1012 05:45:00.159699 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29337465-t9m4c"] Oct 12 05:45:00 crc kubenswrapper[4930]: I1012 05:45:00.333828 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/78dfc24e-649b-4183-b547-83ffda66a897-secret-volume\") pod \"collect-profiles-29337465-t9m4c\" (UID: \"78dfc24e-649b-4183-b547-83ffda66a897\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337465-t9m4c" Oct 12 05:45:00 crc kubenswrapper[4930]: I1012 05:45:00.333895 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8krt\" (UniqueName: \"kubernetes.io/projected/78dfc24e-649b-4183-b547-83ffda66a897-kube-api-access-s8krt\") pod \"collect-profiles-29337465-t9m4c\" (UID: \"78dfc24e-649b-4183-b547-83ffda66a897\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337465-t9m4c" Oct 12 05:45:00 crc kubenswrapper[4930]: I1012 05:45:00.333955 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/78dfc24e-649b-4183-b547-83ffda66a897-config-volume\") pod \"collect-profiles-29337465-t9m4c\" (UID: \"78dfc24e-649b-4183-b547-83ffda66a897\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337465-t9m4c" Oct 12 05:45:00 crc kubenswrapper[4930]: I1012 05:45:00.435376 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/78dfc24e-649b-4183-b547-83ffda66a897-secret-volume\") pod \"collect-profiles-29337465-t9m4c\" (UID: \"78dfc24e-649b-4183-b547-83ffda66a897\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337465-t9m4c" Oct 12 05:45:00 crc kubenswrapper[4930]: I1012 05:45:00.435797 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8krt\" (UniqueName: \"kubernetes.io/projected/78dfc24e-649b-4183-b547-83ffda66a897-kube-api-access-s8krt\") pod \"collect-profiles-29337465-t9m4c\" (UID: \"78dfc24e-649b-4183-b547-83ffda66a897\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337465-t9m4c" Oct 12 05:45:00 crc kubenswrapper[4930]: I1012 05:45:00.435962 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/78dfc24e-649b-4183-b547-83ffda66a897-config-volume\") pod \"collect-profiles-29337465-t9m4c\" (UID: \"78dfc24e-649b-4183-b547-83ffda66a897\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337465-t9m4c" Oct 12 05:45:00 crc kubenswrapper[4930]: I1012 05:45:00.437021 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/78dfc24e-649b-4183-b547-83ffda66a897-config-volume\") pod \"collect-profiles-29337465-t9m4c\" (UID: \"78dfc24e-649b-4183-b547-83ffda66a897\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337465-t9m4c" Oct 12 05:45:00 crc kubenswrapper[4930]: I1012 05:45:00.454407 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/78dfc24e-649b-4183-b547-83ffda66a897-secret-volume\") pod \"collect-profiles-29337465-t9m4c\" (UID: \"78dfc24e-649b-4183-b547-83ffda66a897\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337465-t9m4c" Oct 12 05:45:00 crc kubenswrapper[4930]: I1012 05:45:00.458098 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8krt\" (UniqueName: \"kubernetes.io/projected/78dfc24e-649b-4183-b547-83ffda66a897-kube-api-access-s8krt\") pod \"collect-profiles-29337465-t9m4c\" (UID: \"78dfc24e-649b-4183-b547-83ffda66a897\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337465-t9m4c" Oct 12 05:45:00 crc kubenswrapper[4930]: I1012 05:45:00.484939 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29337465-t9m4c" Oct 12 05:45:00 crc kubenswrapper[4930]: I1012 05:45:00.960228 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29337465-t9m4c"] Oct 12 05:45:01 crc kubenswrapper[4930]: I1012 05:45:01.768112 4930 generic.go:334] "Generic (PLEG): container finished" podID="78dfc24e-649b-4183-b547-83ffda66a897" containerID="50cd0190e3ea9244e0bce44eaecb6bd9ceefed270d2ceb61875f3cc8cccb804a" exitCode=0 Oct 12 05:45:01 crc kubenswrapper[4930]: I1012 05:45:01.768467 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29337465-t9m4c" event={"ID":"78dfc24e-649b-4183-b547-83ffda66a897","Type":"ContainerDied","Data":"50cd0190e3ea9244e0bce44eaecb6bd9ceefed270d2ceb61875f3cc8cccb804a"} Oct 12 05:45:01 crc kubenswrapper[4930]: I1012 05:45:01.768496 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29337465-t9m4c" event={"ID":"78dfc24e-649b-4183-b547-83ffda66a897","Type":"ContainerStarted","Data":"8052b989190d16515787316ddda517ab3720b6a52485476880ebb8ec3dc13dad"} Oct 12 05:45:03 crc kubenswrapper[4930]: I1012 05:45:03.038343 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29337465-t9m4c" Oct 12 05:45:03 crc kubenswrapper[4930]: I1012 05:45:03.179233 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/78dfc24e-649b-4183-b547-83ffda66a897-config-volume\") pod \"78dfc24e-649b-4183-b547-83ffda66a897\" (UID: \"78dfc24e-649b-4183-b547-83ffda66a897\") " Oct 12 05:45:03 crc kubenswrapper[4930]: I1012 05:45:03.179312 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/78dfc24e-649b-4183-b547-83ffda66a897-secret-volume\") pod \"78dfc24e-649b-4183-b547-83ffda66a897\" (UID: \"78dfc24e-649b-4183-b547-83ffda66a897\") " Oct 12 05:45:03 crc kubenswrapper[4930]: I1012 05:45:03.179404 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8krt\" (UniqueName: \"kubernetes.io/projected/78dfc24e-649b-4183-b547-83ffda66a897-kube-api-access-s8krt\") pod \"78dfc24e-649b-4183-b547-83ffda66a897\" (UID: \"78dfc24e-649b-4183-b547-83ffda66a897\") " Oct 12 05:45:03 crc kubenswrapper[4930]: I1012 05:45:03.180256 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78dfc24e-649b-4183-b547-83ffda66a897-config-volume" (OuterVolumeSpecName: "config-volume") pod "78dfc24e-649b-4183-b547-83ffda66a897" (UID: "78dfc24e-649b-4183-b547-83ffda66a897"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:45:03 crc kubenswrapper[4930]: I1012 05:45:03.184904 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78dfc24e-649b-4183-b547-83ffda66a897-kube-api-access-s8krt" (OuterVolumeSpecName: "kube-api-access-s8krt") pod "78dfc24e-649b-4183-b547-83ffda66a897" (UID: "78dfc24e-649b-4183-b547-83ffda66a897"). InnerVolumeSpecName "kube-api-access-s8krt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:45:03 crc kubenswrapper[4930]: I1012 05:45:03.185263 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78dfc24e-649b-4183-b547-83ffda66a897-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "78dfc24e-649b-4183-b547-83ffda66a897" (UID: "78dfc24e-649b-4183-b547-83ffda66a897"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:45:03 crc kubenswrapper[4930]: I1012 05:45:03.280913 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s8krt\" (UniqueName: \"kubernetes.io/projected/78dfc24e-649b-4183-b547-83ffda66a897-kube-api-access-s8krt\") on node \"crc\" DevicePath \"\"" Oct 12 05:45:03 crc kubenswrapper[4930]: I1012 05:45:03.281025 4930 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/78dfc24e-649b-4183-b547-83ffda66a897-config-volume\") on node \"crc\" DevicePath \"\"" Oct 12 05:45:03 crc kubenswrapper[4930]: I1012 05:45:03.281044 4930 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/78dfc24e-649b-4183-b547-83ffda66a897-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 12 05:45:03 crc kubenswrapper[4930]: I1012 05:45:03.780751 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29337465-t9m4c" event={"ID":"78dfc24e-649b-4183-b547-83ffda66a897","Type":"ContainerDied","Data":"8052b989190d16515787316ddda517ab3720b6a52485476880ebb8ec3dc13dad"} Oct 12 05:45:03 crc kubenswrapper[4930]: I1012 05:45:03.780789 4930 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8052b989190d16515787316ddda517ab3720b6a52485476880ebb8ec3dc13dad" Oct 12 05:45:03 crc kubenswrapper[4930]: I1012 05:45:03.780863 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29337465-t9m4c" Oct 12 05:45:05 crc kubenswrapper[4930]: I1012 05:45:05.737645 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8wdhk"] Oct 12 05:45:05 crc kubenswrapper[4930]: I1012 05:45:05.738454 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8wdhk" podUID="54acd3fa-9208-450c-9a6a-4bae6962c325" containerName="registry-server" containerID="cri-o://145b675bc067d35b38e5cd864995c6a9a845ff73bbcb5eb405b22713b3f5d4e4" gracePeriod=30 Oct 12 05:45:05 crc kubenswrapper[4930]: I1012 05:45:05.757138 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xvjv6"] Oct 12 05:45:05 crc kubenswrapper[4930]: I1012 05:45:05.757512 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xvjv6" podUID="deece869-6cf9-4922-b75a-294c828c6e9e" containerName="registry-server" containerID="cri-o://0bd46b9577918758a6ed15658c75081322b397febd13e5338a664eca151808e0" gracePeriod=30 Oct 12 05:45:05 crc kubenswrapper[4930]: I1012 05:45:05.762805 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bs29k"] Oct 12 05:45:05 crc kubenswrapper[4930]: I1012 05:45:05.763010 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-bs29k" podUID="a51e2785-1c3d-4354-a071-dadb05075c68" containerName="marketplace-operator" containerID="cri-o://8efdde2ce6389e72ad82f1eb5cc2b576bbf55fbaf5f135dbc92a4179d946f76e" gracePeriod=30 Oct 12 05:45:05 crc kubenswrapper[4930]: I1012 05:45:05.776347 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9vqgr"] Oct 12 05:45:05 crc kubenswrapper[4930]: I1012 05:45:05.776662 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9vqgr" podUID="e085d670-8038-4828-acb6-dddc74a33655" containerName="registry-server" containerID="cri-o://52f5333b79ef9ba518babc5712e2c9e74bc8edb1e5ad1d1c2ed44e2b78d9605e" gracePeriod=30 Oct 12 05:45:05 crc kubenswrapper[4930]: I1012 05:45:05.785492 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xxb6v"] Oct 12 05:45:05 crc kubenswrapper[4930]: I1012 05:45:05.785803 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xxb6v" podUID="abe3804c-7b56-43f4-be75-206e80471232" containerName="registry-server" containerID="cri-o://637133fadd4a9e2255f70b2c6bd88f6a4042364db51ff9a32e870a57f3d67666" gracePeriod=30 Oct 12 05:45:05 crc kubenswrapper[4930]: I1012 05:45:05.857231 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-c9wvx"] Oct 12 05:45:05 crc kubenswrapper[4930]: E1012 05:45:05.857437 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78dfc24e-649b-4183-b547-83ffda66a897" containerName="collect-profiles" Oct 12 05:45:05 crc kubenswrapper[4930]: I1012 05:45:05.857448 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="78dfc24e-649b-4183-b547-83ffda66a897" containerName="collect-profiles" Oct 12 05:45:05 crc kubenswrapper[4930]: I1012 05:45:05.857543 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="78dfc24e-649b-4183-b547-83ffda66a897" containerName="collect-profiles" Oct 12 05:45:05 crc kubenswrapper[4930]: I1012 05:45:05.857934 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-c9wvx" Oct 12 05:45:05 crc kubenswrapper[4930]: I1012 05:45:05.913807 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-c9wvx"] Oct 12 05:45:05 crc kubenswrapper[4930]: I1012 05:45:05.948556 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cttst\" (UniqueName: \"kubernetes.io/projected/00ab86fc-c038-4dc6-aaf5-6eac7c953d24-kube-api-access-cttst\") pod \"marketplace-operator-79b997595-c9wvx\" (UID: \"00ab86fc-c038-4dc6-aaf5-6eac7c953d24\") " pod="openshift-marketplace/marketplace-operator-79b997595-c9wvx" Oct 12 05:45:05 crc kubenswrapper[4930]: I1012 05:45:05.948639 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/00ab86fc-c038-4dc6-aaf5-6eac7c953d24-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-c9wvx\" (UID: \"00ab86fc-c038-4dc6-aaf5-6eac7c953d24\") " pod="openshift-marketplace/marketplace-operator-79b997595-c9wvx" Oct 12 05:45:05 crc kubenswrapper[4930]: I1012 05:45:05.948661 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/00ab86fc-c038-4dc6-aaf5-6eac7c953d24-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-c9wvx\" (UID: \"00ab86fc-c038-4dc6-aaf5-6eac7c953d24\") " pod="openshift-marketplace/marketplace-operator-79b997595-c9wvx" Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.050937 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/00ab86fc-c038-4dc6-aaf5-6eac7c953d24-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-c9wvx\" (UID: \"00ab86fc-c038-4dc6-aaf5-6eac7c953d24\") " pod="openshift-marketplace/marketplace-operator-79b997595-c9wvx" Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.050982 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/00ab86fc-c038-4dc6-aaf5-6eac7c953d24-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-c9wvx\" (UID: \"00ab86fc-c038-4dc6-aaf5-6eac7c953d24\") " pod="openshift-marketplace/marketplace-operator-79b997595-c9wvx" Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.051051 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cttst\" (UniqueName: \"kubernetes.io/projected/00ab86fc-c038-4dc6-aaf5-6eac7c953d24-kube-api-access-cttst\") pod \"marketplace-operator-79b997595-c9wvx\" (UID: \"00ab86fc-c038-4dc6-aaf5-6eac7c953d24\") " pod="openshift-marketplace/marketplace-operator-79b997595-c9wvx" Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.052782 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/00ab86fc-c038-4dc6-aaf5-6eac7c953d24-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-c9wvx\" (UID: \"00ab86fc-c038-4dc6-aaf5-6eac7c953d24\") " pod="openshift-marketplace/marketplace-operator-79b997595-c9wvx" Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.058869 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/00ab86fc-c038-4dc6-aaf5-6eac7c953d24-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-c9wvx\" (UID: \"00ab86fc-c038-4dc6-aaf5-6eac7c953d24\") " pod="openshift-marketplace/marketplace-operator-79b997595-c9wvx" Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.078037 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cttst\" (UniqueName: \"kubernetes.io/projected/00ab86fc-c038-4dc6-aaf5-6eac7c953d24-kube-api-access-cttst\") pod \"marketplace-operator-79b997595-c9wvx\" (UID: \"00ab86fc-c038-4dc6-aaf5-6eac7c953d24\") " pod="openshift-marketplace/marketplace-operator-79b997595-c9wvx" Oct 12 05:45:06 crc kubenswrapper[4930]: E1012 05:45:06.149256 4930 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0bd46b9577918758a6ed15658c75081322b397febd13e5338a664eca151808e0 is running failed: container process not found" containerID="0bd46b9577918758a6ed15658c75081322b397febd13e5338a664eca151808e0" cmd=["grpc_health_probe","-addr=:50051"] Oct 12 05:45:06 crc kubenswrapper[4930]: E1012 05:45:06.149537 4930 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0bd46b9577918758a6ed15658c75081322b397febd13e5338a664eca151808e0 is running failed: container process not found" containerID="0bd46b9577918758a6ed15658c75081322b397febd13e5338a664eca151808e0" cmd=["grpc_health_probe","-addr=:50051"] Oct 12 05:45:06 crc kubenswrapper[4930]: E1012 05:45:06.149758 4930 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0bd46b9577918758a6ed15658c75081322b397febd13e5338a664eca151808e0 is running failed: container process not found" containerID="0bd46b9577918758a6ed15658c75081322b397febd13e5338a664eca151808e0" cmd=["grpc_health_probe","-addr=:50051"] Oct 12 05:45:06 crc kubenswrapper[4930]: E1012 05:45:06.149783 4930 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0bd46b9577918758a6ed15658c75081322b397febd13e5338a664eca151808e0 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-xvjv6" podUID="deece869-6cf9-4922-b75a-294c828c6e9e" containerName="registry-server" Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.242176 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-c9wvx" Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.264998 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-bs29k" Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.292195 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xvjv6" Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.316034 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xxb6v" Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.316420 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8wdhk" Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.320760 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9vqgr" Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.374310 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e085d670-8038-4828-acb6-dddc74a33655-utilities\") pod \"e085d670-8038-4828-acb6-dddc74a33655\" (UID: \"e085d670-8038-4828-acb6-dddc74a33655\") " Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.374377 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abe3804c-7b56-43f4-be75-206e80471232-catalog-content\") pod \"abe3804c-7b56-43f4-be75-206e80471232\" (UID: \"abe3804c-7b56-43f4-be75-206e80471232\") " Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.374417 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wlfh7\" (UniqueName: \"kubernetes.io/projected/abe3804c-7b56-43f4-be75-206e80471232-kube-api-access-wlfh7\") pod \"abe3804c-7b56-43f4-be75-206e80471232\" (UID: \"abe3804c-7b56-43f4-be75-206e80471232\") " Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.374447 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abe3804c-7b56-43f4-be75-206e80471232-utilities\") pod \"abe3804c-7b56-43f4-be75-206e80471232\" (UID: \"abe3804c-7b56-43f4-be75-206e80471232\") " Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.374482 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6nbf2\" (UniqueName: \"kubernetes.io/projected/a51e2785-1c3d-4354-a071-dadb05075c68-kube-api-access-6nbf2\") pod \"a51e2785-1c3d-4354-a071-dadb05075c68\" (UID: \"a51e2785-1c3d-4354-a071-dadb05075c68\") " Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.374500 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/deece869-6cf9-4922-b75a-294c828c6e9e-utilities\") pod \"deece869-6cf9-4922-b75a-294c828c6e9e\" (UID: \"deece869-6cf9-4922-b75a-294c828c6e9e\") " Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.374542 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a51e2785-1c3d-4354-a071-dadb05075c68-marketplace-operator-metrics\") pod \"a51e2785-1c3d-4354-a071-dadb05075c68\" (UID: \"a51e2785-1c3d-4354-a071-dadb05075c68\") " Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.374566 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/deece869-6cf9-4922-b75a-294c828c6e9e-catalog-content\") pod \"deece869-6cf9-4922-b75a-294c828c6e9e\" (UID: \"deece869-6cf9-4922-b75a-294c828c6e9e\") " Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.374583 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54acd3fa-9208-450c-9a6a-4bae6962c325-catalog-content\") pod \"54acd3fa-9208-450c-9a6a-4bae6962c325\" (UID: \"54acd3fa-9208-450c-9a6a-4bae6962c325\") " Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.374609 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a51e2785-1c3d-4354-a071-dadb05075c68-marketplace-trusted-ca\") pod \"a51e2785-1c3d-4354-a071-dadb05075c68\" (UID: \"a51e2785-1c3d-4354-a071-dadb05075c68\") " Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.374631 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e085d670-8038-4828-acb6-dddc74a33655-catalog-content\") pod \"e085d670-8038-4828-acb6-dddc74a33655\" (UID: \"e085d670-8038-4828-acb6-dddc74a33655\") " Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.374655 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qlw4t\" (UniqueName: \"kubernetes.io/projected/54acd3fa-9208-450c-9a6a-4bae6962c325-kube-api-access-qlw4t\") pod \"54acd3fa-9208-450c-9a6a-4bae6962c325\" (UID: \"54acd3fa-9208-450c-9a6a-4bae6962c325\") " Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.374672 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kq7rv\" (UniqueName: \"kubernetes.io/projected/e085d670-8038-4828-acb6-dddc74a33655-kube-api-access-kq7rv\") pod \"e085d670-8038-4828-acb6-dddc74a33655\" (UID: \"e085d670-8038-4828-acb6-dddc74a33655\") " Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.374695 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvkvn\" (UniqueName: \"kubernetes.io/projected/deece869-6cf9-4922-b75a-294c828c6e9e-kube-api-access-vvkvn\") pod \"deece869-6cf9-4922-b75a-294c828c6e9e\" (UID: \"deece869-6cf9-4922-b75a-294c828c6e9e\") " Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.374719 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54acd3fa-9208-450c-9a6a-4bae6962c325-utilities\") pod \"54acd3fa-9208-450c-9a6a-4bae6962c325\" (UID: \"54acd3fa-9208-450c-9a6a-4bae6962c325\") " Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.375148 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e085d670-8038-4828-acb6-dddc74a33655-utilities" (OuterVolumeSpecName: "utilities") pod "e085d670-8038-4828-acb6-dddc74a33655" (UID: "e085d670-8038-4828-acb6-dddc74a33655"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.375844 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54acd3fa-9208-450c-9a6a-4bae6962c325-utilities" (OuterVolumeSpecName: "utilities") pod "54acd3fa-9208-450c-9a6a-4bae6962c325" (UID: "54acd3fa-9208-450c-9a6a-4bae6962c325"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.377453 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a51e2785-1c3d-4354-a071-dadb05075c68-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "a51e2785-1c3d-4354-a071-dadb05075c68" (UID: "a51e2785-1c3d-4354-a071-dadb05075c68"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.382257 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54acd3fa-9208-450c-9a6a-4bae6962c325-kube-api-access-qlw4t" (OuterVolumeSpecName: "kube-api-access-qlw4t") pod "54acd3fa-9208-450c-9a6a-4bae6962c325" (UID: "54acd3fa-9208-450c-9a6a-4bae6962c325"). InnerVolumeSpecName "kube-api-access-qlw4t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.384708 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abe3804c-7b56-43f4-be75-206e80471232-kube-api-access-wlfh7" (OuterVolumeSpecName: "kube-api-access-wlfh7") pod "abe3804c-7b56-43f4-be75-206e80471232" (UID: "abe3804c-7b56-43f4-be75-206e80471232"). InnerVolumeSpecName "kube-api-access-wlfh7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.385545 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/deece869-6cf9-4922-b75a-294c828c6e9e-utilities" (OuterVolumeSpecName: "utilities") pod "deece869-6cf9-4922-b75a-294c828c6e9e" (UID: "deece869-6cf9-4922-b75a-294c828c6e9e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.386254 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abe3804c-7b56-43f4-be75-206e80471232-utilities" (OuterVolumeSpecName: "utilities") pod "abe3804c-7b56-43f4-be75-206e80471232" (UID: "abe3804c-7b56-43f4-be75-206e80471232"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.390243 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a51e2785-1c3d-4354-a071-dadb05075c68-kube-api-access-6nbf2" (OuterVolumeSpecName: "kube-api-access-6nbf2") pod "a51e2785-1c3d-4354-a071-dadb05075c68" (UID: "a51e2785-1c3d-4354-a071-dadb05075c68"). InnerVolumeSpecName "kube-api-access-6nbf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.391873 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/deece869-6cf9-4922-b75a-294c828c6e9e-kube-api-access-vvkvn" (OuterVolumeSpecName: "kube-api-access-vvkvn") pod "deece869-6cf9-4922-b75a-294c828c6e9e" (UID: "deece869-6cf9-4922-b75a-294c828c6e9e"). InnerVolumeSpecName "kube-api-access-vvkvn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.391929 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e085d670-8038-4828-acb6-dddc74a33655-kube-api-access-kq7rv" (OuterVolumeSpecName: "kube-api-access-kq7rv") pod "e085d670-8038-4828-acb6-dddc74a33655" (UID: "e085d670-8038-4828-acb6-dddc74a33655"). InnerVolumeSpecName "kube-api-access-kq7rv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.392293 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a51e2785-1c3d-4354-a071-dadb05075c68-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "a51e2785-1c3d-4354-a071-dadb05075c68" (UID: "a51e2785-1c3d-4354-a071-dadb05075c68"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.397087 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e085d670-8038-4828-acb6-dddc74a33655-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e085d670-8038-4828-acb6-dddc74a33655" (UID: "e085d670-8038-4828-acb6-dddc74a33655"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.425199 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54acd3fa-9208-450c-9a6a-4bae6962c325-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "54acd3fa-9208-450c-9a6a-4bae6962c325" (UID: "54acd3fa-9208-450c-9a6a-4bae6962c325"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.438630 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/deece869-6cf9-4922-b75a-294c828c6e9e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "deece869-6cf9-4922-b75a-294c828c6e9e" (UID: "deece869-6cf9-4922-b75a-294c828c6e9e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.475660 4930 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54acd3fa-9208-450c-9a6a-4bae6962c325-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.475690 4930 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e085d670-8038-4828-acb6-dddc74a33655-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.475699 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wlfh7\" (UniqueName: \"kubernetes.io/projected/abe3804c-7b56-43f4-be75-206e80471232-kube-api-access-wlfh7\") on node \"crc\" DevicePath \"\"" Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.475709 4930 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abe3804c-7b56-43f4-be75-206e80471232-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.475718 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6nbf2\" (UniqueName: \"kubernetes.io/projected/a51e2785-1c3d-4354-a071-dadb05075c68-kube-api-access-6nbf2\") on node \"crc\" DevicePath \"\"" Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.475726 4930 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/deece869-6cf9-4922-b75a-294c828c6e9e-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.475748 4930 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a51e2785-1c3d-4354-a071-dadb05075c68-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.475759 4930 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/deece869-6cf9-4922-b75a-294c828c6e9e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.475767 4930 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54acd3fa-9208-450c-9a6a-4bae6962c325-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.475775 4930 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a51e2785-1c3d-4354-a071-dadb05075c68-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.475784 4930 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e085d670-8038-4828-acb6-dddc74a33655-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.475792 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qlw4t\" (UniqueName: \"kubernetes.io/projected/54acd3fa-9208-450c-9a6a-4bae6962c325-kube-api-access-qlw4t\") on node \"crc\" DevicePath \"\"" Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.475801 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kq7rv\" (UniqueName: \"kubernetes.io/projected/e085d670-8038-4828-acb6-dddc74a33655-kube-api-access-kq7rv\") on node \"crc\" DevicePath \"\"" Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.475812 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvkvn\" (UniqueName: \"kubernetes.io/projected/deece869-6cf9-4922-b75a-294c828c6e9e-kube-api-access-vvkvn\") on node \"crc\" DevicePath \"\"" Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.476492 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-c9wvx"] Oct 12 05:45:06 crc kubenswrapper[4930]: W1012 05:45:06.486326 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00ab86fc_c038_4dc6_aaf5_6eac7c953d24.slice/crio-0709eb986dbbbd87ab542c4c13085b7960e9e2bac98715bbf3c19e6b04cd32d6 WatchSource:0}: Error finding container 0709eb986dbbbd87ab542c4c13085b7960e9e2bac98715bbf3c19e6b04cd32d6: Status 404 returned error can't find the container with id 0709eb986dbbbd87ab542c4c13085b7960e9e2bac98715bbf3c19e6b04cd32d6 Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.508766 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abe3804c-7b56-43f4-be75-206e80471232-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "abe3804c-7b56-43f4-be75-206e80471232" (UID: "abe3804c-7b56-43f4-be75-206e80471232"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.576467 4930 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abe3804c-7b56-43f4-be75-206e80471232-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.841301 4930 generic.go:334] "Generic (PLEG): container finished" podID="deece869-6cf9-4922-b75a-294c828c6e9e" containerID="0bd46b9577918758a6ed15658c75081322b397febd13e5338a664eca151808e0" exitCode=0 Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.841373 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xvjv6" event={"ID":"deece869-6cf9-4922-b75a-294c828c6e9e","Type":"ContainerDied","Data":"0bd46b9577918758a6ed15658c75081322b397febd13e5338a664eca151808e0"} Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.841402 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xvjv6" event={"ID":"deece869-6cf9-4922-b75a-294c828c6e9e","Type":"ContainerDied","Data":"33c422288a82d81935f37b170503a392d369f96e1973511e6b38e0dbc97c9b56"} Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.841421 4930 scope.go:117] "RemoveContainer" containerID="0bd46b9577918758a6ed15658c75081322b397febd13e5338a664eca151808e0" Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.841429 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xvjv6" Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.844848 4930 generic.go:334] "Generic (PLEG): container finished" podID="e085d670-8038-4828-acb6-dddc74a33655" containerID="52f5333b79ef9ba518babc5712e2c9e74bc8edb1e5ad1d1c2ed44e2b78d9605e" exitCode=0 Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.844954 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9vqgr" event={"ID":"e085d670-8038-4828-acb6-dddc74a33655","Type":"ContainerDied","Data":"52f5333b79ef9ba518babc5712e2c9e74bc8edb1e5ad1d1c2ed44e2b78d9605e"} Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.844993 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9vqgr" event={"ID":"e085d670-8038-4828-acb6-dddc74a33655","Type":"ContainerDied","Data":"b851fac227366e817d3ca6132d22cf2d8c6722bcee4dbe33753141d59f1c645c"} Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.845012 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9vqgr" Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.847343 4930 generic.go:334] "Generic (PLEG): container finished" podID="a51e2785-1c3d-4354-a071-dadb05075c68" containerID="8efdde2ce6389e72ad82f1eb5cc2b576bbf55fbaf5f135dbc92a4179d946f76e" exitCode=0 Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.847381 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-bs29k" Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.847394 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-bs29k" event={"ID":"a51e2785-1c3d-4354-a071-dadb05075c68","Type":"ContainerDied","Data":"8efdde2ce6389e72ad82f1eb5cc2b576bbf55fbaf5f135dbc92a4179d946f76e"} Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.847409 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-bs29k" event={"ID":"a51e2785-1c3d-4354-a071-dadb05075c68","Type":"ContainerDied","Data":"d26c532eedf22d2fb6d8aa58bc15884c5b780b34d4bd50a10f4744b5ace41db0"} Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.854391 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-c9wvx" event={"ID":"00ab86fc-c038-4dc6-aaf5-6eac7c953d24","Type":"ContainerStarted","Data":"f7132e747f2147a5a91ab457fc9f9151cbfbd5aa13fb0565bbcdf5d37e3efbcc"} Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.854431 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-c9wvx" event={"ID":"00ab86fc-c038-4dc6-aaf5-6eac7c953d24","Type":"ContainerStarted","Data":"0709eb986dbbbd87ab542c4c13085b7960e9e2bac98715bbf3c19e6b04cd32d6"} Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.855355 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-c9wvx" Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.855416 4930 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-c9wvx container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.56:8080/healthz\": dial tcp 10.217.0.56:8080: connect: connection refused" start-of-body= Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.855442 4930 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-c9wvx" podUID="00ab86fc-c038-4dc6-aaf5-6eac7c953d24" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.56:8080/healthz\": dial tcp 10.217.0.56:8080: connect: connection refused" Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.856573 4930 generic.go:334] "Generic (PLEG): container finished" podID="abe3804c-7b56-43f4-be75-206e80471232" containerID="637133fadd4a9e2255f70b2c6bd88f6a4042364db51ff9a32e870a57f3d67666" exitCode=0 Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.856624 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xxb6v" Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.856630 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xxb6v" event={"ID":"abe3804c-7b56-43f4-be75-206e80471232","Type":"ContainerDied","Data":"637133fadd4a9e2255f70b2c6bd88f6a4042364db51ff9a32e870a57f3d67666"} Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.856727 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xxb6v" event={"ID":"abe3804c-7b56-43f4-be75-206e80471232","Type":"ContainerDied","Data":"ae1c8b94b1a8bb350f33a0166cdd82162b5af09667ef055e07b09d45ea10485a"} Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.859206 4930 generic.go:334] "Generic (PLEG): container finished" podID="54acd3fa-9208-450c-9a6a-4bae6962c325" containerID="145b675bc067d35b38e5cd864995c6a9a845ff73bbcb5eb405b22713b3f5d4e4" exitCode=0 Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.859240 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8wdhk" event={"ID":"54acd3fa-9208-450c-9a6a-4bae6962c325","Type":"ContainerDied","Data":"145b675bc067d35b38e5cd864995c6a9a845ff73bbcb5eb405b22713b3f5d4e4"} Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.859263 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8wdhk" event={"ID":"54acd3fa-9208-450c-9a6a-4bae6962c325","Type":"ContainerDied","Data":"5b728964c5334400aea56ec7a0b4d636e7d35f0d1c9cd1e881637bc40f301de7"} Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.859550 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8wdhk" Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.869358 4930 scope.go:117] "RemoveContainer" containerID="ac8ca8ec2ddc7771f1010bdc0ef9016553538d0b9d4b08d3fcf05aed6b3055a4" Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.880703 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-c9wvx" podStartSLOduration=1.880689396 podStartE2EDuration="1.880689396s" podCreationTimestamp="2025-10-12 05:45:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:45:06.875924531 +0000 UTC m=+239.418026296" watchObservedRunningTime="2025-10-12 05:45:06.880689396 +0000 UTC m=+239.422791161" Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.941808 4930 scope.go:117] "RemoveContainer" containerID="4f47e304d8de582c36f6ade26a50f3e06bafa72cab73632b8733254447f8d097" Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.965685 4930 scope.go:117] "RemoveContainer" containerID="0bd46b9577918758a6ed15658c75081322b397febd13e5338a664eca151808e0" Oct 12 05:45:06 crc kubenswrapper[4930]: E1012 05:45:06.966275 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bd46b9577918758a6ed15658c75081322b397febd13e5338a664eca151808e0\": container with ID starting with 0bd46b9577918758a6ed15658c75081322b397febd13e5338a664eca151808e0 not found: ID does not exist" containerID="0bd46b9577918758a6ed15658c75081322b397febd13e5338a664eca151808e0" Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.966308 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bd46b9577918758a6ed15658c75081322b397febd13e5338a664eca151808e0"} err="failed to get container status \"0bd46b9577918758a6ed15658c75081322b397febd13e5338a664eca151808e0\": rpc error: code = NotFound desc = could not find container \"0bd46b9577918758a6ed15658c75081322b397febd13e5338a664eca151808e0\": container with ID starting with 0bd46b9577918758a6ed15658c75081322b397febd13e5338a664eca151808e0 not found: ID does not exist" Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.966330 4930 scope.go:117] "RemoveContainer" containerID="ac8ca8ec2ddc7771f1010bdc0ef9016553538d0b9d4b08d3fcf05aed6b3055a4" Oct 12 05:45:06 crc kubenswrapper[4930]: E1012 05:45:06.967208 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac8ca8ec2ddc7771f1010bdc0ef9016553538d0b9d4b08d3fcf05aed6b3055a4\": container with ID starting with ac8ca8ec2ddc7771f1010bdc0ef9016553538d0b9d4b08d3fcf05aed6b3055a4 not found: ID does not exist" containerID="ac8ca8ec2ddc7771f1010bdc0ef9016553538d0b9d4b08d3fcf05aed6b3055a4" Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.967230 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac8ca8ec2ddc7771f1010bdc0ef9016553538d0b9d4b08d3fcf05aed6b3055a4"} err="failed to get container status \"ac8ca8ec2ddc7771f1010bdc0ef9016553538d0b9d4b08d3fcf05aed6b3055a4\": rpc error: code = NotFound desc = could not find container \"ac8ca8ec2ddc7771f1010bdc0ef9016553538d0b9d4b08d3fcf05aed6b3055a4\": container with ID starting with ac8ca8ec2ddc7771f1010bdc0ef9016553538d0b9d4b08d3fcf05aed6b3055a4 not found: ID does not exist" Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.967244 4930 scope.go:117] "RemoveContainer" containerID="4f47e304d8de582c36f6ade26a50f3e06bafa72cab73632b8733254447f8d097" Oct 12 05:45:06 crc kubenswrapper[4930]: E1012 05:45:06.967499 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f47e304d8de582c36f6ade26a50f3e06bafa72cab73632b8733254447f8d097\": container with ID starting with 4f47e304d8de582c36f6ade26a50f3e06bafa72cab73632b8733254447f8d097 not found: ID does not exist" containerID="4f47e304d8de582c36f6ade26a50f3e06bafa72cab73632b8733254447f8d097" Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.967525 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f47e304d8de582c36f6ade26a50f3e06bafa72cab73632b8733254447f8d097"} err="failed to get container status \"4f47e304d8de582c36f6ade26a50f3e06bafa72cab73632b8733254447f8d097\": rpc error: code = NotFound desc = could not find container \"4f47e304d8de582c36f6ade26a50f3e06bafa72cab73632b8733254447f8d097\": container with ID starting with 4f47e304d8de582c36f6ade26a50f3e06bafa72cab73632b8733254447f8d097 not found: ID does not exist" Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.967541 4930 scope.go:117] "RemoveContainer" containerID="52f5333b79ef9ba518babc5712e2c9e74bc8edb1e5ad1d1c2ed44e2b78d9605e" Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.989205 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9vqgr"] Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.991965 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9vqgr"] Oct 12 05:45:06 crc kubenswrapper[4930]: I1012 05:45:06.993392 4930 scope.go:117] "RemoveContainer" containerID="0c09a956fc89f0ffd2c23fdb762c2e7357a8fc91882b5078d63f673c6eb723e5" Oct 12 05:45:07 crc kubenswrapper[4930]: I1012 05:45:07.011636 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8wdhk"] Oct 12 05:45:07 crc kubenswrapper[4930]: I1012 05:45:07.022034 4930 scope.go:117] "RemoveContainer" containerID="72b49ebebe7e590ce849e00cf7f6ea8691173ec5465245f6479fe0a8cfbec7fb" Oct 12 05:45:07 crc kubenswrapper[4930]: I1012 05:45:07.026770 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8wdhk"] Oct 12 05:45:07 crc kubenswrapper[4930]: I1012 05:45:07.048079 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xvjv6"] Oct 12 05:45:07 crc kubenswrapper[4930]: I1012 05:45:07.057125 4930 scope.go:117] "RemoveContainer" containerID="52f5333b79ef9ba518babc5712e2c9e74bc8edb1e5ad1d1c2ed44e2b78d9605e" Oct 12 05:45:07 crc kubenswrapper[4930]: I1012 05:45:07.057266 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xvjv6"] Oct 12 05:45:07 crc kubenswrapper[4930]: E1012 05:45:07.058805 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52f5333b79ef9ba518babc5712e2c9e74bc8edb1e5ad1d1c2ed44e2b78d9605e\": container with ID starting with 52f5333b79ef9ba518babc5712e2c9e74bc8edb1e5ad1d1c2ed44e2b78d9605e not found: ID does not exist" containerID="52f5333b79ef9ba518babc5712e2c9e74bc8edb1e5ad1d1c2ed44e2b78d9605e" Oct 12 05:45:07 crc kubenswrapper[4930]: I1012 05:45:07.058839 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52f5333b79ef9ba518babc5712e2c9e74bc8edb1e5ad1d1c2ed44e2b78d9605e"} err="failed to get container status \"52f5333b79ef9ba518babc5712e2c9e74bc8edb1e5ad1d1c2ed44e2b78d9605e\": rpc error: code = NotFound desc = could not find container \"52f5333b79ef9ba518babc5712e2c9e74bc8edb1e5ad1d1c2ed44e2b78d9605e\": container with ID starting with 52f5333b79ef9ba518babc5712e2c9e74bc8edb1e5ad1d1c2ed44e2b78d9605e not found: ID does not exist" Oct 12 05:45:07 crc kubenswrapper[4930]: I1012 05:45:07.058865 4930 scope.go:117] "RemoveContainer" containerID="0c09a956fc89f0ffd2c23fdb762c2e7357a8fc91882b5078d63f673c6eb723e5" Oct 12 05:45:07 crc kubenswrapper[4930]: E1012 05:45:07.059999 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c09a956fc89f0ffd2c23fdb762c2e7357a8fc91882b5078d63f673c6eb723e5\": container with ID starting with 0c09a956fc89f0ffd2c23fdb762c2e7357a8fc91882b5078d63f673c6eb723e5 not found: ID does not exist" containerID="0c09a956fc89f0ffd2c23fdb762c2e7357a8fc91882b5078d63f673c6eb723e5" Oct 12 05:45:07 crc kubenswrapper[4930]: I1012 05:45:07.060023 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c09a956fc89f0ffd2c23fdb762c2e7357a8fc91882b5078d63f673c6eb723e5"} err="failed to get container status \"0c09a956fc89f0ffd2c23fdb762c2e7357a8fc91882b5078d63f673c6eb723e5\": rpc error: code = NotFound desc = could not find container \"0c09a956fc89f0ffd2c23fdb762c2e7357a8fc91882b5078d63f673c6eb723e5\": container with ID starting with 0c09a956fc89f0ffd2c23fdb762c2e7357a8fc91882b5078d63f673c6eb723e5 not found: ID does not exist" Oct 12 05:45:07 crc kubenswrapper[4930]: I1012 05:45:07.060039 4930 scope.go:117] "RemoveContainer" containerID="72b49ebebe7e590ce849e00cf7f6ea8691173ec5465245f6479fe0a8cfbec7fb" Oct 12 05:45:07 crc kubenswrapper[4930]: E1012 05:45:07.060933 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72b49ebebe7e590ce849e00cf7f6ea8691173ec5465245f6479fe0a8cfbec7fb\": container with ID starting with 72b49ebebe7e590ce849e00cf7f6ea8691173ec5465245f6479fe0a8cfbec7fb not found: ID does not exist" containerID="72b49ebebe7e590ce849e00cf7f6ea8691173ec5465245f6479fe0a8cfbec7fb" Oct 12 05:45:07 crc kubenswrapper[4930]: I1012 05:45:07.060983 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72b49ebebe7e590ce849e00cf7f6ea8691173ec5465245f6479fe0a8cfbec7fb"} err="failed to get container status \"72b49ebebe7e590ce849e00cf7f6ea8691173ec5465245f6479fe0a8cfbec7fb\": rpc error: code = NotFound desc = could not find container \"72b49ebebe7e590ce849e00cf7f6ea8691173ec5465245f6479fe0a8cfbec7fb\": container with ID starting with 72b49ebebe7e590ce849e00cf7f6ea8691173ec5465245f6479fe0a8cfbec7fb not found: ID does not exist" Oct 12 05:45:07 crc kubenswrapper[4930]: I1012 05:45:07.061009 4930 scope.go:117] "RemoveContainer" containerID="8efdde2ce6389e72ad82f1eb5cc2b576bbf55fbaf5f135dbc92a4179d946f76e" Oct 12 05:45:07 crc kubenswrapper[4930]: I1012 05:45:07.068173 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bs29k"] Oct 12 05:45:07 crc kubenswrapper[4930]: I1012 05:45:07.071166 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bs29k"] Oct 12 05:45:07 crc kubenswrapper[4930]: I1012 05:45:07.073947 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xxb6v"] Oct 12 05:45:07 crc kubenswrapper[4930]: I1012 05:45:07.075627 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xxb6v"] Oct 12 05:45:07 crc kubenswrapper[4930]: I1012 05:45:07.075978 4930 scope.go:117] "RemoveContainer" containerID="8efdde2ce6389e72ad82f1eb5cc2b576bbf55fbaf5f135dbc92a4179d946f76e" Oct 12 05:45:07 crc kubenswrapper[4930]: E1012 05:45:07.076562 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8efdde2ce6389e72ad82f1eb5cc2b576bbf55fbaf5f135dbc92a4179d946f76e\": container with ID starting with 8efdde2ce6389e72ad82f1eb5cc2b576bbf55fbaf5f135dbc92a4179d946f76e not found: ID does not exist" containerID="8efdde2ce6389e72ad82f1eb5cc2b576bbf55fbaf5f135dbc92a4179d946f76e" Oct 12 05:45:07 crc kubenswrapper[4930]: I1012 05:45:07.076604 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8efdde2ce6389e72ad82f1eb5cc2b576bbf55fbaf5f135dbc92a4179d946f76e"} err="failed to get container status \"8efdde2ce6389e72ad82f1eb5cc2b576bbf55fbaf5f135dbc92a4179d946f76e\": rpc error: code = NotFound desc = could not find container \"8efdde2ce6389e72ad82f1eb5cc2b576bbf55fbaf5f135dbc92a4179d946f76e\": container with ID starting with 8efdde2ce6389e72ad82f1eb5cc2b576bbf55fbaf5f135dbc92a4179d946f76e not found: ID does not exist" Oct 12 05:45:07 crc kubenswrapper[4930]: I1012 05:45:07.076631 4930 scope.go:117] "RemoveContainer" containerID="637133fadd4a9e2255f70b2c6bd88f6a4042364db51ff9a32e870a57f3d67666" Oct 12 05:45:07 crc kubenswrapper[4930]: I1012 05:45:07.089923 4930 scope.go:117] "RemoveContainer" containerID="0be7df8747cfd61f5d45ee0aa2d0c130474d39a6eafa719052735a324056df5f" Oct 12 05:45:07 crc kubenswrapper[4930]: I1012 05:45:07.111093 4930 scope.go:117] "RemoveContainer" containerID="a63e26652077fa08b432f6cf4b30dde293ecdb2a2f7f48859e8134568510ac13" Oct 12 05:45:07 crc kubenswrapper[4930]: I1012 05:45:07.127258 4930 scope.go:117] "RemoveContainer" containerID="637133fadd4a9e2255f70b2c6bd88f6a4042364db51ff9a32e870a57f3d67666" Oct 12 05:45:07 crc kubenswrapper[4930]: E1012 05:45:07.127668 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"637133fadd4a9e2255f70b2c6bd88f6a4042364db51ff9a32e870a57f3d67666\": container with ID starting with 637133fadd4a9e2255f70b2c6bd88f6a4042364db51ff9a32e870a57f3d67666 not found: ID does not exist" containerID="637133fadd4a9e2255f70b2c6bd88f6a4042364db51ff9a32e870a57f3d67666" Oct 12 05:45:07 crc kubenswrapper[4930]: I1012 05:45:07.127708 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"637133fadd4a9e2255f70b2c6bd88f6a4042364db51ff9a32e870a57f3d67666"} err="failed to get container status \"637133fadd4a9e2255f70b2c6bd88f6a4042364db51ff9a32e870a57f3d67666\": rpc error: code = NotFound desc = could not find container \"637133fadd4a9e2255f70b2c6bd88f6a4042364db51ff9a32e870a57f3d67666\": container with ID starting with 637133fadd4a9e2255f70b2c6bd88f6a4042364db51ff9a32e870a57f3d67666 not found: ID does not exist" Oct 12 05:45:07 crc kubenswrapper[4930]: I1012 05:45:07.127750 4930 scope.go:117] "RemoveContainer" containerID="0be7df8747cfd61f5d45ee0aa2d0c130474d39a6eafa719052735a324056df5f" Oct 12 05:45:07 crc kubenswrapper[4930]: E1012 05:45:07.128136 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0be7df8747cfd61f5d45ee0aa2d0c130474d39a6eafa719052735a324056df5f\": container with ID starting with 0be7df8747cfd61f5d45ee0aa2d0c130474d39a6eafa719052735a324056df5f not found: ID does not exist" containerID="0be7df8747cfd61f5d45ee0aa2d0c130474d39a6eafa719052735a324056df5f" Oct 12 05:45:07 crc kubenswrapper[4930]: I1012 05:45:07.128175 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0be7df8747cfd61f5d45ee0aa2d0c130474d39a6eafa719052735a324056df5f"} err="failed to get container status \"0be7df8747cfd61f5d45ee0aa2d0c130474d39a6eafa719052735a324056df5f\": rpc error: code = NotFound desc = could not find container \"0be7df8747cfd61f5d45ee0aa2d0c130474d39a6eafa719052735a324056df5f\": container with ID starting with 0be7df8747cfd61f5d45ee0aa2d0c130474d39a6eafa719052735a324056df5f not found: ID does not exist" Oct 12 05:45:07 crc kubenswrapper[4930]: I1012 05:45:07.128204 4930 scope.go:117] "RemoveContainer" containerID="a63e26652077fa08b432f6cf4b30dde293ecdb2a2f7f48859e8134568510ac13" Oct 12 05:45:07 crc kubenswrapper[4930]: E1012 05:45:07.128504 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a63e26652077fa08b432f6cf4b30dde293ecdb2a2f7f48859e8134568510ac13\": container with ID starting with a63e26652077fa08b432f6cf4b30dde293ecdb2a2f7f48859e8134568510ac13 not found: ID does not exist" containerID="a63e26652077fa08b432f6cf4b30dde293ecdb2a2f7f48859e8134568510ac13" Oct 12 05:45:07 crc kubenswrapper[4930]: I1012 05:45:07.128527 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a63e26652077fa08b432f6cf4b30dde293ecdb2a2f7f48859e8134568510ac13"} err="failed to get container status \"a63e26652077fa08b432f6cf4b30dde293ecdb2a2f7f48859e8134568510ac13\": rpc error: code = NotFound desc = could not find container \"a63e26652077fa08b432f6cf4b30dde293ecdb2a2f7f48859e8134568510ac13\": container with ID starting with a63e26652077fa08b432f6cf4b30dde293ecdb2a2f7f48859e8134568510ac13 not found: ID does not exist" Oct 12 05:45:07 crc kubenswrapper[4930]: I1012 05:45:07.128542 4930 scope.go:117] "RemoveContainer" containerID="145b675bc067d35b38e5cd864995c6a9a845ff73bbcb5eb405b22713b3f5d4e4" Oct 12 05:45:07 crc kubenswrapper[4930]: I1012 05:45:07.143416 4930 scope.go:117] "RemoveContainer" containerID="04f0e0b6485d6dcf08e8f87350c0ed2e0fde93f32018c42f28b70bf314c28bf1" Oct 12 05:45:07 crc kubenswrapper[4930]: I1012 05:45:07.158481 4930 scope.go:117] "RemoveContainer" containerID="1b864609006f936003f95f2bb11e3b4761f05c3be5574089377f3429b3e36369" Oct 12 05:45:07 crc kubenswrapper[4930]: I1012 05:45:07.175730 4930 scope.go:117] "RemoveContainer" containerID="145b675bc067d35b38e5cd864995c6a9a845ff73bbcb5eb405b22713b3f5d4e4" Oct 12 05:45:07 crc kubenswrapper[4930]: E1012 05:45:07.176130 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"145b675bc067d35b38e5cd864995c6a9a845ff73bbcb5eb405b22713b3f5d4e4\": container with ID starting with 145b675bc067d35b38e5cd864995c6a9a845ff73bbcb5eb405b22713b3f5d4e4 not found: ID does not exist" containerID="145b675bc067d35b38e5cd864995c6a9a845ff73bbcb5eb405b22713b3f5d4e4" Oct 12 05:45:07 crc kubenswrapper[4930]: I1012 05:45:07.176178 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"145b675bc067d35b38e5cd864995c6a9a845ff73bbcb5eb405b22713b3f5d4e4"} err="failed to get container status \"145b675bc067d35b38e5cd864995c6a9a845ff73bbcb5eb405b22713b3f5d4e4\": rpc error: code = NotFound desc = could not find container \"145b675bc067d35b38e5cd864995c6a9a845ff73bbcb5eb405b22713b3f5d4e4\": container with ID starting with 145b675bc067d35b38e5cd864995c6a9a845ff73bbcb5eb405b22713b3f5d4e4 not found: ID does not exist" Oct 12 05:45:07 crc kubenswrapper[4930]: I1012 05:45:07.176215 4930 scope.go:117] "RemoveContainer" containerID="04f0e0b6485d6dcf08e8f87350c0ed2e0fde93f32018c42f28b70bf314c28bf1" Oct 12 05:45:07 crc kubenswrapper[4930]: E1012 05:45:07.176558 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04f0e0b6485d6dcf08e8f87350c0ed2e0fde93f32018c42f28b70bf314c28bf1\": container with ID starting with 04f0e0b6485d6dcf08e8f87350c0ed2e0fde93f32018c42f28b70bf314c28bf1 not found: ID does not exist" containerID="04f0e0b6485d6dcf08e8f87350c0ed2e0fde93f32018c42f28b70bf314c28bf1" Oct 12 05:45:07 crc kubenswrapper[4930]: I1012 05:45:07.176609 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04f0e0b6485d6dcf08e8f87350c0ed2e0fde93f32018c42f28b70bf314c28bf1"} err="failed to get container status \"04f0e0b6485d6dcf08e8f87350c0ed2e0fde93f32018c42f28b70bf314c28bf1\": rpc error: code = NotFound desc = could not find container \"04f0e0b6485d6dcf08e8f87350c0ed2e0fde93f32018c42f28b70bf314c28bf1\": container with ID starting with 04f0e0b6485d6dcf08e8f87350c0ed2e0fde93f32018c42f28b70bf314c28bf1 not found: ID does not exist" Oct 12 05:45:07 crc kubenswrapper[4930]: I1012 05:45:07.176637 4930 scope.go:117] "RemoveContainer" containerID="1b864609006f936003f95f2bb11e3b4761f05c3be5574089377f3429b3e36369" Oct 12 05:45:07 crc kubenswrapper[4930]: E1012 05:45:07.177114 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b864609006f936003f95f2bb11e3b4761f05c3be5574089377f3429b3e36369\": container with ID starting with 1b864609006f936003f95f2bb11e3b4761f05c3be5574089377f3429b3e36369 not found: ID does not exist" containerID="1b864609006f936003f95f2bb11e3b4761f05c3be5574089377f3429b3e36369" Oct 12 05:45:07 crc kubenswrapper[4930]: I1012 05:45:07.177138 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b864609006f936003f95f2bb11e3b4761f05c3be5574089377f3429b3e36369"} err="failed to get container status \"1b864609006f936003f95f2bb11e3b4761f05c3be5574089377f3429b3e36369\": rpc error: code = NotFound desc = could not find container \"1b864609006f936003f95f2bb11e3b4761f05c3be5574089377f3429b3e36369\": container with ID starting with 1b864609006f936003f95f2bb11e3b4761f05c3be5574089377f3429b3e36369 not found: ID does not exist" Oct 12 05:45:07 crc kubenswrapper[4930]: I1012 05:45:07.875874 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-c9wvx" Oct 12 05:45:07 crc kubenswrapper[4930]: I1012 05:45:07.955523 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-m4lsn"] Oct 12 05:45:07 crc kubenswrapper[4930]: E1012 05:45:07.955722 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54acd3fa-9208-450c-9a6a-4bae6962c325" containerName="registry-server" Oct 12 05:45:07 crc kubenswrapper[4930]: I1012 05:45:07.955736 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="54acd3fa-9208-450c-9a6a-4bae6962c325" containerName="registry-server" Oct 12 05:45:07 crc kubenswrapper[4930]: E1012 05:45:07.955760 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e085d670-8038-4828-acb6-dddc74a33655" containerName="extract-content" Oct 12 05:45:07 crc kubenswrapper[4930]: I1012 05:45:07.955766 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="e085d670-8038-4828-acb6-dddc74a33655" containerName="extract-content" Oct 12 05:45:07 crc kubenswrapper[4930]: E1012 05:45:07.955773 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e085d670-8038-4828-acb6-dddc74a33655" containerName="extract-utilities" Oct 12 05:45:07 crc kubenswrapper[4930]: I1012 05:45:07.955781 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="e085d670-8038-4828-acb6-dddc74a33655" containerName="extract-utilities" Oct 12 05:45:07 crc kubenswrapper[4930]: E1012 05:45:07.955791 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abe3804c-7b56-43f4-be75-206e80471232" containerName="registry-server" Oct 12 05:45:07 crc kubenswrapper[4930]: I1012 05:45:07.955797 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="abe3804c-7b56-43f4-be75-206e80471232" containerName="registry-server" Oct 12 05:45:07 crc kubenswrapper[4930]: E1012 05:45:07.955809 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deece869-6cf9-4922-b75a-294c828c6e9e" containerName="extract-content" Oct 12 05:45:07 crc kubenswrapper[4930]: I1012 05:45:07.955814 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="deece869-6cf9-4922-b75a-294c828c6e9e" containerName="extract-content" Oct 12 05:45:07 crc kubenswrapper[4930]: E1012 05:45:07.955821 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a51e2785-1c3d-4354-a071-dadb05075c68" containerName="marketplace-operator" Oct 12 05:45:07 crc kubenswrapper[4930]: I1012 05:45:07.955827 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="a51e2785-1c3d-4354-a071-dadb05075c68" containerName="marketplace-operator" Oct 12 05:45:07 crc kubenswrapper[4930]: E1012 05:45:07.955835 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54acd3fa-9208-450c-9a6a-4bae6962c325" containerName="extract-utilities" Oct 12 05:45:07 crc kubenswrapper[4930]: I1012 05:45:07.955840 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="54acd3fa-9208-450c-9a6a-4bae6962c325" containerName="extract-utilities" Oct 12 05:45:07 crc kubenswrapper[4930]: E1012 05:45:07.955849 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deece869-6cf9-4922-b75a-294c828c6e9e" containerName="registry-server" Oct 12 05:45:07 crc kubenswrapper[4930]: I1012 05:45:07.955855 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="deece869-6cf9-4922-b75a-294c828c6e9e" containerName="registry-server" Oct 12 05:45:07 crc kubenswrapper[4930]: E1012 05:45:07.955861 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abe3804c-7b56-43f4-be75-206e80471232" containerName="extract-content" Oct 12 05:45:07 crc kubenswrapper[4930]: I1012 05:45:07.955867 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="abe3804c-7b56-43f4-be75-206e80471232" containerName="extract-content" Oct 12 05:45:07 crc kubenswrapper[4930]: E1012 05:45:07.955874 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deece869-6cf9-4922-b75a-294c828c6e9e" containerName="extract-utilities" Oct 12 05:45:07 crc kubenswrapper[4930]: I1012 05:45:07.955879 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="deece869-6cf9-4922-b75a-294c828c6e9e" containerName="extract-utilities" Oct 12 05:45:07 crc kubenswrapper[4930]: E1012 05:45:07.955886 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abe3804c-7b56-43f4-be75-206e80471232" containerName="extract-utilities" Oct 12 05:45:07 crc kubenswrapper[4930]: I1012 05:45:07.955892 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="abe3804c-7b56-43f4-be75-206e80471232" containerName="extract-utilities" Oct 12 05:45:07 crc kubenswrapper[4930]: E1012 05:45:07.955902 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e085d670-8038-4828-acb6-dddc74a33655" containerName="registry-server" Oct 12 05:45:07 crc kubenswrapper[4930]: I1012 05:45:07.955908 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="e085d670-8038-4828-acb6-dddc74a33655" containerName="registry-server" Oct 12 05:45:07 crc kubenswrapper[4930]: E1012 05:45:07.955916 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54acd3fa-9208-450c-9a6a-4bae6962c325" containerName="extract-content" Oct 12 05:45:07 crc kubenswrapper[4930]: I1012 05:45:07.955922 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="54acd3fa-9208-450c-9a6a-4bae6962c325" containerName="extract-content" Oct 12 05:45:07 crc kubenswrapper[4930]: I1012 05:45:07.955998 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="deece869-6cf9-4922-b75a-294c828c6e9e" containerName="registry-server" Oct 12 05:45:07 crc kubenswrapper[4930]: I1012 05:45:07.956013 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="abe3804c-7b56-43f4-be75-206e80471232" containerName="registry-server" Oct 12 05:45:07 crc kubenswrapper[4930]: I1012 05:45:07.956019 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="54acd3fa-9208-450c-9a6a-4bae6962c325" containerName="registry-server" Oct 12 05:45:07 crc kubenswrapper[4930]: I1012 05:45:07.956030 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="a51e2785-1c3d-4354-a071-dadb05075c68" containerName="marketplace-operator" Oct 12 05:45:07 crc kubenswrapper[4930]: I1012 05:45:07.956037 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="e085d670-8038-4828-acb6-dddc74a33655" containerName="registry-server" Oct 12 05:45:07 crc kubenswrapper[4930]: I1012 05:45:07.956671 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m4lsn" Oct 12 05:45:07 crc kubenswrapper[4930]: I1012 05:45:07.962297 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 12 05:45:07 crc kubenswrapper[4930]: I1012 05:45:07.962544 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m4lsn"] Oct 12 05:45:07 crc kubenswrapper[4930]: I1012 05:45:07.998255 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwswk\" (UniqueName: \"kubernetes.io/projected/0b85acba-bed0-4e85-8252-3972368d611c-kube-api-access-cwswk\") pod \"redhat-marketplace-m4lsn\" (UID: \"0b85acba-bed0-4e85-8252-3972368d611c\") " pod="openshift-marketplace/redhat-marketplace-m4lsn" Oct 12 05:45:07 crc kubenswrapper[4930]: I1012 05:45:07.998317 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b85acba-bed0-4e85-8252-3972368d611c-catalog-content\") pod \"redhat-marketplace-m4lsn\" (UID: \"0b85acba-bed0-4e85-8252-3972368d611c\") " pod="openshift-marketplace/redhat-marketplace-m4lsn" Oct 12 05:45:07 crc kubenswrapper[4930]: I1012 05:45:07.998407 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b85acba-bed0-4e85-8252-3972368d611c-utilities\") pod \"redhat-marketplace-m4lsn\" (UID: \"0b85acba-bed0-4e85-8252-3972368d611c\") " pod="openshift-marketplace/redhat-marketplace-m4lsn" Oct 12 05:45:08 crc kubenswrapper[4930]: I1012 05:45:08.099810 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b85acba-bed0-4e85-8252-3972368d611c-catalog-content\") pod \"redhat-marketplace-m4lsn\" (UID: \"0b85acba-bed0-4e85-8252-3972368d611c\") " pod="openshift-marketplace/redhat-marketplace-m4lsn" Oct 12 05:45:08 crc kubenswrapper[4930]: I1012 05:45:08.099891 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b85acba-bed0-4e85-8252-3972368d611c-utilities\") pod \"redhat-marketplace-m4lsn\" (UID: \"0b85acba-bed0-4e85-8252-3972368d611c\") " pod="openshift-marketplace/redhat-marketplace-m4lsn" Oct 12 05:45:08 crc kubenswrapper[4930]: I1012 05:45:08.099957 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwswk\" (UniqueName: \"kubernetes.io/projected/0b85acba-bed0-4e85-8252-3972368d611c-kube-api-access-cwswk\") pod \"redhat-marketplace-m4lsn\" (UID: \"0b85acba-bed0-4e85-8252-3972368d611c\") " pod="openshift-marketplace/redhat-marketplace-m4lsn" Oct 12 05:45:08 crc kubenswrapper[4930]: I1012 05:45:08.100622 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b85acba-bed0-4e85-8252-3972368d611c-catalog-content\") pod \"redhat-marketplace-m4lsn\" (UID: \"0b85acba-bed0-4e85-8252-3972368d611c\") " pod="openshift-marketplace/redhat-marketplace-m4lsn" Oct 12 05:45:08 crc kubenswrapper[4930]: I1012 05:45:08.101165 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b85acba-bed0-4e85-8252-3972368d611c-utilities\") pod \"redhat-marketplace-m4lsn\" (UID: \"0b85acba-bed0-4e85-8252-3972368d611c\") " pod="openshift-marketplace/redhat-marketplace-m4lsn" Oct 12 05:45:08 crc kubenswrapper[4930]: I1012 05:45:08.120501 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwswk\" (UniqueName: \"kubernetes.io/projected/0b85acba-bed0-4e85-8252-3972368d611c-kube-api-access-cwswk\") pod \"redhat-marketplace-m4lsn\" (UID: \"0b85acba-bed0-4e85-8252-3972368d611c\") " pod="openshift-marketplace/redhat-marketplace-m4lsn" Oct 12 05:45:08 crc kubenswrapper[4930]: I1012 05:45:08.143694 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54acd3fa-9208-450c-9a6a-4bae6962c325" path="/var/lib/kubelet/pods/54acd3fa-9208-450c-9a6a-4bae6962c325/volumes" Oct 12 05:45:08 crc kubenswrapper[4930]: I1012 05:45:08.145469 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a51e2785-1c3d-4354-a071-dadb05075c68" path="/var/lib/kubelet/pods/a51e2785-1c3d-4354-a071-dadb05075c68/volumes" Oct 12 05:45:08 crc kubenswrapper[4930]: I1012 05:45:08.146019 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abe3804c-7b56-43f4-be75-206e80471232" path="/var/lib/kubelet/pods/abe3804c-7b56-43f4-be75-206e80471232/volumes" Oct 12 05:45:08 crc kubenswrapper[4930]: I1012 05:45:08.148165 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="deece869-6cf9-4922-b75a-294c828c6e9e" path="/var/lib/kubelet/pods/deece869-6cf9-4922-b75a-294c828c6e9e/volumes" Oct 12 05:45:08 crc kubenswrapper[4930]: I1012 05:45:08.149143 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e085d670-8038-4828-acb6-dddc74a33655" path="/var/lib/kubelet/pods/e085d670-8038-4828-acb6-dddc74a33655/volumes" Oct 12 05:45:08 crc kubenswrapper[4930]: I1012 05:45:08.155712 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tzfd6"] Oct 12 05:45:08 crc kubenswrapper[4930]: I1012 05:45:08.156872 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tzfd6" Oct 12 05:45:08 crc kubenswrapper[4930]: I1012 05:45:08.159733 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 12 05:45:08 crc kubenswrapper[4930]: I1012 05:45:08.163889 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tzfd6"] Oct 12 05:45:08 crc kubenswrapper[4930]: I1012 05:45:08.200723 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d98b1ef4-3cf1-4432-8b03-0c9d7118248d-catalog-content\") pod \"redhat-operators-tzfd6\" (UID: \"d98b1ef4-3cf1-4432-8b03-0c9d7118248d\") " pod="openshift-marketplace/redhat-operators-tzfd6" Oct 12 05:45:08 crc kubenswrapper[4930]: I1012 05:45:08.200883 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d98b1ef4-3cf1-4432-8b03-0c9d7118248d-utilities\") pod \"redhat-operators-tzfd6\" (UID: \"d98b1ef4-3cf1-4432-8b03-0c9d7118248d\") " pod="openshift-marketplace/redhat-operators-tzfd6" Oct 12 05:45:08 crc kubenswrapper[4930]: I1012 05:45:08.200920 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5mcm\" (UniqueName: \"kubernetes.io/projected/d98b1ef4-3cf1-4432-8b03-0c9d7118248d-kube-api-access-z5mcm\") pod \"redhat-operators-tzfd6\" (UID: \"d98b1ef4-3cf1-4432-8b03-0c9d7118248d\") " pod="openshift-marketplace/redhat-operators-tzfd6" Oct 12 05:45:08 crc kubenswrapper[4930]: I1012 05:45:08.280892 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 12 05:45:08 crc kubenswrapper[4930]: I1012 05:45:08.289913 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m4lsn" Oct 12 05:45:08 crc kubenswrapper[4930]: I1012 05:45:08.302551 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5mcm\" (UniqueName: \"kubernetes.io/projected/d98b1ef4-3cf1-4432-8b03-0c9d7118248d-kube-api-access-z5mcm\") pod \"redhat-operators-tzfd6\" (UID: \"d98b1ef4-3cf1-4432-8b03-0c9d7118248d\") " pod="openshift-marketplace/redhat-operators-tzfd6" Oct 12 05:45:08 crc kubenswrapper[4930]: I1012 05:45:08.302612 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d98b1ef4-3cf1-4432-8b03-0c9d7118248d-catalog-content\") pod \"redhat-operators-tzfd6\" (UID: \"d98b1ef4-3cf1-4432-8b03-0c9d7118248d\") " pod="openshift-marketplace/redhat-operators-tzfd6" Oct 12 05:45:08 crc kubenswrapper[4930]: I1012 05:45:08.302698 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d98b1ef4-3cf1-4432-8b03-0c9d7118248d-utilities\") pod \"redhat-operators-tzfd6\" (UID: \"d98b1ef4-3cf1-4432-8b03-0c9d7118248d\") " pod="openshift-marketplace/redhat-operators-tzfd6" Oct 12 05:45:08 crc kubenswrapper[4930]: I1012 05:45:08.303108 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d98b1ef4-3cf1-4432-8b03-0c9d7118248d-catalog-content\") pod \"redhat-operators-tzfd6\" (UID: \"d98b1ef4-3cf1-4432-8b03-0c9d7118248d\") " pod="openshift-marketplace/redhat-operators-tzfd6" Oct 12 05:45:08 crc kubenswrapper[4930]: I1012 05:45:08.303643 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d98b1ef4-3cf1-4432-8b03-0c9d7118248d-utilities\") pod \"redhat-operators-tzfd6\" (UID: \"d98b1ef4-3cf1-4432-8b03-0c9d7118248d\") " pod="openshift-marketplace/redhat-operators-tzfd6" Oct 12 05:45:08 crc kubenswrapper[4930]: I1012 05:45:08.324000 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5mcm\" (UniqueName: \"kubernetes.io/projected/d98b1ef4-3cf1-4432-8b03-0c9d7118248d-kube-api-access-z5mcm\") pod \"redhat-operators-tzfd6\" (UID: \"d98b1ef4-3cf1-4432-8b03-0c9d7118248d\") " pod="openshift-marketplace/redhat-operators-tzfd6" Oct 12 05:45:08 crc kubenswrapper[4930]: I1012 05:45:08.507099 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tzfd6" Oct 12 05:45:08 crc kubenswrapper[4930]: I1012 05:45:08.674041 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tzfd6"] Oct 12 05:45:08 crc kubenswrapper[4930]: W1012 05:45:08.674543 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd98b1ef4_3cf1_4432_8b03_0c9d7118248d.slice/crio-7fceace998663eb2f67b8d26250a0a5e80f7b06fa2455c1d3b51baf7e4e34df3 WatchSource:0}: Error finding container 7fceace998663eb2f67b8d26250a0a5e80f7b06fa2455c1d3b51baf7e4e34df3: Status 404 returned error can't find the container with id 7fceace998663eb2f67b8d26250a0a5e80f7b06fa2455c1d3b51baf7e4e34df3 Oct 12 05:45:08 crc kubenswrapper[4930]: I1012 05:45:08.680370 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m4lsn"] Oct 12 05:45:08 crc kubenswrapper[4930]: W1012 05:45:08.683375 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b85acba_bed0_4e85_8252_3972368d611c.slice/crio-0694f45fc176e63194b1490d796ee65816a1ce83eed0d38b3789af7366a9b842 WatchSource:0}: Error finding container 0694f45fc176e63194b1490d796ee65816a1ce83eed0d38b3789af7366a9b842: Status 404 returned error can't find the container with id 0694f45fc176e63194b1490d796ee65816a1ce83eed0d38b3789af7366a9b842 Oct 12 05:45:08 crc kubenswrapper[4930]: I1012 05:45:08.894610 4930 generic.go:334] "Generic (PLEG): container finished" podID="d98b1ef4-3cf1-4432-8b03-0c9d7118248d" containerID="4d79bbbd24ca617759911a288e244e99f3f736fb2d617b303cac0c137cf7ae46" exitCode=0 Oct 12 05:45:08 crc kubenswrapper[4930]: I1012 05:45:08.894697 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tzfd6" event={"ID":"d98b1ef4-3cf1-4432-8b03-0c9d7118248d","Type":"ContainerDied","Data":"4d79bbbd24ca617759911a288e244e99f3f736fb2d617b303cac0c137cf7ae46"} Oct 12 05:45:08 crc kubenswrapper[4930]: I1012 05:45:08.894729 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tzfd6" event={"ID":"d98b1ef4-3cf1-4432-8b03-0c9d7118248d","Type":"ContainerStarted","Data":"7fceace998663eb2f67b8d26250a0a5e80f7b06fa2455c1d3b51baf7e4e34df3"} Oct 12 05:45:08 crc kubenswrapper[4930]: I1012 05:45:08.896527 4930 generic.go:334] "Generic (PLEG): container finished" podID="0b85acba-bed0-4e85-8252-3972368d611c" containerID="f050002ecb5ec9a3780dd5610f936b538745d0d1ec982211179f917e6f21518c" exitCode=0 Oct 12 05:45:08 crc kubenswrapper[4930]: I1012 05:45:08.896606 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m4lsn" event={"ID":"0b85acba-bed0-4e85-8252-3972368d611c","Type":"ContainerDied","Data":"f050002ecb5ec9a3780dd5610f936b538745d0d1ec982211179f917e6f21518c"} Oct 12 05:45:08 crc kubenswrapper[4930]: I1012 05:45:08.896672 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m4lsn" event={"ID":"0b85acba-bed0-4e85-8252-3972368d611c","Type":"ContainerStarted","Data":"0694f45fc176e63194b1490d796ee65816a1ce83eed0d38b3789af7366a9b842"} Oct 12 05:45:09 crc kubenswrapper[4930]: I1012 05:45:09.906277 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tzfd6" event={"ID":"d98b1ef4-3cf1-4432-8b03-0c9d7118248d","Type":"ContainerStarted","Data":"95d9d0ae466c47ac2617cb1daf543fb76454208c4d4518beef2496ca32aa49d3"} Oct 12 05:45:10 crc kubenswrapper[4930]: I1012 05:45:10.359047 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6d7zn"] Oct 12 05:45:10 crc kubenswrapper[4930]: I1012 05:45:10.359961 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6d7zn" Oct 12 05:45:10 crc kubenswrapper[4930]: I1012 05:45:10.362094 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 12 05:45:10 crc kubenswrapper[4930]: I1012 05:45:10.370187 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6d7zn"] Oct 12 05:45:10 crc kubenswrapper[4930]: I1012 05:45:10.432891 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/732507c6-d79c-4b3e-a0fa-23111ec6b02b-catalog-content\") pod \"community-operators-6d7zn\" (UID: \"732507c6-d79c-4b3e-a0fa-23111ec6b02b\") " pod="openshift-marketplace/community-operators-6d7zn" Oct 12 05:45:10 crc kubenswrapper[4930]: I1012 05:45:10.433178 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/732507c6-d79c-4b3e-a0fa-23111ec6b02b-utilities\") pod \"community-operators-6d7zn\" (UID: \"732507c6-d79c-4b3e-a0fa-23111ec6b02b\") " pod="openshift-marketplace/community-operators-6d7zn" Oct 12 05:45:10 crc kubenswrapper[4930]: I1012 05:45:10.433251 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfnxq\" (UniqueName: \"kubernetes.io/projected/732507c6-d79c-4b3e-a0fa-23111ec6b02b-kube-api-access-qfnxq\") pod \"community-operators-6d7zn\" (UID: \"732507c6-d79c-4b3e-a0fa-23111ec6b02b\") " pod="openshift-marketplace/community-operators-6d7zn" Oct 12 05:45:10 crc kubenswrapper[4930]: I1012 05:45:10.533800 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/732507c6-d79c-4b3e-a0fa-23111ec6b02b-utilities\") pod \"community-operators-6d7zn\" (UID: \"732507c6-d79c-4b3e-a0fa-23111ec6b02b\") " pod="openshift-marketplace/community-operators-6d7zn" Oct 12 05:45:10 crc kubenswrapper[4930]: I1012 05:45:10.533939 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfnxq\" (UniqueName: \"kubernetes.io/projected/732507c6-d79c-4b3e-a0fa-23111ec6b02b-kube-api-access-qfnxq\") pod \"community-operators-6d7zn\" (UID: \"732507c6-d79c-4b3e-a0fa-23111ec6b02b\") " pod="openshift-marketplace/community-operators-6d7zn" Oct 12 05:45:10 crc kubenswrapper[4930]: I1012 05:45:10.534023 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/732507c6-d79c-4b3e-a0fa-23111ec6b02b-catalog-content\") pod \"community-operators-6d7zn\" (UID: \"732507c6-d79c-4b3e-a0fa-23111ec6b02b\") " pod="openshift-marketplace/community-operators-6d7zn" Oct 12 05:45:10 crc kubenswrapper[4930]: I1012 05:45:10.534356 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/732507c6-d79c-4b3e-a0fa-23111ec6b02b-utilities\") pod \"community-operators-6d7zn\" (UID: \"732507c6-d79c-4b3e-a0fa-23111ec6b02b\") " pod="openshift-marketplace/community-operators-6d7zn" Oct 12 05:45:10 crc kubenswrapper[4930]: I1012 05:45:10.534964 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/732507c6-d79c-4b3e-a0fa-23111ec6b02b-catalog-content\") pod \"community-operators-6d7zn\" (UID: \"732507c6-d79c-4b3e-a0fa-23111ec6b02b\") " pod="openshift-marketplace/community-operators-6d7zn" Oct 12 05:45:10 crc kubenswrapper[4930]: I1012 05:45:10.557957 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfnxq\" (UniqueName: \"kubernetes.io/projected/732507c6-d79c-4b3e-a0fa-23111ec6b02b-kube-api-access-qfnxq\") pod \"community-operators-6d7zn\" (UID: \"732507c6-d79c-4b3e-a0fa-23111ec6b02b\") " pod="openshift-marketplace/community-operators-6d7zn" Oct 12 05:45:10 crc kubenswrapper[4930]: I1012 05:45:10.559575 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jwl4b"] Oct 12 05:45:10 crc kubenswrapper[4930]: I1012 05:45:10.560773 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jwl4b" Oct 12 05:45:10 crc kubenswrapper[4930]: I1012 05:45:10.564307 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 12 05:45:10 crc kubenswrapper[4930]: I1012 05:45:10.566121 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jwl4b"] Oct 12 05:45:10 crc kubenswrapper[4930]: I1012 05:45:10.634892 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/051544d2-1a7f-4cb4-8b3e-c6ffeb23732f-utilities\") pod \"certified-operators-jwl4b\" (UID: \"051544d2-1a7f-4cb4-8b3e-c6ffeb23732f\") " pod="openshift-marketplace/certified-operators-jwl4b" Oct 12 05:45:10 crc kubenswrapper[4930]: I1012 05:45:10.635134 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/051544d2-1a7f-4cb4-8b3e-c6ffeb23732f-catalog-content\") pod \"certified-operators-jwl4b\" (UID: \"051544d2-1a7f-4cb4-8b3e-c6ffeb23732f\") " pod="openshift-marketplace/certified-operators-jwl4b" Oct 12 05:45:10 crc kubenswrapper[4930]: I1012 05:45:10.635320 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jp2mb\" (UniqueName: \"kubernetes.io/projected/051544d2-1a7f-4cb4-8b3e-c6ffeb23732f-kube-api-access-jp2mb\") pod \"certified-operators-jwl4b\" (UID: \"051544d2-1a7f-4cb4-8b3e-c6ffeb23732f\") " pod="openshift-marketplace/certified-operators-jwl4b" Oct 12 05:45:10 crc kubenswrapper[4930]: I1012 05:45:10.695338 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6d7zn" Oct 12 05:45:10 crc kubenswrapper[4930]: I1012 05:45:10.736859 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/051544d2-1a7f-4cb4-8b3e-c6ffeb23732f-utilities\") pod \"certified-operators-jwl4b\" (UID: \"051544d2-1a7f-4cb4-8b3e-c6ffeb23732f\") " pod="openshift-marketplace/certified-operators-jwl4b" Oct 12 05:45:10 crc kubenswrapper[4930]: I1012 05:45:10.736910 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/051544d2-1a7f-4cb4-8b3e-c6ffeb23732f-catalog-content\") pod \"certified-operators-jwl4b\" (UID: \"051544d2-1a7f-4cb4-8b3e-c6ffeb23732f\") " pod="openshift-marketplace/certified-operators-jwl4b" Oct 12 05:45:10 crc kubenswrapper[4930]: I1012 05:45:10.736979 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jp2mb\" (UniqueName: \"kubernetes.io/projected/051544d2-1a7f-4cb4-8b3e-c6ffeb23732f-kube-api-access-jp2mb\") pod \"certified-operators-jwl4b\" (UID: \"051544d2-1a7f-4cb4-8b3e-c6ffeb23732f\") " pod="openshift-marketplace/certified-operators-jwl4b" Oct 12 05:45:10 crc kubenswrapper[4930]: I1012 05:45:10.737445 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/051544d2-1a7f-4cb4-8b3e-c6ffeb23732f-catalog-content\") pod \"certified-operators-jwl4b\" (UID: \"051544d2-1a7f-4cb4-8b3e-c6ffeb23732f\") " pod="openshift-marketplace/certified-operators-jwl4b" Oct 12 05:45:10 crc kubenswrapper[4930]: I1012 05:45:10.738041 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/051544d2-1a7f-4cb4-8b3e-c6ffeb23732f-utilities\") pod \"certified-operators-jwl4b\" (UID: \"051544d2-1a7f-4cb4-8b3e-c6ffeb23732f\") " pod="openshift-marketplace/certified-operators-jwl4b" Oct 12 05:45:10 crc kubenswrapper[4930]: I1012 05:45:10.754278 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jp2mb\" (UniqueName: \"kubernetes.io/projected/051544d2-1a7f-4cb4-8b3e-c6ffeb23732f-kube-api-access-jp2mb\") pod \"certified-operators-jwl4b\" (UID: \"051544d2-1a7f-4cb4-8b3e-c6ffeb23732f\") " pod="openshift-marketplace/certified-operators-jwl4b" Oct 12 05:45:10 crc kubenswrapper[4930]: I1012 05:45:10.894728 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jwl4b" Oct 12 05:45:10 crc kubenswrapper[4930]: I1012 05:45:10.913277 4930 generic.go:334] "Generic (PLEG): container finished" podID="d98b1ef4-3cf1-4432-8b03-0c9d7118248d" containerID="95d9d0ae466c47ac2617cb1daf543fb76454208c4d4518beef2496ca32aa49d3" exitCode=0 Oct 12 05:45:10 crc kubenswrapper[4930]: I1012 05:45:10.913351 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tzfd6" event={"ID":"d98b1ef4-3cf1-4432-8b03-0c9d7118248d","Type":"ContainerDied","Data":"95d9d0ae466c47ac2617cb1daf543fb76454208c4d4518beef2496ca32aa49d3"} Oct 12 05:45:10 crc kubenswrapper[4930]: I1012 05:45:10.916624 4930 generic.go:334] "Generic (PLEG): container finished" podID="0b85acba-bed0-4e85-8252-3972368d611c" containerID="19850323571fa202975653809dd3ec3ac0ce24ea5170fa6d80b9b5ea63e93fd9" exitCode=0 Oct 12 05:45:10 crc kubenswrapper[4930]: I1012 05:45:10.916660 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m4lsn" event={"ID":"0b85acba-bed0-4e85-8252-3972368d611c","Type":"ContainerDied","Data":"19850323571fa202975653809dd3ec3ac0ce24ea5170fa6d80b9b5ea63e93fd9"} Oct 12 05:45:11 crc kubenswrapper[4930]: I1012 05:45:11.150227 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6d7zn"] Oct 12 05:45:11 crc kubenswrapper[4930]: I1012 05:45:11.319589 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jwl4b"] Oct 12 05:45:11 crc kubenswrapper[4930]: W1012 05:45:11.380629 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod051544d2_1a7f_4cb4_8b3e_c6ffeb23732f.slice/crio-3fd57668efbc284941c57bcf385fc26aa765c26b197e689788c4781c174d9370 WatchSource:0}: Error finding container 3fd57668efbc284941c57bcf385fc26aa765c26b197e689788c4781c174d9370: Status 404 returned error can't find the container with id 3fd57668efbc284941c57bcf385fc26aa765c26b197e689788c4781c174d9370 Oct 12 05:45:11 crc kubenswrapper[4930]: I1012 05:45:11.924435 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tzfd6" event={"ID":"d98b1ef4-3cf1-4432-8b03-0c9d7118248d","Type":"ContainerStarted","Data":"6b2a07ef4b20d84599b501f0f34a7df3e973156780bd1ded699fe16f518811df"} Oct 12 05:45:11 crc kubenswrapper[4930]: I1012 05:45:11.928530 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m4lsn" event={"ID":"0b85acba-bed0-4e85-8252-3972368d611c","Type":"ContainerStarted","Data":"777dceb1cbdfcc11222ef76d7fd696dee80b5ff183521262c0a40aeced659240"} Oct 12 05:45:11 crc kubenswrapper[4930]: I1012 05:45:11.930175 4930 generic.go:334] "Generic (PLEG): container finished" podID="051544d2-1a7f-4cb4-8b3e-c6ffeb23732f" containerID="9627a669c451de507b82a2451ea888854c794413612ba71c6b708413a38cecb5" exitCode=0 Oct 12 05:45:11 crc kubenswrapper[4930]: I1012 05:45:11.930265 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jwl4b" event={"ID":"051544d2-1a7f-4cb4-8b3e-c6ffeb23732f","Type":"ContainerDied","Data":"9627a669c451de507b82a2451ea888854c794413612ba71c6b708413a38cecb5"} Oct 12 05:45:11 crc kubenswrapper[4930]: I1012 05:45:11.930313 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jwl4b" event={"ID":"051544d2-1a7f-4cb4-8b3e-c6ffeb23732f","Type":"ContainerStarted","Data":"3fd57668efbc284941c57bcf385fc26aa765c26b197e689788c4781c174d9370"} Oct 12 05:45:11 crc kubenswrapper[4930]: I1012 05:45:11.932980 4930 generic.go:334] "Generic (PLEG): container finished" podID="732507c6-d79c-4b3e-a0fa-23111ec6b02b" containerID="dea2f24bbe336f0aecdea5b07b165b7b83ca6f845730b8e8ddaa83e8091672d1" exitCode=0 Oct 12 05:45:11 crc kubenswrapper[4930]: I1012 05:45:11.933012 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6d7zn" event={"ID":"732507c6-d79c-4b3e-a0fa-23111ec6b02b","Type":"ContainerDied","Data":"dea2f24bbe336f0aecdea5b07b165b7b83ca6f845730b8e8ddaa83e8091672d1"} Oct 12 05:45:11 crc kubenswrapper[4930]: I1012 05:45:11.933324 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6d7zn" event={"ID":"732507c6-d79c-4b3e-a0fa-23111ec6b02b","Type":"ContainerStarted","Data":"3bb9fd485fc09077f0f6eb2c4b69bc4575a9e125053f8e78efaa40108d98446a"} Oct 12 05:45:11 crc kubenswrapper[4930]: I1012 05:45:11.950349 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tzfd6" podStartSLOduration=1.420118711 podStartE2EDuration="3.950332605s" podCreationTimestamp="2025-10-12 05:45:08 +0000 UTC" firstStartedPulling="2025-10-12 05:45:08.89650951 +0000 UTC m=+241.438611285" lastFinishedPulling="2025-10-12 05:45:11.426723414 +0000 UTC m=+243.968825179" observedRunningTime="2025-10-12 05:45:11.949520832 +0000 UTC m=+244.491622597" watchObservedRunningTime="2025-10-12 05:45:11.950332605 +0000 UTC m=+244.492434370" Oct 12 05:45:12 crc kubenswrapper[4930]: I1012 05:45:12.018452 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-m4lsn" podStartSLOduration=2.555560291 podStartE2EDuration="5.018434366s" podCreationTimestamp="2025-10-12 05:45:07 +0000 UTC" firstStartedPulling="2025-10-12 05:45:08.898566828 +0000 UTC m=+241.440668593" lastFinishedPulling="2025-10-12 05:45:11.361440893 +0000 UTC m=+243.903542668" observedRunningTime="2025-10-12 05:45:12.017702905 +0000 UTC m=+244.559804670" watchObservedRunningTime="2025-10-12 05:45:12.018434366 +0000 UTC m=+244.560536131" Oct 12 05:45:14 crc kubenswrapper[4930]: I1012 05:45:14.952176 4930 generic.go:334] "Generic (PLEG): container finished" podID="051544d2-1a7f-4cb4-8b3e-c6ffeb23732f" containerID="fddbe0fc73e49ad8890284e9333ffd0b1290efc1eb7490a2224dce26685643b2" exitCode=0 Oct 12 05:45:14 crc kubenswrapper[4930]: I1012 05:45:14.952233 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jwl4b" event={"ID":"051544d2-1a7f-4cb4-8b3e-c6ffeb23732f","Type":"ContainerDied","Data":"fddbe0fc73e49ad8890284e9333ffd0b1290efc1eb7490a2224dce26685643b2"} Oct 12 05:45:16 crc kubenswrapper[4930]: I1012 05:45:16.967707 4930 generic.go:334] "Generic (PLEG): container finished" podID="732507c6-d79c-4b3e-a0fa-23111ec6b02b" containerID="0b5f90317bbd927685b53a1b78b5fe637659bc6666af6e1db7675249c51a3831" exitCode=0 Oct 12 05:45:16 crc kubenswrapper[4930]: I1012 05:45:16.968068 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6d7zn" event={"ID":"732507c6-d79c-4b3e-a0fa-23111ec6b02b","Type":"ContainerDied","Data":"0b5f90317bbd927685b53a1b78b5fe637659bc6666af6e1db7675249c51a3831"} Oct 12 05:45:16 crc kubenswrapper[4930]: I1012 05:45:16.973639 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jwl4b" event={"ID":"051544d2-1a7f-4cb4-8b3e-c6ffeb23732f","Type":"ContainerStarted","Data":"09707e01ea96d8717b936ff9f83684d8e7a147d6490d7a4d48d49e3399ffcb45"} Oct 12 05:45:17 crc kubenswrapper[4930]: I1012 05:45:17.033147 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jwl4b" podStartSLOduration=2.581657073 podStartE2EDuration="7.033121084s" podCreationTimestamp="2025-10-12 05:45:10 +0000 UTC" firstStartedPulling="2025-10-12 05:45:11.931236446 +0000 UTC m=+244.473338221" lastFinishedPulling="2025-10-12 05:45:16.382700447 +0000 UTC m=+248.924802232" observedRunningTime="2025-10-12 05:45:17.028083272 +0000 UTC m=+249.570185057" watchObservedRunningTime="2025-10-12 05:45:17.033121084 +0000 UTC m=+249.575222859" Oct 12 05:45:17 crc kubenswrapper[4930]: I1012 05:45:17.982564 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6d7zn" event={"ID":"732507c6-d79c-4b3e-a0fa-23111ec6b02b","Type":"ContainerStarted","Data":"00eb150dd04b10876f74f827150194a541c9dd5b610a3ab26634070c5e9582d4"} Oct 12 05:45:18 crc kubenswrapper[4930]: I1012 05:45:18.001337 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6d7zn" podStartSLOduration=2.342472827 podStartE2EDuration="8.001320886s" podCreationTimestamp="2025-10-12 05:45:10 +0000 UTC" firstStartedPulling="2025-10-12 05:45:11.934267882 +0000 UTC m=+244.476369647" lastFinishedPulling="2025-10-12 05:45:17.593115921 +0000 UTC m=+250.135217706" observedRunningTime="2025-10-12 05:45:18.000594976 +0000 UTC m=+250.542696751" watchObservedRunningTime="2025-10-12 05:45:18.001320886 +0000 UTC m=+250.543422651" Oct 12 05:45:18 crc kubenswrapper[4930]: I1012 05:45:18.290192 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-m4lsn" Oct 12 05:45:18 crc kubenswrapper[4930]: I1012 05:45:18.290258 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-m4lsn" Oct 12 05:45:18 crc kubenswrapper[4930]: I1012 05:45:18.338277 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-m4lsn" Oct 12 05:45:18 crc kubenswrapper[4930]: I1012 05:45:18.507813 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tzfd6" Oct 12 05:45:18 crc kubenswrapper[4930]: I1012 05:45:18.508103 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tzfd6" Oct 12 05:45:18 crc kubenswrapper[4930]: I1012 05:45:18.550361 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tzfd6" Oct 12 05:45:19 crc kubenswrapper[4930]: I1012 05:45:19.042484 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tzfd6" Oct 12 05:45:19 crc kubenswrapper[4930]: I1012 05:45:19.058257 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-m4lsn" Oct 12 05:45:20 crc kubenswrapper[4930]: I1012 05:45:20.695996 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6d7zn" Oct 12 05:45:20 crc kubenswrapper[4930]: I1012 05:45:20.696065 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6d7zn" Oct 12 05:45:20 crc kubenswrapper[4930]: I1012 05:45:20.762201 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6d7zn" Oct 12 05:45:20 crc kubenswrapper[4930]: I1012 05:45:20.895619 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jwl4b" Oct 12 05:45:20 crc kubenswrapper[4930]: I1012 05:45:20.895823 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jwl4b" Oct 12 05:45:20 crc kubenswrapper[4930]: I1012 05:45:20.954051 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jwl4b" Oct 12 05:45:22 crc kubenswrapper[4930]: I1012 05:45:22.056619 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jwl4b" Oct 12 05:45:30 crc kubenswrapper[4930]: I1012 05:45:30.761273 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6d7zn" Oct 12 05:46:33 crc kubenswrapper[4930]: I1012 05:46:33.669730 4930 patch_prober.go:28] interesting pod/machine-config-daemon-mk4tf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 05:46:33 crc kubenswrapper[4930]: I1012 05:46:33.670366 4930 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 05:47:03 crc kubenswrapper[4930]: I1012 05:47:03.669704 4930 patch_prober.go:28] interesting pod/machine-config-daemon-mk4tf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 05:47:03 crc kubenswrapper[4930]: I1012 05:47:03.670484 4930 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 05:47:33 crc kubenswrapper[4930]: I1012 05:47:33.669955 4930 patch_prober.go:28] interesting pod/machine-config-daemon-mk4tf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 05:47:33 crc kubenswrapper[4930]: I1012 05:47:33.670589 4930 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 05:47:33 crc kubenswrapper[4930]: I1012 05:47:33.670655 4930 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" Oct 12 05:47:33 crc kubenswrapper[4930]: I1012 05:47:33.671419 4930 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7434a5664387f355e60869203e8ba53d5bb3922f3950f2fa26ad54baf1b367ce"} pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 12 05:47:33 crc kubenswrapper[4930]: I1012 05:47:33.671514 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerName="machine-config-daemon" containerID="cri-o://7434a5664387f355e60869203e8ba53d5bb3922f3950f2fa26ad54baf1b367ce" gracePeriod=600 Oct 12 05:47:33 crc kubenswrapper[4930]: I1012 05:47:33.886900 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" event={"ID":"02f8684c-a3e4-44e8-9741-9f54488d8d8d","Type":"ContainerDied","Data":"7434a5664387f355e60869203e8ba53d5bb3922f3950f2fa26ad54baf1b367ce"} Oct 12 05:47:33 crc kubenswrapper[4930]: I1012 05:47:33.886908 4930 generic.go:334] "Generic (PLEG): container finished" podID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerID="7434a5664387f355e60869203e8ba53d5bb3922f3950f2fa26ad54baf1b367ce" exitCode=0 Oct 12 05:47:33 crc kubenswrapper[4930]: I1012 05:47:33.887002 4930 scope.go:117] "RemoveContainer" containerID="2193886cf3e24fbf30c3e6f91ab52d9b7e68968c8b42a4d0deafab5668555701" Oct 12 05:47:34 crc kubenswrapper[4930]: I1012 05:47:34.894958 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" event={"ID":"02f8684c-a3e4-44e8-9741-9f54488d8d8d","Type":"ContainerStarted","Data":"f9b4436f5dc6b5a5cc61a2d31420352c424bca4ed9a68e98bbb43f4e86026438"} Oct 12 05:48:48 crc kubenswrapper[4930]: I1012 05:48:48.899575 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-xxh24"] Oct 12 05:48:48 crc kubenswrapper[4930]: I1012 05:48:48.903845 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-xxh24" Oct 12 05:48:48 crc kubenswrapper[4930]: I1012 05:48:48.910308 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-xxh24"] Oct 12 05:48:49 crc kubenswrapper[4930]: I1012 05:48:49.064366 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1c0a551e-b6cd-4e8d-b5b6-13a08b180df1-trusted-ca\") pod \"image-registry-66df7c8f76-xxh24\" (UID: \"1c0a551e-b6cd-4e8d-b5b6-13a08b180df1\") " pod="openshift-image-registry/image-registry-66df7c8f76-xxh24" Oct 12 05:48:49 crc kubenswrapper[4930]: I1012 05:48:49.064461 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1c0a551e-b6cd-4e8d-b5b6-13a08b180df1-ca-trust-extracted\") pod \"image-registry-66df7c8f76-xxh24\" (UID: \"1c0a551e-b6cd-4e8d-b5b6-13a08b180df1\") " pod="openshift-image-registry/image-registry-66df7c8f76-xxh24" Oct 12 05:48:49 crc kubenswrapper[4930]: I1012 05:48:49.064532 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1c0a551e-b6cd-4e8d-b5b6-13a08b180df1-registry-tls\") pod \"image-registry-66df7c8f76-xxh24\" (UID: \"1c0a551e-b6cd-4e8d-b5b6-13a08b180df1\") " pod="openshift-image-registry/image-registry-66df7c8f76-xxh24" Oct 12 05:48:49 crc kubenswrapper[4930]: I1012 05:48:49.064587 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6s5x\" (UniqueName: \"kubernetes.io/projected/1c0a551e-b6cd-4e8d-b5b6-13a08b180df1-kube-api-access-c6s5x\") pod \"image-registry-66df7c8f76-xxh24\" (UID: \"1c0a551e-b6cd-4e8d-b5b6-13a08b180df1\") " pod="openshift-image-registry/image-registry-66df7c8f76-xxh24" Oct 12 05:48:49 crc kubenswrapper[4930]: I1012 05:48:49.064836 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1c0a551e-b6cd-4e8d-b5b6-13a08b180df1-bound-sa-token\") pod \"image-registry-66df7c8f76-xxh24\" (UID: \"1c0a551e-b6cd-4e8d-b5b6-13a08b180df1\") " pod="openshift-image-registry/image-registry-66df7c8f76-xxh24" Oct 12 05:48:49 crc kubenswrapper[4930]: I1012 05:48:49.064901 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1c0a551e-b6cd-4e8d-b5b6-13a08b180df1-installation-pull-secrets\") pod \"image-registry-66df7c8f76-xxh24\" (UID: \"1c0a551e-b6cd-4e8d-b5b6-13a08b180df1\") " pod="openshift-image-registry/image-registry-66df7c8f76-xxh24" Oct 12 05:48:49 crc kubenswrapper[4930]: I1012 05:48:49.064933 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1c0a551e-b6cd-4e8d-b5b6-13a08b180df1-registry-certificates\") pod \"image-registry-66df7c8f76-xxh24\" (UID: \"1c0a551e-b6cd-4e8d-b5b6-13a08b180df1\") " pod="openshift-image-registry/image-registry-66df7c8f76-xxh24" Oct 12 05:48:49 crc kubenswrapper[4930]: I1012 05:48:49.065033 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-xxh24\" (UID: \"1c0a551e-b6cd-4e8d-b5b6-13a08b180df1\") " pod="openshift-image-registry/image-registry-66df7c8f76-xxh24" Oct 12 05:48:49 crc kubenswrapper[4930]: I1012 05:48:49.101977 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-xxh24\" (UID: \"1c0a551e-b6cd-4e8d-b5b6-13a08b180df1\") " pod="openshift-image-registry/image-registry-66df7c8f76-xxh24" Oct 12 05:48:49 crc kubenswrapper[4930]: I1012 05:48:49.166304 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1c0a551e-b6cd-4e8d-b5b6-13a08b180df1-trusted-ca\") pod \"image-registry-66df7c8f76-xxh24\" (UID: \"1c0a551e-b6cd-4e8d-b5b6-13a08b180df1\") " pod="openshift-image-registry/image-registry-66df7c8f76-xxh24" Oct 12 05:48:49 crc kubenswrapper[4930]: I1012 05:48:49.166389 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1c0a551e-b6cd-4e8d-b5b6-13a08b180df1-ca-trust-extracted\") pod \"image-registry-66df7c8f76-xxh24\" (UID: \"1c0a551e-b6cd-4e8d-b5b6-13a08b180df1\") " pod="openshift-image-registry/image-registry-66df7c8f76-xxh24" Oct 12 05:48:49 crc kubenswrapper[4930]: I1012 05:48:49.166450 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1c0a551e-b6cd-4e8d-b5b6-13a08b180df1-registry-tls\") pod \"image-registry-66df7c8f76-xxh24\" (UID: \"1c0a551e-b6cd-4e8d-b5b6-13a08b180df1\") " pod="openshift-image-registry/image-registry-66df7c8f76-xxh24" Oct 12 05:48:49 crc kubenswrapper[4930]: I1012 05:48:49.166503 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6s5x\" (UniqueName: \"kubernetes.io/projected/1c0a551e-b6cd-4e8d-b5b6-13a08b180df1-kube-api-access-c6s5x\") pod \"image-registry-66df7c8f76-xxh24\" (UID: \"1c0a551e-b6cd-4e8d-b5b6-13a08b180df1\") " pod="openshift-image-registry/image-registry-66df7c8f76-xxh24" Oct 12 05:48:49 crc kubenswrapper[4930]: I1012 05:48:49.166593 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1c0a551e-b6cd-4e8d-b5b6-13a08b180df1-bound-sa-token\") pod \"image-registry-66df7c8f76-xxh24\" (UID: \"1c0a551e-b6cd-4e8d-b5b6-13a08b180df1\") " pod="openshift-image-registry/image-registry-66df7c8f76-xxh24" Oct 12 05:48:49 crc kubenswrapper[4930]: I1012 05:48:49.166640 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1c0a551e-b6cd-4e8d-b5b6-13a08b180df1-installation-pull-secrets\") pod \"image-registry-66df7c8f76-xxh24\" (UID: \"1c0a551e-b6cd-4e8d-b5b6-13a08b180df1\") " pod="openshift-image-registry/image-registry-66df7c8f76-xxh24" Oct 12 05:48:49 crc kubenswrapper[4930]: I1012 05:48:49.166675 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1c0a551e-b6cd-4e8d-b5b6-13a08b180df1-registry-certificates\") pod \"image-registry-66df7c8f76-xxh24\" (UID: \"1c0a551e-b6cd-4e8d-b5b6-13a08b180df1\") " pod="openshift-image-registry/image-registry-66df7c8f76-xxh24" Oct 12 05:48:49 crc kubenswrapper[4930]: I1012 05:48:49.167970 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1c0a551e-b6cd-4e8d-b5b6-13a08b180df1-ca-trust-extracted\") pod \"image-registry-66df7c8f76-xxh24\" (UID: \"1c0a551e-b6cd-4e8d-b5b6-13a08b180df1\") " pod="openshift-image-registry/image-registry-66df7c8f76-xxh24" Oct 12 05:48:49 crc kubenswrapper[4930]: I1012 05:48:49.170393 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1c0a551e-b6cd-4e8d-b5b6-13a08b180df1-trusted-ca\") pod \"image-registry-66df7c8f76-xxh24\" (UID: \"1c0a551e-b6cd-4e8d-b5b6-13a08b180df1\") " pod="openshift-image-registry/image-registry-66df7c8f76-xxh24" Oct 12 05:48:49 crc kubenswrapper[4930]: I1012 05:48:49.170955 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1c0a551e-b6cd-4e8d-b5b6-13a08b180df1-registry-certificates\") pod \"image-registry-66df7c8f76-xxh24\" (UID: \"1c0a551e-b6cd-4e8d-b5b6-13a08b180df1\") " pod="openshift-image-registry/image-registry-66df7c8f76-xxh24" Oct 12 05:48:49 crc kubenswrapper[4930]: I1012 05:48:49.177193 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1c0a551e-b6cd-4e8d-b5b6-13a08b180df1-registry-tls\") pod \"image-registry-66df7c8f76-xxh24\" (UID: \"1c0a551e-b6cd-4e8d-b5b6-13a08b180df1\") " pod="openshift-image-registry/image-registry-66df7c8f76-xxh24" Oct 12 05:48:49 crc kubenswrapper[4930]: I1012 05:48:49.178644 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1c0a551e-b6cd-4e8d-b5b6-13a08b180df1-installation-pull-secrets\") pod \"image-registry-66df7c8f76-xxh24\" (UID: \"1c0a551e-b6cd-4e8d-b5b6-13a08b180df1\") " pod="openshift-image-registry/image-registry-66df7c8f76-xxh24" Oct 12 05:48:49 crc kubenswrapper[4930]: I1012 05:48:49.190641 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1c0a551e-b6cd-4e8d-b5b6-13a08b180df1-bound-sa-token\") pod \"image-registry-66df7c8f76-xxh24\" (UID: \"1c0a551e-b6cd-4e8d-b5b6-13a08b180df1\") " pod="openshift-image-registry/image-registry-66df7c8f76-xxh24" Oct 12 05:48:49 crc kubenswrapper[4930]: I1012 05:48:49.195239 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6s5x\" (UniqueName: \"kubernetes.io/projected/1c0a551e-b6cd-4e8d-b5b6-13a08b180df1-kube-api-access-c6s5x\") pod \"image-registry-66df7c8f76-xxh24\" (UID: \"1c0a551e-b6cd-4e8d-b5b6-13a08b180df1\") " pod="openshift-image-registry/image-registry-66df7c8f76-xxh24" Oct 12 05:48:49 crc kubenswrapper[4930]: I1012 05:48:49.275763 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-xxh24" Oct 12 05:48:49 crc kubenswrapper[4930]: I1012 05:48:49.559082 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-xxh24"] Oct 12 05:48:50 crc kubenswrapper[4930]: I1012 05:48:50.416986 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-xxh24" event={"ID":"1c0a551e-b6cd-4e8d-b5b6-13a08b180df1","Type":"ContainerStarted","Data":"b0b6aa48e935e7dc2b183b2f251c09a804404ee8401690103eb407828adea50d"} Oct 12 05:48:50 crc kubenswrapper[4930]: I1012 05:48:50.417521 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-xxh24" event={"ID":"1c0a551e-b6cd-4e8d-b5b6-13a08b180df1","Type":"ContainerStarted","Data":"d954d6f013569d953636d8bb961500f0f4ff36f6859def281291f30e68558183"} Oct 12 05:48:50 crc kubenswrapper[4930]: I1012 05:48:50.417556 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-xxh24" Oct 12 05:48:50 crc kubenswrapper[4930]: I1012 05:48:50.451036 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-xxh24" podStartSLOduration=2.451007883 podStartE2EDuration="2.451007883s" podCreationTimestamp="2025-10-12 05:48:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:48:50.44516703 +0000 UTC m=+462.987268845" watchObservedRunningTime="2025-10-12 05:48:50.451007883 +0000 UTC m=+462.993109688" Oct 12 05:49:09 crc kubenswrapper[4930]: I1012 05:49:09.283479 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-xxh24" Oct 12 05:49:09 crc kubenswrapper[4930]: I1012 05:49:09.394161 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bzqf2"] Oct 12 05:49:33 crc kubenswrapper[4930]: I1012 05:49:33.670005 4930 patch_prober.go:28] interesting pod/machine-config-daemon-mk4tf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 05:49:33 crc kubenswrapper[4930]: I1012 05:49:33.670635 4930 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 05:49:34 crc kubenswrapper[4930]: I1012 05:49:34.444111 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-bzqf2" podUID="51d2d15f-f2ac-4939-b11d-f41e5891323d" containerName="registry" containerID="cri-o://4f28a2f8f2ff69d7a0202b3453ede968740ec9493f520531fbe412d499c2b185" gracePeriod=30 Oct 12 05:49:34 crc kubenswrapper[4930]: I1012 05:49:34.743602 4930 generic.go:334] "Generic (PLEG): container finished" podID="51d2d15f-f2ac-4939-b11d-f41e5891323d" containerID="4f28a2f8f2ff69d7a0202b3453ede968740ec9493f520531fbe412d499c2b185" exitCode=0 Oct 12 05:49:34 crc kubenswrapper[4930]: I1012 05:49:34.743849 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-bzqf2" event={"ID":"51d2d15f-f2ac-4939-b11d-f41e5891323d","Type":"ContainerDied","Data":"4f28a2f8f2ff69d7a0202b3453ede968740ec9493f520531fbe412d499c2b185"} Oct 12 05:49:34 crc kubenswrapper[4930]: I1012 05:49:34.864634 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-bzqf2" Oct 12 05:49:34 crc kubenswrapper[4930]: I1012 05:49:34.874461 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/51d2d15f-f2ac-4939-b11d-f41e5891323d-installation-pull-secrets\") pod \"51d2d15f-f2ac-4939-b11d-f41e5891323d\" (UID: \"51d2d15f-f2ac-4939-b11d-f41e5891323d\") " Oct 12 05:49:34 crc kubenswrapper[4930]: I1012 05:49:34.874602 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/51d2d15f-f2ac-4939-b11d-f41e5891323d-ca-trust-extracted\") pod \"51d2d15f-f2ac-4939-b11d-f41e5891323d\" (UID: \"51d2d15f-f2ac-4939-b11d-f41e5891323d\") " Oct 12 05:49:34 crc kubenswrapper[4930]: I1012 05:49:34.874803 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"51d2d15f-f2ac-4939-b11d-f41e5891323d\" (UID: \"51d2d15f-f2ac-4939-b11d-f41e5891323d\") " Oct 12 05:49:34 crc kubenswrapper[4930]: I1012 05:49:34.874876 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/51d2d15f-f2ac-4939-b11d-f41e5891323d-bound-sa-token\") pod \"51d2d15f-f2ac-4939-b11d-f41e5891323d\" (UID: \"51d2d15f-f2ac-4939-b11d-f41e5891323d\") " Oct 12 05:49:34 crc kubenswrapper[4930]: I1012 05:49:34.874947 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/51d2d15f-f2ac-4939-b11d-f41e5891323d-registry-tls\") pod \"51d2d15f-f2ac-4939-b11d-f41e5891323d\" (UID: \"51d2d15f-f2ac-4939-b11d-f41e5891323d\") " Oct 12 05:49:34 crc kubenswrapper[4930]: I1012 05:49:34.874994 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7sw2\" (UniqueName: \"kubernetes.io/projected/51d2d15f-f2ac-4939-b11d-f41e5891323d-kube-api-access-s7sw2\") pod \"51d2d15f-f2ac-4939-b11d-f41e5891323d\" (UID: \"51d2d15f-f2ac-4939-b11d-f41e5891323d\") " Oct 12 05:49:34 crc kubenswrapper[4930]: I1012 05:49:34.875050 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/51d2d15f-f2ac-4939-b11d-f41e5891323d-registry-certificates\") pod \"51d2d15f-f2ac-4939-b11d-f41e5891323d\" (UID: \"51d2d15f-f2ac-4939-b11d-f41e5891323d\") " Oct 12 05:49:34 crc kubenswrapper[4930]: I1012 05:49:34.875083 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/51d2d15f-f2ac-4939-b11d-f41e5891323d-trusted-ca\") pod \"51d2d15f-f2ac-4939-b11d-f41e5891323d\" (UID: \"51d2d15f-f2ac-4939-b11d-f41e5891323d\") " Oct 12 05:49:34 crc kubenswrapper[4930]: I1012 05:49:34.876067 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51d2d15f-f2ac-4939-b11d-f41e5891323d-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "51d2d15f-f2ac-4939-b11d-f41e5891323d" (UID: "51d2d15f-f2ac-4939-b11d-f41e5891323d"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:49:34 crc kubenswrapper[4930]: I1012 05:49:34.876099 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51d2d15f-f2ac-4939-b11d-f41e5891323d-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "51d2d15f-f2ac-4939-b11d-f41e5891323d" (UID: "51d2d15f-f2ac-4939-b11d-f41e5891323d"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:49:34 crc kubenswrapper[4930]: I1012 05:49:34.881125 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51d2d15f-f2ac-4939-b11d-f41e5891323d-kube-api-access-s7sw2" (OuterVolumeSpecName: "kube-api-access-s7sw2") pod "51d2d15f-f2ac-4939-b11d-f41e5891323d" (UID: "51d2d15f-f2ac-4939-b11d-f41e5891323d"). InnerVolumeSpecName "kube-api-access-s7sw2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:49:34 crc kubenswrapper[4930]: I1012 05:49:34.881399 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51d2d15f-f2ac-4939-b11d-f41e5891323d-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "51d2d15f-f2ac-4939-b11d-f41e5891323d" (UID: "51d2d15f-f2ac-4939-b11d-f41e5891323d"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:49:34 crc kubenswrapper[4930]: I1012 05:49:34.881435 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51d2d15f-f2ac-4939-b11d-f41e5891323d-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "51d2d15f-f2ac-4939-b11d-f41e5891323d" (UID: "51d2d15f-f2ac-4939-b11d-f41e5891323d"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:49:34 crc kubenswrapper[4930]: I1012 05:49:34.883359 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51d2d15f-f2ac-4939-b11d-f41e5891323d-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "51d2d15f-f2ac-4939-b11d-f41e5891323d" (UID: "51d2d15f-f2ac-4939-b11d-f41e5891323d"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:49:34 crc kubenswrapper[4930]: I1012 05:49:34.885062 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "51d2d15f-f2ac-4939-b11d-f41e5891323d" (UID: "51d2d15f-f2ac-4939-b11d-f41e5891323d"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 12 05:49:34 crc kubenswrapper[4930]: I1012 05:49:34.902530 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51d2d15f-f2ac-4939-b11d-f41e5891323d-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "51d2d15f-f2ac-4939-b11d-f41e5891323d" (UID: "51d2d15f-f2ac-4939-b11d-f41e5891323d"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 05:49:34 crc kubenswrapper[4930]: I1012 05:49:34.976286 4930 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/51d2d15f-f2ac-4939-b11d-f41e5891323d-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 12 05:49:34 crc kubenswrapper[4930]: I1012 05:49:34.976314 4930 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/51d2d15f-f2ac-4939-b11d-f41e5891323d-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 12 05:49:34 crc kubenswrapper[4930]: I1012 05:49:34.976323 4930 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/51d2d15f-f2ac-4939-b11d-f41e5891323d-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 12 05:49:34 crc kubenswrapper[4930]: I1012 05:49:34.976331 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7sw2\" (UniqueName: \"kubernetes.io/projected/51d2d15f-f2ac-4939-b11d-f41e5891323d-kube-api-access-s7sw2\") on node \"crc\" DevicePath \"\"" Oct 12 05:49:34 crc kubenswrapper[4930]: I1012 05:49:34.976341 4930 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/51d2d15f-f2ac-4939-b11d-f41e5891323d-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 12 05:49:34 crc kubenswrapper[4930]: I1012 05:49:34.976350 4930 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/51d2d15f-f2ac-4939-b11d-f41e5891323d-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 12 05:49:34 crc kubenswrapper[4930]: I1012 05:49:34.976357 4930 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/51d2d15f-f2ac-4939-b11d-f41e5891323d-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 12 05:49:35 crc kubenswrapper[4930]: I1012 05:49:35.754219 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-bzqf2" event={"ID":"51d2d15f-f2ac-4939-b11d-f41e5891323d","Type":"ContainerDied","Data":"b3a22c74397ec2a625417b3a4b08711d8bac8474461f5096ca2c969fee5f15c8"} Oct 12 05:49:35 crc kubenswrapper[4930]: I1012 05:49:35.754307 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-bzqf2" Oct 12 05:49:35 crc kubenswrapper[4930]: I1012 05:49:35.754326 4930 scope.go:117] "RemoveContainer" containerID="4f28a2f8f2ff69d7a0202b3453ede968740ec9493f520531fbe412d499c2b185" Oct 12 05:49:35 crc kubenswrapper[4930]: I1012 05:49:35.801415 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bzqf2"] Oct 12 05:49:35 crc kubenswrapper[4930]: I1012 05:49:35.807248 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bzqf2"] Oct 12 05:49:36 crc kubenswrapper[4930]: I1012 05:49:36.148122 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51d2d15f-f2ac-4939-b11d-f41e5891323d" path="/var/lib/kubelet/pods/51d2d15f-f2ac-4939-b11d-f41e5891323d/volumes" Oct 12 05:50:03 crc kubenswrapper[4930]: I1012 05:50:03.669071 4930 patch_prober.go:28] interesting pod/machine-config-daemon-mk4tf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 05:50:03 crc kubenswrapper[4930]: I1012 05:50:03.669688 4930 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 05:50:05 crc kubenswrapper[4930]: I1012 05:50:05.986013 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-v95v2"] Oct 12 05:50:05 crc kubenswrapper[4930]: E1012 05:50:05.986374 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51d2d15f-f2ac-4939-b11d-f41e5891323d" containerName="registry" Oct 12 05:50:05 crc kubenswrapper[4930]: I1012 05:50:05.986397 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="51d2d15f-f2ac-4939-b11d-f41e5891323d" containerName="registry" Oct 12 05:50:05 crc kubenswrapper[4930]: I1012 05:50:05.986640 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="51d2d15f-f2ac-4939-b11d-f41e5891323d" containerName="registry" Oct 12 05:50:05 crc kubenswrapper[4930]: I1012 05:50:05.987609 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-v95v2" Oct 12 05:50:05 crc kubenswrapper[4930]: I1012 05:50:05.991720 4930 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-lqnj2" Oct 12 05:50:05 crc kubenswrapper[4930]: I1012 05:50:05.991988 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Oct 12 05:50:05 crc kubenswrapper[4930]: I1012 05:50:05.993387 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-v95v2"] Oct 12 05:50:05 crc kubenswrapper[4930]: I1012 05:50:05.996862 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Oct 12 05:50:06 crc kubenswrapper[4930]: I1012 05:50:06.022130 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-wvrgv"] Oct 12 05:50:06 crc kubenswrapper[4930]: I1012 05:50:06.023232 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-wvrgv" Oct 12 05:50:06 crc kubenswrapper[4930]: I1012 05:50:06.026728 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-6ptlt"] Oct 12 05:50:06 crc kubenswrapper[4930]: I1012 05:50:06.027496 4930 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-qvngf" Oct 12 05:50:06 crc kubenswrapper[4930]: I1012 05:50:06.027559 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-6ptlt" Oct 12 05:50:06 crc kubenswrapper[4930]: I1012 05:50:06.040000 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6smn8\" (UniqueName: \"kubernetes.io/projected/b488879d-5b6c-438f-8b60-842f84b05028-kube-api-access-6smn8\") pod \"cert-manager-cainjector-7f985d654d-v95v2\" (UID: \"b488879d-5b6c-438f-8b60-842f84b05028\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-v95v2" Oct 12 05:50:06 crc kubenswrapper[4930]: I1012 05:50:06.040394 4930 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-gxj96" Oct 12 05:50:06 crc kubenswrapper[4930]: I1012 05:50:06.042068 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-6ptlt"] Oct 12 05:50:06 crc kubenswrapper[4930]: I1012 05:50:06.047902 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-wvrgv"] Oct 12 05:50:06 crc kubenswrapper[4930]: I1012 05:50:06.141152 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6smn8\" (UniqueName: \"kubernetes.io/projected/b488879d-5b6c-438f-8b60-842f84b05028-kube-api-access-6smn8\") pod \"cert-manager-cainjector-7f985d654d-v95v2\" (UID: \"b488879d-5b6c-438f-8b60-842f84b05028\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-v95v2" Oct 12 05:50:06 crc kubenswrapper[4930]: I1012 05:50:06.141552 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkthg\" (UniqueName: \"kubernetes.io/projected/e7b9505d-6b6f-4460-9f40-119155e4ba33-kube-api-access-wkthg\") pod \"cert-manager-5b446d88c5-wvrgv\" (UID: \"e7b9505d-6b6f-4460-9f40-119155e4ba33\") " pod="cert-manager/cert-manager-5b446d88c5-wvrgv" Oct 12 05:50:06 crc kubenswrapper[4930]: I1012 05:50:06.141998 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hnbm\" (UniqueName: \"kubernetes.io/projected/a45f8116-ddf1-4733-bfb2-9fd498a04620-kube-api-access-9hnbm\") pod \"cert-manager-webhook-5655c58dd6-6ptlt\" (UID: \"a45f8116-ddf1-4733-bfb2-9fd498a04620\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-6ptlt" Oct 12 05:50:06 crc kubenswrapper[4930]: I1012 05:50:06.173495 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6smn8\" (UniqueName: \"kubernetes.io/projected/b488879d-5b6c-438f-8b60-842f84b05028-kube-api-access-6smn8\") pod \"cert-manager-cainjector-7f985d654d-v95v2\" (UID: \"b488879d-5b6c-438f-8b60-842f84b05028\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-v95v2" Oct 12 05:50:06 crc kubenswrapper[4930]: I1012 05:50:06.243133 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkthg\" (UniqueName: \"kubernetes.io/projected/e7b9505d-6b6f-4460-9f40-119155e4ba33-kube-api-access-wkthg\") pod \"cert-manager-5b446d88c5-wvrgv\" (UID: \"e7b9505d-6b6f-4460-9f40-119155e4ba33\") " pod="cert-manager/cert-manager-5b446d88c5-wvrgv" Oct 12 05:50:06 crc kubenswrapper[4930]: I1012 05:50:06.243229 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hnbm\" (UniqueName: \"kubernetes.io/projected/a45f8116-ddf1-4733-bfb2-9fd498a04620-kube-api-access-9hnbm\") pod \"cert-manager-webhook-5655c58dd6-6ptlt\" (UID: \"a45f8116-ddf1-4733-bfb2-9fd498a04620\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-6ptlt" Oct 12 05:50:06 crc kubenswrapper[4930]: I1012 05:50:06.273994 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hnbm\" (UniqueName: \"kubernetes.io/projected/a45f8116-ddf1-4733-bfb2-9fd498a04620-kube-api-access-9hnbm\") pod \"cert-manager-webhook-5655c58dd6-6ptlt\" (UID: \"a45f8116-ddf1-4733-bfb2-9fd498a04620\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-6ptlt" Oct 12 05:50:06 crc kubenswrapper[4930]: I1012 05:50:06.276006 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkthg\" (UniqueName: \"kubernetes.io/projected/e7b9505d-6b6f-4460-9f40-119155e4ba33-kube-api-access-wkthg\") pod \"cert-manager-5b446d88c5-wvrgv\" (UID: \"e7b9505d-6b6f-4460-9f40-119155e4ba33\") " pod="cert-manager/cert-manager-5b446d88c5-wvrgv" Oct 12 05:50:06 crc kubenswrapper[4930]: I1012 05:50:06.310227 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-v95v2" Oct 12 05:50:06 crc kubenswrapper[4930]: I1012 05:50:06.342275 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-wvrgv" Oct 12 05:50:06 crc kubenswrapper[4930]: I1012 05:50:06.358586 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-6ptlt" Oct 12 05:50:06 crc kubenswrapper[4930]: I1012 05:50:06.561780 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-v95v2"] Oct 12 05:50:06 crc kubenswrapper[4930]: I1012 05:50:06.576400 4930 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 12 05:50:06 crc kubenswrapper[4930]: I1012 05:50:06.833970 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-6ptlt"] Oct 12 05:50:06 crc kubenswrapper[4930]: W1012 05:50:06.844408 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda45f8116_ddf1_4733_bfb2_9fd498a04620.slice/crio-24ed5209a7755e6ce59f351e63da73632caabb5bfd07c359e09c7ed7853bd884 WatchSource:0}: Error finding container 24ed5209a7755e6ce59f351e63da73632caabb5bfd07c359e09c7ed7853bd884: Status 404 returned error can't find the container with id 24ed5209a7755e6ce59f351e63da73632caabb5bfd07c359e09c7ed7853bd884 Oct 12 05:50:06 crc kubenswrapper[4930]: I1012 05:50:06.860636 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-wvrgv"] Oct 12 05:50:06 crc kubenswrapper[4930]: W1012 05:50:06.873575 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode7b9505d_6b6f_4460_9f40_119155e4ba33.slice/crio-a7c04c6f6095880af41e11ddbb16ccff4102c6f4c2e4e9c5fc8dffc76d48b53a WatchSource:0}: Error finding container a7c04c6f6095880af41e11ddbb16ccff4102c6f4c2e4e9c5fc8dffc76d48b53a: Status 404 returned error can't find the container with id a7c04c6f6095880af41e11ddbb16ccff4102c6f4c2e4e9c5fc8dffc76d48b53a Oct 12 05:50:06 crc kubenswrapper[4930]: I1012 05:50:06.988575 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-wvrgv" event={"ID":"e7b9505d-6b6f-4460-9f40-119155e4ba33","Type":"ContainerStarted","Data":"a7c04c6f6095880af41e11ddbb16ccff4102c6f4c2e4e9c5fc8dffc76d48b53a"} Oct 12 05:50:06 crc kubenswrapper[4930]: I1012 05:50:06.990799 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-v95v2" event={"ID":"b488879d-5b6c-438f-8b60-842f84b05028","Type":"ContainerStarted","Data":"386b3012bc923295b0a1f6ae1843d1bc76d1a1eb407ed59f040b6b0d73284c69"} Oct 12 05:50:06 crc kubenswrapper[4930]: I1012 05:50:06.991790 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-6ptlt" event={"ID":"a45f8116-ddf1-4733-bfb2-9fd498a04620","Type":"ContainerStarted","Data":"24ed5209a7755e6ce59f351e63da73632caabb5bfd07c359e09c7ed7853bd884"} Oct 12 05:50:08 crc kubenswrapper[4930]: I1012 05:50:08.304523 4930 scope.go:117] "RemoveContainer" containerID="161b6146bb94e3a8bed24475efeaf2560392abd92ac26be249802ce3450ffb94" Oct 12 05:50:08 crc kubenswrapper[4930]: I1012 05:50:08.390718 4930 scope.go:117] "RemoveContainer" containerID="03484064af533ba8e6f6d19a0745fdb506c14203d412f6ca31e821eb26537be5" Oct 12 05:50:08 crc kubenswrapper[4930]: I1012 05:50:08.459457 4930 scope.go:117] "RemoveContainer" containerID="34fc364554c0d74dde77a04f4cfee532bed8c4f4274526976a70678733a3efa1" Oct 12 05:50:09 crc kubenswrapper[4930]: I1012 05:50:09.011206 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-v95v2" event={"ID":"b488879d-5b6c-438f-8b60-842f84b05028","Type":"ContainerStarted","Data":"1c7f10cc69935bda6136f18155f0d9878dbc8bd53a10e17e581c3d16ba528338"} Oct 12 05:50:09 crc kubenswrapper[4930]: I1012 05:50:09.029100 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-v95v2" podStartSLOduration=2.045386772 podStartE2EDuration="4.029085384s" podCreationTimestamp="2025-10-12 05:50:05 +0000 UTC" firstStartedPulling="2025-10-12 05:50:06.576210677 +0000 UTC m=+539.118312442" lastFinishedPulling="2025-10-12 05:50:08.559909289 +0000 UTC m=+541.102011054" observedRunningTime="2025-10-12 05:50:09.026290743 +0000 UTC m=+541.568392508" watchObservedRunningTime="2025-10-12 05:50:09.029085384 +0000 UTC m=+541.571187149" Oct 12 05:50:11 crc kubenswrapper[4930]: I1012 05:50:11.036763 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-6ptlt" event={"ID":"a45f8116-ddf1-4733-bfb2-9fd498a04620","Type":"ContainerStarted","Data":"4e4e35a7d4a7f4e35786c39cbf98702dd70cb78200d434aa07d005aa9ecc34f5"} Oct 12 05:50:11 crc kubenswrapper[4930]: I1012 05:50:11.037515 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-6ptlt" Oct 12 05:50:11 crc kubenswrapper[4930]: I1012 05:50:11.039413 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-wvrgv" event={"ID":"e7b9505d-6b6f-4460-9f40-119155e4ba33","Type":"ContainerStarted","Data":"5559879773ccdd7f88178768b99a0792d925184e7d782169063f6dae7fdd2436"} Oct 12 05:50:11 crc kubenswrapper[4930]: I1012 05:50:11.064614 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-6ptlt" podStartSLOduration=2.488026514 podStartE2EDuration="6.064582108s" podCreationTimestamp="2025-10-12 05:50:05 +0000 UTC" firstStartedPulling="2025-10-12 05:50:06.853700222 +0000 UTC m=+539.395802017" lastFinishedPulling="2025-10-12 05:50:10.430255846 +0000 UTC m=+542.972357611" observedRunningTime="2025-10-12 05:50:11.058000241 +0000 UTC m=+543.600102006" watchObservedRunningTime="2025-10-12 05:50:11.064582108 +0000 UTC m=+543.606683913" Oct 12 05:50:11 crc kubenswrapper[4930]: I1012 05:50:11.079572 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-wvrgv" podStartSLOduration=2.525725079 podStartE2EDuration="6.079545257s" podCreationTimestamp="2025-10-12 05:50:05 +0000 UTC" firstStartedPulling="2025-10-12 05:50:06.879377583 +0000 UTC m=+539.421479358" lastFinishedPulling="2025-10-12 05:50:10.433197771 +0000 UTC m=+542.975299536" observedRunningTime="2025-10-12 05:50:11.073425732 +0000 UTC m=+543.615527527" watchObservedRunningTime="2025-10-12 05:50:11.079545257 +0000 UTC m=+543.621647062" Oct 12 05:50:16 crc kubenswrapper[4930]: I1012 05:50:16.362782 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-6ptlt" Oct 12 05:50:16 crc kubenswrapper[4930]: I1012 05:50:16.683589 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-mdhw6"] Oct 12 05:50:16 crc kubenswrapper[4930]: I1012 05:50:16.684229 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" podUID="0add0fa2-092f-4dcc-8c72-82881564bf63" containerName="ovn-controller" containerID="cri-o://c9898a7d7e617b5eb8eab6b6d9cb5d2ea92e8fbb0f4b1e0e60711e8f4a5e3aef" gracePeriod=30 Oct 12 05:50:16 crc kubenswrapper[4930]: I1012 05:50:16.684283 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" podUID="0add0fa2-092f-4dcc-8c72-82881564bf63" containerName="sbdb" containerID="cri-o://dcac9953a6ca0ac812cb0d1aed7ff1179eed01831f4af4a098f3e6938b5bc666" gracePeriod=30 Oct 12 05:50:16 crc kubenswrapper[4930]: I1012 05:50:16.684341 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" podUID="0add0fa2-092f-4dcc-8c72-82881564bf63" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://48a4295347cdfdf0864fbc2b102904f73e19cc33021fc5a82eed61255c4641d3" gracePeriod=30 Oct 12 05:50:16 crc kubenswrapper[4930]: I1012 05:50:16.684398 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" podUID="0add0fa2-092f-4dcc-8c72-82881564bf63" containerName="kube-rbac-proxy-node" containerID="cri-o://7495c144ce989bf10b08b199a7459ea85351b77941a3edcb0ef76a01b2e08164" gracePeriod=30 Oct 12 05:50:16 crc kubenswrapper[4930]: I1012 05:50:16.684434 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" podUID="0add0fa2-092f-4dcc-8c72-82881564bf63" containerName="nbdb" containerID="cri-o://a2cba08ba8fd74a521bdd005e51be52f80687a1ccf0f250d11b709c894be8235" gracePeriod=30 Oct 12 05:50:16 crc kubenswrapper[4930]: I1012 05:50:16.684483 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" podUID="0add0fa2-092f-4dcc-8c72-82881564bf63" containerName="ovn-acl-logging" containerID="cri-o://b17051b78f584f1199be3038a82d9e675d1b9fb4da939611577c0b05235a147a" gracePeriod=30 Oct 12 05:50:16 crc kubenswrapper[4930]: I1012 05:50:16.684490 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" podUID="0add0fa2-092f-4dcc-8c72-82881564bf63" containerName="northd" containerID="cri-o://1006beab015e55856b70b9bcb93bb81b541e6c593fae8fd366c361990d233914" gracePeriod=30 Oct 12 05:50:16 crc kubenswrapper[4930]: I1012 05:50:16.743676 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" podUID="0add0fa2-092f-4dcc-8c72-82881564bf63" containerName="ovnkube-controller" containerID="cri-o://63db3e72cc84311fe4ad16d27280c33993ec32d15e656ea269fe9ed2f19426ba" gracePeriod=30 Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.033325 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mdhw6_0add0fa2-092f-4dcc-8c72-82881564bf63/ovnkube-controller/3.log" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.036647 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mdhw6_0add0fa2-092f-4dcc-8c72-82881564bf63/ovn-acl-logging/0.log" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.037217 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mdhw6_0add0fa2-092f-4dcc-8c72-82881564bf63/ovn-controller/0.log" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.037813 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.105935 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mdhw6_0add0fa2-092f-4dcc-8c72-82881564bf63/ovnkube-controller/3.log" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.114864 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0add0fa2-092f-4dcc-8c72-82881564bf63-ovn-node-metrics-cert\") pod \"0add0fa2-092f-4dcc-8c72-82881564bf63\" (UID: \"0add0fa2-092f-4dcc-8c72-82881564bf63\") " Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.114949 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-run-openvswitch\") pod \"0add0fa2-092f-4dcc-8c72-82881564bf63\" (UID: \"0add0fa2-092f-4dcc-8c72-82881564bf63\") " Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.115001 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-var-lib-openvswitch\") pod \"0add0fa2-092f-4dcc-8c72-82881564bf63\" (UID: \"0add0fa2-092f-4dcc-8c72-82881564bf63\") " Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.115048 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-etc-openvswitch\") pod \"0add0fa2-092f-4dcc-8c72-82881564bf63\" (UID: \"0add0fa2-092f-4dcc-8c72-82881564bf63\") " Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.115086 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-node-log\") pod \"0add0fa2-092f-4dcc-8c72-82881564bf63\" (UID: \"0add0fa2-092f-4dcc-8c72-82881564bf63\") " Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.115129 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-host-kubelet\") pod \"0add0fa2-092f-4dcc-8c72-82881564bf63\" (UID: \"0add0fa2-092f-4dcc-8c72-82881564bf63\") " Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.115176 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0add0fa2-092f-4dcc-8c72-82881564bf63-env-overrides\") pod \"0add0fa2-092f-4dcc-8c72-82881564bf63\" (UID: \"0add0fa2-092f-4dcc-8c72-82881564bf63\") " Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.115206 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-host-var-lib-cni-networks-ovn-kubernetes\") pod \"0add0fa2-092f-4dcc-8c72-82881564bf63\" (UID: \"0add0fa2-092f-4dcc-8c72-82881564bf63\") " Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.115251 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0add0fa2-092f-4dcc-8c72-82881564bf63-ovnkube-script-lib\") pod \"0add0fa2-092f-4dcc-8c72-82881564bf63\" (UID: \"0add0fa2-092f-4dcc-8c72-82881564bf63\") " Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.115296 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-run-ovn\") pod \"0add0fa2-092f-4dcc-8c72-82881564bf63\" (UID: \"0add0fa2-092f-4dcc-8c72-82881564bf63\") " Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.115333 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-run-systemd\") pod \"0add0fa2-092f-4dcc-8c72-82881564bf63\" (UID: \"0add0fa2-092f-4dcc-8c72-82881564bf63\") " Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.115363 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-host-slash\") pod \"0add0fa2-092f-4dcc-8c72-82881564bf63\" (UID: \"0add0fa2-092f-4dcc-8c72-82881564bf63\") " Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.115399 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-host-run-netns\") pod \"0add0fa2-092f-4dcc-8c72-82881564bf63\" (UID: \"0add0fa2-092f-4dcc-8c72-82881564bf63\") " Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.115425 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-host-cni-netd\") pod \"0add0fa2-092f-4dcc-8c72-82881564bf63\" (UID: \"0add0fa2-092f-4dcc-8c72-82881564bf63\") " Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.115480 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-host-run-ovn-kubernetes\") pod \"0add0fa2-092f-4dcc-8c72-82881564bf63\" (UID: \"0add0fa2-092f-4dcc-8c72-82881564bf63\") " Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.115520 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7hpj\" (UniqueName: \"kubernetes.io/projected/0add0fa2-092f-4dcc-8c72-82881564bf63-kube-api-access-g7hpj\") pod \"0add0fa2-092f-4dcc-8c72-82881564bf63\" (UID: \"0add0fa2-092f-4dcc-8c72-82881564bf63\") " Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.115556 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-log-socket\") pod \"0add0fa2-092f-4dcc-8c72-82881564bf63\" (UID: \"0add0fa2-092f-4dcc-8c72-82881564bf63\") " Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.115583 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-host-cni-bin\") pod \"0add0fa2-092f-4dcc-8c72-82881564bf63\" (UID: \"0add0fa2-092f-4dcc-8c72-82881564bf63\") " Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.115620 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0add0fa2-092f-4dcc-8c72-82881564bf63-ovnkube-config\") pod \"0add0fa2-092f-4dcc-8c72-82881564bf63\" (UID: \"0add0fa2-092f-4dcc-8c72-82881564bf63\") " Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.115667 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-systemd-units\") pod \"0add0fa2-092f-4dcc-8c72-82881564bf63\" (UID: \"0add0fa2-092f-4dcc-8c72-82881564bf63\") " Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.116064 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "0add0fa2-092f-4dcc-8c72-82881564bf63" (UID: "0add0fa2-092f-4dcc-8c72-82881564bf63"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.116110 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0add0fa2-092f-4dcc-8c72-82881564bf63-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "0add0fa2-092f-4dcc-8c72-82881564bf63" (UID: "0add0fa2-092f-4dcc-8c72-82881564bf63"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.116122 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "0add0fa2-092f-4dcc-8c72-82881564bf63" (UID: "0add0fa2-092f-4dcc-8c72-82881564bf63"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.116802 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "0add0fa2-092f-4dcc-8c72-82881564bf63" (UID: "0add0fa2-092f-4dcc-8c72-82881564bf63"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.116823 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "0add0fa2-092f-4dcc-8c72-82881564bf63" (UID: "0add0fa2-092f-4dcc-8c72-82881564bf63"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.116884 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "0add0fa2-092f-4dcc-8c72-82881564bf63" (UID: "0add0fa2-092f-4dcc-8c72-82881564bf63"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.116933 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-host-slash" (OuterVolumeSpecName: "host-slash") pod "0add0fa2-092f-4dcc-8c72-82881564bf63" (UID: "0add0fa2-092f-4dcc-8c72-82881564bf63"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.116994 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "0add0fa2-092f-4dcc-8c72-82881564bf63" (UID: "0add0fa2-092f-4dcc-8c72-82881564bf63"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.117032 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-log-socket" (OuterVolumeSpecName: "log-socket") pod "0add0fa2-092f-4dcc-8c72-82881564bf63" (UID: "0add0fa2-092f-4dcc-8c72-82881564bf63"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.116991 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "0add0fa2-092f-4dcc-8c72-82881564bf63" (UID: "0add0fa2-092f-4dcc-8c72-82881564bf63"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.117066 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "0add0fa2-092f-4dcc-8c72-82881564bf63" (UID: "0add0fa2-092f-4dcc-8c72-82881564bf63"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.117152 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-node-log" (OuterVolumeSpecName: "node-log") pod "0add0fa2-092f-4dcc-8c72-82881564bf63" (UID: "0add0fa2-092f-4dcc-8c72-82881564bf63"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.117219 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "0add0fa2-092f-4dcc-8c72-82881564bf63" (UID: "0add0fa2-092f-4dcc-8c72-82881564bf63"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.117275 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "0add0fa2-092f-4dcc-8c72-82881564bf63" (UID: "0add0fa2-092f-4dcc-8c72-82881564bf63"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.117657 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mdhw6_0add0fa2-092f-4dcc-8c72-82881564bf63/ovn-acl-logging/0.log" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.118101 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0add0fa2-092f-4dcc-8c72-82881564bf63-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "0add0fa2-092f-4dcc-8c72-82881564bf63" (UID: "0add0fa2-092f-4dcc-8c72-82881564bf63"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.118165 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0add0fa2-092f-4dcc-8c72-82881564bf63-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "0add0fa2-092f-4dcc-8c72-82881564bf63" (UID: "0add0fa2-092f-4dcc-8c72-82881564bf63"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.118192 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "0add0fa2-092f-4dcc-8c72-82881564bf63" (UID: "0add0fa2-092f-4dcc-8c72-82881564bf63"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.121564 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mdhw6_0add0fa2-092f-4dcc-8c72-82881564bf63/ovn-controller/0.log" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.122041 4930 generic.go:334] "Generic (PLEG): container finished" podID="0add0fa2-092f-4dcc-8c72-82881564bf63" containerID="63db3e72cc84311fe4ad16d27280c33993ec32d15e656ea269fe9ed2f19426ba" exitCode=0 Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.122078 4930 generic.go:334] "Generic (PLEG): container finished" podID="0add0fa2-092f-4dcc-8c72-82881564bf63" containerID="dcac9953a6ca0ac812cb0d1aed7ff1179eed01831f4af4a098f3e6938b5bc666" exitCode=0 Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.122093 4930 generic.go:334] "Generic (PLEG): container finished" podID="0add0fa2-092f-4dcc-8c72-82881564bf63" containerID="a2cba08ba8fd74a521bdd005e51be52f80687a1ccf0f250d11b709c894be8235" exitCode=0 Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.122109 4930 generic.go:334] "Generic (PLEG): container finished" podID="0add0fa2-092f-4dcc-8c72-82881564bf63" containerID="1006beab015e55856b70b9bcb93bb81b541e6c593fae8fd366c361990d233914" exitCode=0 Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.122122 4930 generic.go:334] "Generic (PLEG): container finished" podID="0add0fa2-092f-4dcc-8c72-82881564bf63" containerID="48a4295347cdfdf0864fbc2b102904f73e19cc33021fc5a82eed61255c4641d3" exitCode=0 Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.122135 4930 generic.go:334] "Generic (PLEG): container finished" podID="0add0fa2-092f-4dcc-8c72-82881564bf63" containerID="7495c144ce989bf10b08b199a7459ea85351b77941a3edcb0ef76a01b2e08164" exitCode=0 Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.122147 4930 generic.go:334] "Generic (PLEG): container finished" podID="0add0fa2-092f-4dcc-8c72-82881564bf63" containerID="b17051b78f584f1199be3038a82d9e675d1b9fb4da939611577c0b05235a147a" exitCode=143 Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.122160 4930 generic.go:334] "Generic (PLEG): container finished" podID="0add0fa2-092f-4dcc-8c72-82881564bf63" containerID="c9898a7d7e617b5eb8eab6b6d9cb5d2ea92e8fbb0f4b1e0e60711e8f4a5e3aef" exitCode=143 Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.122167 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0add0fa2-092f-4dcc-8c72-82881564bf63-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "0add0fa2-092f-4dcc-8c72-82881564bf63" (UID: "0add0fa2-092f-4dcc-8c72-82881564bf63"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.122200 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.122199 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" event={"ID":"0add0fa2-092f-4dcc-8c72-82881564bf63","Type":"ContainerDied","Data":"63db3e72cc84311fe4ad16d27280c33993ec32d15e656ea269fe9ed2f19426ba"} Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.122273 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" event={"ID":"0add0fa2-092f-4dcc-8c72-82881564bf63","Type":"ContainerDied","Data":"dcac9953a6ca0ac812cb0d1aed7ff1179eed01831f4af4a098f3e6938b5bc666"} Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.122301 4930 scope.go:117] "RemoveContainer" containerID="63db3e72cc84311fe4ad16d27280c33993ec32d15e656ea269fe9ed2f19426ba" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.122346 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" event={"ID":"0add0fa2-092f-4dcc-8c72-82881564bf63","Type":"ContainerDied","Data":"a2cba08ba8fd74a521bdd005e51be52f80687a1ccf0f250d11b709c894be8235"} Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.122485 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" event={"ID":"0add0fa2-092f-4dcc-8c72-82881564bf63","Type":"ContainerDied","Data":"1006beab015e55856b70b9bcb93bb81b541e6c593fae8fd366c361990d233914"} Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.122515 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" event={"ID":"0add0fa2-092f-4dcc-8c72-82881564bf63","Type":"ContainerDied","Data":"48a4295347cdfdf0864fbc2b102904f73e19cc33021fc5a82eed61255c4641d3"} Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.122572 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" event={"ID":"0add0fa2-092f-4dcc-8c72-82881564bf63","Type":"ContainerDied","Data":"7495c144ce989bf10b08b199a7459ea85351b77941a3edcb0ef76a01b2e08164"} Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.122592 4930 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3d13c51f393f32e647751366764fe45c6eb3dd586ba90fc528fce23cae8b16b7"} Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.122611 4930 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dcac9953a6ca0ac812cb0d1aed7ff1179eed01831f4af4a098f3e6938b5bc666"} Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.122623 4930 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a2cba08ba8fd74a521bdd005e51be52f80687a1ccf0f250d11b709c894be8235"} Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.122645 4930 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1006beab015e55856b70b9bcb93bb81b541e6c593fae8fd366c361990d233914"} Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.122656 4930 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"48a4295347cdfdf0864fbc2b102904f73e19cc33021fc5a82eed61255c4641d3"} Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.122667 4930 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7495c144ce989bf10b08b199a7459ea85351b77941a3edcb0ef76a01b2e08164"} Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.122677 4930 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b17051b78f584f1199be3038a82d9e675d1b9fb4da939611577c0b05235a147a"} Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.122690 4930 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c9898a7d7e617b5eb8eab6b6d9cb5d2ea92e8fbb0f4b1e0e60711e8f4a5e3aef"} Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.122701 4930 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb"} Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.122715 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" event={"ID":"0add0fa2-092f-4dcc-8c72-82881564bf63","Type":"ContainerDied","Data":"b17051b78f584f1199be3038a82d9e675d1b9fb4da939611577c0b05235a147a"} Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.122731 4930 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"63db3e72cc84311fe4ad16d27280c33993ec32d15e656ea269fe9ed2f19426ba"} Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.122767 4930 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3d13c51f393f32e647751366764fe45c6eb3dd586ba90fc528fce23cae8b16b7"} Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.122780 4930 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dcac9953a6ca0ac812cb0d1aed7ff1179eed01831f4af4a098f3e6938b5bc666"} Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.122790 4930 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a2cba08ba8fd74a521bdd005e51be52f80687a1ccf0f250d11b709c894be8235"} Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.122801 4930 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1006beab015e55856b70b9bcb93bb81b541e6c593fae8fd366c361990d233914"} Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.122811 4930 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"48a4295347cdfdf0864fbc2b102904f73e19cc33021fc5a82eed61255c4641d3"} Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.122822 4930 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7495c144ce989bf10b08b199a7459ea85351b77941a3edcb0ef76a01b2e08164"} Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.122835 4930 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b17051b78f584f1199be3038a82d9e675d1b9fb4da939611577c0b05235a147a"} Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.122846 4930 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c9898a7d7e617b5eb8eab6b6d9cb5d2ea92e8fbb0f4b1e0e60711e8f4a5e3aef"} Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.122862 4930 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb"} Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.122864 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0add0fa2-092f-4dcc-8c72-82881564bf63-kube-api-access-g7hpj" (OuterVolumeSpecName: "kube-api-access-g7hpj") pod "0add0fa2-092f-4dcc-8c72-82881564bf63" (UID: "0add0fa2-092f-4dcc-8c72-82881564bf63"). InnerVolumeSpecName "kube-api-access-g7hpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.122878 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" event={"ID":"0add0fa2-092f-4dcc-8c72-82881564bf63","Type":"ContainerDied","Data":"c9898a7d7e617b5eb8eab6b6d9cb5d2ea92e8fbb0f4b1e0e60711e8f4a5e3aef"} Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.122900 4930 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"63db3e72cc84311fe4ad16d27280c33993ec32d15e656ea269fe9ed2f19426ba"} Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.122913 4930 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3d13c51f393f32e647751366764fe45c6eb3dd586ba90fc528fce23cae8b16b7"} Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.122925 4930 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dcac9953a6ca0ac812cb0d1aed7ff1179eed01831f4af4a098f3e6938b5bc666"} Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.122937 4930 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a2cba08ba8fd74a521bdd005e51be52f80687a1ccf0f250d11b709c894be8235"} Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.122948 4930 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1006beab015e55856b70b9bcb93bb81b541e6c593fae8fd366c361990d233914"} Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.122960 4930 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"48a4295347cdfdf0864fbc2b102904f73e19cc33021fc5a82eed61255c4641d3"} Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.122972 4930 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7495c144ce989bf10b08b199a7459ea85351b77941a3edcb0ef76a01b2e08164"} Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.122988 4930 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b17051b78f584f1199be3038a82d9e675d1b9fb4da939611577c0b05235a147a"} Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.123004 4930 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c9898a7d7e617b5eb8eab6b6d9cb5d2ea92e8fbb0f4b1e0e60711e8f4a5e3aef"} Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.123020 4930 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb"} Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.123040 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mdhw6" event={"ID":"0add0fa2-092f-4dcc-8c72-82881564bf63","Type":"ContainerDied","Data":"52edd9d2d5d32cd79f2107a4cbd70e8cfbbf8f5b8fe5c2fc1e883106efee0254"} Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.123059 4930 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"63db3e72cc84311fe4ad16d27280c33993ec32d15e656ea269fe9ed2f19426ba"} Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.123073 4930 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3d13c51f393f32e647751366764fe45c6eb3dd586ba90fc528fce23cae8b16b7"} Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.123084 4930 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dcac9953a6ca0ac812cb0d1aed7ff1179eed01831f4af4a098f3e6938b5bc666"} Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.123094 4930 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a2cba08ba8fd74a521bdd005e51be52f80687a1ccf0f250d11b709c894be8235"} Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.123106 4930 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1006beab015e55856b70b9bcb93bb81b541e6c593fae8fd366c361990d233914"} Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.123116 4930 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"48a4295347cdfdf0864fbc2b102904f73e19cc33021fc5a82eed61255c4641d3"} Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.123127 4930 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7495c144ce989bf10b08b199a7459ea85351b77941a3edcb0ef76a01b2e08164"} Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.123138 4930 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b17051b78f584f1199be3038a82d9e675d1b9fb4da939611577c0b05235a147a"} Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.123149 4930 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c9898a7d7e617b5eb8eab6b6d9cb5d2ea92e8fbb0f4b1e0e60711e8f4a5e3aef"} Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.123159 4930 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb"} Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.124339 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tq29s_c1c3ae9e-26ae-418f-b261-eabc4302b332/kube-multus/2.log" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.130390 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tq29s_c1c3ae9e-26ae-418f-b261-eabc4302b332/kube-multus/1.log" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.130568 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tq29s" event={"ID":"c1c3ae9e-26ae-418f-b261-eabc4302b332","Type":"ContainerDied","Data":"b8fe6fb418f70bbc6c0032da29f42a0654018e861808de8b636b2a9170c51464"} Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.130635 4930 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6697bf685228a837d46450ea695e9579b3bda51686b6b5bcd8338e66757476cd"} Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.131440 4930 scope.go:117] "RemoveContainer" containerID="b8fe6fb418f70bbc6c0032da29f42a0654018e861808de8b636b2a9170c51464" Oct 12 05:50:17 crc kubenswrapper[4930]: E1012 05:50:17.131897 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-tq29s_openshift-multus(c1c3ae9e-26ae-418f-b261-eabc4302b332)\"" pod="openshift-multus/multus-tq29s" podUID="c1c3ae9e-26ae-418f-b261-eabc4302b332" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.130482 4930 generic.go:334] "Generic (PLEG): container finished" podID="c1c3ae9e-26ae-418f-b261-eabc4302b332" containerID="b8fe6fb418f70bbc6c0032da29f42a0654018e861808de8b636b2a9170c51464" exitCode=2 Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.132870 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-npzmj"] Oct 12 05:50:17 crc kubenswrapper[4930]: E1012 05:50:17.133210 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0add0fa2-092f-4dcc-8c72-82881564bf63" containerName="ovn-acl-logging" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.133239 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="0add0fa2-092f-4dcc-8c72-82881564bf63" containerName="ovn-acl-logging" Oct 12 05:50:17 crc kubenswrapper[4930]: E1012 05:50:17.133258 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0add0fa2-092f-4dcc-8c72-82881564bf63" containerName="ovnkube-controller" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.133276 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="0add0fa2-092f-4dcc-8c72-82881564bf63" containerName="ovnkube-controller" Oct 12 05:50:17 crc kubenswrapper[4930]: E1012 05:50:17.133294 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0add0fa2-092f-4dcc-8c72-82881564bf63" containerName="kube-rbac-proxy-ovn-metrics" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.133306 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="0add0fa2-092f-4dcc-8c72-82881564bf63" containerName="kube-rbac-proxy-ovn-metrics" Oct 12 05:50:17 crc kubenswrapper[4930]: E1012 05:50:17.133319 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0add0fa2-092f-4dcc-8c72-82881564bf63" containerName="ovnkube-controller" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.133331 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="0add0fa2-092f-4dcc-8c72-82881564bf63" containerName="ovnkube-controller" Oct 12 05:50:17 crc kubenswrapper[4930]: E1012 05:50:17.133349 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0add0fa2-092f-4dcc-8c72-82881564bf63" containerName="ovnkube-controller" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.133361 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="0add0fa2-092f-4dcc-8c72-82881564bf63" containerName="ovnkube-controller" Oct 12 05:50:17 crc kubenswrapper[4930]: E1012 05:50:17.133378 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0add0fa2-092f-4dcc-8c72-82881564bf63" containerName="ovnkube-controller" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.133391 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="0add0fa2-092f-4dcc-8c72-82881564bf63" containerName="ovnkube-controller" Oct 12 05:50:17 crc kubenswrapper[4930]: E1012 05:50:17.133415 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0add0fa2-092f-4dcc-8c72-82881564bf63" containerName="nbdb" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.133427 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="0add0fa2-092f-4dcc-8c72-82881564bf63" containerName="nbdb" Oct 12 05:50:17 crc kubenswrapper[4930]: E1012 05:50:17.133450 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0add0fa2-092f-4dcc-8c72-82881564bf63" containerName="sbdb" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.133463 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="0add0fa2-092f-4dcc-8c72-82881564bf63" containerName="sbdb" Oct 12 05:50:17 crc kubenswrapper[4930]: E1012 05:50:17.133480 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0add0fa2-092f-4dcc-8c72-82881564bf63" containerName="kube-rbac-proxy-node" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.133491 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="0add0fa2-092f-4dcc-8c72-82881564bf63" containerName="kube-rbac-proxy-node" Oct 12 05:50:17 crc kubenswrapper[4930]: E1012 05:50:17.133504 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0add0fa2-092f-4dcc-8c72-82881564bf63" containerName="northd" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.133516 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="0add0fa2-092f-4dcc-8c72-82881564bf63" containerName="northd" Oct 12 05:50:17 crc kubenswrapper[4930]: E1012 05:50:17.133531 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0add0fa2-092f-4dcc-8c72-82881564bf63" containerName="kubecfg-setup" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.133543 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="0add0fa2-092f-4dcc-8c72-82881564bf63" containerName="kubecfg-setup" Oct 12 05:50:17 crc kubenswrapper[4930]: E1012 05:50:17.133557 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0add0fa2-092f-4dcc-8c72-82881564bf63" containerName="ovn-controller" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.133568 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="0add0fa2-092f-4dcc-8c72-82881564bf63" containerName="ovn-controller" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.133764 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="0add0fa2-092f-4dcc-8c72-82881564bf63" containerName="ovnkube-controller" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.133782 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="0add0fa2-092f-4dcc-8c72-82881564bf63" containerName="northd" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.133801 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="0add0fa2-092f-4dcc-8c72-82881564bf63" containerName="ovnkube-controller" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.133817 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="0add0fa2-092f-4dcc-8c72-82881564bf63" containerName="kube-rbac-proxy-node" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.133831 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="0add0fa2-092f-4dcc-8c72-82881564bf63" containerName="ovnkube-controller" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.133846 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="0add0fa2-092f-4dcc-8c72-82881564bf63" containerName="ovnkube-controller" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.133864 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="0add0fa2-092f-4dcc-8c72-82881564bf63" containerName="nbdb" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.133880 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="0add0fa2-092f-4dcc-8c72-82881564bf63" containerName="ovn-controller" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.133898 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="0add0fa2-092f-4dcc-8c72-82881564bf63" containerName="kube-rbac-proxy-ovn-metrics" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.133918 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="0add0fa2-092f-4dcc-8c72-82881564bf63" containerName="sbdb" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.133933 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="0add0fa2-092f-4dcc-8c72-82881564bf63" containerName="ovn-acl-logging" Oct 12 05:50:17 crc kubenswrapper[4930]: E1012 05:50:17.134104 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0add0fa2-092f-4dcc-8c72-82881564bf63" containerName="ovnkube-controller" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.134118 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="0add0fa2-092f-4dcc-8c72-82881564bf63" containerName="ovnkube-controller" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.134283 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="0add0fa2-092f-4dcc-8c72-82881564bf63" containerName="ovnkube-controller" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.137541 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-npzmj" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.157414 4930 scope.go:117] "RemoveContainer" containerID="3d13c51f393f32e647751366764fe45c6eb3dd586ba90fc528fce23cae8b16b7" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.160014 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "0add0fa2-092f-4dcc-8c72-82881564bf63" (UID: "0add0fa2-092f-4dcc-8c72-82881564bf63"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.186795 4930 scope.go:117] "RemoveContainer" containerID="dcac9953a6ca0ac812cb0d1aed7ff1179eed01831f4af4a098f3e6938b5bc666" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.217723 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1b204b3c-0774-4bdc-8b8b-5357935c2b63-ovnkube-script-lib\") pod \"ovnkube-node-npzmj\" (UID: \"1b204b3c-0774-4bdc-8b8b-5357935c2b63\") " pod="openshift-ovn-kubernetes/ovnkube-node-npzmj" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.217842 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1b204b3c-0774-4bdc-8b8b-5357935c2b63-log-socket\") pod \"ovnkube-node-npzmj\" (UID: \"1b204b3c-0774-4bdc-8b8b-5357935c2b63\") " pod="openshift-ovn-kubernetes/ovnkube-node-npzmj" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.217885 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1b204b3c-0774-4bdc-8b8b-5357935c2b63-host-slash\") pod \"ovnkube-node-npzmj\" (UID: \"1b204b3c-0774-4bdc-8b8b-5357935c2b63\") " pod="openshift-ovn-kubernetes/ovnkube-node-npzmj" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.217909 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1b204b3c-0774-4bdc-8b8b-5357935c2b63-host-run-ovn-kubernetes\") pod \"ovnkube-node-npzmj\" (UID: \"1b204b3c-0774-4bdc-8b8b-5357935c2b63\") " pod="openshift-ovn-kubernetes/ovnkube-node-npzmj" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.217931 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1b204b3c-0774-4bdc-8b8b-5357935c2b63-host-cni-netd\") pod \"ovnkube-node-npzmj\" (UID: \"1b204b3c-0774-4bdc-8b8b-5357935c2b63\") " pod="openshift-ovn-kubernetes/ovnkube-node-npzmj" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.217953 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1b204b3c-0774-4bdc-8b8b-5357935c2b63-etc-openvswitch\") pod \"ovnkube-node-npzmj\" (UID: \"1b204b3c-0774-4bdc-8b8b-5357935c2b63\") " pod="openshift-ovn-kubernetes/ovnkube-node-npzmj" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.217975 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1b204b3c-0774-4bdc-8b8b-5357935c2b63-node-log\") pod \"ovnkube-node-npzmj\" (UID: \"1b204b3c-0774-4bdc-8b8b-5357935c2b63\") " pod="openshift-ovn-kubernetes/ovnkube-node-npzmj" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.218009 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1b204b3c-0774-4bdc-8b8b-5357935c2b63-run-openvswitch\") pod \"ovnkube-node-npzmj\" (UID: \"1b204b3c-0774-4bdc-8b8b-5357935c2b63\") " pod="openshift-ovn-kubernetes/ovnkube-node-npzmj" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.218054 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1b204b3c-0774-4bdc-8b8b-5357935c2b63-ovn-node-metrics-cert\") pod \"ovnkube-node-npzmj\" (UID: \"1b204b3c-0774-4bdc-8b8b-5357935c2b63\") " pod="openshift-ovn-kubernetes/ovnkube-node-npzmj" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.218086 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1b204b3c-0774-4bdc-8b8b-5357935c2b63-host-kubelet\") pod \"ovnkube-node-npzmj\" (UID: \"1b204b3c-0774-4bdc-8b8b-5357935c2b63\") " pod="openshift-ovn-kubernetes/ovnkube-node-npzmj" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.218119 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1b204b3c-0774-4bdc-8b8b-5357935c2b63-var-lib-openvswitch\") pod \"ovnkube-node-npzmj\" (UID: \"1b204b3c-0774-4bdc-8b8b-5357935c2b63\") " pod="openshift-ovn-kubernetes/ovnkube-node-npzmj" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.218153 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1b204b3c-0774-4bdc-8b8b-5357935c2b63-systemd-units\") pod \"ovnkube-node-npzmj\" (UID: \"1b204b3c-0774-4bdc-8b8b-5357935c2b63\") " pod="openshift-ovn-kubernetes/ovnkube-node-npzmj" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.218186 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1b204b3c-0774-4bdc-8b8b-5357935c2b63-host-cni-bin\") pod \"ovnkube-node-npzmj\" (UID: \"1b204b3c-0774-4bdc-8b8b-5357935c2b63\") " pod="openshift-ovn-kubernetes/ovnkube-node-npzmj" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.218354 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1b204b3c-0774-4bdc-8b8b-5357935c2b63-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-npzmj\" (UID: \"1b204b3c-0774-4bdc-8b8b-5357935c2b63\") " pod="openshift-ovn-kubernetes/ovnkube-node-npzmj" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.218394 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nv4b4\" (UniqueName: \"kubernetes.io/projected/1b204b3c-0774-4bdc-8b8b-5357935c2b63-kube-api-access-nv4b4\") pod \"ovnkube-node-npzmj\" (UID: \"1b204b3c-0774-4bdc-8b8b-5357935c2b63\") " pod="openshift-ovn-kubernetes/ovnkube-node-npzmj" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.218760 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1b204b3c-0774-4bdc-8b8b-5357935c2b63-run-systemd\") pod \"ovnkube-node-npzmj\" (UID: \"1b204b3c-0774-4bdc-8b8b-5357935c2b63\") " pod="openshift-ovn-kubernetes/ovnkube-node-npzmj" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.219023 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1b204b3c-0774-4bdc-8b8b-5357935c2b63-run-ovn\") pod \"ovnkube-node-npzmj\" (UID: \"1b204b3c-0774-4bdc-8b8b-5357935c2b63\") " pod="openshift-ovn-kubernetes/ovnkube-node-npzmj" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.219102 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1b204b3c-0774-4bdc-8b8b-5357935c2b63-ovnkube-config\") pod \"ovnkube-node-npzmj\" (UID: \"1b204b3c-0774-4bdc-8b8b-5357935c2b63\") " pod="openshift-ovn-kubernetes/ovnkube-node-npzmj" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.219177 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1b204b3c-0774-4bdc-8b8b-5357935c2b63-env-overrides\") pod \"ovnkube-node-npzmj\" (UID: \"1b204b3c-0774-4bdc-8b8b-5357935c2b63\") " pod="openshift-ovn-kubernetes/ovnkube-node-npzmj" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.219422 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1b204b3c-0774-4bdc-8b8b-5357935c2b63-host-run-netns\") pod \"ovnkube-node-npzmj\" (UID: \"1b204b3c-0774-4bdc-8b8b-5357935c2b63\") " pod="openshift-ovn-kubernetes/ovnkube-node-npzmj" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.219494 4930 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.219510 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7hpj\" (UniqueName: \"kubernetes.io/projected/0add0fa2-092f-4dcc-8c72-82881564bf63-kube-api-access-g7hpj\") on node \"crc\" DevicePath \"\"" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.219522 4930 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-log-socket\") on node \"crc\" DevicePath \"\"" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.219534 4930 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-host-cni-bin\") on node \"crc\" DevicePath \"\"" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.219546 4930 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0add0fa2-092f-4dcc-8c72-82881564bf63-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.219560 4930 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-systemd-units\") on node \"crc\" DevicePath \"\"" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.219570 4930 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0add0fa2-092f-4dcc-8c72-82881564bf63-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.219583 4930 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-run-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.219624 4930 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.219636 4930 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.219650 4930 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-node-log\") on node \"crc\" DevicePath \"\"" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.219662 4930 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-host-kubelet\") on node \"crc\" DevicePath \"\"" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.219675 4930 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.219687 4930 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0add0fa2-092f-4dcc-8c72-82881564bf63-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.219699 4930 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0add0fa2-092f-4dcc-8c72-82881564bf63-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.219710 4930 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.219722 4930 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-run-systemd\") on node \"crc\" DevicePath \"\"" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.219733 4930 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-host-slash\") on node \"crc\" DevicePath \"\"" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.219764 4930 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-host-run-netns\") on node \"crc\" DevicePath \"\"" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.219774 4930 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0add0fa2-092f-4dcc-8c72-82881564bf63-host-cni-netd\") on node \"crc\" DevicePath \"\"" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.225647 4930 scope.go:117] "RemoveContainer" containerID="a2cba08ba8fd74a521bdd005e51be52f80687a1ccf0f250d11b709c894be8235" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.250903 4930 scope.go:117] "RemoveContainer" containerID="1006beab015e55856b70b9bcb93bb81b541e6c593fae8fd366c361990d233914" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.266237 4930 scope.go:117] "RemoveContainer" containerID="48a4295347cdfdf0864fbc2b102904f73e19cc33021fc5a82eed61255c4641d3" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.283191 4930 scope.go:117] "RemoveContainer" containerID="7495c144ce989bf10b08b199a7459ea85351b77941a3edcb0ef76a01b2e08164" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.297552 4930 scope.go:117] "RemoveContainer" containerID="b17051b78f584f1199be3038a82d9e675d1b9fb4da939611577c0b05235a147a" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.312369 4930 scope.go:117] "RemoveContainer" containerID="c9898a7d7e617b5eb8eab6b6d9cb5d2ea92e8fbb0f4b1e0e60711e8f4a5e3aef" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.320488 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1b204b3c-0774-4bdc-8b8b-5357935c2b63-host-slash\") pod \"ovnkube-node-npzmj\" (UID: \"1b204b3c-0774-4bdc-8b8b-5357935c2b63\") " pod="openshift-ovn-kubernetes/ovnkube-node-npzmj" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.320531 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1b204b3c-0774-4bdc-8b8b-5357935c2b63-host-run-ovn-kubernetes\") pod \"ovnkube-node-npzmj\" (UID: \"1b204b3c-0774-4bdc-8b8b-5357935c2b63\") " pod="openshift-ovn-kubernetes/ovnkube-node-npzmj" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.320555 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1b204b3c-0774-4bdc-8b8b-5357935c2b63-host-cni-netd\") pod \"ovnkube-node-npzmj\" (UID: \"1b204b3c-0774-4bdc-8b8b-5357935c2b63\") " pod="openshift-ovn-kubernetes/ovnkube-node-npzmj" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.320571 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1b204b3c-0774-4bdc-8b8b-5357935c2b63-etc-openvswitch\") pod \"ovnkube-node-npzmj\" (UID: \"1b204b3c-0774-4bdc-8b8b-5357935c2b63\") " pod="openshift-ovn-kubernetes/ovnkube-node-npzmj" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.320586 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1b204b3c-0774-4bdc-8b8b-5357935c2b63-node-log\") pod \"ovnkube-node-npzmj\" (UID: \"1b204b3c-0774-4bdc-8b8b-5357935c2b63\") " pod="openshift-ovn-kubernetes/ovnkube-node-npzmj" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.320607 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1b204b3c-0774-4bdc-8b8b-5357935c2b63-run-openvswitch\") pod \"ovnkube-node-npzmj\" (UID: \"1b204b3c-0774-4bdc-8b8b-5357935c2b63\") " pod="openshift-ovn-kubernetes/ovnkube-node-npzmj" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.320628 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1b204b3c-0774-4bdc-8b8b-5357935c2b63-ovn-node-metrics-cert\") pod \"ovnkube-node-npzmj\" (UID: \"1b204b3c-0774-4bdc-8b8b-5357935c2b63\") " pod="openshift-ovn-kubernetes/ovnkube-node-npzmj" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.320652 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1b204b3c-0774-4bdc-8b8b-5357935c2b63-host-kubelet\") pod \"ovnkube-node-npzmj\" (UID: \"1b204b3c-0774-4bdc-8b8b-5357935c2b63\") " pod="openshift-ovn-kubernetes/ovnkube-node-npzmj" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.320668 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1b204b3c-0774-4bdc-8b8b-5357935c2b63-var-lib-openvswitch\") pod \"ovnkube-node-npzmj\" (UID: \"1b204b3c-0774-4bdc-8b8b-5357935c2b63\") " pod="openshift-ovn-kubernetes/ovnkube-node-npzmj" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.320686 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1b204b3c-0774-4bdc-8b8b-5357935c2b63-systemd-units\") pod \"ovnkube-node-npzmj\" (UID: \"1b204b3c-0774-4bdc-8b8b-5357935c2b63\") " pod="openshift-ovn-kubernetes/ovnkube-node-npzmj" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.320700 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1b204b3c-0774-4bdc-8b8b-5357935c2b63-host-cni-bin\") pod \"ovnkube-node-npzmj\" (UID: \"1b204b3c-0774-4bdc-8b8b-5357935c2b63\") " pod="openshift-ovn-kubernetes/ovnkube-node-npzmj" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.320715 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1b204b3c-0774-4bdc-8b8b-5357935c2b63-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-npzmj\" (UID: \"1b204b3c-0774-4bdc-8b8b-5357935c2b63\") " pod="openshift-ovn-kubernetes/ovnkube-node-npzmj" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.320752 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nv4b4\" (UniqueName: \"kubernetes.io/projected/1b204b3c-0774-4bdc-8b8b-5357935c2b63-kube-api-access-nv4b4\") pod \"ovnkube-node-npzmj\" (UID: \"1b204b3c-0774-4bdc-8b8b-5357935c2b63\") " pod="openshift-ovn-kubernetes/ovnkube-node-npzmj" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.320768 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1b204b3c-0774-4bdc-8b8b-5357935c2b63-run-systemd\") pod \"ovnkube-node-npzmj\" (UID: \"1b204b3c-0774-4bdc-8b8b-5357935c2b63\") " pod="openshift-ovn-kubernetes/ovnkube-node-npzmj" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.320783 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1b204b3c-0774-4bdc-8b8b-5357935c2b63-run-ovn\") pod \"ovnkube-node-npzmj\" (UID: \"1b204b3c-0774-4bdc-8b8b-5357935c2b63\") " pod="openshift-ovn-kubernetes/ovnkube-node-npzmj" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.320800 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1b204b3c-0774-4bdc-8b8b-5357935c2b63-ovnkube-config\") pod \"ovnkube-node-npzmj\" (UID: \"1b204b3c-0774-4bdc-8b8b-5357935c2b63\") " pod="openshift-ovn-kubernetes/ovnkube-node-npzmj" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.320817 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1b204b3c-0774-4bdc-8b8b-5357935c2b63-env-overrides\") pod \"ovnkube-node-npzmj\" (UID: \"1b204b3c-0774-4bdc-8b8b-5357935c2b63\") " pod="openshift-ovn-kubernetes/ovnkube-node-npzmj" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.320841 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1b204b3c-0774-4bdc-8b8b-5357935c2b63-host-run-netns\") pod \"ovnkube-node-npzmj\" (UID: \"1b204b3c-0774-4bdc-8b8b-5357935c2b63\") " pod="openshift-ovn-kubernetes/ovnkube-node-npzmj" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.320859 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1b204b3c-0774-4bdc-8b8b-5357935c2b63-ovnkube-script-lib\") pod \"ovnkube-node-npzmj\" (UID: \"1b204b3c-0774-4bdc-8b8b-5357935c2b63\") " pod="openshift-ovn-kubernetes/ovnkube-node-npzmj" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.320884 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1b204b3c-0774-4bdc-8b8b-5357935c2b63-log-socket\") pod \"ovnkube-node-npzmj\" (UID: \"1b204b3c-0774-4bdc-8b8b-5357935c2b63\") " pod="openshift-ovn-kubernetes/ovnkube-node-npzmj" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.320944 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1b204b3c-0774-4bdc-8b8b-5357935c2b63-log-socket\") pod \"ovnkube-node-npzmj\" (UID: \"1b204b3c-0774-4bdc-8b8b-5357935c2b63\") " pod="openshift-ovn-kubernetes/ovnkube-node-npzmj" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.320981 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1b204b3c-0774-4bdc-8b8b-5357935c2b63-host-slash\") pod \"ovnkube-node-npzmj\" (UID: \"1b204b3c-0774-4bdc-8b8b-5357935c2b63\") " pod="openshift-ovn-kubernetes/ovnkube-node-npzmj" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.321007 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1b204b3c-0774-4bdc-8b8b-5357935c2b63-host-run-ovn-kubernetes\") pod \"ovnkube-node-npzmj\" (UID: \"1b204b3c-0774-4bdc-8b8b-5357935c2b63\") " pod="openshift-ovn-kubernetes/ovnkube-node-npzmj" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.321029 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1b204b3c-0774-4bdc-8b8b-5357935c2b63-host-cni-netd\") pod \"ovnkube-node-npzmj\" (UID: \"1b204b3c-0774-4bdc-8b8b-5357935c2b63\") " pod="openshift-ovn-kubernetes/ovnkube-node-npzmj" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.321047 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1b204b3c-0774-4bdc-8b8b-5357935c2b63-etc-openvswitch\") pod \"ovnkube-node-npzmj\" (UID: \"1b204b3c-0774-4bdc-8b8b-5357935c2b63\") " pod="openshift-ovn-kubernetes/ovnkube-node-npzmj" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.321065 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1b204b3c-0774-4bdc-8b8b-5357935c2b63-node-log\") pod \"ovnkube-node-npzmj\" (UID: \"1b204b3c-0774-4bdc-8b8b-5357935c2b63\") " pod="openshift-ovn-kubernetes/ovnkube-node-npzmj" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.321086 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1b204b3c-0774-4bdc-8b8b-5357935c2b63-run-openvswitch\") pod \"ovnkube-node-npzmj\" (UID: \"1b204b3c-0774-4bdc-8b8b-5357935c2b63\") " pod="openshift-ovn-kubernetes/ovnkube-node-npzmj" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.321459 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1b204b3c-0774-4bdc-8b8b-5357935c2b63-systemd-units\") pod \"ovnkube-node-npzmj\" (UID: \"1b204b3c-0774-4bdc-8b8b-5357935c2b63\") " pod="openshift-ovn-kubernetes/ovnkube-node-npzmj" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.321504 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1b204b3c-0774-4bdc-8b8b-5357935c2b63-host-cni-bin\") pod \"ovnkube-node-npzmj\" (UID: \"1b204b3c-0774-4bdc-8b8b-5357935c2b63\") " pod="openshift-ovn-kubernetes/ovnkube-node-npzmj" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.321534 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1b204b3c-0774-4bdc-8b8b-5357935c2b63-host-kubelet\") pod \"ovnkube-node-npzmj\" (UID: \"1b204b3c-0774-4bdc-8b8b-5357935c2b63\") " pod="openshift-ovn-kubernetes/ovnkube-node-npzmj" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.321571 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1b204b3c-0774-4bdc-8b8b-5357935c2b63-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-npzmj\" (UID: \"1b204b3c-0774-4bdc-8b8b-5357935c2b63\") " pod="openshift-ovn-kubernetes/ovnkube-node-npzmj" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.321579 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1b204b3c-0774-4bdc-8b8b-5357935c2b63-var-lib-openvswitch\") pod \"ovnkube-node-npzmj\" (UID: \"1b204b3c-0774-4bdc-8b8b-5357935c2b63\") " pod="openshift-ovn-kubernetes/ovnkube-node-npzmj" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.321604 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1b204b3c-0774-4bdc-8b8b-5357935c2b63-host-run-netns\") pod \"ovnkube-node-npzmj\" (UID: \"1b204b3c-0774-4bdc-8b8b-5357935c2b63\") " pod="openshift-ovn-kubernetes/ovnkube-node-npzmj" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.321637 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1b204b3c-0774-4bdc-8b8b-5357935c2b63-run-systemd\") pod \"ovnkube-node-npzmj\" (UID: \"1b204b3c-0774-4bdc-8b8b-5357935c2b63\") " pod="openshift-ovn-kubernetes/ovnkube-node-npzmj" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.321661 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1b204b3c-0774-4bdc-8b8b-5357935c2b63-run-ovn\") pod \"ovnkube-node-npzmj\" (UID: \"1b204b3c-0774-4bdc-8b8b-5357935c2b63\") " pod="openshift-ovn-kubernetes/ovnkube-node-npzmj" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.322255 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1b204b3c-0774-4bdc-8b8b-5357935c2b63-env-overrides\") pod \"ovnkube-node-npzmj\" (UID: \"1b204b3c-0774-4bdc-8b8b-5357935c2b63\") " pod="openshift-ovn-kubernetes/ovnkube-node-npzmj" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.322385 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1b204b3c-0774-4bdc-8b8b-5357935c2b63-ovnkube-config\") pod \"ovnkube-node-npzmj\" (UID: \"1b204b3c-0774-4bdc-8b8b-5357935c2b63\") " pod="openshift-ovn-kubernetes/ovnkube-node-npzmj" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.322402 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1b204b3c-0774-4bdc-8b8b-5357935c2b63-ovnkube-script-lib\") pod \"ovnkube-node-npzmj\" (UID: \"1b204b3c-0774-4bdc-8b8b-5357935c2b63\") " pod="openshift-ovn-kubernetes/ovnkube-node-npzmj" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.325331 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1b204b3c-0774-4bdc-8b8b-5357935c2b63-ovn-node-metrics-cert\") pod \"ovnkube-node-npzmj\" (UID: \"1b204b3c-0774-4bdc-8b8b-5357935c2b63\") " pod="openshift-ovn-kubernetes/ovnkube-node-npzmj" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.329443 4930 scope.go:117] "RemoveContainer" containerID="5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.339535 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nv4b4\" (UniqueName: \"kubernetes.io/projected/1b204b3c-0774-4bdc-8b8b-5357935c2b63-kube-api-access-nv4b4\") pod \"ovnkube-node-npzmj\" (UID: \"1b204b3c-0774-4bdc-8b8b-5357935c2b63\") " pod="openshift-ovn-kubernetes/ovnkube-node-npzmj" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.343936 4930 scope.go:117] "RemoveContainer" containerID="63db3e72cc84311fe4ad16d27280c33993ec32d15e656ea269fe9ed2f19426ba" Oct 12 05:50:17 crc kubenswrapper[4930]: E1012 05:50:17.344491 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63db3e72cc84311fe4ad16d27280c33993ec32d15e656ea269fe9ed2f19426ba\": container with ID starting with 63db3e72cc84311fe4ad16d27280c33993ec32d15e656ea269fe9ed2f19426ba not found: ID does not exist" containerID="63db3e72cc84311fe4ad16d27280c33993ec32d15e656ea269fe9ed2f19426ba" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.344556 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63db3e72cc84311fe4ad16d27280c33993ec32d15e656ea269fe9ed2f19426ba"} err="failed to get container status \"63db3e72cc84311fe4ad16d27280c33993ec32d15e656ea269fe9ed2f19426ba\": rpc error: code = NotFound desc = could not find container \"63db3e72cc84311fe4ad16d27280c33993ec32d15e656ea269fe9ed2f19426ba\": container with ID starting with 63db3e72cc84311fe4ad16d27280c33993ec32d15e656ea269fe9ed2f19426ba not found: ID does not exist" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.344600 4930 scope.go:117] "RemoveContainer" containerID="3d13c51f393f32e647751366764fe45c6eb3dd586ba90fc528fce23cae8b16b7" Oct 12 05:50:17 crc kubenswrapper[4930]: E1012 05:50:17.345206 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d13c51f393f32e647751366764fe45c6eb3dd586ba90fc528fce23cae8b16b7\": container with ID starting with 3d13c51f393f32e647751366764fe45c6eb3dd586ba90fc528fce23cae8b16b7 not found: ID does not exist" containerID="3d13c51f393f32e647751366764fe45c6eb3dd586ba90fc528fce23cae8b16b7" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.345245 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d13c51f393f32e647751366764fe45c6eb3dd586ba90fc528fce23cae8b16b7"} err="failed to get container status \"3d13c51f393f32e647751366764fe45c6eb3dd586ba90fc528fce23cae8b16b7\": rpc error: code = NotFound desc = could not find container \"3d13c51f393f32e647751366764fe45c6eb3dd586ba90fc528fce23cae8b16b7\": container with ID starting with 3d13c51f393f32e647751366764fe45c6eb3dd586ba90fc528fce23cae8b16b7 not found: ID does not exist" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.345270 4930 scope.go:117] "RemoveContainer" containerID="dcac9953a6ca0ac812cb0d1aed7ff1179eed01831f4af4a098f3e6938b5bc666" Oct 12 05:50:17 crc kubenswrapper[4930]: E1012 05:50:17.345625 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcac9953a6ca0ac812cb0d1aed7ff1179eed01831f4af4a098f3e6938b5bc666\": container with ID starting with dcac9953a6ca0ac812cb0d1aed7ff1179eed01831f4af4a098f3e6938b5bc666 not found: ID does not exist" containerID="dcac9953a6ca0ac812cb0d1aed7ff1179eed01831f4af4a098f3e6938b5bc666" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.345654 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcac9953a6ca0ac812cb0d1aed7ff1179eed01831f4af4a098f3e6938b5bc666"} err="failed to get container status \"dcac9953a6ca0ac812cb0d1aed7ff1179eed01831f4af4a098f3e6938b5bc666\": rpc error: code = NotFound desc = could not find container \"dcac9953a6ca0ac812cb0d1aed7ff1179eed01831f4af4a098f3e6938b5bc666\": container with ID starting with dcac9953a6ca0ac812cb0d1aed7ff1179eed01831f4af4a098f3e6938b5bc666 not found: ID does not exist" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.345673 4930 scope.go:117] "RemoveContainer" containerID="a2cba08ba8fd74a521bdd005e51be52f80687a1ccf0f250d11b709c894be8235" Oct 12 05:50:17 crc kubenswrapper[4930]: E1012 05:50:17.346132 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2cba08ba8fd74a521bdd005e51be52f80687a1ccf0f250d11b709c894be8235\": container with ID starting with a2cba08ba8fd74a521bdd005e51be52f80687a1ccf0f250d11b709c894be8235 not found: ID does not exist" containerID="a2cba08ba8fd74a521bdd005e51be52f80687a1ccf0f250d11b709c894be8235" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.346160 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2cba08ba8fd74a521bdd005e51be52f80687a1ccf0f250d11b709c894be8235"} err="failed to get container status \"a2cba08ba8fd74a521bdd005e51be52f80687a1ccf0f250d11b709c894be8235\": rpc error: code = NotFound desc = could not find container \"a2cba08ba8fd74a521bdd005e51be52f80687a1ccf0f250d11b709c894be8235\": container with ID starting with a2cba08ba8fd74a521bdd005e51be52f80687a1ccf0f250d11b709c894be8235 not found: ID does not exist" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.346206 4930 scope.go:117] "RemoveContainer" containerID="1006beab015e55856b70b9bcb93bb81b541e6c593fae8fd366c361990d233914" Oct 12 05:50:17 crc kubenswrapper[4930]: E1012 05:50:17.346537 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1006beab015e55856b70b9bcb93bb81b541e6c593fae8fd366c361990d233914\": container with ID starting with 1006beab015e55856b70b9bcb93bb81b541e6c593fae8fd366c361990d233914 not found: ID does not exist" containerID="1006beab015e55856b70b9bcb93bb81b541e6c593fae8fd366c361990d233914" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.346573 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1006beab015e55856b70b9bcb93bb81b541e6c593fae8fd366c361990d233914"} err="failed to get container status \"1006beab015e55856b70b9bcb93bb81b541e6c593fae8fd366c361990d233914\": rpc error: code = NotFound desc = could not find container \"1006beab015e55856b70b9bcb93bb81b541e6c593fae8fd366c361990d233914\": container with ID starting with 1006beab015e55856b70b9bcb93bb81b541e6c593fae8fd366c361990d233914 not found: ID does not exist" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.346595 4930 scope.go:117] "RemoveContainer" containerID="48a4295347cdfdf0864fbc2b102904f73e19cc33021fc5a82eed61255c4641d3" Oct 12 05:50:17 crc kubenswrapper[4930]: E1012 05:50:17.346962 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48a4295347cdfdf0864fbc2b102904f73e19cc33021fc5a82eed61255c4641d3\": container with ID starting with 48a4295347cdfdf0864fbc2b102904f73e19cc33021fc5a82eed61255c4641d3 not found: ID does not exist" containerID="48a4295347cdfdf0864fbc2b102904f73e19cc33021fc5a82eed61255c4641d3" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.347011 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48a4295347cdfdf0864fbc2b102904f73e19cc33021fc5a82eed61255c4641d3"} err="failed to get container status \"48a4295347cdfdf0864fbc2b102904f73e19cc33021fc5a82eed61255c4641d3\": rpc error: code = NotFound desc = could not find container \"48a4295347cdfdf0864fbc2b102904f73e19cc33021fc5a82eed61255c4641d3\": container with ID starting with 48a4295347cdfdf0864fbc2b102904f73e19cc33021fc5a82eed61255c4641d3 not found: ID does not exist" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.347043 4930 scope.go:117] "RemoveContainer" containerID="7495c144ce989bf10b08b199a7459ea85351b77941a3edcb0ef76a01b2e08164" Oct 12 05:50:17 crc kubenswrapper[4930]: E1012 05:50:17.347416 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7495c144ce989bf10b08b199a7459ea85351b77941a3edcb0ef76a01b2e08164\": container with ID starting with 7495c144ce989bf10b08b199a7459ea85351b77941a3edcb0ef76a01b2e08164 not found: ID does not exist" containerID="7495c144ce989bf10b08b199a7459ea85351b77941a3edcb0ef76a01b2e08164" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.347447 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7495c144ce989bf10b08b199a7459ea85351b77941a3edcb0ef76a01b2e08164"} err="failed to get container status \"7495c144ce989bf10b08b199a7459ea85351b77941a3edcb0ef76a01b2e08164\": rpc error: code = NotFound desc = could not find container \"7495c144ce989bf10b08b199a7459ea85351b77941a3edcb0ef76a01b2e08164\": container with ID starting with 7495c144ce989bf10b08b199a7459ea85351b77941a3edcb0ef76a01b2e08164 not found: ID does not exist" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.347466 4930 scope.go:117] "RemoveContainer" containerID="b17051b78f584f1199be3038a82d9e675d1b9fb4da939611577c0b05235a147a" Oct 12 05:50:17 crc kubenswrapper[4930]: E1012 05:50:17.347788 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b17051b78f584f1199be3038a82d9e675d1b9fb4da939611577c0b05235a147a\": container with ID starting with b17051b78f584f1199be3038a82d9e675d1b9fb4da939611577c0b05235a147a not found: ID does not exist" containerID="b17051b78f584f1199be3038a82d9e675d1b9fb4da939611577c0b05235a147a" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.347833 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b17051b78f584f1199be3038a82d9e675d1b9fb4da939611577c0b05235a147a"} err="failed to get container status \"b17051b78f584f1199be3038a82d9e675d1b9fb4da939611577c0b05235a147a\": rpc error: code = NotFound desc = could not find container \"b17051b78f584f1199be3038a82d9e675d1b9fb4da939611577c0b05235a147a\": container with ID starting with b17051b78f584f1199be3038a82d9e675d1b9fb4da939611577c0b05235a147a not found: ID does not exist" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.347860 4930 scope.go:117] "RemoveContainer" containerID="c9898a7d7e617b5eb8eab6b6d9cb5d2ea92e8fbb0f4b1e0e60711e8f4a5e3aef" Oct 12 05:50:17 crc kubenswrapper[4930]: E1012 05:50:17.348144 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9898a7d7e617b5eb8eab6b6d9cb5d2ea92e8fbb0f4b1e0e60711e8f4a5e3aef\": container with ID starting with c9898a7d7e617b5eb8eab6b6d9cb5d2ea92e8fbb0f4b1e0e60711e8f4a5e3aef not found: ID does not exist" containerID="c9898a7d7e617b5eb8eab6b6d9cb5d2ea92e8fbb0f4b1e0e60711e8f4a5e3aef" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.348176 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9898a7d7e617b5eb8eab6b6d9cb5d2ea92e8fbb0f4b1e0e60711e8f4a5e3aef"} err="failed to get container status \"c9898a7d7e617b5eb8eab6b6d9cb5d2ea92e8fbb0f4b1e0e60711e8f4a5e3aef\": rpc error: code = NotFound desc = could not find container \"c9898a7d7e617b5eb8eab6b6d9cb5d2ea92e8fbb0f4b1e0e60711e8f4a5e3aef\": container with ID starting with c9898a7d7e617b5eb8eab6b6d9cb5d2ea92e8fbb0f4b1e0e60711e8f4a5e3aef not found: ID does not exist" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.348215 4930 scope.go:117] "RemoveContainer" containerID="5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb" Oct 12 05:50:17 crc kubenswrapper[4930]: E1012 05:50:17.348501 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\": container with ID starting with 5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb not found: ID does not exist" containerID="5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.348552 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb"} err="failed to get container status \"5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\": rpc error: code = NotFound desc = could not find container \"5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\": container with ID starting with 5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb not found: ID does not exist" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.348572 4930 scope.go:117] "RemoveContainer" containerID="63db3e72cc84311fe4ad16d27280c33993ec32d15e656ea269fe9ed2f19426ba" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.348853 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63db3e72cc84311fe4ad16d27280c33993ec32d15e656ea269fe9ed2f19426ba"} err="failed to get container status \"63db3e72cc84311fe4ad16d27280c33993ec32d15e656ea269fe9ed2f19426ba\": rpc error: code = NotFound desc = could not find container \"63db3e72cc84311fe4ad16d27280c33993ec32d15e656ea269fe9ed2f19426ba\": container with ID starting with 63db3e72cc84311fe4ad16d27280c33993ec32d15e656ea269fe9ed2f19426ba not found: ID does not exist" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.348894 4930 scope.go:117] "RemoveContainer" containerID="3d13c51f393f32e647751366764fe45c6eb3dd586ba90fc528fce23cae8b16b7" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.349277 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d13c51f393f32e647751366764fe45c6eb3dd586ba90fc528fce23cae8b16b7"} err="failed to get container status \"3d13c51f393f32e647751366764fe45c6eb3dd586ba90fc528fce23cae8b16b7\": rpc error: code = NotFound desc = could not find container \"3d13c51f393f32e647751366764fe45c6eb3dd586ba90fc528fce23cae8b16b7\": container with ID starting with 3d13c51f393f32e647751366764fe45c6eb3dd586ba90fc528fce23cae8b16b7 not found: ID does not exist" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.349305 4930 scope.go:117] "RemoveContainer" containerID="dcac9953a6ca0ac812cb0d1aed7ff1179eed01831f4af4a098f3e6938b5bc666" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.349694 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcac9953a6ca0ac812cb0d1aed7ff1179eed01831f4af4a098f3e6938b5bc666"} err="failed to get container status \"dcac9953a6ca0ac812cb0d1aed7ff1179eed01831f4af4a098f3e6938b5bc666\": rpc error: code = NotFound desc = could not find container \"dcac9953a6ca0ac812cb0d1aed7ff1179eed01831f4af4a098f3e6938b5bc666\": container with ID starting with dcac9953a6ca0ac812cb0d1aed7ff1179eed01831f4af4a098f3e6938b5bc666 not found: ID does not exist" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.349720 4930 scope.go:117] "RemoveContainer" containerID="a2cba08ba8fd74a521bdd005e51be52f80687a1ccf0f250d11b709c894be8235" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.350027 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2cba08ba8fd74a521bdd005e51be52f80687a1ccf0f250d11b709c894be8235"} err="failed to get container status \"a2cba08ba8fd74a521bdd005e51be52f80687a1ccf0f250d11b709c894be8235\": rpc error: code = NotFound desc = could not find container \"a2cba08ba8fd74a521bdd005e51be52f80687a1ccf0f250d11b709c894be8235\": container with ID starting with a2cba08ba8fd74a521bdd005e51be52f80687a1ccf0f250d11b709c894be8235 not found: ID does not exist" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.350055 4930 scope.go:117] "RemoveContainer" containerID="1006beab015e55856b70b9bcb93bb81b541e6c593fae8fd366c361990d233914" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.350312 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1006beab015e55856b70b9bcb93bb81b541e6c593fae8fd366c361990d233914"} err="failed to get container status \"1006beab015e55856b70b9bcb93bb81b541e6c593fae8fd366c361990d233914\": rpc error: code = NotFound desc = could not find container \"1006beab015e55856b70b9bcb93bb81b541e6c593fae8fd366c361990d233914\": container with ID starting with 1006beab015e55856b70b9bcb93bb81b541e6c593fae8fd366c361990d233914 not found: ID does not exist" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.350339 4930 scope.go:117] "RemoveContainer" containerID="48a4295347cdfdf0864fbc2b102904f73e19cc33021fc5a82eed61255c4641d3" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.350680 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48a4295347cdfdf0864fbc2b102904f73e19cc33021fc5a82eed61255c4641d3"} err="failed to get container status \"48a4295347cdfdf0864fbc2b102904f73e19cc33021fc5a82eed61255c4641d3\": rpc error: code = NotFound desc = could not find container \"48a4295347cdfdf0864fbc2b102904f73e19cc33021fc5a82eed61255c4641d3\": container with ID starting with 48a4295347cdfdf0864fbc2b102904f73e19cc33021fc5a82eed61255c4641d3 not found: ID does not exist" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.350705 4930 scope.go:117] "RemoveContainer" containerID="7495c144ce989bf10b08b199a7459ea85351b77941a3edcb0ef76a01b2e08164" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.351014 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7495c144ce989bf10b08b199a7459ea85351b77941a3edcb0ef76a01b2e08164"} err="failed to get container status \"7495c144ce989bf10b08b199a7459ea85351b77941a3edcb0ef76a01b2e08164\": rpc error: code = NotFound desc = could not find container \"7495c144ce989bf10b08b199a7459ea85351b77941a3edcb0ef76a01b2e08164\": container with ID starting with 7495c144ce989bf10b08b199a7459ea85351b77941a3edcb0ef76a01b2e08164 not found: ID does not exist" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.351038 4930 scope.go:117] "RemoveContainer" containerID="b17051b78f584f1199be3038a82d9e675d1b9fb4da939611577c0b05235a147a" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.351337 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b17051b78f584f1199be3038a82d9e675d1b9fb4da939611577c0b05235a147a"} err="failed to get container status \"b17051b78f584f1199be3038a82d9e675d1b9fb4da939611577c0b05235a147a\": rpc error: code = NotFound desc = could not find container \"b17051b78f584f1199be3038a82d9e675d1b9fb4da939611577c0b05235a147a\": container with ID starting with b17051b78f584f1199be3038a82d9e675d1b9fb4da939611577c0b05235a147a not found: ID does not exist" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.351381 4930 scope.go:117] "RemoveContainer" containerID="c9898a7d7e617b5eb8eab6b6d9cb5d2ea92e8fbb0f4b1e0e60711e8f4a5e3aef" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.351782 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9898a7d7e617b5eb8eab6b6d9cb5d2ea92e8fbb0f4b1e0e60711e8f4a5e3aef"} err="failed to get container status \"c9898a7d7e617b5eb8eab6b6d9cb5d2ea92e8fbb0f4b1e0e60711e8f4a5e3aef\": rpc error: code = NotFound desc = could not find container \"c9898a7d7e617b5eb8eab6b6d9cb5d2ea92e8fbb0f4b1e0e60711e8f4a5e3aef\": container with ID starting with c9898a7d7e617b5eb8eab6b6d9cb5d2ea92e8fbb0f4b1e0e60711e8f4a5e3aef not found: ID does not exist" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.351810 4930 scope.go:117] "RemoveContainer" containerID="5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.352130 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb"} err="failed to get container status \"5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\": rpc error: code = NotFound desc = could not find container \"5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\": container with ID starting with 5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb not found: ID does not exist" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.352156 4930 scope.go:117] "RemoveContainer" containerID="63db3e72cc84311fe4ad16d27280c33993ec32d15e656ea269fe9ed2f19426ba" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.352445 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63db3e72cc84311fe4ad16d27280c33993ec32d15e656ea269fe9ed2f19426ba"} err="failed to get container status \"63db3e72cc84311fe4ad16d27280c33993ec32d15e656ea269fe9ed2f19426ba\": rpc error: code = NotFound desc = could not find container \"63db3e72cc84311fe4ad16d27280c33993ec32d15e656ea269fe9ed2f19426ba\": container with ID starting with 63db3e72cc84311fe4ad16d27280c33993ec32d15e656ea269fe9ed2f19426ba not found: ID does not exist" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.352482 4930 scope.go:117] "RemoveContainer" containerID="3d13c51f393f32e647751366764fe45c6eb3dd586ba90fc528fce23cae8b16b7" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.353053 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d13c51f393f32e647751366764fe45c6eb3dd586ba90fc528fce23cae8b16b7"} err="failed to get container status \"3d13c51f393f32e647751366764fe45c6eb3dd586ba90fc528fce23cae8b16b7\": rpc error: code = NotFound desc = could not find container \"3d13c51f393f32e647751366764fe45c6eb3dd586ba90fc528fce23cae8b16b7\": container with ID starting with 3d13c51f393f32e647751366764fe45c6eb3dd586ba90fc528fce23cae8b16b7 not found: ID does not exist" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.353082 4930 scope.go:117] "RemoveContainer" containerID="dcac9953a6ca0ac812cb0d1aed7ff1179eed01831f4af4a098f3e6938b5bc666" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.353342 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcac9953a6ca0ac812cb0d1aed7ff1179eed01831f4af4a098f3e6938b5bc666"} err="failed to get container status \"dcac9953a6ca0ac812cb0d1aed7ff1179eed01831f4af4a098f3e6938b5bc666\": rpc error: code = NotFound desc = could not find container \"dcac9953a6ca0ac812cb0d1aed7ff1179eed01831f4af4a098f3e6938b5bc666\": container with ID starting with dcac9953a6ca0ac812cb0d1aed7ff1179eed01831f4af4a098f3e6938b5bc666 not found: ID does not exist" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.353372 4930 scope.go:117] "RemoveContainer" containerID="a2cba08ba8fd74a521bdd005e51be52f80687a1ccf0f250d11b709c894be8235" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.353623 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2cba08ba8fd74a521bdd005e51be52f80687a1ccf0f250d11b709c894be8235"} err="failed to get container status \"a2cba08ba8fd74a521bdd005e51be52f80687a1ccf0f250d11b709c894be8235\": rpc error: code = NotFound desc = could not find container \"a2cba08ba8fd74a521bdd005e51be52f80687a1ccf0f250d11b709c894be8235\": container with ID starting with a2cba08ba8fd74a521bdd005e51be52f80687a1ccf0f250d11b709c894be8235 not found: ID does not exist" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.353659 4930 scope.go:117] "RemoveContainer" containerID="1006beab015e55856b70b9bcb93bb81b541e6c593fae8fd366c361990d233914" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.354017 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1006beab015e55856b70b9bcb93bb81b541e6c593fae8fd366c361990d233914"} err="failed to get container status \"1006beab015e55856b70b9bcb93bb81b541e6c593fae8fd366c361990d233914\": rpc error: code = NotFound desc = could not find container \"1006beab015e55856b70b9bcb93bb81b541e6c593fae8fd366c361990d233914\": container with ID starting with 1006beab015e55856b70b9bcb93bb81b541e6c593fae8fd366c361990d233914 not found: ID does not exist" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.354068 4930 scope.go:117] "RemoveContainer" containerID="48a4295347cdfdf0864fbc2b102904f73e19cc33021fc5a82eed61255c4641d3" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.354542 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48a4295347cdfdf0864fbc2b102904f73e19cc33021fc5a82eed61255c4641d3"} err="failed to get container status \"48a4295347cdfdf0864fbc2b102904f73e19cc33021fc5a82eed61255c4641d3\": rpc error: code = NotFound desc = could not find container \"48a4295347cdfdf0864fbc2b102904f73e19cc33021fc5a82eed61255c4641d3\": container with ID starting with 48a4295347cdfdf0864fbc2b102904f73e19cc33021fc5a82eed61255c4641d3 not found: ID does not exist" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.354571 4930 scope.go:117] "RemoveContainer" containerID="7495c144ce989bf10b08b199a7459ea85351b77941a3edcb0ef76a01b2e08164" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.354893 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7495c144ce989bf10b08b199a7459ea85351b77941a3edcb0ef76a01b2e08164"} err="failed to get container status \"7495c144ce989bf10b08b199a7459ea85351b77941a3edcb0ef76a01b2e08164\": rpc error: code = NotFound desc = could not find container \"7495c144ce989bf10b08b199a7459ea85351b77941a3edcb0ef76a01b2e08164\": container with ID starting with 7495c144ce989bf10b08b199a7459ea85351b77941a3edcb0ef76a01b2e08164 not found: ID does not exist" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.354934 4930 scope.go:117] "RemoveContainer" containerID="b17051b78f584f1199be3038a82d9e675d1b9fb4da939611577c0b05235a147a" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.355378 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b17051b78f584f1199be3038a82d9e675d1b9fb4da939611577c0b05235a147a"} err="failed to get container status \"b17051b78f584f1199be3038a82d9e675d1b9fb4da939611577c0b05235a147a\": rpc error: code = NotFound desc = could not find container \"b17051b78f584f1199be3038a82d9e675d1b9fb4da939611577c0b05235a147a\": container with ID starting with b17051b78f584f1199be3038a82d9e675d1b9fb4da939611577c0b05235a147a not found: ID does not exist" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.355422 4930 scope.go:117] "RemoveContainer" containerID="c9898a7d7e617b5eb8eab6b6d9cb5d2ea92e8fbb0f4b1e0e60711e8f4a5e3aef" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.355811 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9898a7d7e617b5eb8eab6b6d9cb5d2ea92e8fbb0f4b1e0e60711e8f4a5e3aef"} err="failed to get container status \"c9898a7d7e617b5eb8eab6b6d9cb5d2ea92e8fbb0f4b1e0e60711e8f4a5e3aef\": rpc error: code = NotFound desc = could not find container \"c9898a7d7e617b5eb8eab6b6d9cb5d2ea92e8fbb0f4b1e0e60711e8f4a5e3aef\": container with ID starting with c9898a7d7e617b5eb8eab6b6d9cb5d2ea92e8fbb0f4b1e0e60711e8f4a5e3aef not found: ID does not exist" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.355925 4930 scope.go:117] "RemoveContainer" containerID="5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.356309 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb"} err="failed to get container status \"5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\": rpc error: code = NotFound desc = could not find container \"5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\": container with ID starting with 5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb not found: ID does not exist" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.356339 4930 scope.go:117] "RemoveContainer" containerID="63db3e72cc84311fe4ad16d27280c33993ec32d15e656ea269fe9ed2f19426ba" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.356676 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63db3e72cc84311fe4ad16d27280c33993ec32d15e656ea269fe9ed2f19426ba"} err="failed to get container status \"63db3e72cc84311fe4ad16d27280c33993ec32d15e656ea269fe9ed2f19426ba\": rpc error: code = NotFound desc = could not find container \"63db3e72cc84311fe4ad16d27280c33993ec32d15e656ea269fe9ed2f19426ba\": container with ID starting with 63db3e72cc84311fe4ad16d27280c33993ec32d15e656ea269fe9ed2f19426ba not found: ID does not exist" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.356700 4930 scope.go:117] "RemoveContainer" containerID="3d13c51f393f32e647751366764fe45c6eb3dd586ba90fc528fce23cae8b16b7" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.357072 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d13c51f393f32e647751366764fe45c6eb3dd586ba90fc528fce23cae8b16b7"} err="failed to get container status \"3d13c51f393f32e647751366764fe45c6eb3dd586ba90fc528fce23cae8b16b7\": rpc error: code = NotFound desc = could not find container \"3d13c51f393f32e647751366764fe45c6eb3dd586ba90fc528fce23cae8b16b7\": container with ID starting with 3d13c51f393f32e647751366764fe45c6eb3dd586ba90fc528fce23cae8b16b7 not found: ID does not exist" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.357105 4930 scope.go:117] "RemoveContainer" containerID="dcac9953a6ca0ac812cb0d1aed7ff1179eed01831f4af4a098f3e6938b5bc666" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.357419 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcac9953a6ca0ac812cb0d1aed7ff1179eed01831f4af4a098f3e6938b5bc666"} err="failed to get container status \"dcac9953a6ca0ac812cb0d1aed7ff1179eed01831f4af4a098f3e6938b5bc666\": rpc error: code = NotFound desc = could not find container \"dcac9953a6ca0ac812cb0d1aed7ff1179eed01831f4af4a098f3e6938b5bc666\": container with ID starting with dcac9953a6ca0ac812cb0d1aed7ff1179eed01831f4af4a098f3e6938b5bc666 not found: ID does not exist" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.357445 4930 scope.go:117] "RemoveContainer" containerID="a2cba08ba8fd74a521bdd005e51be52f80687a1ccf0f250d11b709c894be8235" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.357769 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2cba08ba8fd74a521bdd005e51be52f80687a1ccf0f250d11b709c894be8235"} err="failed to get container status \"a2cba08ba8fd74a521bdd005e51be52f80687a1ccf0f250d11b709c894be8235\": rpc error: code = NotFound desc = could not find container \"a2cba08ba8fd74a521bdd005e51be52f80687a1ccf0f250d11b709c894be8235\": container with ID starting with a2cba08ba8fd74a521bdd005e51be52f80687a1ccf0f250d11b709c894be8235 not found: ID does not exist" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.357815 4930 scope.go:117] "RemoveContainer" containerID="1006beab015e55856b70b9bcb93bb81b541e6c593fae8fd366c361990d233914" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.358148 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1006beab015e55856b70b9bcb93bb81b541e6c593fae8fd366c361990d233914"} err="failed to get container status \"1006beab015e55856b70b9bcb93bb81b541e6c593fae8fd366c361990d233914\": rpc error: code = NotFound desc = could not find container \"1006beab015e55856b70b9bcb93bb81b541e6c593fae8fd366c361990d233914\": container with ID starting with 1006beab015e55856b70b9bcb93bb81b541e6c593fae8fd366c361990d233914 not found: ID does not exist" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.358166 4930 scope.go:117] "RemoveContainer" containerID="48a4295347cdfdf0864fbc2b102904f73e19cc33021fc5a82eed61255c4641d3" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.358416 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48a4295347cdfdf0864fbc2b102904f73e19cc33021fc5a82eed61255c4641d3"} err="failed to get container status \"48a4295347cdfdf0864fbc2b102904f73e19cc33021fc5a82eed61255c4641d3\": rpc error: code = NotFound desc = could not find container \"48a4295347cdfdf0864fbc2b102904f73e19cc33021fc5a82eed61255c4641d3\": container with ID starting with 48a4295347cdfdf0864fbc2b102904f73e19cc33021fc5a82eed61255c4641d3 not found: ID does not exist" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.358441 4930 scope.go:117] "RemoveContainer" containerID="7495c144ce989bf10b08b199a7459ea85351b77941a3edcb0ef76a01b2e08164" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.358729 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7495c144ce989bf10b08b199a7459ea85351b77941a3edcb0ef76a01b2e08164"} err="failed to get container status \"7495c144ce989bf10b08b199a7459ea85351b77941a3edcb0ef76a01b2e08164\": rpc error: code = NotFound desc = could not find container \"7495c144ce989bf10b08b199a7459ea85351b77941a3edcb0ef76a01b2e08164\": container with ID starting with 7495c144ce989bf10b08b199a7459ea85351b77941a3edcb0ef76a01b2e08164 not found: ID does not exist" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.358796 4930 scope.go:117] "RemoveContainer" containerID="b17051b78f584f1199be3038a82d9e675d1b9fb4da939611577c0b05235a147a" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.359077 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b17051b78f584f1199be3038a82d9e675d1b9fb4da939611577c0b05235a147a"} err="failed to get container status \"b17051b78f584f1199be3038a82d9e675d1b9fb4da939611577c0b05235a147a\": rpc error: code = NotFound desc = could not find container \"b17051b78f584f1199be3038a82d9e675d1b9fb4da939611577c0b05235a147a\": container with ID starting with b17051b78f584f1199be3038a82d9e675d1b9fb4da939611577c0b05235a147a not found: ID does not exist" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.359102 4930 scope.go:117] "RemoveContainer" containerID="c9898a7d7e617b5eb8eab6b6d9cb5d2ea92e8fbb0f4b1e0e60711e8f4a5e3aef" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.359393 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9898a7d7e617b5eb8eab6b6d9cb5d2ea92e8fbb0f4b1e0e60711e8f4a5e3aef"} err="failed to get container status \"c9898a7d7e617b5eb8eab6b6d9cb5d2ea92e8fbb0f4b1e0e60711e8f4a5e3aef\": rpc error: code = NotFound desc = could not find container \"c9898a7d7e617b5eb8eab6b6d9cb5d2ea92e8fbb0f4b1e0e60711e8f4a5e3aef\": container with ID starting with c9898a7d7e617b5eb8eab6b6d9cb5d2ea92e8fbb0f4b1e0e60711e8f4a5e3aef not found: ID does not exist" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.359432 4930 scope.go:117] "RemoveContainer" containerID="5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.359838 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb"} err="failed to get container status \"5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\": rpc error: code = NotFound desc = could not find container \"5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb\": container with ID starting with 5ea09ce63549c1b7b55111aebdfc89f37bd39e2e11cf5a4ff208168fdcf416cb not found: ID does not exist" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.359864 4930 scope.go:117] "RemoveContainer" containerID="63db3e72cc84311fe4ad16d27280c33993ec32d15e656ea269fe9ed2f19426ba" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.360214 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63db3e72cc84311fe4ad16d27280c33993ec32d15e656ea269fe9ed2f19426ba"} err="failed to get container status \"63db3e72cc84311fe4ad16d27280c33993ec32d15e656ea269fe9ed2f19426ba\": rpc error: code = NotFound desc = could not find container \"63db3e72cc84311fe4ad16d27280c33993ec32d15e656ea269fe9ed2f19426ba\": container with ID starting with 63db3e72cc84311fe4ad16d27280c33993ec32d15e656ea269fe9ed2f19426ba not found: ID does not exist" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.470442 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-npzmj" Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.474612 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-mdhw6"] Oct 12 05:50:17 crc kubenswrapper[4930]: I1012 05:50:17.480639 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-mdhw6"] Oct 12 05:50:18 crc kubenswrapper[4930]: I1012 05:50:18.144318 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0add0fa2-092f-4dcc-8c72-82881564bf63" path="/var/lib/kubelet/pods/0add0fa2-092f-4dcc-8c72-82881564bf63/volumes" Oct 12 05:50:18 crc kubenswrapper[4930]: I1012 05:50:18.147695 4930 generic.go:334] "Generic (PLEG): container finished" podID="1b204b3c-0774-4bdc-8b8b-5357935c2b63" containerID="5e1422105523df51a406d58d95b5da078e4e016981fb24810b25379a004b8dc7" exitCode=0 Oct 12 05:50:18 crc kubenswrapper[4930]: I1012 05:50:18.147768 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-npzmj" event={"ID":"1b204b3c-0774-4bdc-8b8b-5357935c2b63","Type":"ContainerDied","Data":"5e1422105523df51a406d58d95b5da078e4e016981fb24810b25379a004b8dc7"} Oct 12 05:50:18 crc kubenswrapper[4930]: I1012 05:50:18.147803 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-npzmj" event={"ID":"1b204b3c-0774-4bdc-8b8b-5357935c2b63","Type":"ContainerStarted","Data":"275ce5e537a15b44bd1c5cd0955e308072be1fe21db80bf8cab39768ae1c249d"} Oct 12 05:50:19 crc kubenswrapper[4930]: I1012 05:50:19.158057 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-npzmj" event={"ID":"1b204b3c-0774-4bdc-8b8b-5357935c2b63","Type":"ContainerStarted","Data":"b0d65c2e27e5677006abd7c167743f3e5330767da3515cb08fd72a125fc75741"} Oct 12 05:50:19 crc kubenswrapper[4930]: I1012 05:50:19.158385 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-npzmj" event={"ID":"1b204b3c-0774-4bdc-8b8b-5357935c2b63","Type":"ContainerStarted","Data":"e86f724bda1c9dca89ccecd8a2c5b9923d06d84f4f2e19fa4a84feee304be2de"} Oct 12 05:50:19 crc kubenswrapper[4930]: I1012 05:50:19.158406 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-npzmj" event={"ID":"1b204b3c-0774-4bdc-8b8b-5357935c2b63","Type":"ContainerStarted","Data":"c35de48fa7fc0b8d357fd86b44c7ef3ea0b45994638992e0bb63923a64704f41"} Oct 12 05:50:19 crc kubenswrapper[4930]: I1012 05:50:19.158423 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-npzmj" event={"ID":"1b204b3c-0774-4bdc-8b8b-5357935c2b63","Type":"ContainerStarted","Data":"3c5f8598d506c8f0f5d70dcf3b5699a997c455eb12d0f3acc348bb78a9570ad5"} Oct 12 05:50:19 crc kubenswrapper[4930]: I1012 05:50:19.158439 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-npzmj" event={"ID":"1b204b3c-0774-4bdc-8b8b-5357935c2b63","Type":"ContainerStarted","Data":"49d37e29aa71a434c37b24f2e264455055e5e7388ab9c86360cbd9fb1d860738"} Oct 12 05:50:19 crc kubenswrapper[4930]: I1012 05:50:19.158458 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-npzmj" event={"ID":"1b204b3c-0774-4bdc-8b8b-5357935c2b63","Type":"ContainerStarted","Data":"6bb5d3a8730bbe86a98bdae0393edda5eda102fa69c2222162b7ef95599d6bd4"} Oct 12 05:50:22 crc kubenswrapper[4930]: I1012 05:50:22.182557 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-npzmj" event={"ID":"1b204b3c-0774-4bdc-8b8b-5357935c2b63","Type":"ContainerStarted","Data":"9a38991a8cfd6bf7bfa5d6282a0c95e431fd8687bd64796a6f1f6626142b8415"} Oct 12 05:50:24 crc kubenswrapper[4930]: I1012 05:50:24.202053 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-npzmj" event={"ID":"1b204b3c-0774-4bdc-8b8b-5357935c2b63","Type":"ContainerStarted","Data":"a38e1918ed9ff5f1513d57f93f342a3c0ac3ae9c4d10b29142769a55b494c34f"} Oct 12 05:50:24 crc kubenswrapper[4930]: I1012 05:50:24.202522 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-npzmj" Oct 12 05:50:24 crc kubenswrapper[4930]: I1012 05:50:24.202691 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-npzmj" Oct 12 05:50:24 crc kubenswrapper[4930]: I1012 05:50:24.202882 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-npzmj" Oct 12 05:50:24 crc kubenswrapper[4930]: I1012 05:50:24.243781 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-npzmj" podStartSLOduration=7.24375995 podStartE2EDuration="7.24375995s" podCreationTimestamp="2025-10-12 05:50:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:50:24.239357559 +0000 UTC m=+556.781459354" watchObservedRunningTime="2025-10-12 05:50:24.24375995 +0000 UTC m=+556.785861745" Oct 12 05:50:24 crc kubenswrapper[4930]: I1012 05:50:24.245511 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-npzmj" Oct 12 05:50:24 crc kubenswrapper[4930]: I1012 05:50:24.246046 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-npzmj" Oct 12 05:50:31 crc kubenswrapper[4930]: I1012 05:50:31.135905 4930 scope.go:117] "RemoveContainer" containerID="b8fe6fb418f70bbc6c0032da29f42a0654018e861808de8b636b2a9170c51464" Oct 12 05:50:31 crc kubenswrapper[4930]: E1012 05:50:31.136964 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-tq29s_openshift-multus(c1c3ae9e-26ae-418f-b261-eabc4302b332)\"" pod="openshift-multus/multus-tq29s" podUID="c1c3ae9e-26ae-418f-b261-eabc4302b332" Oct 12 05:50:33 crc kubenswrapper[4930]: I1012 05:50:33.669552 4930 patch_prober.go:28] interesting pod/machine-config-daemon-mk4tf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 05:50:33 crc kubenswrapper[4930]: I1012 05:50:33.669632 4930 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 05:50:33 crc kubenswrapper[4930]: I1012 05:50:33.669690 4930 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" Oct 12 05:50:33 crc kubenswrapper[4930]: I1012 05:50:33.670414 4930 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f9b4436f5dc6b5a5cc61a2d31420352c424bca4ed9a68e98bbb43f4e86026438"} pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 12 05:50:33 crc kubenswrapper[4930]: I1012 05:50:33.670501 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerName="machine-config-daemon" containerID="cri-o://f9b4436f5dc6b5a5cc61a2d31420352c424bca4ed9a68e98bbb43f4e86026438" gracePeriod=600 Oct 12 05:50:34 crc kubenswrapper[4930]: I1012 05:50:34.275146 4930 generic.go:334] "Generic (PLEG): container finished" podID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerID="f9b4436f5dc6b5a5cc61a2d31420352c424bca4ed9a68e98bbb43f4e86026438" exitCode=0 Oct 12 05:50:34 crc kubenswrapper[4930]: I1012 05:50:34.275241 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" event={"ID":"02f8684c-a3e4-44e8-9741-9f54488d8d8d","Type":"ContainerDied","Data":"f9b4436f5dc6b5a5cc61a2d31420352c424bca4ed9a68e98bbb43f4e86026438"} Oct 12 05:50:34 crc kubenswrapper[4930]: I1012 05:50:34.275620 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" event={"ID":"02f8684c-a3e4-44e8-9741-9f54488d8d8d","Type":"ContainerStarted","Data":"8c4b30d4b3900fd24d77d876af5c72fe4eab21e88f5e73b0f78ab790ead4cf26"} Oct 12 05:50:34 crc kubenswrapper[4930]: I1012 05:50:34.275672 4930 scope.go:117] "RemoveContainer" containerID="7434a5664387f355e60869203e8ba53d5bb3922f3950f2fa26ad54baf1b367ce" Oct 12 05:50:42 crc kubenswrapper[4930]: I1012 05:50:42.136380 4930 scope.go:117] "RemoveContainer" containerID="b8fe6fb418f70bbc6c0032da29f42a0654018e861808de8b636b2a9170c51464" Oct 12 05:50:43 crc kubenswrapper[4930]: I1012 05:50:43.347694 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tq29s_c1c3ae9e-26ae-418f-b261-eabc4302b332/kube-multus/2.log" Oct 12 05:50:43 crc kubenswrapper[4930]: I1012 05:50:43.355325 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tq29s_c1c3ae9e-26ae-418f-b261-eabc4302b332/kube-multus/1.log" Oct 12 05:50:43 crc kubenswrapper[4930]: I1012 05:50:43.355400 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tq29s" event={"ID":"c1c3ae9e-26ae-418f-b261-eabc4302b332","Type":"ContainerStarted","Data":"d16425a437c49fa949b52564d9a51e4dee526f7a22db5d4e3b544f4bb2a7b86c"} Oct 12 05:50:43 crc kubenswrapper[4930]: I1012 05:50:43.664807 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddrpfq"] Oct 12 05:50:43 crc kubenswrapper[4930]: I1012 05:50:43.667019 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddrpfq" Oct 12 05:50:43 crc kubenswrapper[4930]: I1012 05:50:43.669461 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 12 05:50:43 crc kubenswrapper[4930]: I1012 05:50:43.681401 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddrpfq"] Oct 12 05:50:43 crc kubenswrapper[4930]: I1012 05:50:43.813148 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bg6hj\" (UniqueName: \"kubernetes.io/projected/f143a61d-685c-4ee5-b95c-f7fae137d0bc-kube-api-access-bg6hj\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddrpfq\" (UID: \"f143a61d-685c-4ee5-b95c-f7fae137d0bc\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddrpfq" Oct 12 05:50:43 crc kubenswrapper[4930]: I1012 05:50:43.813499 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f143a61d-685c-4ee5-b95c-f7fae137d0bc-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddrpfq\" (UID: \"f143a61d-685c-4ee5-b95c-f7fae137d0bc\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddrpfq" Oct 12 05:50:43 crc kubenswrapper[4930]: I1012 05:50:43.813611 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f143a61d-685c-4ee5-b95c-f7fae137d0bc-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddrpfq\" (UID: \"f143a61d-685c-4ee5-b95c-f7fae137d0bc\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddrpfq" Oct 12 05:50:43 crc kubenswrapper[4930]: I1012 05:50:43.914898 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f143a61d-685c-4ee5-b95c-f7fae137d0bc-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddrpfq\" (UID: \"f143a61d-685c-4ee5-b95c-f7fae137d0bc\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddrpfq" Oct 12 05:50:43 crc kubenswrapper[4930]: I1012 05:50:43.915048 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bg6hj\" (UniqueName: \"kubernetes.io/projected/f143a61d-685c-4ee5-b95c-f7fae137d0bc-kube-api-access-bg6hj\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddrpfq\" (UID: \"f143a61d-685c-4ee5-b95c-f7fae137d0bc\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddrpfq" Oct 12 05:50:43 crc kubenswrapper[4930]: I1012 05:50:43.915173 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f143a61d-685c-4ee5-b95c-f7fae137d0bc-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddrpfq\" (UID: \"f143a61d-685c-4ee5-b95c-f7fae137d0bc\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddrpfq" Oct 12 05:50:43 crc kubenswrapper[4930]: I1012 05:50:43.915566 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f143a61d-685c-4ee5-b95c-f7fae137d0bc-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddrpfq\" (UID: \"f143a61d-685c-4ee5-b95c-f7fae137d0bc\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddrpfq" Oct 12 05:50:43 crc kubenswrapper[4930]: I1012 05:50:43.915967 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f143a61d-685c-4ee5-b95c-f7fae137d0bc-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddrpfq\" (UID: \"f143a61d-685c-4ee5-b95c-f7fae137d0bc\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddrpfq" Oct 12 05:50:43 crc kubenswrapper[4930]: I1012 05:50:43.954237 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bg6hj\" (UniqueName: \"kubernetes.io/projected/f143a61d-685c-4ee5-b95c-f7fae137d0bc-kube-api-access-bg6hj\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddrpfq\" (UID: \"f143a61d-685c-4ee5-b95c-f7fae137d0bc\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddrpfq" Oct 12 05:50:43 crc kubenswrapper[4930]: I1012 05:50:43.985683 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddrpfq" Oct 12 05:50:44 crc kubenswrapper[4930]: I1012 05:50:44.255037 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddrpfq"] Oct 12 05:50:44 crc kubenswrapper[4930]: I1012 05:50:44.365502 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddrpfq" event={"ID":"f143a61d-685c-4ee5-b95c-f7fae137d0bc","Type":"ContainerStarted","Data":"f06b5e7a5e5c1d9616537edfda20aabbb7f4d9f6d7aecd014103f7d914f35b21"} Oct 12 05:50:45 crc kubenswrapper[4930]: I1012 05:50:45.375727 4930 generic.go:334] "Generic (PLEG): container finished" podID="f143a61d-685c-4ee5-b95c-f7fae137d0bc" containerID="687a6ff8974d54f023039cfadbf74fa3a7cd48e53ffb890a91d77a22a1261938" exitCode=0 Oct 12 05:50:45 crc kubenswrapper[4930]: I1012 05:50:45.375833 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddrpfq" event={"ID":"f143a61d-685c-4ee5-b95c-f7fae137d0bc","Type":"ContainerDied","Data":"687a6ff8974d54f023039cfadbf74fa3a7cd48e53ffb890a91d77a22a1261938"} Oct 12 05:50:47 crc kubenswrapper[4930]: I1012 05:50:47.395130 4930 generic.go:334] "Generic (PLEG): container finished" podID="f143a61d-685c-4ee5-b95c-f7fae137d0bc" containerID="c79d971291c27b63de37036800cb3c1162687e2e97935589dccec284a5c738dd" exitCode=0 Oct 12 05:50:47 crc kubenswrapper[4930]: I1012 05:50:47.395219 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddrpfq" event={"ID":"f143a61d-685c-4ee5-b95c-f7fae137d0bc","Type":"ContainerDied","Data":"c79d971291c27b63de37036800cb3c1162687e2e97935589dccec284a5c738dd"} Oct 12 05:50:47 crc kubenswrapper[4930]: I1012 05:50:47.517036 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-npzmj" Oct 12 05:50:48 crc kubenswrapper[4930]: I1012 05:50:48.405786 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddrpfq" event={"ID":"f143a61d-685c-4ee5-b95c-f7fae137d0bc","Type":"ContainerStarted","Data":"4430d8d885ed4d8a68776ae51a8297362359610fed7ac202dd9a9fcf7ee3571e"} Oct 12 05:50:48 crc kubenswrapper[4930]: I1012 05:50:48.438638 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddrpfq" podStartSLOduration=4.281169205 podStartE2EDuration="5.438608748s" podCreationTimestamp="2025-10-12 05:50:43 +0000 UTC" firstStartedPulling="2025-10-12 05:50:45.378391095 +0000 UTC m=+577.920492890" lastFinishedPulling="2025-10-12 05:50:46.535830678 +0000 UTC m=+579.077932433" observedRunningTime="2025-10-12 05:50:48.433983061 +0000 UTC m=+580.976084876" watchObservedRunningTime="2025-10-12 05:50:48.438608748 +0000 UTC m=+580.980710553" Oct 12 05:50:49 crc kubenswrapper[4930]: I1012 05:50:49.415817 4930 generic.go:334] "Generic (PLEG): container finished" podID="f143a61d-685c-4ee5-b95c-f7fae137d0bc" containerID="4430d8d885ed4d8a68776ae51a8297362359610fed7ac202dd9a9fcf7ee3571e" exitCode=0 Oct 12 05:50:49 crc kubenswrapper[4930]: I1012 05:50:49.415893 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddrpfq" event={"ID":"f143a61d-685c-4ee5-b95c-f7fae137d0bc","Type":"ContainerDied","Data":"4430d8d885ed4d8a68776ae51a8297362359610fed7ac202dd9a9fcf7ee3571e"} Oct 12 05:50:50 crc kubenswrapper[4930]: I1012 05:50:50.776591 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddrpfq" Oct 12 05:50:50 crc kubenswrapper[4930]: I1012 05:50:50.917567 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f143a61d-685c-4ee5-b95c-f7fae137d0bc-util\") pod \"f143a61d-685c-4ee5-b95c-f7fae137d0bc\" (UID: \"f143a61d-685c-4ee5-b95c-f7fae137d0bc\") " Oct 12 05:50:50 crc kubenswrapper[4930]: I1012 05:50:50.917658 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bg6hj\" (UniqueName: \"kubernetes.io/projected/f143a61d-685c-4ee5-b95c-f7fae137d0bc-kube-api-access-bg6hj\") pod \"f143a61d-685c-4ee5-b95c-f7fae137d0bc\" (UID: \"f143a61d-685c-4ee5-b95c-f7fae137d0bc\") " Oct 12 05:50:50 crc kubenswrapper[4930]: I1012 05:50:50.917791 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f143a61d-685c-4ee5-b95c-f7fae137d0bc-bundle\") pod \"f143a61d-685c-4ee5-b95c-f7fae137d0bc\" (UID: \"f143a61d-685c-4ee5-b95c-f7fae137d0bc\") " Oct 12 05:50:50 crc kubenswrapper[4930]: I1012 05:50:50.921860 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f143a61d-685c-4ee5-b95c-f7fae137d0bc-bundle" (OuterVolumeSpecName: "bundle") pod "f143a61d-685c-4ee5-b95c-f7fae137d0bc" (UID: "f143a61d-685c-4ee5-b95c-f7fae137d0bc"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 05:50:50 crc kubenswrapper[4930]: I1012 05:50:50.926487 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f143a61d-685c-4ee5-b95c-f7fae137d0bc-kube-api-access-bg6hj" (OuterVolumeSpecName: "kube-api-access-bg6hj") pod "f143a61d-685c-4ee5-b95c-f7fae137d0bc" (UID: "f143a61d-685c-4ee5-b95c-f7fae137d0bc"). InnerVolumeSpecName "kube-api-access-bg6hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:50:50 crc kubenswrapper[4930]: I1012 05:50:50.947426 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f143a61d-685c-4ee5-b95c-f7fae137d0bc-util" (OuterVolumeSpecName: "util") pod "f143a61d-685c-4ee5-b95c-f7fae137d0bc" (UID: "f143a61d-685c-4ee5-b95c-f7fae137d0bc"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 05:50:51 crc kubenswrapper[4930]: I1012 05:50:51.020535 4930 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f143a61d-685c-4ee5-b95c-f7fae137d0bc-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 05:50:51 crc kubenswrapper[4930]: I1012 05:50:51.020595 4930 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f143a61d-685c-4ee5-b95c-f7fae137d0bc-util\") on node \"crc\" DevicePath \"\"" Oct 12 05:50:51 crc kubenswrapper[4930]: I1012 05:50:51.020614 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bg6hj\" (UniqueName: \"kubernetes.io/projected/f143a61d-685c-4ee5-b95c-f7fae137d0bc-kube-api-access-bg6hj\") on node \"crc\" DevicePath \"\"" Oct 12 05:50:51 crc kubenswrapper[4930]: I1012 05:50:51.434166 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddrpfq" event={"ID":"f143a61d-685c-4ee5-b95c-f7fae137d0bc","Type":"ContainerDied","Data":"f06b5e7a5e5c1d9616537edfda20aabbb7f4d9f6d7aecd014103f7d914f35b21"} Oct 12 05:50:51 crc kubenswrapper[4930]: I1012 05:50:51.434234 4930 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f06b5e7a5e5c1d9616537edfda20aabbb7f4d9f6d7aecd014103f7d914f35b21" Oct 12 05:50:51 crc kubenswrapper[4930]: I1012 05:50:51.434288 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddrpfq" Oct 12 05:51:00 crc kubenswrapper[4930]: I1012 05:51:00.825663 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-v4gns"] Oct 12 05:51:00 crc kubenswrapper[4930]: E1012 05:51:00.826392 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f143a61d-685c-4ee5-b95c-f7fae137d0bc" containerName="extract" Oct 12 05:51:00 crc kubenswrapper[4930]: I1012 05:51:00.826409 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="f143a61d-685c-4ee5-b95c-f7fae137d0bc" containerName="extract" Oct 12 05:51:00 crc kubenswrapper[4930]: E1012 05:51:00.826423 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f143a61d-685c-4ee5-b95c-f7fae137d0bc" containerName="util" Oct 12 05:51:00 crc kubenswrapper[4930]: I1012 05:51:00.826432 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="f143a61d-685c-4ee5-b95c-f7fae137d0bc" containerName="util" Oct 12 05:51:00 crc kubenswrapper[4930]: E1012 05:51:00.826445 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f143a61d-685c-4ee5-b95c-f7fae137d0bc" containerName="pull" Oct 12 05:51:00 crc kubenswrapper[4930]: I1012 05:51:00.826454 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="f143a61d-685c-4ee5-b95c-f7fae137d0bc" containerName="pull" Oct 12 05:51:00 crc kubenswrapper[4930]: I1012 05:51:00.826578 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="f143a61d-685c-4ee5-b95c-f7fae137d0bc" containerName="extract" Oct 12 05:51:00 crc kubenswrapper[4930]: I1012 05:51:00.826943 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-v4gns" Oct 12 05:51:00 crc kubenswrapper[4930]: I1012 05:51:00.830852 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-m8kqh" Oct 12 05:51:00 crc kubenswrapper[4930]: I1012 05:51:00.834213 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Oct 12 05:51:00 crc kubenswrapper[4930]: I1012 05:51:00.847694 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Oct 12 05:51:00 crc kubenswrapper[4930]: I1012 05:51:00.851579 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-v4gns"] Oct 12 05:51:00 crc kubenswrapper[4930]: I1012 05:51:00.890582 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8s96s\" (UniqueName: \"kubernetes.io/projected/36fc8dbb-9393-4ad2-a475-7933483eef61-kube-api-access-8s96s\") pod \"obo-prometheus-operator-7c8cf85677-v4gns\" (UID: \"36fc8dbb-9393-4ad2-a475-7933483eef61\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-v4gns" Oct 12 05:51:00 crc kubenswrapper[4930]: I1012 05:51:00.962467 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-78c9d5c958-576bf"] Oct 12 05:51:00 crc kubenswrapper[4930]: I1012 05:51:00.963072 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-78c9d5c958-576bf" Oct 12 05:51:00 crc kubenswrapper[4930]: I1012 05:51:00.965281 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Oct 12 05:51:00 crc kubenswrapper[4930]: I1012 05:51:00.965431 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-tgj7g" Oct 12 05:51:00 crc kubenswrapper[4930]: I1012 05:51:00.991984 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8s96s\" (UniqueName: \"kubernetes.io/projected/36fc8dbb-9393-4ad2-a475-7933483eef61-kube-api-access-8s96s\") pod \"obo-prometheus-operator-7c8cf85677-v4gns\" (UID: \"36fc8dbb-9393-4ad2-a475-7933483eef61\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-v4gns" Oct 12 05:51:00 crc kubenswrapper[4930]: I1012 05:51:00.995455 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-78c9d5c958-fbq82"] Oct 12 05:51:00 crc kubenswrapper[4930]: I1012 05:51:00.996624 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-78c9d5c958-fbq82" Oct 12 05:51:01 crc kubenswrapper[4930]: I1012 05:51:01.002563 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-78c9d5c958-576bf"] Oct 12 05:51:01 crc kubenswrapper[4930]: I1012 05:51:01.017099 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-78c9d5c958-fbq82"] Oct 12 05:51:01 crc kubenswrapper[4930]: I1012 05:51:01.033832 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8s96s\" (UniqueName: \"kubernetes.io/projected/36fc8dbb-9393-4ad2-a475-7933483eef61-kube-api-access-8s96s\") pod \"obo-prometheus-operator-7c8cf85677-v4gns\" (UID: \"36fc8dbb-9393-4ad2-a475-7933483eef61\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-v4gns" Oct 12 05:51:01 crc kubenswrapper[4930]: I1012 05:51:01.093175 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1373019b-d435-40a9-8551-11fb23298b48-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-78c9d5c958-576bf\" (UID: \"1373019b-d435-40a9-8551-11fb23298b48\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-78c9d5c958-576bf" Oct 12 05:51:01 crc kubenswrapper[4930]: I1012 05:51:01.093233 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1373019b-d435-40a9-8551-11fb23298b48-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-78c9d5c958-576bf\" (UID: \"1373019b-d435-40a9-8551-11fb23298b48\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-78c9d5c958-576bf" Oct 12 05:51:01 crc kubenswrapper[4930]: I1012 05:51:01.093297 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/828b5980-9511-4284-a5f4-4197242fef19-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-78c9d5c958-fbq82\" (UID: \"828b5980-9511-4284-a5f4-4197242fef19\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-78c9d5c958-fbq82" Oct 12 05:51:01 crc kubenswrapper[4930]: I1012 05:51:01.093327 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/828b5980-9511-4284-a5f4-4197242fef19-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-78c9d5c958-fbq82\" (UID: \"828b5980-9511-4284-a5f4-4197242fef19\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-78c9d5c958-fbq82" Oct 12 05:51:01 crc kubenswrapper[4930]: I1012 05:51:01.142976 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-v4gns" Oct 12 05:51:01 crc kubenswrapper[4930]: I1012 05:51:01.169628 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-tfp9w"] Oct 12 05:51:01 crc kubenswrapper[4930]: I1012 05:51:01.170282 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-tfp9w" Oct 12 05:51:01 crc kubenswrapper[4930]: I1012 05:51:01.174130 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Oct 12 05:51:01 crc kubenswrapper[4930]: I1012 05:51:01.174382 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-kpqsx" Oct 12 05:51:01 crc kubenswrapper[4930]: I1012 05:51:01.194221 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1373019b-d435-40a9-8551-11fb23298b48-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-78c9d5c958-576bf\" (UID: \"1373019b-d435-40a9-8551-11fb23298b48\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-78c9d5c958-576bf" Oct 12 05:51:01 crc kubenswrapper[4930]: I1012 05:51:01.194322 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/828b5980-9511-4284-a5f4-4197242fef19-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-78c9d5c958-fbq82\" (UID: \"828b5980-9511-4284-a5f4-4197242fef19\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-78c9d5c958-fbq82" Oct 12 05:51:01 crc kubenswrapper[4930]: I1012 05:51:01.194353 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/828b5980-9511-4284-a5f4-4197242fef19-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-78c9d5c958-fbq82\" (UID: \"828b5980-9511-4284-a5f4-4197242fef19\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-78c9d5c958-fbq82" Oct 12 05:51:01 crc kubenswrapper[4930]: I1012 05:51:01.194419 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1373019b-d435-40a9-8551-11fb23298b48-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-78c9d5c958-576bf\" (UID: \"1373019b-d435-40a9-8551-11fb23298b48\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-78c9d5c958-576bf" Oct 12 05:51:01 crc kubenswrapper[4930]: I1012 05:51:01.199760 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1373019b-d435-40a9-8551-11fb23298b48-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-78c9d5c958-576bf\" (UID: \"1373019b-d435-40a9-8551-11fb23298b48\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-78c9d5c958-576bf" Oct 12 05:51:01 crc kubenswrapper[4930]: I1012 05:51:01.204092 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/828b5980-9511-4284-a5f4-4197242fef19-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-78c9d5c958-fbq82\" (UID: \"828b5980-9511-4284-a5f4-4197242fef19\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-78c9d5c958-fbq82" Oct 12 05:51:01 crc kubenswrapper[4930]: I1012 05:51:01.209692 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1373019b-d435-40a9-8551-11fb23298b48-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-78c9d5c958-576bf\" (UID: \"1373019b-d435-40a9-8551-11fb23298b48\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-78c9d5c958-576bf" Oct 12 05:51:01 crc kubenswrapper[4930]: I1012 05:51:01.210077 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/828b5980-9511-4284-a5f4-4197242fef19-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-78c9d5c958-fbq82\" (UID: \"828b5980-9511-4284-a5f4-4197242fef19\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-78c9d5c958-fbq82" Oct 12 05:51:01 crc kubenswrapper[4930]: I1012 05:51:01.212948 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-tfp9w"] Oct 12 05:51:01 crc kubenswrapper[4930]: I1012 05:51:01.276227 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-78c9d5c958-576bf" Oct 12 05:51:01 crc kubenswrapper[4930]: I1012 05:51:01.295441 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/cd5c9b39-afff-488d-9bc7-875c644a6975-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-tfp9w\" (UID: \"cd5c9b39-afff-488d-9bc7-875c644a6975\") " pod="openshift-operators/observability-operator-cc5f78dfc-tfp9w" Oct 12 05:51:01 crc kubenswrapper[4930]: I1012 05:51:01.295541 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hztwq\" (UniqueName: \"kubernetes.io/projected/cd5c9b39-afff-488d-9bc7-875c644a6975-kube-api-access-hztwq\") pod \"observability-operator-cc5f78dfc-tfp9w\" (UID: \"cd5c9b39-afff-488d-9bc7-875c644a6975\") " pod="openshift-operators/observability-operator-cc5f78dfc-tfp9w" Oct 12 05:51:01 crc kubenswrapper[4930]: I1012 05:51:01.317657 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-78c9d5c958-fbq82" Oct 12 05:51:01 crc kubenswrapper[4930]: I1012 05:51:01.351299 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-sxjbx"] Oct 12 05:51:01 crc kubenswrapper[4930]: I1012 05:51:01.354240 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-sxjbx" Oct 12 05:51:01 crc kubenswrapper[4930]: I1012 05:51:01.356713 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-vdpmw" Oct 12 05:51:01 crc kubenswrapper[4930]: I1012 05:51:01.373058 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-sxjbx"] Oct 12 05:51:01 crc kubenswrapper[4930]: I1012 05:51:01.397226 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hztwq\" (UniqueName: \"kubernetes.io/projected/cd5c9b39-afff-488d-9bc7-875c644a6975-kube-api-access-hztwq\") pod \"observability-operator-cc5f78dfc-tfp9w\" (UID: \"cd5c9b39-afff-488d-9bc7-875c644a6975\") " pod="openshift-operators/observability-operator-cc5f78dfc-tfp9w" Oct 12 05:51:01 crc kubenswrapper[4930]: I1012 05:51:01.397267 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/cd5c9b39-afff-488d-9bc7-875c644a6975-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-tfp9w\" (UID: \"cd5c9b39-afff-488d-9bc7-875c644a6975\") " pod="openshift-operators/observability-operator-cc5f78dfc-tfp9w" Oct 12 05:51:01 crc kubenswrapper[4930]: I1012 05:51:01.397301 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmwtf\" (UniqueName: \"kubernetes.io/projected/71965cf6-b3c6-4e30-8771-eaad927fcc46-kube-api-access-wmwtf\") pod \"perses-operator-54bc95c9fb-sxjbx\" (UID: \"71965cf6-b3c6-4e30-8771-eaad927fcc46\") " pod="openshift-operators/perses-operator-54bc95c9fb-sxjbx" Oct 12 05:51:01 crc kubenswrapper[4930]: I1012 05:51:01.397331 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/71965cf6-b3c6-4e30-8771-eaad927fcc46-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-sxjbx\" (UID: \"71965cf6-b3c6-4e30-8771-eaad927fcc46\") " pod="openshift-operators/perses-operator-54bc95c9fb-sxjbx" Oct 12 05:51:01 crc kubenswrapper[4930]: I1012 05:51:01.408641 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/cd5c9b39-afff-488d-9bc7-875c644a6975-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-tfp9w\" (UID: \"cd5c9b39-afff-488d-9bc7-875c644a6975\") " pod="openshift-operators/observability-operator-cc5f78dfc-tfp9w" Oct 12 05:51:01 crc kubenswrapper[4930]: I1012 05:51:01.421719 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hztwq\" (UniqueName: \"kubernetes.io/projected/cd5c9b39-afff-488d-9bc7-875c644a6975-kube-api-access-hztwq\") pod \"observability-operator-cc5f78dfc-tfp9w\" (UID: \"cd5c9b39-afff-488d-9bc7-875c644a6975\") " pod="openshift-operators/observability-operator-cc5f78dfc-tfp9w" Oct 12 05:51:01 crc kubenswrapper[4930]: I1012 05:51:01.501662 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmwtf\" (UniqueName: \"kubernetes.io/projected/71965cf6-b3c6-4e30-8771-eaad927fcc46-kube-api-access-wmwtf\") pod \"perses-operator-54bc95c9fb-sxjbx\" (UID: \"71965cf6-b3c6-4e30-8771-eaad927fcc46\") " pod="openshift-operators/perses-operator-54bc95c9fb-sxjbx" Oct 12 05:51:01 crc kubenswrapper[4930]: I1012 05:51:01.501712 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/71965cf6-b3c6-4e30-8771-eaad927fcc46-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-sxjbx\" (UID: \"71965cf6-b3c6-4e30-8771-eaad927fcc46\") " pod="openshift-operators/perses-operator-54bc95c9fb-sxjbx" Oct 12 05:51:01 crc kubenswrapper[4930]: I1012 05:51:01.502618 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/71965cf6-b3c6-4e30-8771-eaad927fcc46-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-sxjbx\" (UID: \"71965cf6-b3c6-4e30-8771-eaad927fcc46\") " pod="openshift-operators/perses-operator-54bc95c9fb-sxjbx" Oct 12 05:51:01 crc kubenswrapper[4930]: I1012 05:51:01.512703 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-v4gns"] Oct 12 05:51:01 crc kubenswrapper[4930]: I1012 05:51:01.519338 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmwtf\" (UniqueName: \"kubernetes.io/projected/71965cf6-b3c6-4e30-8771-eaad927fcc46-kube-api-access-wmwtf\") pod \"perses-operator-54bc95c9fb-sxjbx\" (UID: \"71965cf6-b3c6-4e30-8771-eaad927fcc46\") " pod="openshift-operators/perses-operator-54bc95c9fb-sxjbx" Oct 12 05:51:01 crc kubenswrapper[4930]: W1012 05:51:01.520881 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36fc8dbb_9393_4ad2_a475_7933483eef61.slice/crio-dd67e554e9b219727e097e7ada23cca9932556c5de1185a4f7019ea7353c6194 WatchSource:0}: Error finding container dd67e554e9b219727e097e7ada23cca9932556c5de1185a4f7019ea7353c6194: Status 404 returned error can't find the container with id dd67e554e9b219727e097e7ada23cca9932556c5de1185a4f7019ea7353c6194 Oct 12 05:51:01 crc kubenswrapper[4930]: I1012 05:51:01.563900 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-tfp9w" Oct 12 05:51:01 crc kubenswrapper[4930]: I1012 05:51:01.580338 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-78c9d5c958-fbq82"] Oct 12 05:51:01 crc kubenswrapper[4930]: W1012 05:51:01.585501 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod828b5980_9511_4284_a5f4_4197242fef19.slice/crio-5832b38f8555f438aeccf5ba681d32089efa8a523f8e2409640cce8d6a99f9b4 WatchSource:0}: Error finding container 5832b38f8555f438aeccf5ba681d32089efa8a523f8e2409640cce8d6a99f9b4: Status 404 returned error can't find the container with id 5832b38f8555f438aeccf5ba681d32089efa8a523f8e2409640cce8d6a99f9b4 Oct 12 05:51:01 crc kubenswrapper[4930]: I1012 05:51:01.694117 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-sxjbx" Oct 12 05:51:01 crc kubenswrapper[4930]: I1012 05:51:01.774805 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-tfp9w"] Oct 12 05:51:01 crc kubenswrapper[4930]: W1012 05:51:01.793914 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd5c9b39_afff_488d_9bc7_875c644a6975.slice/crio-fcc10dc3925f073d8c1f07d9c41f5fe43e51f85b06b1b60c8a41cc1b42682409 WatchSource:0}: Error finding container fcc10dc3925f073d8c1f07d9c41f5fe43e51f85b06b1b60c8a41cc1b42682409: Status 404 returned error can't find the container with id fcc10dc3925f073d8c1f07d9c41f5fe43e51f85b06b1b60c8a41cc1b42682409 Oct 12 05:51:01 crc kubenswrapper[4930]: I1012 05:51:01.814633 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-78c9d5c958-576bf"] Oct 12 05:51:01 crc kubenswrapper[4930]: W1012 05:51:01.827929 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1373019b_d435_40a9_8551_11fb23298b48.slice/crio-e003d6d38f6a29538f9d7fd12dff6b4b964ad4fde58cd87e15e07e9b56c0027e WatchSource:0}: Error finding container e003d6d38f6a29538f9d7fd12dff6b4b964ad4fde58cd87e15e07e9b56c0027e: Status 404 returned error can't find the container with id e003d6d38f6a29538f9d7fd12dff6b4b964ad4fde58cd87e15e07e9b56c0027e Oct 12 05:51:02 crc kubenswrapper[4930]: I1012 05:51:02.148516 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-sxjbx"] Oct 12 05:51:02 crc kubenswrapper[4930]: W1012 05:51:02.151416 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71965cf6_b3c6_4e30_8771_eaad927fcc46.slice/crio-9ed25a1f50cdeb38e53e0d82a4683b2d15b243a9a63d9851972724e49e70b46c WatchSource:0}: Error finding container 9ed25a1f50cdeb38e53e0d82a4683b2d15b243a9a63d9851972724e49e70b46c: Status 404 returned error can't find the container with id 9ed25a1f50cdeb38e53e0d82a4683b2d15b243a9a63d9851972724e49e70b46c Oct 12 05:51:02 crc kubenswrapper[4930]: I1012 05:51:02.514926 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-cc5f78dfc-tfp9w" event={"ID":"cd5c9b39-afff-488d-9bc7-875c644a6975","Type":"ContainerStarted","Data":"fcc10dc3925f073d8c1f07d9c41f5fe43e51f85b06b1b60c8a41cc1b42682409"} Oct 12 05:51:02 crc kubenswrapper[4930]: I1012 05:51:02.516154 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-v4gns" event={"ID":"36fc8dbb-9393-4ad2-a475-7933483eef61","Type":"ContainerStarted","Data":"dd67e554e9b219727e097e7ada23cca9932556c5de1185a4f7019ea7353c6194"} Oct 12 05:51:02 crc kubenswrapper[4930]: I1012 05:51:02.517127 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-78c9d5c958-fbq82" event={"ID":"828b5980-9511-4284-a5f4-4197242fef19","Type":"ContainerStarted","Data":"5832b38f8555f438aeccf5ba681d32089efa8a523f8e2409640cce8d6a99f9b4"} Oct 12 05:51:02 crc kubenswrapper[4930]: I1012 05:51:02.518127 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-54bc95c9fb-sxjbx" event={"ID":"71965cf6-b3c6-4e30-8771-eaad927fcc46","Type":"ContainerStarted","Data":"9ed25a1f50cdeb38e53e0d82a4683b2d15b243a9a63d9851972724e49e70b46c"} Oct 12 05:51:02 crc kubenswrapper[4930]: I1012 05:51:02.519106 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-78c9d5c958-576bf" event={"ID":"1373019b-d435-40a9-8551-11fb23298b48","Type":"ContainerStarted","Data":"e003d6d38f6a29538f9d7fd12dff6b4b964ad4fde58cd87e15e07e9b56c0027e"} Oct 12 05:51:08 crc kubenswrapper[4930]: I1012 05:51:08.532502 4930 scope.go:117] "RemoveContainer" containerID="6697bf685228a837d46450ea695e9579b3bda51686b6b5bcd8338e66757476cd" Oct 12 05:51:16 crc kubenswrapper[4930]: I1012 05:51:16.638641 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-v4gns" event={"ID":"36fc8dbb-9393-4ad2-a475-7933483eef61","Type":"ContainerStarted","Data":"9a0d5a50d2511d85befcadca8658eb1fb25a47485ae752314362861743fb3d3b"} Oct 12 05:51:16 crc kubenswrapper[4930]: I1012 05:51:16.642037 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-78c9d5c958-fbq82" event={"ID":"828b5980-9511-4284-a5f4-4197242fef19","Type":"ContainerStarted","Data":"a3395e1ffb597a5c31c45e83a31a94284f0975f1e22ce3415bcf210732daec17"} Oct 12 05:51:16 crc kubenswrapper[4930]: I1012 05:51:16.644885 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tq29s_c1c3ae9e-26ae-418f-b261-eabc4302b332/kube-multus/2.log" Oct 12 05:51:16 crc kubenswrapper[4930]: I1012 05:51:16.646980 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-54bc95c9fb-sxjbx" event={"ID":"71965cf6-b3c6-4e30-8771-eaad927fcc46","Type":"ContainerStarted","Data":"015fbbf10e2b2413cd3387138a0c8326450e4b19015a2e842c801fddd22d4844"} Oct 12 05:51:16 crc kubenswrapper[4930]: I1012 05:51:16.647088 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-54bc95c9fb-sxjbx" Oct 12 05:51:16 crc kubenswrapper[4930]: I1012 05:51:16.649260 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-78c9d5c958-576bf" event={"ID":"1373019b-d435-40a9-8551-11fb23298b48","Type":"ContainerStarted","Data":"a5b8ad9e0364ef5d9274812374e5c14e716224138f021c2e1656fffb313c5f6b"} Oct 12 05:51:16 crc kubenswrapper[4930]: I1012 05:51:16.651086 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-cc5f78dfc-tfp9w" event={"ID":"cd5c9b39-afff-488d-9bc7-875c644a6975","Type":"ContainerStarted","Data":"539a5750a02d206021809355efb108113e9985f6082d20592f4e90f4f2d4deb8"} Oct 12 05:51:16 crc kubenswrapper[4930]: I1012 05:51:16.651353 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-cc5f78dfc-tfp9w" Oct 12 05:51:16 crc kubenswrapper[4930]: I1012 05:51:16.654159 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-cc5f78dfc-tfp9w" Oct 12 05:51:16 crc kubenswrapper[4930]: I1012 05:51:16.669553 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-v4gns" podStartSLOduration=2.061803743 podStartE2EDuration="16.669536174s" podCreationTimestamp="2025-10-12 05:51:00 +0000 UTC" firstStartedPulling="2025-10-12 05:51:01.526159975 +0000 UTC m=+594.068261740" lastFinishedPulling="2025-10-12 05:51:16.133892406 +0000 UTC m=+608.675994171" observedRunningTime="2025-10-12 05:51:16.663676967 +0000 UTC m=+609.205778742" watchObservedRunningTime="2025-10-12 05:51:16.669536174 +0000 UTC m=+609.211637949" Oct 12 05:51:16 crc kubenswrapper[4930]: I1012 05:51:16.698302 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-cc5f78dfc-tfp9w" podStartSLOduration=1.303527987 podStartE2EDuration="15.698282986s" podCreationTimestamp="2025-10-12 05:51:01 +0000 UTC" firstStartedPulling="2025-10-12 05:51:01.800317074 +0000 UTC m=+594.342418839" lastFinishedPulling="2025-10-12 05:51:16.195072073 +0000 UTC m=+608.737173838" observedRunningTime="2025-10-12 05:51:16.69764425 +0000 UTC m=+609.239746025" watchObservedRunningTime="2025-10-12 05:51:16.698282986 +0000 UTC m=+609.240384761" Oct 12 05:51:16 crc kubenswrapper[4930]: I1012 05:51:16.730856 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-78c9d5c958-576bf" podStartSLOduration=2.427422029 podStartE2EDuration="16.730841004s" podCreationTimestamp="2025-10-12 05:51:00 +0000 UTC" firstStartedPulling="2025-10-12 05:51:01.830510652 +0000 UTC m=+594.372612417" lastFinishedPulling="2025-10-12 05:51:16.133929607 +0000 UTC m=+608.676031392" observedRunningTime="2025-10-12 05:51:16.726186558 +0000 UTC m=+609.268288333" watchObservedRunningTime="2025-10-12 05:51:16.730841004 +0000 UTC m=+609.272942779" Oct 12 05:51:16 crc kubenswrapper[4930]: I1012 05:51:16.781188 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-78c9d5c958-fbq82" podStartSLOduration=2.234713739 podStartE2EDuration="16.781172279s" podCreationTimestamp="2025-10-12 05:51:00 +0000 UTC" firstStartedPulling="2025-10-12 05:51:01.589816415 +0000 UTC m=+594.131918180" lastFinishedPulling="2025-10-12 05:51:16.136274965 +0000 UTC m=+608.678376720" observedRunningTime="2025-10-12 05:51:16.772167813 +0000 UTC m=+609.314269588" watchObservedRunningTime="2025-10-12 05:51:16.781172279 +0000 UTC m=+609.323274044" Oct 12 05:51:16 crc kubenswrapper[4930]: I1012 05:51:16.811354 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-54bc95c9fb-sxjbx" podStartSLOduration=1.827451341 podStartE2EDuration="15.811339517s" podCreationTimestamp="2025-10-12 05:51:01 +0000 UTC" firstStartedPulling="2025-10-12 05:51:02.153474577 +0000 UTC m=+594.695576342" lastFinishedPulling="2025-10-12 05:51:16.137362743 +0000 UTC m=+608.679464518" observedRunningTime="2025-10-12 05:51:16.807246934 +0000 UTC m=+609.349348699" watchObservedRunningTime="2025-10-12 05:51:16.811339517 +0000 UTC m=+609.353441282" Oct 12 05:51:21 crc kubenswrapper[4930]: I1012 05:51:21.700454 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-54bc95c9fb-sxjbx" Oct 12 05:51:40 crc kubenswrapper[4930]: I1012 05:51:40.178810 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c94dtk"] Oct 12 05:51:40 crc kubenswrapper[4930]: I1012 05:51:40.181057 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c94dtk" Oct 12 05:51:40 crc kubenswrapper[4930]: I1012 05:51:40.183770 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 12 05:51:40 crc kubenswrapper[4930]: I1012 05:51:40.193693 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c94dtk"] Oct 12 05:51:40 crc kubenswrapper[4930]: I1012 05:51:40.321779 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f643d105-d04a-48e4-b956-27fa864a1542-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c94dtk\" (UID: \"f643d105-d04a-48e4-b956-27fa864a1542\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c94dtk" Oct 12 05:51:40 crc kubenswrapper[4930]: I1012 05:51:40.321855 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f643d105-d04a-48e4-b956-27fa864a1542-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c94dtk\" (UID: \"f643d105-d04a-48e4-b956-27fa864a1542\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c94dtk" Oct 12 05:51:40 crc kubenswrapper[4930]: I1012 05:51:40.321912 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ln8z8\" (UniqueName: \"kubernetes.io/projected/f643d105-d04a-48e4-b956-27fa864a1542-kube-api-access-ln8z8\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c94dtk\" (UID: \"f643d105-d04a-48e4-b956-27fa864a1542\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c94dtk" Oct 12 05:51:40 crc kubenswrapper[4930]: I1012 05:51:40.423849 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ln8z8\" (UniqueName: \"kubernetes.io/projected/f643d105-d04a-48e4-b956-27fa864a1542-kube-api-access-ln8z8\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c94dtk\" (UID: \"f643d105-d04a-48e4-b956-27fa864a1542\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c94dtk" Oct 12 05:51:40 crc kubenswrapper[4930]: I1012 05:51:40.423975 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f643d105-d04a-48e4-b956-27fa864a1542-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c94dtk\" (UID: \"f643d105-d04a-48e4-b956-27fa864a1542\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c94dtk" Oct 12 05:51:40 crc kubenswrapper[4930]: I1012 05:51:40.424006 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f643d105-d04a-48e4-b956-27fa864a1542-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c94dtk\" (UID: \"f643d105-d04a-48e4-b956-27fa864a1542\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c94dtk" Oct 12 05:51:40 crc kubenswrapper[4930]: I1012 05:51:40.424770 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f643d105-d04a-48e4-b956-27fa864a1542-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c94dtk\" (UID: \"f643d105-d04a-48e4-b956-27fa864a1542\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c94dtk" Oct 12 05:51:40 crc kubenswrapper[4930]: I1012 05:51:40.424789 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f643d105-d04a-48e4-b956-27fa864a1542-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c94dtk\" (UID: \"f643d105-d04a-48e4-b956-27fa864a1542\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c94dtk" Oct 12 05:51:40 crc kubenswrapper[4930]: I1012 05:51:40.446991 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ln8z8\" (UniqueName: \"kubernetes.io/projected/f643d105-d04a-48e4-b956-27fa864a1542-kube-api-access-ln8z8\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c94dtk\" (UID: \"f643d105-d04a-48e4-b956-27fa864a1542\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c94dtk" Oct 12 05:51:40 crc kubenswrapper[4930]: I1012 05:51:40.500851 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c94dtk" Oct 12 05:51:40 crc kubenswrapper[4930]: I1012 05:51:40.771505 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c94dtk"] Oct 12 05:51:40 crc kubenswrapper[4930]: I1012 05:51:40.828295 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c94dtk" event={"ID":"f643d105-d04a-48e4-b956-27fa864a1542","Type":"ContainerStarted","Data":"6d2c6386ed279308d156fb1926953b5750a12592271cb2d9ca9939938f9c5152"} Oct 12 05:51:41 crc kubenswrapper[4930]: I1012 05:51:41.838368 4930 generic.go:334] "Generic (PLEG): container finished" podID="f643d105-d04a-48e4-b956-27fa864a1542" containerID="f799d2b9305dfa4be0c8f8644caa14c193c173085d549a588c55df00c7a152e9" exitCode=0 Oct 12 05:51:41 crc kubenswrapper[4930]: I1012 05:51:41.838478 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c94dtk" event={"ID":"f643d105-d04a-48e4-b956-27fa864a1542","Type":"ContainerDied","Data":"f799d2b9305dfa4be0c8f8644caa14c193c173085d549a588c55df00c7a152e9"} Oct 12 05:51:44 crc kubenswrapper[4930]: I1012 05:51:44.868237 4930 generic.go:334] "Generic (PLEG): container finished" podID="f643d105-d04a-48e4-b956-27fa864a1542" containerID="1160c7582810bb3d11c7b72c6aca7777e0b6467c3d05c1e7bee7a5ba4698de4e" exitCode=0 Oct 12 05:51:44 crc kubenswrapper[4930]: I1012 05:51:44.868313 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c94dtk" event={"ID":"f643d105-d04a-48e4-b956-27fa864a1542","Type":"ContainerDied","Data":"1160c7582810bb3d11c7b72c6aca7777e0b6467c3d05c1e7bee7a5ba4698de4e"} Oct 12 05:51:45 crc kubenswrapper[4930]: I1012 05:51:45.881448 4930 generic.go:334] "Generic (PLEG): container finished" podID="f643d105-d04a-48e4-b956-27fa864a1542" containerID="3672683ca0ec83a44fde6a6d6c4c03956a9454a6c2fe20b0410670339e35824e" exitCode=0 Oct 12 05:51:45 crc kubenswrapper[4930]: I1012 05:51:45.881512 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c94dtk" event={"ID":"f643d105-d04a-48e4-b956-27fa864a1542","Type":"ContainerDied","Data":"3672683ca0ec83a44fde6a6d6c4c03956a9454a6c2fe20b0410670339e35824e"} Oct 12 05:51:47 crc kubenswrapper[4930]: I1012 05:51:47.225660 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c94dtk" Oct 12 05:51:47 crc kubenswrapper[4930]: I1012 05:51:47.325421 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f643d105-d04a-48e4-b956-27fa864a1542-util\") pod \"f643d105-d04a-48e4-b956-27fa864a1542\" (UID: \"f643d105-d04a-48e4-b956-27fa864a1542\") " Oct 12 05:51:47 crc kubenswrapper[4930]: I1012 05:51:47.325496 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ln8z8\" (UniqueName: \"kubernetes.io/projected/f643d105-d04a-48e4-b956-27fa864a1542-kube-api-access-ln8z8\") pod \"f643d105-d04a-48e4-b956-27fa864a1542\" (UID: \"f643d105-d04a-48e4-b956-27fa864a1542\") " Oct 12 05:51:47 crc kubenswrapper[4930]: I1012 05:51:47.325618 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f643d105-d04a-48e4-b956-27fa864a1542-bundle\") pod \"f643d105-d04a-48e4-b956-27fa864a1542\" (UID: \"f643d105-d04a-48e4-b956-27fa864a1542\") " Oct 12 05:51:47 crc kubenswrapper[4930]: I1012 05:51:47.326522 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f643d105-d04a-48e4-b956-27fa864a1542-bundle" (OuterVolumeSpecName: "bundle") pod "f643d105-d04a-48e4-b956-27fa864a1542" (UID: "f643d105-d04a-48e4-b956-27fa864a1542"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 05:51:47 crc kubenswrapper[4930]: I1012 05:51:47.333168 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f643d105-d04a-48e4-b956-27fa864a1542-kube-api-access-ln8z8" (OuterVolumeSpecName: "kube-api-access-ln8z8") pod "f643d105-d04a-48e4-b956-27fa864a1542" (UID: "f643d105-d04a-48e4-b956-27fa864a1542"). InnerVolumeSpecName "kube-api-access-ln8z8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:51:47 crc kubenswrapper[4930]: I1012 05:51:47.337981 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f643d105-d04a-48e4-b956-27fa864a1542-util" (OuterVolumeSpecName: "util") pod "f643d105-d04a-48e4-b956-27fa864a1542" (UID: "f643d105-d04a-48e4-b956-27fa864a1542"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 05:51:47 crc kubenswrapper[4930]: I1012 05:51:47.426651 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ln8z8\" (UniqueName: \"kubernetes.io/projected/f643d105-d04a-48e4-b956-27fa864a1542-kube-api-access-ln8z8\") on node \"crc\" DevicePath \"\"" Oct 12 05:51:47 crc kubenswrapper[4930]: I1012 05:51:47.426706 4930 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f643d105-d04a-48e4-b956-27fa864a1542-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 05:51:47 crc kubenswrapper[4930]: I1012 05:51:47.426718 4930 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f643d105-d04a-48e4-b956-27fa864a1542-util\") on node \"crc\" DevicePath \"\"" Oct 12 05:51:47 crc kubenswrapper[4930]: I1012 05:51:47.901935 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c94dtk" event={"ID":"f643d105-d04a-48e4-b956-27fa864a1542","Type":"ContainerDied","Data":"6d2c6386ed279308d156fb1926953b5750a12592271cb2d9ca9939938f9c5152"} Oct 12 05:51:47 crc kubenswrapper[4930]: I1012 05:51:47.901994 4930 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d2c6386ed279308d156fb1926953b5750a12592271cb2d9ca9939938f9c5152" Oct 12 05:51:47 crc kubenswrapper[4930]: I1012 05:51:47.902091 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c94dtk" Oct 12 05:51:51 crc kubenswrapper[4930]: I1012 05:51:51.774251 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-gznzl"] Oct 12 05:51:51 crc kubenswrapper[4930]: E1012 05:51:51.774604 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f643d105-d04a-48e4-b956-27fa864a1542" containerName="extract" Oct 12 05:51:51 crc kubenswrapper[4930]: I1012 05:51:51.774623 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="f643d105-d04a-48e4-b956-27fa864a1542" containerName="extract" Oct 12 05:51:51 crc kubenswrapper[4930]: E1012 05:51:51.774647 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f643d105-d04a-48e4-b956-27fa864a1542" containerName="util" Oct 12 05:51:51 crc kubenswrapper[4930]: I1012 05:51:51.774654 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="f643d105-d04a-48e4-b956-27fa864a1542" containerName="util" Oct 12 05:51:51 crc kubenswrapper[4930]: E1012 05:51:51.774674 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f643d105-d04a-48e4-b956-27fa864a1542" containerName="pull" Oct 12 05:51:51 crc kubenswrapper[4930]: I1012 05:51:51.774680 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="f643d105-d04a-48e4-b956-27fa864a1542" containerName="pull" Oct 12 05:51:51 crc kubenswrapper[4930]: I1012 05:51:51.774803 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="f643d105-d04a-48e4-b956-27fa864a1542" containerName="extract" Oct 12 05:51:51 crc kubenswrapper[4930]: I1012 05:51:51.775304 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-gznzl" Oct 12 05:51:51 crc kubenswrapper[4930]: I1012 05:51:51.777836 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-kr9jd" Oct 12 05:51:51 crc kubenswrapper[4930]: I1012 05:51:51.778714 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Oct 12 05:51:51 crc kubenswrapper[4930]: I1012 05:51:51.779826 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Oct 12 05:51:51 crc kubenswrapper[4930]: I1012 05:51:51.793122 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-gznzl"] Oct 12 05:51:51 crc kubenswrapper[4930]: I1012 05:51:51.895471 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffszd\" (UniqueName: \"kubernetes.io/projected/0920866d-8bd7-49ce-81f7-5aa6b77e9198-kube-api-access-ffszd\") pod \"nmstate-operator-858ddd8f98-gznzl\" (UID: \"0920866d-8bd7-49ce-81f7-5aa6b77e9198\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-gznzl" Oct 12 05:51:51 crc kubenswrapper[4930]: I1012 05:51:51.997011 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffszd\" (UniqueName: \"kubernetes.io/projected/0920866d-8bd7-49ce-81f7-5aa6b77e9198-kube-api-access-ffszd\") pod \"nmstate-operator-858ddd8f98-gznzl\" (UID: \"0920866d-8bd7-49ce-81f7-5aa6b77e9198\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-gznzl" Oct 12 05:51:52 crc kubenswrapper[4930]: I1012 05:51:52.033606 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffszd\" (UniqueName: \"kubernetes.io/projected/0920866d-8bd7-49ce-81f7-5aa6b77e9198-kube-api-access-ffszd\") pod \"nmstate-operator-858ddd8f98-gznzl\" (UID: \"0920866d-8bd7-49ce-81f7-5aa6b77e9198\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-gznzl" Oct 12 05:51:52 crc kubenswrapper[4930]: I1012 05:51:52.101482 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-gznzl" Oct 12 05:51:52 crc kubenswrapper[4930]: I1012 05:51:52.396083 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-gznzl"] Oct 12 05:51:52 crc kubenswrapper[4930]: I1012 05:51:52.945384 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-gznzl" event={"ID":"0920866d-8bd7-49ce-81f7-5aa6b77e9198","Type":"ContainerStarted","Data":"5ab8c2fdbe8c0f0db179072ae774ed081c38e0d17cfcb02e0c010fc112c72ae3"} Oct 12 05:51:55 crc kubenswrapper[4930]: I1012 05:51:55.971377 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-gznzl" event={"ID":"0920866d-8bd7-49ce-81f7-5aa6b77e9198","Type":"ContainerStarted","Data":"5ceab302c00732c716bcba1b5c3ff654bf2e74656702f0dc2b9ad6c46e2ad244"} Oct 12 05:51:55 crc kubenswrapper[4930]: I1012 05:51:55.996234 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-858ddd8f98-gznzl" podStartSLOduration=2.5511719189999997 podStartE2EDuration="4.996217272s" podCreationTimestamp="2025-10-12 05:51:51 +0000 UTC" firstStartedPulling="2025-10-12 05:51:52.393831099 +0000 UTC m=+644.935932874" lastFinishedPulling="2025-10-12 05:51:54.838876462 +0000 UTC m=+647.380978227" observedRunningTime="2025-10-12 05:51:55.992361325 +0000 UTC m=+648.534463100" watchObservedRunningTime="2025-10-12 05:51:55.996217272 +0000 UTC m=+648.538319047" Oct 12 05:52:01 crc kubenswrapper[4930]: I1012 05:52:01.504927 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-gtfdw"] Oct 12 05:52:01 crc kubenswrapper[4930]: I1012 05:52:01.506204 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-gtfdw" Oct 12 05:52:01 crc kubenswrapper[4930]: I1012 05:52:01.509708 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-dkdls" Oct 12 05:52:01 crc kubenswrapper[4930]: I1012 05:52:01.515208 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-k92gs"] Oct 12 05:52:01 crc kubenswrapper[4930]: I1012 05:52:01.516352 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-k92gs" Oct 12 05:52:01 crc kubenswrapper[4930]: I1012 05:52:01.518930 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Oct 12 05:52:01 crc kubenswrapper[4930]: I1012 05:52:01.522347 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-gtfdw"] Oct 12 05:52:01 crc kubenswrapper[4930]: I1012 05:52:01.569885 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-k92gs"] Oct 12 05:52:01 crc kubenswrapper[4930]: I1012 05:52:01.598182 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-mngll"] Oct 12 05:52:01 crc kubenswrapper[4930]: I1012 05:52:01.599303 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-mngll" Oct 12 05:52:01 crc kubenswrapper[4930]: I1012 05:52:01.669298 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rzs9\" (UniqueName: \"kubernetes.io/projected/2cb896f0-cc6a-44bd-ae9e-801659d154f4-kube-api-access-5rzs9\") pod \"nmstate-metrics-fdff9cb8d-gtfdw\" (UID: \"2cb896f0-cc6a-44bd-ae9e-801659d154f4\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-gtfdw" Oct 12 05:52:01 crc kubenswrapper[4930]: I1012 05:52:01.669343 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/c75f4df3-9537-4a4b-9170-bee951bdb162-nmstate-lock\") pod \"nmstate-handler-mngll\" (UID: \"c75f4df3-9537-4a4b-9170-bee951bdb162\") " pod="openshift-nmstate/nmstate-handler-mngll" Oct 12 05:52:01 crc kubenswrapper[4930]: I1012 05:52:01.669364 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhwkz\" (UniqueName: \"kubernetes.io/projected/9a8ee773-dcea-477c-98ea-afc18295b1a0-kube-api-access-lhwkz\") pod \"nmstate-webhook-6cdbc54649-k92gs\" (UID: \"9a8ee773-dcea-477c-98ea-afc18295b1a0\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-k92gs" Oct 12 05:52:01 crc kubenswrapper[4930]: I1012 05:52:01.669387 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmznx\" (UniqueName: \"kubernetes.io/projected/c75f4df3-9537-4a4b-9170-bee951bdb162-kube-api-access-zmznx\") pod \"nmstate-handler-mngll\" (UID: \"c75f4df3-9537-4a4b-9170-bee951bdb162\") " pod="openshift-nmstate/nmstate-handler-mngll" Oct 12 05:52:01 crc kubenswrapper[4930]: I1012 05:52:01.669575 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/c75f4df3-9537-4a4b-9170-bee951bdb162-dbus-socket\") pod \"nmstate-handler-mngll\" (UID: \"c75f4df3-9537-4a4b-9170-bee951bdb162\") " pod="openshift-nmstate/nmstate-handler-mngll" Oct 12 05:52:01 crc kubenswrapper[4930]: I1012 05:52:01.669640 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/c75f4df3-9537-4a4b-9170-bee951bdb162-ovs-socket\") pod \"nmstate-handler-mngll\" (UID: \"c75f4df3-9537-4a4b-9170-bee951bdb162\") " pod="openshift-nmstate/nmstate-handler-mngll" Oct 12 05:52:01 crc kubenswrapper[4930]: I1012 05:52:01.669673 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/9a8ee773-dcea-477c-98ea-afc18295b1a0-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-k92gs\" (UID: \"9a8ee773-dcea-477c-98ea-afc18295b1a0\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-k92gs" Oct 12 05:52:01 crc kubenswrapper[4930]: I1012 05:52:01.683378 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-54mmw"] Oct 12 05:52:01 crc kubenswrapper[4930]: I1012 05:52:01.684152 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-54mmw" Oct 12 05:52:01 crc kubenswrapper[4930]: I1012 05:52:01.688721 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Oct 12 05:52:01 crc kubenswrapper[4930]: I1012 05:52:01.689808 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-mkrwg" Oct 12 05:52:01 crc kubenswrapper[4930]: I1012 05:52:01.691240 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Oct 12 05:52:01 crc kubenswrapper[4930]: I1012 05:52:01.694081 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-54mmw"] Oct 12 05:52:01 crc kubenswrapper[4930]: I1012 05:52:01.770636 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rzs9\" (UniqueName: \"kubernetes.io/projected/2cb896f0-cc6a-44bd-ae9e-801659d154f4-kube-api-access-5rzs9\") pod \"nmstate-metrics-fdff9cb8d-gtfdw\" (UID: \"2cb896f0-cc6a-44bd-ae9e-801659d154f4\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-gtfdw" Oct 12 05:52:01 crc kubenswrapper[4930]: I1012 05:52:01.770675 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/c75f4df3-9537-4a4b-9170-bee951bdb162-nmstate-lock\") pod \"nmstate-handler-mngll\" (UID: \"c75f4df3-9537-4a4b-9170-bee951bdb162\") " pod="openshift-nmstate/nmstate-handler-mngll" Oct 12 05:52:01 crc kubenswrapper[4930]: I1012 05:52:01.770698 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhwkz\" (UniqueName: \"kubernetes.io/projected/9a8ee773-dcea-477c-98ea-afc18295b1a0-kube-api-access-lhwkz\") pod \"nmstate-webhook-6cdbc54649-k92gs\" (UID: \"9a8ee773-dcea-477c-98ea-afc18295b1a0\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-k92gs" Oct 12 05:52:01 crc kubenswrapper[4930]: I1012 05:52:01.770722 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmznx\" (UniqueName: \"kubernetes.io/projected/c75f4df3-9537-4a4b-9170-bee951bdb162-kube-api-access-zmznx\") pod \"nmstate-handler-mngll\" (UID: \"c75f4df3-9537-4a4b-9170-bee951bdb162\") " pod="openshift-nmstate/nmstate-handler-mngll" Oct 12 05:52:01 crc kubenswrapper[4930]: I1012 05:52:01.770754 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssv97\" (UniqueName: \"kubernetes.io/projected/0d4f3b47-0a65-4b2d-837d-9e9e9efab38e-kube-api-access-ssv97\") pod \"nmstate-console-plugin-6b874cbd85-54mmw\" (UID: \"0d4f3b47-0a65-4b2d-837d-9e9e9efab38e\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-54mmw" Oct 12 05:52:01 crc kubenswrapper[4930]: I1012 05:52:01.770775 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/0d4f3b47-0a65-4b2d-837d-9e9e9efab38e-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-54mmw\" (UID: \"0d4f3b47-0a65-4b2d-837d-9e9e9efab38e\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-54mmw" Oct 12 05:52:01 crc kubenswrapper[4930]: I1012 05:52:01.770819 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/c75f4df3-9537-4a4b-9170-bee951bdb162-dbus-socket\") pod \"nmstate-handler-mngll\" (UID: \"c75f4df3-9537-4a4b-9170-bee951bdb162\") " pod="openshift-nmstate/nmstate-handler-mngll" Oct 12 05:52:01 crc kubenswrapper[4930]: I1012 05:52:01.770840 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/c75f4df3-9537-4a4b-9170-bee951bdb162-ovs-socket\") pod \"nmstate-handler-mngll\" (UID: \"c75f4df3-9537-4a4b-9170-bee951bdb162\") " pod="openshift-nmstate/nmstate-handler-mngll" Oct 12 05:52:01 crc kubenswrapper[4930]: I1012 05:52:01.770860 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/9a8ee773-dcea-477c-98ea-afc18295b1a0-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-k92gs\" (UID: \"9a8ee773-dcea-477c-98ea-afc18295b1a0\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-k92gs" Oct 12 05:52:01 crc kubenswrapper[4930]: I1012 05:52:01.770882 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/0d4f3b47-0a65-4b2d-837d-9e9e9efab38e-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-54mmw\" (UID: \"0d4f3b47-0a65-4b2d-837d-9e9e9efab38e\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-54mmw" Oct 12 05:52:01 crc kubenswrapper[4930]: I1012 05:52:01.770926 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/c75f4df3-9537-4a4b-9170-bee951bdb162-nmstate-lock\") pod \"nmstate-handler-mngll\" (UID: \"c75f4df3-9537-4a4b-9170-bee951bdb162\") " pod="openshift-nmstate/nmstate-handler-mngll" Oct 12 05:52:01 crc kubenswrapper[4930]: I1012 05:52:01.770959 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/c75f4df3-9537-4a4b-9170-bee951bdb162-ovs-socket\") pod \"nmstate-handler-mngll\" (UID: \"c75f4df3-9537-4a4b-9170-bee951bdb162\") " pod="openshift-nmstate/nmstate-handler-mngll" Oct 12 05:52:01 crc kubenswrapper[4930]: I1012 05:52:01.771263 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/c75f4df3-9537-4a4b-9170-bee951bdb162-dbus-socket\") pod \"nmstate-handler-mngll\" (UID: \"c75f4df3-9537-4a4b-9170-bee951bdb162\") " pod="openshift-nmstate/nmstate-handler-mngll" Oct 12 05:52:01 crc kubenswrapper[4930]: I1012 05:52:01.777573 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/9a8ee773-dcea-477c-98ea-afc18295b1a0-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-k92gs\" (UID: \"9a8ee773-dcea-477c-98ea-afc18295b1a0\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-k92gs" Oct 12 05:52:01 crc kubenswrapper[4930]: I1012 05:52:01.785764 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmznx\" (UniqueName: \"kubernetes.io/projected/c75f4df3-9537-4a4b-9170-bee951bdb162-kube-api-access-zmznx\") pod \"nmstate-handler-mngll\" (UID: \"c75f4df3-9537-4a4b-9170-bee951bdb162\") " pod="openshift-nmstate/nmstate-handler-mngll" Oct 12 05:52:01 crc kubenswrapper[4930]: I1012 05:52:01.800254 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhwkz\" (UniqueName: \"kubernetes.io/projected/9a8ee773-dcea-477c-98ea-afc18295b1a0-kube-api-access-lhwkz\") pod \"nmstate-webhook-6cdbc54649-k92gs\" (UID: \"9a8ee773-dcea-477c-98ea-afc18295b1a0\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-k92gs" Oct 12 05:52:01 crc kubenswrapper[4930]: I1012 05:52:01.806585 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rzs9\" (UniqueName: \"kubernetes.io/projected/2cb896f0-cc6a-44bd-ae9e-801659d154f4-kube-api-access-5rzs9\") pod \"nmstate-metrics-fdff9cb8d-gtfdw\" (UID: \"2cb896f0-cc6a-44bd-ae9e-801659d154f4\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-gtfdw" Oct 12 05:52:01 crc kubenswrapper[4930]: I1012 05:52:01.871591 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssv97\" (UniqueName: \"kubernetes.io/projected/0d4f3b47-0a65-4b2d-837d-9e9e9efab38e-kube-api-access-ssv97\") pod \"nmstate-console-plugin-6b874cbd85-54mmw\" (UID: \"0d4f3b47-0a65-4b2d-837d-9e9e9efab38e\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-54mmw" Oct 12 05:52:01 crc kubenswrapper[4930]: I1012 05:52:01.871639 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/0d4f3b47-0a65-4b2d-837d-9e9e9efab38e-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-54mmw\" (UID: \"0d4f3b47-0a65-4b2d-837d-9e9e9efab38e\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-54mmw" Oct 12 05:52:01 crc kubenswrapper[4930]: I1012 05:52:01.871712 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/0d4f3b47-0a65-4b2d-837d-9e9e9efab38e-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-54mmw\" (UID: \"0d4f3b47-0a65-4b2d-837d-9e9e9efab38e\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-54mmw" Oct 12 05:52:01 crc kubenswrapper[4930]: I1012 05:52:01.872649 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/0d4f3b47-0a65-4b2d-837d-9e9e9efab38e-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-54mmw\" (UID: \"0d4f3b47-0a65-4b2d-837d-9e9e9efab38e\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-54mmw" Oct 12 05:52:01 crc kubenswrapper[4930]: I1012 05:52:01.875774 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/0d4f3b47-0a65-4b2d-837d-9e9e9efab38e-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-54mmw\" (UID: \"0d4f3b47-0a65-4b2d-837d-9e9e9efab38e\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-54mmw" Oct 12 05:52:01 crc kubenswrapper[4930]: I1012 05:52:01.878989 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-gtfdw" Oct 12 05:52:01 crc kubenswrapper[4930]: I1012 05:52:01.887409 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f6cbd8f8b-wcl5p"] Oct 12 05:52:01 crc kubenswrapper[4930]: I1012 05:52:01.888662 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f6cbd8f8b-wcl5p" Oct 12 05:52:01 crc kubenswrapper[4930]: I1012 05:52:01.892095 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-k92gs" Oct 12 05:52:01 crc kubenswrapper[4930]: I1012 05:52:01.915574 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssv97\" (UniqueName: \"kubernetes.io/projected/0d4f3b47-0a65-4b2d-837d-9e9e9efab38e-kube-api-access-ssv97\") pod \"nmstate-console-plugin-6b874cbd85-54mmw\" (UID: \"0d4f3b47-0a65-4b2d-837d-9e9e9efab38e\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-54mmw" Oct 12 05:52:01 crc kubenswrapper[4930]: I1012 05:52:01.919169 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f6cbd8f8b-wcl5p"] Oct 12 05:52:01 crc kubenswrapper[4930]: I1012 05:52:01.927962 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-mngll" Oct 12 05:52:01 crc kubenswrapper[4930]: I1012 05:52:01.999327 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-54mmw" Oct 12 05:52:02 crc kubenswrapper[4930]: I1012 05:52:02.019935 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-mngll" event={"ID":"c75f4df3-9537-4a4b-9170-bee951bdb162","Type":"ContainerStarted","Data":"8cee7de520c35d10bd4b678f835e7a55730e4d6cff52a7f1ad3193165510f602"} Oct 12 05:52:02 crc kubenswrapper[4930]: I1012 05:52:02.074391 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e8ca8a38-64b7-4a5d-ab5b-c12d803a678a-oauth-serving-cert\") pod \"console-f6cbd8f8b-wcl5p\" (UID: \"e8ca8a38-64b7-4a5d-ab5b-c12d803a678a\") " pod="openshift-console/console-f6cbd8f8b-wcl5p" Oct 12 05:52:02 crc kubenswrapper[4930]: I1012 05:52:02.074450 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcgmk\" (UniqueName: \"kubernetes.io/projected/e8ca8a38-64b7-4a5d-ab5b-c12d803a678a-kube-api-access-tcgmk\") pod \"console-f6cbd8f8b-wcl5p\" (UID: \"e8ca8a38-64b7-4a5d-ab5b-c12d803a678a\") " pod="openshift-console/console-f6cbd8f8b-wcl5p" Oct 12 05:52:02 crc kubenswrapper[4930]: I1012 05:52:02.074507 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e8ca8a38-64b7-4a5d-ab5b-c12d803a678a-console-serving-cert\") pod \"console-f6cbd8f8b-wcl5p\" (UID: \"e8ca8a38-64b7-4a5d-ab5b-c12d803a678a\") " pod="openshift-console/console-f6cbd8f8b-wcl5p" Oct 12 05:52:02 crc kubenswrapper[4930]: I1012 05:52:02.074550 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8ca8a38-64b7-4a5d-ab5b-c12d803a678a-trusted-ca-bundle\") pod \"console-f6cbd8f8b-wcl5p\" (UID: \"e8ca8a38-64b7-4a5d-ab5b-c12d803a678a\") " pod="openshift-console/console-f6cbd8f8b-wcl5p" Oct 12 05:52:02 crc kubenswrapper[4930]: I1012 05:52:02.074580 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e8ca8a38-64b7-4a5d-ab5b-c12d803a678a-console-config\") pod \"console-f6cbd8f8b-wcl5p\" (UID: \"e8ca8a38-64b7-4a5d-ab5b-c12d803a678a\") " pod="openshift-console/console-f6cbd8f8b-wcl5p" Oct 12 05:52:02 crc kubenswrapper[4930]: I1012 05:52:02.074606 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e8ca8a38-64b7-4a5d-ab5b-c12d803a678a-service-ca\") pod \"console-f6cbd8f8b-wcl5p\" (UID: \"e8ca8a38-64b7-4a5d-ab5b-c12d803a678a\") " pod="openshift-console/console-f6cbd8f8b-wcl5p" Oct 12 05:52:02 crc kubenswrapper[4930]: I1012 05:52:02.074628 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e8ca8a38-64b7-4a5d-ab5b-c12d803a678a-console-oauth-config\") pod \"console-f6cbd8f8b-wcl5p\" (UID: \"e8ca8a38-64b7-4a5d-ab5b-c12d803a678a\") " pod="openshift-console/console-f6cbd8f8b-wcl5p" Oct 12 05:52:02 crc kubenswrapper[4930]: I1012 05:52:02.176001 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e8ca8a38-64b7-4a5d-ab5b-c12d803a678a-oauth-serving-cert\") pod \"console-f6cbd8f8b-wcl5p\" (UID: \"e8ca8a38-64b7-4a5d-ab5b-c12d803a678a\") " pod="openshift-console/console-f6cbd8f8b-wcl5p" Oct 12 05:52:02 crc kubenswrapper[4930]: I1012 05:52:02.176065 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcgmk\" (UniqueName: \"kubernetes.io/projected/e8ca8a38-64b7-4a5d-ab5b-c12d803a678a-kube-api-access-tcgmk\") pod \"console-f6cbd8f8b-wcl5p\" (UID: \"e8ca8a38-64b7-4a5d-ab5b-c12d803a678a\") " pod="openshift-console/console-f6cbd8f8b-wcl5p" Oct 12 05:52:02 crc kubenswrapper[4930]: I1012 05:52:02.176118 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e8ca8a38-64b7-4a5d-ab5b-c12d803a678a-console-serving-cert\") pod \"console-f6cbd8f8b-wcl5p\" (UID: \"e8ca8a38-64b7-4a5d-ab5b-c12d803a678a\") " pod="openshift-console/console-f6cbd8f8b-wcl5p" Oct 12 05:52:02 crc kubenswrapper[4930]: I1012 05:52:02.176155 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8ca8a38-64b7-4a5d-ab5b-c12d803a678a-trusted-ca-bundle\") pod \"console-f6cbd8f8b-wcl5p\" (UID: \"e8ca8a38-64b7-4a5d-ab5b-c12d803a678a\") " pod="openshift-console/console-f6cbd8f8b-wcl5p" Oct 12 05:52:02 crc kubenswrapper[4930]: I1012 05:52:02.176178 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e8ca8a38-64b7-4a5d-ab5b-c12d803a678a-console-config\") pod \"console-f6cbd8f8b-wcl5p\" (UID: \"e8ca8a38-64b7-4a5d-ab5b-c12d803a678a\") " pod="openshift-console/console-f6cbd8f8b-wcl5p" Oct 12 05:52:02 crc kubenswrapper[4930]: I1012 05:52:02.176195 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e8ca8a38-64b7-4a5d-ab5b-c12d803a678a-service-ca\") pod \"console-f6cbd8f8b-wcl5p\" (UID: \"e8ca8a38-64b7-4a5d-ab5b-c12d803a678a\") " pod="openshift-console/console-f6cbd8f8b-wcl5p" Oct 12 05:52:02 crc kubenswrapper[4930]: I1012 05:52:02.176211 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e8ca8a38-64b7-4a5d-ab5b-c12d803a678a-console-oauth-config\") pod \"console-f6cbd8f8b-wcl5p\" (UID: \"e8ca8a38-64b7-4a5d-ab5b-c12d803a678a\") " pod="openshift-console/console-f6cbd8f8b-wcl5p" Oct 12 05:52:02 crc kubenswrapper[4930]: I1012 05:52:02.177267 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e8ca8a38-64b7-4a5d-ab5b-c12d803a678a-console-config\") pod \"console-f6cbd8f8b-wcl5p\" (UID: \"e8ca8a38-64b7-4a5d-ab5b-c12d803a678a\") " pod="openshift-console/console-f6cbd8f8b-wcl5p" Oct 12 05:52:02 crc kubenswrapper[4930]: I1012 05:52:02.178018 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e8ca8a38-64b7-4a5d-ab5b-c12d803a678a-service-ca\") pod \"console-f6cbd8f8b-wcl5p\" (UID: \"e8ca8a38-64b7-4a5d-ab5b-c12d803a678a\") " pod="openshift-console/console-f6cbd8f8b-wcl5p" Oct 12 05:52:02 crc kubenswrapper[4930]: I1012 05:52:02.178425 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8ca8a38-64b7-4a5d-ab5b-c12d803a678a-trusted-ca-bundle\") pod \"console-f6cbd8f8b-wcl5p\" (UID: \"e8ca8a38-64b7-4a5d-ab5b-c12d803a678a\") " pod="openshift-console/console-f6cbd8f8b-wcl5p" Oct 12 05:52:02 crc kubenswrapper[4930]: I1012 05:52:02.178488 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e8ca8a38-64b7-4a5d-ab5b-c12d803a678a-oauth-serving-cert\") pod \"console-f6cbd8f8b-wcl5p\" (UID: \"e8ca8a38-64b7-4a5d-ab5b-c12d803a678a\") " pod="openshift-console/console-f6cbd8f8b-wcl5p" Oct 12 05:52:02 crc kubenswrapper[4930]: I1012 05:52:02.180304 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e8ca8a38-64b7-4a5d-ab5b-c12d803a678a-console-oauth-config\") pod \"console-f6cbd8f8b-wcl5p\" (UID: \"e8ca8a38-64b7-4a5d-ab5b-c12d803a678a\") " pod="openshift-console/console-f6cbd8f8b-wcl5p" Oct 12 05:52:02 crc kubenswrapper[4930]: I1012 05:52:02.180685 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e8ca8a38-64b7-4a5d-ab5b-c12d803a678a-console-serving-cert\") pod \"console-f6cbd8f8b-wcl5p\" (UID: \"e8ca8a38-64b7-4a5d-ab5b-c12d803a678a\") " pod="openshift-console/console-f6cbd8f8b-wcl5p" Oct 12 05:52:02 crc kubenswrapper[4930]: I1012 05:52:02.191661 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-54mmw"] Oct 12 05:52:02 crc kubenswrapper[4930]: I1012 05:52:02.194487 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcgmk\" (UniqueName: \"kubernetes.io/projected/e8ca8a38-64b7-4a5d-ab5b-c12d803a678a-kube-api-access-tcgmk\") pod \"console-f6cbd8f8b-wcl5p\" (UID: \"e8ca8a38-64b7-4a5d-ab5b-c12d803a678a\") " pod="openshift-console/console-f6cbd8f8b-wcl5p" Oct 12 05:52:02 crc kubenswrapper[4930]: W1012 05:52:02.199422 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d4f3b47_0a65_4b2d_837d_9e9e9efab38e.slice/crio-4623607da7cfb22c5914824f7dd021df5f8cd7d3901d5cc438f272cae007fad5 WatchSource:0}: Error finding container 4623607da7cfb22c5914824f7dd021df5f8cd7d3901d5cc438f272cae007fad5: Status 404 returned error can't find the container with id 4623607da7cfb22c5914824f7dd021df5f8cd7d3901d5cc438f272cae007fad5 Oct 12 05:52:02 crc kubenswrapper[4930]: I1012 05:52:02.241814 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f6cbd8f8b-wcl5p" Oct 12 05:52:02 crc kubenswrapper[4930]: I1012 05:52:02.302467 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-gtfdw"] Oct 12 05:52:02 crc kubenswrapper[4930]: W1012 05:52:02.311495 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2cb896f0_cc6a_44bd_ae9e_801659d154f4.slice/crio-c0268827aa746683a75433d3842eaa53651fa0926a92cac17260787714d6f9f4 WatchSource:0}: Error finding container c0268827aa746683a75433d3842eaa53651fa0926a92cac17260787714d6f9f4: Status 404 returned error can't find the container with id c0268827aa746683a75433d3842eaa53651fa0926a92cac17260787714d6f9f4 Oct 12 05:52:02 crc kubenswrapper[4930]: I1012 05:52:02.351452 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-k92gs"] Oct 12 05:52:02 crc kubenswrapper[4930]: W1012 05:52:02.354887 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a8ee773_dcea_477c_98ea_afc18295b1a0.slice/crio-5d4918c592dc5a2cd82f74b5cec11acb7e7714dafd2b9aef99df785ec26a55de WatchSource:0}: Error finding container 5d4918c592dc5a2cd82f74b5cec11acb7e7714dafd2b9aef99df785ec26a55de: Status 404 returned error can't find the container with id 5d4918c592dc5a2cd82f74b5cec11acb7e7714dafd2b9aef99df785ec26a55de Oct 12 05:52:02 crc kubenswrapper[4930]: I1012 05:52:02.439600 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f6cbd8f8b-wcl5p"] Oct 12 05:52:02 crc kubenswrapper[4930]: W1012 05:52:02.449549 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8ca8a38_64b7_4a5d_ab5b_c12d803a678a.slice/crio-b3b2771a6dfdb93de5a7262551e4428b9a82cd13cf7960c57eddc58b2865c942 WatchSource:0}: Error finding container b3b2771a6dfdb93de5a7262551e4428b9a82cd13cf7960c57eddc58b2865c942: Status 404 returned error can't find the container with id b3b2771a6dfdb93de5a7262551e4428b9a82cd13cf7960c57eddc58b2865c942 Oct 12 05:52:03 crc kubenswrapper[4930]: I1012 05:52:03.033951 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f6cbd8f8b-wcl5p" event={"ID":"e8ca8a38-64b7-4a5d-ab5b-c12d803a678a","Type":"ContainerStarted","Data":"2374eabe56fede45ff2d53633ac4edc23ee03eb4fdeb857778650d8219449959"} Oct 12 05:52:03 crc kubenswrapper[4930]: I1012 05:52:03.034565 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f6cbd8f8b-wcl5p" event={"ID":"e8ca8a38-64b7-4a5d-ab5b-c12d803a678a","Type":"ContainerStarted","Data":"b3b2771a6dfdb93de5a7262551e4428b9a82cd13cf7960c57eddc58b2865c942"} Oct 12 05:52:03 crc kubenswrapper[4930]: I1012 05:52:03.037778 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-k92gs" event={"ID":"9a8ee773-dcea-477c-98ea-afc18295b1a0","Type":"ContainerStarted","Data":"5d4918c592dc5a2cd82f74b5cec11acb7e7714dafd2b9aef99df785ec26a55de"} Oct 12 05:52:03 crc kubenswrapper[4930]: I1012 05:52:03.040364 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-54mmw" event={"ID":"0d4f3b47-0a65-4b2d-837d-9e9e9efab38e","Type":"ContainerStarted","Data":"4623607da7cfb22c5914824f7dd021df5f8cd7d3901d5cc438f272cae007fad5"} Oct 12 05:52:03 crc kubenswrapper[4930]: I1012 05:52:03.042108 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-gtfdw" event={"ID":"2cb896f0-cc6a-44bd-ae9e-801659d154f4","Type":"ContainerStarted","Data":"c0268827aa746683a75433d3842eaa53651fa0926a92cac17260787714d6f9f4"} Oct 12 05:52:03 crc kubenswrapper[4930]: I1012 05:52:03.065120 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f6cbd8f8b-wcl5p" podStartSLOduration=2.065093665 podStartE2EDuration="2.065093665s" podCreationTimestamp="2025-10-12 05:52:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:52:03.062055298 +0000 UTC m=+655.604157103" watchObservedRunningTime="2025-10-12 05:52:03.065093665 +0000 UTC m=+655.607195460" Oct 12 05:52:06 crc kubenswrapper[4930]: I1012 05:52:06.071009 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-k92gs" event={"ID":"9a8ee773-dcea-477c-98ea-afc18295b1a0","Type":"ContainerStarted","Data":"ce2ecb56974daaf0dfd0874ae5ecd1d725310d38a539fb72b155db8a362c64e3"} Oct 12 05:52:06 crc kubenswrapper[4930]: I1012 05:52:06.071915 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-k92gs" Oct 12 05:52:06 crc kubenswrapper[4930]: I1012 05:52:06.074961 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-54mmw" event={"ID":"0d4f3b47-0a65-4b2d-837d-9e9e9efab38e","Type":"ContainerStarted","Data":"f6b1fde64a1fed2b6a3a038c2e291801bffaac801c4ffc67186fdb26af4ec61c"} Oct 12 05:52:06 crc kubenswrapper[4930]: I1012 05:52:06.078936 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-mngll" event={"ID":"c75f4df3-9537-4a4b-9170-bee951bdb162","Type":"ContainerStarted","Data":"216a345a1898b21fe65d500b60c2599151a173b67a9ef22e449c97028e7b0a5e"} Oct 12 05:52:06 crc kubenswrapper[4930]: I1012 05:52:06.079346 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-mngll" Oct 12 05:52:06 crc kubenswrapper[4930]: I1012 05:52:06.080932 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-gtfdw" event={"ID":"2cb896f0-cc6a-44bd-ae9e-801659d154f4","Type":"ContainerStarted","Data":"8a346b8d458852f2f66830434e66f42df5d245ecb63194ece82b7bebb9c2dc93"} Oct 12 05:52:06 crc kubenswrapper[4930]: I1012 05:52:06.090946 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-k92gs" podStartSLOduration=2.072353579 podStartE2EDuration="5.090929067s" podCreationTimestamp="2025-10-12 05:52:01 +0000 UTC" firstStartedPulling="2025-10-12 05:52:02.357453974 +0000 UTC m=+654.899555739" lastFinishedPulling="2025-10-12 05:52:05.376029432 +0000 UTC m=+657.918131227" observedRunningTime="2025-10-12 05:52:06.087207483 +0000 UTC m=+658.629309258" watchObservedRunningTime="2025-10-12 05:52:06.090929067 +0000 UTC m=+658.633030862" Oct 12 05:52:06 crc kubenswrapper[4930]: I1012 05:52:06.114555 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-mngll" podStartSLOduration=1.7297343760000001 podStartE2EDuration="5.114539224s" podCreationTimestamp="2025-10-12 05:52:01 +0000 UTC" firstStartedPulling="2025-10-12 05:52:01.981284273 +0000 UTC m=+654.523386038" lastFinishedPulling="2025-10-12 05:52:05.366089081 +0000 UTC m=+657.908190886" observedRunningTime="2025-10-12 05:52:06.109691071 +0000 UTC m=+658.651792836" watchObservedRunningTime="2025-10-12 05:52:06.114539224 +0000 UTC m=+658.656640989" Oct 12 05:52:06 crc kubenswrapper[4930]: I1012 05:52:06.140639 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-54mmw" podStartSLOduration=1.9824543559999999 podStartE2EDuration="5.140614833s" podCreationTimestamp="2025-10-12 05:52:01 +0000 UTC" firstStartedPulling="2025-10-12 05:52:02.202368243 +0000 UTC m=+654.744470008" lastFinishedPulling="2025-10-12 05:52:05.36052868 +0000 UTC m=+657.902630485" observedRunningTime="2025-10-12 05:52:06.130512338 +0000 UTC m=+658.672614093" watchObservedRunningTime="2025-10-12 05:52:06.140614833 +0000 UTC m=+658.682716608" Oct 12 05:52:09 crc kubenswrapper[4930]: I1012 05:52:09.109771 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-gtfdw" event={"ID":"2cb896f0-cc6a-44bd-ae9e-801659d154f4","Type":"ContainerStarted","Data":"608ea340d33a0f1631e5fe37bd17bdc6c831f4435e8e90cfa3b0549e34c8a20f"} Oct 12 05:52:09 crc kubenswrapper[4930]: I1012 05:52:09.146588 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-gtfdw" podStartSLOduration=2.428960414 podStartE2EDuration="8.146545912s" podCreationTimestamp="2025-10-12 05:52:01 +0000 UTC" firstStartedPulling="2025-10-12 05:52:02.313468662 +0000 UTC m=+654.855570427" lastFinishedPulling="2025-10-12 05:52:08.03105414 +0000 UTC m=+660.573155925" observedRunningTime="2025-10-12 05:52:09.139285208 +0000 UTC m=+661.681386983" watchObservedRunningTime="2025-10-12 05:52:09.146545912 +0000 UTC m=+661.688647707" Oct 12 05:52:11 crc kubenswrapper[4930]: I1012 05:52:11.962665 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-mngll" Oct 12 05:52:12 crc kubenswrapper[4930]: I1012 05:52:12.242972 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f6cbd8f8b-wcl5p" Oct 12 05:52:12 crc kubenswrapper[4930]: I1012 05:52:12.243052 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f6cbd8f8b-wcl5p" Oct 12 05:52:12 crc kubenswrapper[4930]: I1012 05:52:12.251049 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f6cbd8f8b-wcl5p" Oct 12 05:52:13 crc kubenswrapper[4930]: I1012 05:52:13.152278 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f6cbd8f8b-wcl5p" Oct 12 05:52:13 crc kubenswrapper[4930]: I1012 05:52:13.235373 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-b6q56"] Oct 12 05:52:21 crc kubenswrapper[4930]: I1012 05:52:21.902567 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-k92gs" Oct 12 05:52:38 crc kubenswrapper[4930]: I1012 05:52:38.326277 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-b6q56" podUID="eb977a38-ef3b-4820-a364-ad16d6c857d5" containerName="console" containerID="cri-o://dcbd971c5f0222d6cf67950c14c24621a3d891093126d1370402ec38c51fe4b0" gracePeriod=15 Oct 12 05:52:38 crc kubenswrapper[4930]: I1012 05:52:38.848427 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-b6q56_eb977a38-ef3b-4820-a364-ad16d6c857d5/console/0.log" Oct 12 05:52:38 crc kubenswrapper[4930]: I1012 05:52:38.848775 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-b6q56" Oct 12 05:52:38 crc kubenswrapper[4930]: I1012 05:52:38.970272 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eb977a38-ef3b-4820-a364-ad16d6c857d5-console-serving-cert\") pod \"eb977a38-ef3b-4820-a364-ad16d6c857d5\" (UID: \"eb977a38-ef3b-4820-a364-ad16d6c857d5\") " Oct 12 05:52:38 crc kubenswrapper[4930]: I1012 05:52:38.970373 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb977a38-ef3b-4820-a364-ad16d6c857d5-trusted-ca-bundle\") pod \"eb977a38-ef3b-4820-a364-ad16d6c857d5\" (UID: \"eb977a38-ef3b-4820-a364-ad16d6c857d5\") " Oct 12 05:52:38 crc kubenswrapper[4930]: I1012 05:52:38.970547 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eb977a38-ef3b-4820-a364-ad16d6c857d5-console-config\") pod \"eb977a38-ef3b-4820-a364-ad16d6c857d5\" (UID: \"eb977a38-ef3b-4820-a364-ad16d6c857d5\") " Oct 12 05:52:38 crc kubenswrapper[4930]: I1012 05:52:38.970591 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eb977a38-ef3b-4820-a364-ad16d6c857d5-console-oauth-config\") pod \"eb977a38-ef3b-4820-a364-ad16d6c857d5\" (UID: \"eb977a38-ef3b-4820-a364-ad16d6c857d5\") " Oct 12 05:52:38 crc kubenswrapper[4930]: I1012 05:52:38.970635 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eb977a38-ef3b-4820-a364-ad16d6c857d5-service-ca\") pod \"eb977a38-ef3b-4820-a364-ad16d6c857d5\" (UID: \"eb977a38-ef3b-4820-a364-ad16d6c857d5\") " Oct 12 05:52:38 crc kubenswrapper[4930]: I1012 05:52:38.970699 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8ns4\" (UniqueName: \"kubernetes.io/projected/eb977a38-ef3b-4820-a364-ad16d6c857d5-kube-api-access-l8ns4\") pod \"eb977a38-ef3b-4820-a364-ad16d6c857d5\" (UID: \"eb977a38-ef3b-4820-a364-ad16d6c857d5\") " Oct 12 05:52:38 crc kubenswrapper[4930]: I1012 05:52:38.970820 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eb977a38-ef3b-4820-a364-ad16d6c857d5-oauth-serving-cert\") pod \"eb977a38-ef3b-4820-a364-ad16d6c857d5\" (UID: \"eb977a38-ef3b-4820-a364-ad16d6c857d5\") " Oct 12 05:52:38 crc kubenswrapper[4930]: I1012 05:52:38.971654 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb977a38-ef3b-4820-a364-ad16d6c857d5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "eb977a38-ef3b-4820-a364-ad16d6c857d5" (UID: "eb977a38-ef3b-4820-a364-ad16d6c857d5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:52:38 crc kubenswrapper[4930]: I1012 05:52:38.971683 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb977a38-ef3b-4820-a364-ad16d6c857d5-service-ca" (OuterVolumeSpecName: "service-ca") pod "eb977a38-ef3b-4820-a364-ad16d6c857d5" (UID: "eb977a38-ef3b-4820-a364-ad16d6c857d5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:52:38 crc kubenswrapper[4930]: I1012 05:52:38.971700 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb977a38-ef3b-4820-a364-ad16d6c857d5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "eb977a38-ef3b-4820-a364-ad16d6c857d5" (UID: "eb977a38-ef3b-4820-a364-ad16d6c857d5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:52:38 crc kubenswrapper[4930]: I1012 05:52:38.971670 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb977a38-ef3b-4820-a364-ad16d6c857d5-console-config" (OuterVolumeSpecName: "console-config") pod "eb977a38-ef3b-4820-a364-ad16d6c857d5" (UID: "eb977a38-ef3b-4820-a364-ad16d6c857d5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:52:38 crc kubenswrapper[4930]: I1012 05:52:38.978781 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb977a38-ef3b-4820-a364-ad16d6c857d5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "eb977a38-ef3b-4820-a364-ad16d6c857d5" (UID: "eb977a38-ef3b-4820-a364-ad16d6c857d5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:52:38 crc kubenswrapper[4930]: I1012 05:52:38.978880 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb977a38-ef3b-4820-a364-ad16d6c857d5-kube-api-access-l8ns4" (OuterVolumeSpecName: "kube-api-access-l8ns4") pod "eb977a38-ef3b-4820-a364-ad16d6c857d5" (UID: "eb977a38-ef3b-4820-a364-ad16d6c857d5"). InnerVolumeSpecName "kube-api-access-l8ns4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:52:38 crc kubenswrapper[4930]: I1012 05:52:38.978955 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb977a38-ef3b-4820-a364-ad16d6c857d5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "eb977a38-ef3b-4820-a364-ad16d6c857d5" (UID: "eb977a38-ef3b-4820-a364-ad16d6c857d5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:52:39 crc kubenswrapper[4930]: I1012 05:52:39.073147 4930 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eb977a38-ef3b-4820-a364-ad16d6c857d5-console-config\") on node \"crc\" DevicePath \"\"" Oct 12 05:52:39 crc kubenswrapper[4930]: I1012 05:52:39.073185 4930 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eb977a38-ef3b-4820-a364-ad16d6c857d5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 12 05:52:39 crc kubenswrapper[4930]: I1012 05:52:39.073197 4930 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eb977a38-ef3b-4820-a364-ad16d6c857d5-service-ca\") on node \"crc\" DevicePath \"\"" Oct 12 05:52:39 crc kubenswrapper[4930]: I1012 05:52:39.073208 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8ns4\" (UniqueName: \"kubernetes.io/projected/eb977a38-ef3b-4820-a364-ad16d6c857d5-kube-api-access-l8ns4\") on node \"crc\" DevicePath \"\"" Oct 12 05:52:39 crc kubenswrapper[4930]: I1012 05:52:39.073220 4930 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eb977a38-ef3b-4820-a364-ad16d6c857d5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 12 05:52:39 crc kubenswrapper[4930]: I1012 05:52:39.073230 4930 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eb977a38-ef3b-4820-a364-ad16d6c857d5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 12 05:52:39 crc kubenswrapper[4930]: I1012 05:52:39.073240 4930 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb977a38-ef3b-4820-a364-ad16d6c857d5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 05:52:39 crc kubenswrapper[4930]: I1012 05:52:39.373347 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-b6q56_eb977a38-ef3b-4820-a364-ad16d6c857d5/console/0.log" Oct 12 05:52:39 crc kubenswrapper[4930]: I1012 05:52:39.374615 4930 generic.go:334] "Generic (PLEG): container finished" podID="eb977a38-ef3b-4820-a364-ad16d6c857d5" containerID="dcbd971c5f0222d6cf67950c14c24621a3d891093126d1370402ec38c51fe4b0" exitCode=2 Oct 12 05:52:39 crc kubenswrapper[4930]: I1012 05:52:39.374747 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-b6q56" event={"ID":"eb977a38-ef3b-4820-a364-ad16d6c857d5","Type":"ContainerDied","Data":"dcbd971c5f0222d6cf67950c14c24621a3d891093126d1370402ec38c51fe4b0"} Oct 12 05:52:39 crc kubenswrapper[4930]: I1012 05:52:39.374866 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-b6q56" event={"ID":"eb977a38-ef3b-4820-a364-ad16d6c857d5","Type":"ContainerDied","Data":"4e3ebf72ea3463a27fee00b739b2d7b986e74d3a0d2211c35ebed95c3791200a"} Oct 12 05:52:39 crc kubenswrapper[4930]: I1012 05:52:39.374951 4930 scope.go:117] "RemoveContainer" containerID="dcbd971c5f0222d6cf67950c14c24621a3d891093126d1370402ec38c51fe4b0" Oct 12 05:52:39 crc kubenswrapper[4930]: I1012 05:52:39.375138 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-b6q56" Oct 12 05:52:39 crc kubenswrapper[4930]: I1012 05:52:39.398864 4930 scope.go:117] "RemoveContainer" containerID="dcbd971c5f0222d6cf67950c14c24621a3d891093126d1370402ec38c51fe4b0" Oct 12 05:52:39 crc kubenswrapper[4930]: E1012 05:52:39.399427 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcbd971c5f0222d6cf67950c14c24621a3d891093126d1370402ec38c51fe4b0\": container with ID starting with dcbd971c5f0222d6cf67950c14c24621a3d891093126d1370402ec38c51fe4b0 not found: ID does not exist" containerID="dcbd971c5f0222d6cf67950c14c24621a3d891093126d1370402ec38c51fe4b0" Oct 12 05:52:39 crc kubenswrapper[4930]: I1012 05:52:39.399469 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcbd971c5f0222d6cf67950c14c24621a3d891093126d1370402ec38c51fe4b0"} err="failed to get container status \"dcbd971c5f0222d6cf67950c14c24621a3d891093126d1370402ec38c51fe4b0\": rpc error: code = NotFound desc = could not find container \"dcbd971c5f0222d6cf67950c14c24621a3d891093126d1370402ec38c51fe4b0\": container with ID starting with dcbd971c5f0222d6cf67950c14c24621a3d891093126d1370402ec38c51fe4b0 not found: ID does not exist" Oct 12 05:52:39 crc kubenswrapper[4930]: I1012 05:52:39.404464 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-b6q56"] Oct 12 05:52:39 crc kubenswrapper[4930]: I1012 05:52:39.409160 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-b6q56"] Oct 12 05:52:40 crc kubenswrapper[4930]: I1012 05:52:40.148619 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb977a38-ef3b-4820-a364-ad16d6c857d5" path="/var/lib/kubelet/pods/eb977a38-ef3b-4820-a364-ad16d6c857d5/volumes" Oct 12 05:52:40 crc kubenswrapper[4930]: I1012 05:52:40.787185 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qfplm"] Oct 12 05:52:40 crc kubenswrapper[4930]: E1012 05:52:40.788108 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb977a38-ef3b-4820-a364-ad16d6c857d5" containerName="console" Oct 12 05:52:40 crc kubenswrapper[4930]: I1012 05:52:40.788134 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb977a38-ef3b-4820-a364-ad16d6c857d5" containerName="console" Oct 12 05:52:40 crc kubenswrapper[4930]: I1012 05:52:40.788363 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb977a38-ef3b-4820-a364-ad16d6c857d5" containerName="console" Oct 12 05:52:40 crc kubenswrapper[4930]: I1012 05:52:40.790081 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qfplm" Oct 12 05:52:40 crc kubenswrapper[4930]: I1012 05:52:40.793507 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 12 05:52:40 crc kubenswrapper[4930]: I1012 05:52:40.798254 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qfplm"] Oct 12 05:52:40 crc kubenswrapper[4930]: I1012 05:52:40.899834 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e54ad199-e123-435c-a125-b662c577a86b-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qfplm\" (UID: \"e54ad199-e123-435c-a125-b662c577a86b\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qfplm" Oct 12 05:52:40 crc kubenswrapper[4930]: I1012 05:52:40.899938 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9kmf\" (UniqueName: \"kubernetes.io/projected/e54ad199-e123-435c-a125-b662c577a86b-kube-api-access-s9kmf\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qfplm\" (UID: \"e54ad199-e123-435c-a125-b662c577a86b\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qfplm" Oct 12 05:52:40 crc kubenswrapper[4930]: I1012 05:52:40.899988 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e54ad199-e123-435c-a125-b662c577a86b-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qfplm\" (UID: \"e54ad199-e123-435c-a125-b662c577a86b\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qfplm" Oct 12 05:52:41 crc kubenswrapper[4930]: I1012 05:52:41.002215 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e54ad199-e123-435c-a125-b662c577a86b-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qfplm\" (UID: \"e54ad199-e123-435c-a125-b662c577a86b\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qfplm" Oct 12 05:52:41 crc kubenswrapper[4930]: I1012 05:52:41.002297 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9kmf\" (UniqueName: \"kubernetes.io/projected/e54ad199-e123-435c-a125-b662c577a86b-kube-api-access-s9kmf\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qfplm\" (UID: \"e54ad199-e123-435c-a125-b662c577a86b\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qfplm" Oct 12 05:52:41 crc kubenswrapper[4930]: I1012 05:52:41.002342 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e54ad199-e123-435c-a125-b662c577a86b-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qfplm\" (UID: \"e54ad199-e123-435c-a125-b662c577a86b\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qfplm" Oct 12 05:52:41 crc kubenswrapper[4930]: I1012 05:52:41.002841 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e54ad199-e123-435c-a125-b662c577a86b-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qfplm\" (UID: \"e54ad199-e123-435c-a125-b662c577a86b\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qfplm" Oct 12 05:52:41 crc kubenswrapper[4930]: I1012 05:52:41.002914 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e54ad199-e123-435c-a125-b662c577a86b-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qfplm\" (UID: \"e54ad199-e123-435c-a125-b662c577a86b\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qfplm" Oct 12 05:52:41 crc kubenswrapper[4930]: I1012 05:52:41.027644 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9kmf\" (UniqueName: \"kubernetes.io/projected/e54ad199-e123-435c-a125-b662c577a86b-kube-api-access-s9kmf\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qfplm\" (UID: \"e54ad199-e123-435c-a125-b662c577a86b\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qfplm" Oct 12 05:52:41 crc kubenswrapper[4930]: I1012 05:52:41.113128 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qfplm" Oct 12 05:52:41 crc kubenswrapper[4930]: I1012 05:52:41.528516 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qfplm"] Oct 12 05:52:42 crc kubenswrapper[4930]: I1012 05:52:42.403047 4930 generic.go:334] "Generic (PLEG): container finished" podID="e54ad199-e123-435c-a125-b662c577a86b" containerID="15720fba71a20956aaf020525bd696ca12ece4fdeec3b9edda8de6b0378fd3a0" exitCode=0 Oct 12 05:52:42 crc kubenswrapper[4930]: I1012 05:52:42.403093 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qfplm" event={"ID":"e54ad199-e123-435c-a125-b662c577a86b","Type":"ContainerDied","Data":"15720fba71a20956aaf020525bd696ca12ece4fdeec3b9edda8de6b0378fd3a0"} Oct 12 05:52:42 crc kubenswrapper[4930]: I1012 05:52:42.403124 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qfplm" event={"ID":"e54ad199-e123-435c-a125-b662c577a86b","Type":"ContainerStarted","Data":"3a2d7ff8b81e6ebeb0befd8bb7f2ba2e6e6b810ddefd0346dee3c35f3ca68c13"} Oct 12 05:52:44 crc kubenswrapper[4930]: I1012 05:52:44.432482 4930 generic.go:334] "Generic (PLEG): container finished" podID="e54ad199-e123-435c-a125-b662c577a86b" containerID="c34163aa5cd292c20ef5830cad216e650a5586f64b95016a8b754598019ffa10" exitCode=0 Oct 12 05:52:44 crc kubenswrapper[4930]: I1012 05:52:44.432583 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qfplm" event={"ID":"e54ad199-e123-435c-a125-b662c577a86b","Type":"ContainerDied","Data":"c34163aa5cd292c20ef5830cad216e650a5586f64b95016a8b754598019ffa10"} Oct 12 05:52:45 crc kubenswrapper[4930]: I1012 05:52:45.445105 4930 generic.go:334] "Generic (PLEG): container finished" podID="e54ad199-e123-435c-a125-b662c577a86b" containerID="bca0bf444db22a246f1f1d2f815f20f787f4c7d22cc3aefc72c84e7b88ffc89e" exitCode=0 Oct 12 05:52:45 crc kubenswrapper[4930]: I1012 05:52:45.445265 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qfplm" event={"ID":"e54ad199-e123-435c-a125-b662c577a86b","Type":"ContainerDied","Data":"bca0bf444db22a246f1f1d2f815f20f787f4c7d22cc3aefc72c84e7b88ffc89e"} Oct 12 05:52:46 crc kubenswrapper[4930]: I1012 05:52:46.868001 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qfplm" Oct 12 05:52:46 crc kubenswrapper[4930]: I1012 05:52:46.989418 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e54ad199-e123-435c-a125-b662c577a86b-util\") pod \"e54ad199-e123-435c-a125-b662c577a86b\" (UID: \"e54ad199-e123-435c-a125-b662c577a86b\") " Oct 12 05:52:46 crc kubenswrapper[4930]: I1012 05:52:46.989917 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9kmf\" (UniqueName: \"kubernetes.io/projected/e54ad199-e123-435c-a125-b662c577a86b-kube-api-access-s9kmf\") pod \"e54ad199-e123-435c-a125-b662c577a86b\" (UID: \"e54ad199-e123-435c-a125-b662c577a86b\") " Oct 12 05:52:46 crc kubenswrapper[4930]: I1012 05:52:46.990035 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e54ad199-e123-435c-a125-b662c577a86b-bundle\") pod \"e54ad199-e123-435c-a125-b662c577a86b\" (UID: \"e54ad199-e123-435c-a125-b662c577a86b\") " Oct 12 05:52:46 crc kubenswrapper[4930]: I1012 05:52:46.990923 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e54ad199-e123-435c-a125-b662c577a86b-bundle" (OuterVolumeSpecName: "bundle") pod "e54ad199-e123-435c-a125-b662c577a86b" (UID: "e54ad199-e123-435c-a125-b662c577a86b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 05:52:46 crc kubenswrapper[4930]: I1012 05:52:46.995447 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e54ad199-e123-435c-a125-b662c577a86b-kube-api-access-s9kmf" (OuterVolumeSpecName: "kube-api-access-s9kmf") pod "e54ad199-e123-435c-a125-b662c577a86b" (UID: "e54ad199-e123-435c-a125-b662c577a86b"). InnerVolumeSpecName "kube-api-access-s9kmf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:52:47 crc kubenswrapper[4930]: I1012 05:52:47.092128 4930 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e54ad199-e123-435c-a125-b662c577a86b-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 05:52:47 crc kubenswrapper[4930]: I1012 05:52:47.092183 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9kmf\" (UniqueName: \"kubernetes.io/projected/e54ad199-e123-435c-a125-b662c577a86b-kube-api-access-s9kmf\") on node \"crc\" DevicePath \"\"" Oct 12 05:52:47 crc kubenswrapper[4930]: I1012 05:52:47.148627 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e54ad199-e123-435c-a125-b662c577a86b-util" (OuterVolumeSpecName: "util") pod "e54ad199-e123-435c-a125-b662c577a86b" (UID: "e54ad199-e123-435c-a125-b662c577a86b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 05:52:47 crc kubenswrapper[4930]: I1012 05:52:47.193289 4930 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e54ad199-e123-435c-a125-b662c577a86b-util\") on node \"crc\" DevicePath \"\"" Oct 12 05:52:47 crc kubenswrapper[4930]: I1012 05:52:47.464581 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qfplm" event={"ID":"e54ad199-e123-435c-a125-b662c577a86b","Type":"ContainerDied","Data":"3a2d7ff8b81e6ebeb0befd8bb7f2ba2e6e6b810ddefd0346dee3c35f3ca68c13"} Oct 12 05:52:47 crc kubenswrapper[4930]: I1012 05:52:47.464708 4930 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3a2d7ff8b81e6ebeb0befd8bb7f2ba2e6e6b810ddefd0346dee3c35f3ca68c13" Oct 12 05:52:47 crc kubenswrapper[4930]: I1012 05:52:47.464632 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qfplm" Oct 12 05:53:01 crc kubenswrapper[4930]: I1012 05:53:01.181314 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-7ccf4dfd66-4nwxw"] Oct 12 05:53:01 crc kubenswrapper[4930]: E1012 05:53:01.181989 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e54ad199-e123-435c-a125-b662c577a86b" containerName="extract" Oct 12 05:53:01 crc kubenswrapper[4930]: I1012 05:53:01.182001 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="e54ad199-e123-435c-a125-b662c577a86b" containerName="extract" Oct 12 05:53:01 crc kubenswrapper[4930]: E1012 05:53:01.182012 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e54ad199-e123-435c-a125-b662c577a86b" containerName="util" Oct 12 05:53:01 crc kubenswrapper[4930]: I1012 05:53:01.182018 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="e54ad199-e123-435c-a125-b662c577a86b" containerName="util" Oct 12 05:53:01 crc kubenswrapper[4930]: E1012 05:53:01.182032 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e54ad199-e123-435c-a125-b662c577a86b" containerName="pull" Oct 12 05:53:01 crc kubenswrapper[4930]: I1012 05:53:01.182038 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="e54ad199-e123-435c-a125-b662c577a86b" containerName="pull" Oct 12 05:53:01 crc kubenswrapper[4930]: I1012 05:53:01.182138 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="e54ad199-e123-435c-a125-b662c577a86b" containerName="extract" Oct 12 05:53:01 crc kubenswrapper[4930]: I1012 05:53:01.182555 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7ccf4dfd66-4nwxw" Oct 12 05:53:01 crc kubenswrapper[4930]: I1012 05:53:01.184180 4930 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-t8r9q" Oct 12 05:53:01 crc kubenswrapper[4930]: I1012 05:53:01.184414 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Oct 12 05:53:01 crc kubenswrapper[4930]: I1012 05:53:01.185137 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Oct 12 05:53:01 crc kubenswrapper[4930]: I1012 05:53:01.185166 4930 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Oct 12 05:53:01 crc kubenswrapper[4930]: I1012 05:53:01.185458 4930 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Oct 12 05:53:01 crc kubenswrapper[4930]: I1012 05:53:01.203055 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7ccf4dfd66-4nwxw"] Oct 12 05:53:01 crc kubenswrapper[4930]: I1012 05:53:01.286964 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8ccb343f-6987-45c8-9b9f-a2bb32efbe22-apiservice-cert\") pod \"metallb-operator-controller-manager-7ccf4dfd66-4nwxw\" (UID: \"8ccb343f-6987-45c8-9b9f-a2bb32efbe22\") " pod="metallb-system/metallb-operator-controller-manager-7ccf4dfd66-4nwxw" Oct 12 05:53:01 crc kubenswrapper[4930]: I1012 05:53:01.287033 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xbkr\" (UniqueName: \"kubernetes.io/projected/8ccb343f-6987-45c8-9b9f-a2bb32efbe22-kube-api-access-6xbkr\") pod \"metallb-operator-controller-manager-7ccf4dfd66-4nwxw\" (UID: \"8ccb343f-6987-45c8-9b9f-a2bb32efbe22\") " pod="metallb-system/metallb-operator-controller-manager-7ccf4dfd66-4nwxw" Oct 12 05:53:01 crc kubenswrapper[4930]: I1012 05:53:01.287067 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8ccb343f-6987-45c8-9b9f-a2bb32efbe22-webhook-cert\") pod \"metallb-operator-controller-manager-7ccf4dfd66-4nwxw\" (UID: \"8ccb343f-6987-45c8-9b9f-a2bb32efbe22\") " pod="metallb-system/metallb-operator-controller-manager-7ccf4dfd66-4nwxw" Oct 12 05:53:01 crc kubenswrapper[4930]: I1012 05:53:01.388794 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8ccb343f-6987-45c8-9b9f-a2bb32efbe22-apiservice-cert\") pod \"metallb-operator-controller-manager-7ccf4dfd66-4nwxw\" (UID: \"8ccb343f-6987-45c8-9b9f-a2bb32efbe22\") " pod="metallb-system/metallb-operator-controller-manager-7ccf4dfd66-4nwxw" Oct 12 05:53:01 crc kubenswrapper[4930]: I1012 05:53:01.388844 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xbkr\" (UniqueName: \"kubernetes.io/projected/8ccb343f-6987-45c8-9b9f-a2bb32efbe22-kube-api-access-6xbkr\") pod \"metallb-operator-controller-manager-7ccf4dfd66-4nwxw\" (UID: \"8ccb343f-6987-45c8-9b9f-a2bb32efbe22\") " pod="metallb-system/metallb-operator-controller-manager-7ccf4dfd66-4nwxw" Oct 12 05:53:01 crc kubenswrapper[4930]: I1012 05:53:01.388875 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8ccb343f-6987-45c8-9b9f-a2bb32efbe22-webhook-cert\") pod \"metallb-operator-controller-manager-7ccf4dfd66-4nwxw\" (UID: \"8ccb343f-6987-45c8-9b9f-a2bb32efbe22\") " pod="metallb-system/metallb-operator-controller-manager-7ccf4dfd66-4nwxw" Oct 12 05:53:01 crc kubenswrapper[4930]: I1012 05:53:01.412079 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8ccb343f-6987-45c8-9b9f-a2bb32efbe22-apiservice-cert\") pod \"metallb-operator-controller-manager-7ccf4dfd66-4nwxw\" (UID: \"8ccb343f-6987-45c8-9b9f-a2bb32efbe22\") " pod="metallb-system/metallb-operator-controller-manager-7ccf4dfd66-4nwxw" Oct 12 05:53:01 crc kubenswrapper[4930]: I1012 05:53:01.412207 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8ccb343f-6987-45c8-9b9f-a2bb32efbe22-webhook-cert\") pod \"metallb-operator-controller-manager-7ccf4dfd66-4nwxw\" (UID: \"8ccb343f-6987-45c8-9b9f-a2bb32efbe22\") " pod="metallb-system/metallb-operator-controller-manager-7ccf4dfd66-4nwxw" Oct 12 05:53:01 crc kubenswrapper[4930]: I1012 05:53:01.419829 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xbkr\" (UniqueName: \"kubernetes.io/projected/8ccb343f-6987-45c8-9b9f-a2bb32efbe22-kube-api-access-6xbkr\") pod \"metallb-operator-controller-manager-7ccf4dfd66-4nwxw\" (UID: \"8ccb343f-6987-45c8-9b9f-a2bb32efbe22\") " pod="metallb-system/metallb-operator-controller-manager-7ccf4dfd66-4nwxw" Oct 12 05:53:01 crc kubenswrapper[4930]: I1012 05:53:01.498778 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7ccf4dfd66-4nwxw" Oct 12 05:53:01 crc kubenswrapper[4930]: I1012 05:53:01.537295 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-688d4f8df8-dh759"] Oct 12 05:53:01 crc kubenswrapper[4930]: I1012 05:53:01.537998 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-688d4f8df8-dh759" Oct 12 05:53:01 crc kubenswrapper[4930]: I1012 05:53:01.539820 4930 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 12 05:53:01 crc kubenswrapper[4930]: I1012 05:53:01.540017 4930 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-mbf7g" Oct 12 05:53:01 crc kubenswrapper[4930]: I1012 05:53:01.540642 4930 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Oct 12 05:53:01 crc kubenswrapper[4930]: I1012 05:53:01.589657 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-688d4f8df8-dh759"] Oct 12 05:53:01 crc kubenswrapper[4930]: I1012 05:53:01.695399 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/726977f0-7624-4ce2-95ed-80f844fcfc81-apiservice-cert\") pod \"metallb-operator-webhook-server-688d4f8df8-dh759\" (UID: \"726977f0-7624-4ce2-95ed-80f844fcfc81\") " pod="metallb-system/metallb-operator-webhook-server-688d4f8df8-dh759" Oct 12 05:53:01 crc kubenswrapper[4930]: I1012 05:53:01.695749 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/726977f0-7624-4ce2-95ed-80f844fcfc81-webhook-cert\") pod \"metallb-operator-webhook-server-688d4f8df8-dh759\" (UID: \"726977f0-7624-4ce2-95ed-80f844fcfc81\") " pod="metallb-system/metallb-operator-webhook-server-688d4f8df8-dh759" Oct 12 05:53:01 crc kubenswrapper[4930]: I1012 05:53:01.695777 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbxwl\" (UniqueName: \"kubernetes.io/projected/726977f0-7624-4ce2-95ed-80f844fcfc81-kube-api-access-bbxwl\") pod \"metallb-operator-webhook-server-688d4f8df8-dh759\" (UID: \"726977f0-7624-4ce2-95ed-80f844fcfc81\") " pod="metallb-system/metallb-operator-webhook-server-688d4f8df8-dh759" Oct 12 05:53:01 crc kubenswrapper[4930]: I1012 05:53:01.797313 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/726977f0-7624-4ce2-95ed-80f844fcfc81-apiservice-cert\") pod \"metallb-operator-webhook-server-688d4f8df8-dh759\" (UID: \"726977f0-7624-4ce2-95ed-80f844fcfc81\") " pod="metallb-system/metallb-operator-webhook-server-688d4f8df8-dh759" Oct 12 05:53:01 crc kubenswrapper[4930]: I1012 05:53:01.797357 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/726977f0-7624-4ce2-95ed-80f844fcfc81-webhook-cert\") pod \"metallb-operator-webhook-server-688d4f8df8-dh759\" (UID: \"726977f0-7624-4ce2-95ed-80f844fcfc81\") " pod="metallb-system/metallb-operator-webhook-server-688d4f8df8-dh759" Oct 12 05:53:01 crc kubenswrapper[4930]: I1012 05:53:01.797385 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbxwl\" (UniqueName: \"kubernetes.io/projected/726977f0-7624-4ce2-95ed-80f844fcfc81-kube-api-access-bbxwl\") pod \"metallb-operator-webhook-server-688d4f8df8-dh759\" (UID: \"726977f0-7624-4ce2-95ed-80f844fcfc81\") " pod="metallb-system/metallb-operator-webhook-server-688d4f8df8-dh759" Oct 12 05:53:01 crc kubenswrapper[4930]: I1012 05:53:01.803460 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/726977f0-7624-4ce2-95ed-80f844fcfc81-webhook-cert\") pod \"metallb-operator-webhook-server-688d4f8df8-dh759\" (UID: \"726977f0-7624-4ce2-95ed-80f844fcfc81\") " pod="metallb-system/metallb-operator-webhook-server-688d4f8df8-dh759" Oct 12 05:53:01 crc kubenswrapper[4930]: I1012 05:53:01.803479 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/726977f0-7624-4ce2-95ed-80f844fcfc81-apiservice-cert\") pod \"metallb-operator-webhook-server-688d4f8df8-dh759\" (UID: \"726977f0-7624-4ce2-95ed-80f844fcfc81\") " pod="metallb-system/metallb-operator-webhook-server-688d4f8df8-dh759" Oct 12 05:53:01 crc kubenswrapper[4930]: I1012 05:53:01.817640 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbxwl\" (UniqueName: \"kubernetes.io/projected/726977f0-7624-4ce2-95ed-80f844fcfc81-kube-api-access-bbxwl\") pod \"metallb-operator-webhook-server-688d4f8df8-dh759\" (UID: \"726977f0-7624-4ce2-95ed-80f844fcfc81\") " pod="metallb-system/metallb-operator-webhook-server-688d4f8df8-dh759" Oct 12 05:53:01 crc kubenswrapper[4930]: I1012 05:53:01.871538 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-688d4f8df8-dh759" Oct 12 05:53:02 crc kubenswrapper[4930]: I1012 05:53:02.039033 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7ccf4dfd66-4nwxw"] Oct 12 05:53:02 crc kubenswrapper[4930]: W1012 05:53:02.049863 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ccb343f_6987_45c8_9b9f_a2bb32efbe22.slice/crio-0e89c08b8526386b3195555d26540054aad51b35a112d817c92bcd7a2eef7b8a WatchSource:0}: Error finding container 0e89c08b8526386b3195555d26540054aad51b35a112d817c92bcd7a2eef7b8a: Status 404 returned error can't find the container with id 0e89c08b8526386b3195555d26540054aad51b35a112d817c92bcd7a2eef7b8a Oct 12 05:53:02 crc kubenswrapper[4930]: I1012 05:53:02.090951 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-688d4f8df8-dh759"] Oct 12 05:53:02 crc kubenswrapper[4930]: W1012 05:53:02.097779 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod726977f0_7624_4ce2_95ed_80f844fcfc81.slice/crio-155221fae78b605176e28cb549c3955920380e38b7fdff2fc188e0156b774431 WatchSource:0}: Error finding container 155221fae78b605176e28cb549c3955920380e38b7fdff2fc188e0156b774431: Status 404 returned error can't find the container with id 155221fae78b605176e28cb549c3955920380e38b7fdff2fc188e0156b774431 Oct 12 05:53:02 crc kubenswrapper[4930]: I1012 05:53:02.637848 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-688d4f8df8-dh759" event={"ID":"726977f0-7624-4ce2-95ed-80f844fcfc81","Type":"ContainerStarted","Data":"155221fae78b605176e28cb549c3955920380e38b7fdff2fc188e0156b774431"} Oct 12 05:53:02 crc kubenswrapper[4930]: I1012 05:53:02.639522 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7ccf4dfd66-4nwxw" event={"ID":"8ccb343f-6987-45c8-9b9f-a2bb32efbe22","Type":"ContainerStarted","Data":"0e89c08b8526386b3195555d26540054aad51b35a112d817c92bcd7a2eef7b8a"} Oct 12 05:53:03 crc kubenswrapper[4930]: I1012 05:53:03.668993 4930 patch_prober.go:28] interesting pod/machine-config-daemon-mk4tf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 05:53:03 crc kubenswrapper[4930]: I1012 05:53:03.669056 4930 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 05:53:08 crc kubenswrapper[4930]: I1012 05:53:08.693344 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7ccf4dfd66-4nwxw" event={"ID":"8ccb343f-6987-45c8-9b9f-a2bb32efbe22","Type":"ContainerStarted","Data":"43eb5834a9eb3e1218d35c478f4c64d3c6a01cb0381b79621bb4d55330ce63bd"} Oct 12 05:53:08 crc kubenswrapper[4930]: I1012 05:53:08.697576 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-688d4f8df8-dh759" event={"ID":"726977f0-7624-4ce2-95ed-80f844fcfc81","Type":"ContainerStarted","Data":"5a4e1213f2ab6a708fda6b984c77aec458702d2b5442634e0ce4625032f6afdf"} Oct 12 05:53:08 crc kubenswrapper[4930]: I1012 05:53:08.697954 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-7ccf4dfd66-4nwxw" Oct 12 05:53:08 crc kubenswrapper[4930]: I1012 05:53:08.698241 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-688d4f8df8-dh759" Oct 12 05:53:08 crc kubenswrapper[4930]: I1012 05:53:08.730784 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-7ccf4dfd66-4nwxw" podStartSLOduration=1.593630121 podStartE2EDuration="7.730707417s" podCreationTimestamp="2025-10-12 05:53:01 +0000 UTC" firstStartedPulling="2025-10-12 05:53:02.052056198 +0000 UTC m=+714.594157963" lastFinishedPulling="2025-10-12 05:53:08.189133494 +0000 UTC m=+720.731235259" observedRunningTime="2025-10-12 05:53:08.724125619 +0000 UTC m=+721.266227414" watchObservedRunningTime="2025-10-12 05:53:08.730707417 +0000 UTC m=+721.272809222" Oct 12 05:53:08 crc kubenswrapper[4930]: I1012 05:53:08.766070 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-688d4f8df8-dh759" podStartSLOduration=1.654008078 podStartE2EDuration="7.766044825s" podCreationTimestamp="2025-10-12 05:53:01 +0000 UTC" firstStartedPulling="2025-10-12 05:53:02.110940417 +0000 UTC m=+714.653042182" lastFinishedPulling="2025-10-12 05:53:08.222977164 +0000 UTC m=+720.765078929" observedRunningTime="2025-10-12 05:53:08.76271056 +0000 UTC m=+721.304812335" watchObservedRunningTime="2025-10-12 05:53:08.766044825 +0000 UTC m=+721.308146600" Oct 12 05:53:21 crc kubenswrapper[4930]: I1012 05:53:21.882378 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-688d4f8df8-dh759" Oct 12 05:53:33 crc kubenswrapper[4930]: I1012 05:53:33.670047 4930 patch_prober.go:28] interesting pod/machine-config-daemon-mk4tf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 05:53:33 crc kubenswrapper[4930]: I1012 05:53:33.672679 4930 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 05:53:34 crc kubenswrapper[4930]: I1012 05:53:34.436529 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-t48rv"] Oct 12 05:53:34 crc kubenswrapper[4930]: I1012 05:53:34.437053 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-t48rv" podUID="79650663-cd46-4fef-a731-dfc45f8fa945" containerName="controller-manager" containerID="cri-o://3d98b32d90922bf1f0fceddbbbb4f16ff4395068f862d8f64d6a439e8301f149" gracePeriod=30 Oct 12 05:53:34 crc kubenswrapper[4930]: I1012 05:53:34.546281 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-cqctf"] Oct 12 05:53:34 crc kubenswrapper[4930]: I1012 05:53:34.546485 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cqctf" podUID="cd39a673-35f1-435a-bd8d-02b253c12f1c" containerName="route-controller-manager" containerID="cri-o://35f261bf33cbdef5ee122b1903a8108c926f33285cb7e99169a83c41193f36c5" gracePeriod=30 Oct 12 05:53:34 crc kubenswrapper[4930]: I1012 05:53:34.886361 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-t48rv" Oct 12 05:53:34 crc kubenswrapper[4930]: I1012 05:53:34.892873 4930 generic.go:334] "Generic (PLEG): container finished" podID="79650663-cd46-4fef-a731-dfc45f8fa945" containerID="3d98b32d90922bf1f0fceddbbbb4f16ff4395068f862d8f64d6a439e8301f149" exitCode=0 Oct 12 05:53:34 crc kubenswrapper[4930]: I1012 05:53:34.892935 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-t48rv" event={"ID":"79650663-cd46-4fef-a731-dfc45f8fa945","Type":"ContainerDied","Data":"3d98b32d90922bf1f0fceddbbbb4f16ff4395068f862d8f64d6a439e8301f149"} Oct 12 05:53:34 crc kubenswrapper[4930]: I1012 05:53:34.892963 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-t48rv" event={"ID":"79650663-cd46-4fef-a731-dfc45f8fa945","Type":"ContainerDied","Data":"9b08e544d6fac720cfae74ee3a8e55a6f2e2b819b4f01602a50cb1c4485cc0e2"} Oct 12 05:53:34 crc kubenswrapper[4930]: I1012 05:53:34.892979 4930 scope.go:117] "RemoveContainer" containerID="3d98b32d90922bf1f0fceddbbbb4f16ff4395068f862d8f64d6a439e8301f149" Oct 12 05:53:34 crc kubenswrapper[4930]: I1012 05:53:34.893067 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-t48rv" Oct 12 05:53:34 crc kubenswrapper[4930]: I1012 05:53:34.895934 4930 generic.go:334] "Generic (PLEG): container finished" podID="cd39a673-35f1-435a-bd8d-02b253c12f1c" containerID="35f261bf33cbdef5ee122b1903a8108c926f33285cb7e99169a83c41193f36c5" exitCode=0 Oct 12 05:53:34 crc kubenswrapper[4930]: I1012 05:53:34.895982 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cqctf" event={"ID":"cd39a673-35f1-435a-bd8d-02b253c12f1c","Type":"ContainerDied","Data":"35f261bf33cbdef5ee122b1903a8108c926f33285cb7e99169a83c41193f36c5"} Oct 12 05:53:34 crc kubenswrapper[4930]: I1012 05:53:34.926109 4930 scope.go:117] "RemoveContainer" containerID="3d98b32d90922bf1f0fceddbbbb4f16ff4395068f862d8f64d6a439e8301f149" Oct 12 05:53:34 crc kubenswrapper[4930]: E1012 05:53:34.933845 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d98b32d90922bf1f0fceddbbbb4f16ff4395068f862d8f64d6a439e8301f149\": container with ID starting with 3d98b32d90922bf1f0fceddbbbb4f16ff4395068f862d8f64d6a439e8301f149 not found: ID does not exist" containerID="3d98b32d90922bf1f0fceddbbbb4f16ff4395068f862d8f64d6a439e8301f149" Oct 12 05:53:34 crc kubenswrapper[4930]: I1012 05:53:34.933902 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d98b32d90922bf1f0fceddbbbb4f16ff4395068f862d8f64d6a439e8301f149"} err="failed to get container status \"3d98b32d90922bf1f0fceddbbbb4f16ff4395068f862d8f64d6a439e8301f149\": rpc error: code = NotFound desc = could not find container \"3d98b32d90922bf1f0fceddbbbb4f16ff4395068f862d8f64d6a439e8301f149\": container with ID starting with 3d98b32d90922bf1f0fceddbbbb4f16ff4395068f862d8f64d6a439e8301f149 not found: ID does not exist" Oct 12 05:53:34 crc kubenswrapper[4930]: I1012 05:53:34.983472 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cqctf" Oct 12 05:53:35 crc kubenswrapper[4930]: I1012 05:53:35.001357 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79650663-cd46-4fef-a731-dfc45f8fa945-serving-cert\") pod \"79650663-cd46-4fef-a731-dfc45f8fa945\" (UID: \"79650663-cd46-4fef-a731-dfc45f8fa945\") " Oct 12 05:53:35 crc kubenswrapper[4930]: I1012 05:53:35.001412 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q99cr\" (UniqueName: \"kubernetes.io/projected/79650663-cd46-4fef-a731-dfc45f8fa945-kube-api-access-q99cr\") pod \"79650663-cd46-4fef-a731-dfc45f8fa945\" (UID: \"79650663-cd46-4fef-a731-dfc45f8fa945\") " Oct 12 05:53:35 crc kubenswrapper[4930]: I1012 05:53:35.001472 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/79650663-cd46-4fef-a731-dfc45f8fa945-proxy-ca-bundles\") pod \"79650663-cd46-4fef-a731-dfc45f8fa945\" (UID: \"79650663-cd46-4fef-a731-dfc45f8fa945\") " Oct 12 05:53:35 crc kubenswrapper[4930]: I1012 05:53:35.001536 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/79650663-cd46-4fef-a731-dfc45f8fa945-client-ca\") pod \"79650663-cd46-4fef-a731-dfc45f8fa945\" (UID: \"79650663-cd46-4fef-a731-dfc45f8fa945\") " Oct 12 05:53:35 crc kubenswrapper[4930]: I1012 05:53:35.001622 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79650663-cd46-4fef-a731-dfc45f8fa945-config\") pod \"79650663-cd46-4fef-a731-dfc45f8fa945\" (UID: \"79650663-cd46-4fef-a731-dfc45f8fa945\") " Oct 12 05:53:35 crc kubenswrapper[4930]: I1012 05:53:35.002293 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79650663-cd46-4fef-a731-dfc45f8fa945-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "79650663-cd46-4fef-a731-dfc45f8fa945" (UID: "79650663-cd46-4fef-a731-dfc45f8fa945"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:53:35 crc kubenswrapper[4930]: I1012 05:53:35.002564 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79650663-cd46-4fef-a731-dfc45f8fa945-client-ca" (OuterVolumeSpecName: "client-ca") pod "79650663-cd46-4fef-a731-dfc45f8fa945" (UID: "79650663-cd46-4fef-a731-dfc45f8fa945"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:53:35 crc kubenswrapper[4930]: I1012 05:53:35.002721 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79650663-cd46-4fef-a731-dfc45f8fa945-config" (OuterVolumeSpecName: "config") pod "79650663-cd46-4fef-a731-dfc45f8fa945" (UID: "79650663-cd46-4fef-a731-dfc45f8fa945"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:53:35 crc kubenswrapper[4930]: I1012 05:53:35.008519 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79650663-cd46-4fef-a731-dfc45f8fa945-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "79650663-cd46-4fef-a731-dfc45f8fa945" (UID: "79650663-cd46-4fef-a731-dfc45f8fa945"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:53:35 crc kubenswrapper[4930]: I1012 05:53:35.009561 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79650663-cd46-4fef-a731-dfc45f8fa945-kube-api-access-q99cr" (OuterVolumeSpecName: "kube-api-access-q99cr") pod "79650663-cd46-4fef-a731-dfc45f8fa945" (UID: "79650663-cd46-4fef-a731-dfc45f8fa945"). InnerVolumeSpecName "kube-api-access-q99cr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:53:35 crc kubenswrapper[4930]: I1012 05:53:35.102860 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd39a673-35f1-435a-bd8d-02b253c12f1c-config\") pod \"cd39a673-35f1-435a-bd8d-02b253c12f1c\" (UID: \"cd39a673-35f1-435a-bd8d-02b253c12f1c\") " Oct 12 05:53:35 crc kubenswrapper[4930]: I1012 05:53:35.102969 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd39a673-35f1-435a-bd8d-02b253c12f1c-serving-cert\") pod \"cd39a673-35f1-435a-bd8d-02b253c12f1c\" (UID: \"cd39a673-35f1-435a-bd8d-02b253c12f1c\") " Oct 12 05:53:35 crc kubenswrapper[4930]: I1012 05:53:35.103034 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cd39a673-35f1-435a-bd8d-02b253c12f1c-client-ca\") pod \"cd39a673-35f1-435a-bd8d-02b253c12f1c\" (UID: \"cd39a673-35f1-435a-bd8d-02b253c12f1c\") " Oct 12 05:53:35 crc kubenswrapper[4930]: I1012 05:53:35.103168 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mhct\" (UniqueName: \"kubernetes.io/projected/cd39a673-35f1-435a-bd8d-02b253c12f1c-kube-api-access-8mhct\") pod \"cd39a673-35f1-435a-bd8d-02b253c12f1c\" (UID: \"cd39a673-35f1-435a-bd8d-02b253c12f1c\") " Oct 12 05:53:35 crc kubenswrapper[4930]: I1012 05:53:35.103637 4930 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79650663-cd46-4fef-a731-dfc45f8fa945-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 12 05:53:35 crc kubenswrapper[4930]: I1012 05:53:35.103672 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q99cr\" (UniqueName: \"kubernetes.io/projected/79650663-cd46-4fef-a731-dfc45f8fa945-kube-api-access-q99cr\") on node \"crc\" DevicePath \"\"" Oct 12 05:53:35 crc kubenswrapper[4930]: I1012 05:53:35.103694 4930 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/79650663-cd46-4fef-a731-dfc45f8fa945-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 12 05:53:35 crc kubenswrapper[4930]: I1012 05:53:35.103719 4930 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/79650663-cd46-4fef-a731-dfc45f8fa945-client-ca\") on node \"crc\" DevicePath \"\"" Oct 12 05:53:35 crc kubenswrapper[4930]: I1012 05:53:35.103797 4930 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79650663-cd46-4fef-a731-dfc45f8fa945-config\") on node \"crc\" DevicePath \"\"" Oct 12 05:53:35 crc kubenswrapper[4930]: I1012 05:53:35.104011 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd39a673-35f1-435a-bd8d-02b253c12f1c-client-ca" (OuterVolumeSpecName: "client-ca") pod "cd39a673-35f1-435a-bd8d-02b253c12f1c" (UID: "cd39a673-35f1-435a-bd8d-02b253c12f1c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:53:35 crc kubenswrapper[4930]: I1012 05:53:35.104023 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd39a673-35f1-435a-bd8d-02b253c12f1c-config" (OuterVolumeSpecName: "config") pod "cd39a673-35f1-435a-bd8d-02b253c12f1c" (UID: "cd39a673-35f1-435a-bd8d-02b253c12f1c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:53:35 crc kubenswrapper[4930]: I1012 05:53:35.106270 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd39a673-35f1-435a-bd8d-02b253c12f1c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "cd39a673-35f1-435a-bd8d-02b253c12f1c" (UID: "cd39a673-35f1-435a-bd8d-02b253c12f1c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:53:35 crc kubenswrapper[4930]: I1012 05:53:35.107447 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd39a673-35f1-435a-bd8d-02b253c12f1c-kube-api-access-8mhct" (OuterVolumeSpecName: "kube-api-access-8mhct") pod "cd39a673-35f1-435a-bd8d-02b253c12f1c" (UID: "cd39a673-35f1-435a-bd8d-02b253c12f1c"). InnerVolumeSpecName "kube-api-access-8mhct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:53:35 crc kubenswrapper[4930]: I1012 05:53:35.204894 4930 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd39a673-35f1-435a-bd8d-02b253c12f1c-config\") on node \"crc\" DevicePath \"\"" Oct 12 05:53:35 crc kubenswrapper[4930]: I1012 05:53:35.204924 4930 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd39a673-35f1-435a-bd8d-02b253c12f1c-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 12 05:53:35 crc kubenswrapper[4930]: I1012 05:53:35.204935 4930 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cd39a673-35f1-435a-bd8d-02b253c12f1c-client-ca\") on node \"crc\" DevicePath \"\"" Oct 12 05:53:35 crc kubenswrapper[4930]: I1012 05:53:35.204945 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mhct\" (UniqueName: \"kubernetes.io/projected/cd39a673-35f1-435a-bd8d-02b253c12f1c-kube-api-access-8mhct\") on node \"crc\" DevicePath \"\"" Oct 12 05:53:35 crc kubenswrapper[4930]: I1012 05:53:35.222355 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-t48rv"] Oct 12 05:53:35 crc kubenswrapper[4930]: I1012 05:53:35.226495 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-t48rv"] Oct 12 05:53:35 crc kubenswrapper[4930]: I1012 05:53:35.827859 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-676bd8bf7c-s6bkp"] Oct 12 05:53:35 crc kubenswrapper[4930]: E1012 05:53:35.828200 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd39a673-35f1-435a-bd8d-02b253c12f1c" containerName="route-controller-manager" Oct 12 05:53:35 crc kubenswrapper[4930]: I1012 05:53:35.828217 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd39a673-35f1-435a-bd8d-02b253c12f1c" containerName="route-controller-manager" Oct 12 05:53:35 crc kubenswrapper[4930]: E1012 05:53:35.828230 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79650663-cd46-4fef-a731-dfc45f8fa945" containerName="controller-manager" Oct 12 05:53:35 crc kubenswrapper[4930]: I1012 05:53:35.828240 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="79650663-cd46-4fef-a731-dfc45f8fa945" containerName="controller-manager" Oct 12 05:53:35 crc kubenswrapper[4930]: I1012 05:53:35.828372 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="79650663-cd46-4fef-a731-dfc45f8fa945" containerName="controller-manager" Oct 12 05:53:35 crc kubenswrapper[4930]: I1012 05:53:35.828397 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd39a673-35f1-435a-bd8d-02b253c12f1c" containerName="route-controller-manager" Oct 12 05:53:35 crc kubenswrapper[4930]: I1012 05:53:35.828906 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-676bd8bf7c-s6bkp" Oct 12 05:53:35 crc kubenswrapper[4930]: I1012 05:53:35.831651 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 12 05:53:35 crc kubenswrapper[4930]: I1012 05:53:35.847202 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59bdf459cf-2hhq5"] Oct 12 05:53:35 crc kubenswrapper[4930]: I1012 05:53:35.847461 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 12 05:53:35 crc kubenswrapper[4930]: I1012 05:53:35.847556 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 12 05:53:35 crc kubenswrapper[4930]: I1012 05:53:35.847856 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 12 05:53:35 crc kubenswrapper[4930]: I1012 05:53:35.847919 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 12 05:53:35 crc kubenswrapper[4930]: I1012 05:53:35.848343 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-59bdf459cf-2hhq5" Oct 12 05:53:35 crc kubenswrapper[4930]: I1012 05:53:35.850026 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 12 05:53:35 crc kubenswrapper[4930]: I1012 05:53:35.853117 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-676bd8bf7c-s6bkp"] Oct 12 05:53:35 crc kubenswrapper[4930]: I1012 05:53:35.864332 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 12 05:53:35 crc kubenswrapper[4930]: I1012 05:53:35.884195 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59bdf459cf-2hhq5"] Oct 12 05:53:35 crc kubenswrapper[4930]: I1012 05:53:35.903313 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cqctf" event={"ID":"cd39a673-35f1-435a-bd8d-02b253c12f1c","Type":"ContainerDied","Data":"1ce60dbc9d6af70d2963d52a196ea75759088f4f30b448cffcf59a7b9483a792"} Oct 12 05:53:35 crc kubenswrapper[4930]: I1012 05:53:35.903372 4930 scope.go:117] "RemoveContainer" containerID="35f261bf33cbdef5ee122b1903a8108c926f33285cb7e99169a83c41193f36c5" Oct 12 05:53:35 crc kubenswrapper[4930]: I1012 05:53:35.903793 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-cqctf" Oct 12 05:53:35 crc kubenswrapper[4930]: I1012 05:53:35.941214 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-cqctf"] Oct 12 05:53:35 crc kubenswrapper[4930]: I1012 05:53:35.944881 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-cqctf"] Oct 12 05:53:36 crc kubenswrapper[4930]: I1012 05:53:36.016658 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sb52t\" (UniqueName: \"kubernetes.io/projected/820e7959-68b2-4878-b985-0b64680d143c-kube-api-access-sb52t\") pod \"controller-manager-676bd8bf7c-s6bkp\" (UID: \"820e7959-68b2-4878-b985-0b64680d143c\") " pod="openshift-controller-manager/controller-manager-676bd8bf7c-s6bkp" Oct 12 05:53:36 crc kubenswrapper[4930]: I1012 05:53:36.016728 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/820e7959-68b2-4878-b985-0b64680d143c-proxy-ca-bundles\") pod \"controller-manager-676bd8bf7c-s6bkp\" (UID: \"820e7959-68b2-4878-b985-0b64680d143c\") " pod="openshift-controller-manager/controller-manager-676bd8bf7c-s6bkp" Oct 12 05:53:36 crc kubenswrapper[4930]: I1012 05:53:36.016772 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/820e7959-68b2-4878-b985-0b64680d143c-client-ca\") pod \"controller-manager-676bd8bf7c-s6bkp\" (UID: \"820e7959-68b2-4878-b985-0b64680d143c\") " pod="openshift-controller-manager/controller-manager-676bd8bf7c-s6bkp" Oct 12 05:53:36 crc kubenswrapper[4930]: I1012 05:53:36.016824 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlsrt\" (UniqueName: \"kubernetes.io/projected/66cab3cb-247c-426e-b3f5-ed62fd132596-kube-api-access-qlsrt\") pod \"route-controller-manager-59bdf459cf-2hhq5\" (UID: \"66cab3cb-247c-426e-b3f5-ed62fd132596\") " pod="openshift-route-controller-manager/route-controller-manager-59bdf459cf-2hhq5" Oct 12 05:53:36 crc kubenswrapper[4930]: I1012 05:53:36.016843 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66cab3cb-247c-426e-b3f5-ed62fd132596-config\") pod \"route-controller-manager-59bdf459cf-2hhq5\" (UID: \"66cab3cb-247c-426e-b3f5-ed62fd132596\") " pod="openshift-route-controller-manager/route-controller-manager-59bdf459cf-2hhq5" Oct 12 05:53:36 crc kubenswrapper[4930]: I1012 05:53:36.016863 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/820e7959-68b2-4878-b985-0b64680d143c-config\") pod \"controller-manager-676bd8bf7c-s6bkp\" (UID: \"820e7959-68b2-4878-b985-0b64680d143c\") " pod="openshift-controller-manager/controller-manager-676bd8bf7c-s6bkp" Oct 12 05:53:36 crc kubenswrapper[4930]: I1012 05:53:36.016885 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/820e7959-68b2-4878-b985-0b64680d143c-serving-cert\") pod \"controller-manager-676bd8bf7c-s6bkp\" (UID: \"820e7959-68b2-4878-b985-0b64680d143c\") " pod="openshift-controller-manager/controller-manager-676bd8bf7c-s6bkp" Oct 12 05:53:36 crc kubenswrapper[4930]: I1012 05:53:36.016902 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66cab3cb-247c-426e-b3f5-ed62fd132596-serving-cert\") pod \"route-controller-manager-59bdf459cf-2hhq5\" (UID: \"66cab3cb-247c-426e-b3f5-ed62fd132596\") " pod="openshift-route-controller-manager/route-controller-manager-59bdf459cf-2hhq5" Oct 12 05:53:36 crc kubenswrapper[4930]: I1012 05:53:36.016939 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/66cab3cb-247c-426e-b3f5-ed62fd132596-client-ca\") pod \"route-controller-manager-59bdf459cf-2hhq5\" (UID: \"66cab3cb-247c-426e-b3f5-ed62fd132596\") " pod="openshift-route-controller-manager/route-controller-manager-59bdf459cf-2hhq5" Oct 12 05:53:36 crc kubenswrapper[4930]: I1012 05:53:36.119650 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/820e7959-68b2-4878-b985-0b64680d143c-client-ca\") pod \"controller-manager-676bd8bf7c-s6bkp\" (UID: \"820e7959-68b2-4878-b985-0b64680d143c\") " pod="openshift-controller-manager/controller-manager-676bd8bf7c-s6bkp" Oct 12 05:53:36 crc kubenswrapper[4930]: I1012 05:53:36.120101 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlsrt\" (UniqueName: \"kubernetes.io/projected/66cab3cb-247c-426e-b3f5-ed62fd132596-kube-api-access-qlsrt\") pod \"route-controller-manager-59bdf459cf-2hhq5\" (UID: \"66cab3cb-247c-426e-b3f5-ed62fd132596\") " pod="openshift-route-controller-manager/route-controller-manager-59bdf459cf-2hhq5" Oct 12 05:53:36 crc kubenswrapper[4930]: I1012 05:53:36.120146 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66cab3cb-247c-426e-b3f5-ed62fd132596-config\") pod \"route-controller-manager-59bdf459cf-2hhq5\" (UID: \"66cab3cb-247c-426e-b3f5-ed62fd132596\") " pod="openshift-route-controller-manager/route-controller-manager-59bdf459cf-2hhq5" Oct 12 05:53:36 crc kubenswrapper[4930]: I1012 05:53:36.120181 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/820e7959-68b2-4878-b985-0b64680d143c-config\") pod \"controller-manager-676bd8bf7c-s6bkp\" (UID: \"820e7959-68b2-4878-b985-0b64680d143c\") " pod="openshift-controller-manager/controller-manager-676bd8bf7c-s6bkp" Oct 12 05:53:36 crc kubenswrapper[4930]: I1012 05:53:36.120257 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/820e7959-68b2-4878-b985-0b64680d143c-serving-cert\") pod \"controller-manager-676bd8bf7c-s6bkp\" (UID: \"820e7959-68b2-4878-b985-0b64680d143c\") " pod="openshift-controller-manager/controller-manager-676bd8bf7c-s6bkp" Oct 12 05:53:36 crc kubenswrapper[4930]: I1012 05:53:36.120293 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66cab3cb-247c-426e-b3f5-ed62fd132596-serving-cert\") pod \"route-controller-manager-59bdf459cf-2hhq5\" (UID: \"66cab3cb-247c-426e-b3f5-ed62fd132596\") " pod="openshift-route-controller-manager/route-controller-manager-59bdf459cf-2hhq5" Oct 12 05:53:36 crc kubenswrapper[4930]: I1012 05:53:36.120369 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/66cab3cb-247c-426e-b3f5-ed62fd132596-client-ca\") pod \"route-controller-manager-59bdf459cf-2hhq5\" (UID: \"66cab3cb-247c-426e-b3f5-ed62fd132596\") " pod="openshift-route-controller-manager/route-controller-manager-59bdf459cf-2hhq5" Oct 12 05:53:36 crc kubenswrapper[4930]: I1012 05:53:36.120430 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sb52t\" (UniqueName: \"kubernetes.io/projected/820e7959-68b2-4878-b985-0b64680d143c-kube-api-access-sb52t\") pod \"controller-manager-676bd8bf7c-s6bkp\" (UID: \"820e7959-68b2-4878-b985-0b64680d143c\") " pod="openshift-controller-manager/controller-manager-676bd8bf7c-s6bkp" Oct 12 05:53:36 crc kubenswrapper[4930]: I1012 05:53:36.120486 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/820e7959-68b2-4878-b985-0b64680d143c-proxy-ca-bundles\") pod \"controller-manager-676bd8bf7c-s6bkp\" (UID: \"820e7959-68b2-4878-b985-0b64680d143c\") " pod="openshift-controller-manager/controller-manager-676bd8bf7c-s6bkp" Oct 12 05:53:36 crc kubenswrapper[4930]: I1012 05:53:36.122582 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/820e7959-68b2-4878-b985-0b64680d143c-proxy-ca-bundles\") pod \"controller-manager-676bd8bf7c-s6bkp\" (UID: \"820e7959-68b2-4878-b985-0b64680d143c\") " pod="openshift-controller-manager/controller-manager-676bd8bf7c-s6bkp" Oct 12 05:53:36 crc kubenswrapper[4930]: I1012 05:53:36.122912 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/820e7959-68b2-4878-b985-0b64680d143c-client-ca\") pod \"controller-manager-676bd8bf7c-s6bkp\" (UID: \"820e7959-68b2-4878-b985-0b64680d143c\") " pod="openshift-controller-manager/controller-manager-676bd8bf7c-s6bkp" Oct 12 05:53:36 crc kubenswrapper[4930]: I1012 05:53:36.123473 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/820e7959-68b2-4878-b985-0b64680d143c-config\") pod \"controller-manager-676bd8bf7c-s6bkp\" (UID: \"820e7959-68b2-4878-b985-0b64680d143c\") " pod="openshift-controller-manager/controller-manager-676bd8bf7c-s6bkp" Oct 12 05:53:36 crc kubenswrapper[4930]: I1012 05:53:36.124023 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66cab3cb-247c-426e-b3f5-ed62fd132596-config\") pod \"route-controller-manager-59bdf459cf-2hhq5\" (UID: \"66cab3cb-247c-426e-b3f5-ed62fd132596\") " pod="openshift-route-controller-manager/route-controller-manager-59bdf459cf-2hhq5" Oct 12 05:53:36 crc kubenswrapper[4930]: I1012 05:53:36.124432 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/66cab3cb-247c-426e-b3f5-ed62fd132596-client-ca\") pod \"route-controller-manager-59bdf459cf-2hhq5\" (UID: \"66cab3cb-247c-426e-b3f5-ed62fd132596\") " pod="openshift-route-controller-manager/route-controller-manager-59bdf459cf-2hhq5" Oct 12 05:53:36 crc kubenswrapper[4930]: I1012 05:53:36.125768 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66cab3cb-247c-426e-b3f5-ed62fd132596-serving-cert\") pod \"route-controller-manager-59bdf459cf-2hhq5\" (UID: \"66cab3cb-247c-426e-b3f5-ed62fd132596\") " pod="openshift-route-controller-manager/route-controller-manager-59bdf459cf-2hhq5" Oct 12 05:53:36 crc kubenswrapper[4930]: I1012 05:53:36.129638 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/820e7959-68b2-4878-b985-0b64680d143c-serving-cert\") pod \"controller-manager-676bd8bf7c-s6bkp\" (UID: \"820e7959-68b2-4878-b985-0b64680d143c\") " pod="openshift-controller-manager/controller-manager-676bd8bf7c-s6bkp" Oct 12 05:53:36 crc kubenswrapper[4930]: I1012 05:53:36.140832 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlsrt\" (UniqueName: \"kubernetes.io/projected/66cab3cb-247c-426e-b3f5-ed62fd132596-kube-api-access-qlsrt\") pod \"route-controller-manager-59bdf459cf-2hhq5\" (UID: \"66cab3cb-247c-426e-b3f5-ed62fd132596\") " pod="openshift-route-controller-manager/route-controller-manager-59bdf459cf-2hhq5" Oct 12 05:53:36 crc kubenswrapper[4930]: I1012 05:53:36.141014 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sb52t\" (UniqueName: \"kubernetes.io/projected/820e7959-68b2-4878-b985-0b64680d143c-kube-api-access-sb52t\") pod \"controller-manager-676bd8bf7c-s6bkp\" (UID: \"820e7959-68b2-4878-b985-0b64680d143c\") " pod="openshift-controller-manager/controller-manager-676bd8bf7c-s6bkp" Oct 12 05:53:36 crc kubenswrapper[4930]: I1012 05:53:36.149970 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79650663-cd46-4fef-a731-dfc45f8fa945" path="/var/lib/kubelet/pods/79650663-cd46-4fef-a731-dfc45f8fa945/volumes" Oct 12 05:53:36 crc kubenswrapper[4930]: I1012 05:53:36.151687 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd39a673-35f1-435a-bd8d-02b253c12f1c" path="/var/lib/kubelet/pods/cd39a673-35f1-435a-bd8d-02b253c12f1c/volumes" Oct 12 05:53:36 crc kubenswrapper[4930]: I1012 05:53:36.152656 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-676bd8bf7c-s6bkp" Oct 12 05:53:36 crc kubenswrapper[4930]: I1012 05:53:36.173198 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-59bdf459cf-2hhq5" Oct 12 05:53:36 crc kubenswrapper[4930]: I1012 05:53:36.413838 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59bdf459cf-2hhq5"] Oct 12 05:53:36 crc kubenswrapper[4930]: I1012 05:53:36.668897 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59bdf459cf-2hhq5"] Oct 12 05:53:36 crc kubenswrapper[4930]: I1012 05:53:36.682064 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-676bd8bf7c-s6bkp"] Oct 12 05:53:36 crc kubenswrapper[4930]: W1012 05:53:36.696602 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod820e7959_68b2_4878_b985_0b64680d143c.slice/crio-6632f3661f2dbe902f00fc72ddee528b8a04020cc6792af9b92b71d9bb32a4ea WatchSource:0}: Error finding container 6632f3661f2dbe902f00fc72ddee528b8a04020cc6792af9b92b71d9bb32a4ea: Status 404 returned error can't find the container with id 6632f3661f2dbe902f00fc72ddee528b8a04020cc6792af9b92b71d9bb32a4ea Oct 12 05:53:36 crc kubenswrapper[4930]: I1012 05:53:36.916135 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-59bdf459cf-2hhq5" event={"ID":"66cab3cb-247c-426e-b3f5-ed62fd132596","Type":"ContainerStarted","Data":"62b8b97946544b7ded6d4631061bc3940433bccf4ab2a63ada626a3f1366f383"} Oct 12 05:53:36 crc kubenswrapper[4930]: I1012 05:53:36.916190 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-59bdf459cf-2hhq5" event={"ID":"66cab3cb-247c-426e-b3f5-ed62fd132596","Type":"ContainerStarted","Data":"19237a1bb40af1cf3a2769a7a9cb4e5fbabd6abad8e0e20d56a6799ebd602ae5"} Oct 12 05:53:36 crc kubenswrapper[4930]: I1012 05:53:36.916342 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-59bdf459cf-2hhq5" podUID="66cab3cb-247c-426e-b3f5-ed62fd132596" containerName="route-controller-manager" containerID="cri-o://62b8b97946544b7ded6d4631061bc3940433bccf4ab2a63ada626a3f1366f383" gracePeriod=30 Oct 12 05:53:36 crc kubenswrapper[4930]: I1012 05:53:36.916376 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-59bdf459cf-2hhq5" Oct 12 05:53:36 crc kubenswrapper[4930]: I1012 05:53:36.917904 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-676bd8bf7c-s6bkp" event={"ID":"820e7959-68b2-4878-b985-0b64680d143c","Type":"ContainerStarted","Data":"2287c7d394fa7ba523e9d46ba1ee1905331f911fe670e33833d765049f49afd0"} Oct 12 05:53:36 crc kubenswrapper[4930]: I1012 05:53:36.917931 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-676bd8bf7c-s6bkp" event={"ID":"820e7959-68b2-4878-b985-0b64680d143c","Type":"ContainerStarted","Data":"6632f3661f2dbe902f00fc72ddee528b8a04020cc6792af9b92b71d9bb32a4ea"} Oct 12 05:53:36 crc kubenswrapper[4930]: I1012 05:53:36.918139 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-676bd8bf7c-s6bkp" Oct 12 05:53:36 crc kubenswrapper[4930]: I1012 05:53:36.919991 4930 patch_prober.go:28] interesting pod/controller-manager-676bd8bf7c-s6bkp container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.66:8443/healthz\": dial tcp 10.217.0.66:8443: connect: connection refused" start-of-body= Oct 12 05:53:36 crc kubenswrapper[4930]: I1012 05:53:36.920035 4930 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-676bd8bf7c-s6bkp" podUID="820e7959-68b2-4878-b985-0b64680d143c" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.66:8443/healthz\": dial tcp 10.217.0.66:8443: connect: connection refused" Oct 12 05:53:36 crc kubenswrapper[4930]: I1012 05:53:36.948199 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-59bdf459cf-2hhq5" podStartSLOduration=2.948182051 podStartE2EDuration="2.948182051s" podCreationTimestamp="2025-10-12 05:53:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:53:36.945697528 +0000 UTC m=+749.487799303" watchObservedRunningTime="2025-10-12 05:53:36.948182051 +0000 UTC m=+749.490283816" Oct 12 05:53:36 crc kubenswrapper[4930]: I1012 05:53:36.971175 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-676bd8bf7c-s6bkp" podStartSLOduration=2.971149825 podStartE2EDuration="2.971149825s" podCreationTimestamp="2025-10-12 05:53:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:53:36.966495176 +0000 UTC m=+749.508596941" watchObservedRunningTime="2025-10-12 05:53:36.971149825 +0000 UTC m=+749.513251590" Oct 12 05:53:37 crc kubenswrapper[4930]: I1012 05:53:37.047254 4930 patch_prober.go:28] interesting pod/route-controller-manager-59bdf459cf-2hhq5 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.67:8443/healthz\": read tcp 10.217.0.2:40654->10.217.0.67:8443: read: connection reset by peer" start-of-body= Oct 12 05:53:37 crc kubenswrapper[4930]: I1012 05:53:37.047320 4930 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-59bdf459cf-2hhq5" podUID="66cab3cb-247c-426e-b3f5-ed62fd132596" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.67:8443/healthz\": read tcp 10.217.0.2:40654->10.217.0.67:8443: read: connection reset by peer" Oct 12 05:53:37 crc kubenswrapper[4930]: I1012 05:53:37.456085 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-59bdf459cf-2hhq5_66cab3cb-247c-426e-b3f5-ed62fd132596/route-controller-manager/0.log" Oct 12 05:53:37 crc kubenswrapper[4930]: I1012 05:53:37.456575 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-59bdf459cf-2hhq5" Oct 12 05:53:37 crc kubenswrapper[4930]: I1012 05:53:37.501819 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d6dcb7655-gxvnn"] Oct 12 05:53:37 crc kubenswrapper[4930]: E1012 05:53:37.502230 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66cab3cb-247c-426e-b3f5-ed62fd132596" containerName="route-controller-manager" Oct 12 05:53:37 crc kubenswrapper[4930]: I1012 05:53:37.502253 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="66cab3cb-247c-426e-b3f5-ed62fd132596" containerName="route-controller-manager" Oct 12 05:53:37 crc kubenswrapper[4930]: I1012 05:53:37.502407 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="66cab3cb-247c-426e-b3f5-ed62fd132596" containerName="route-controller-manager" Oct 12 05:53:37 crc kubenswrapper[4930]: I1012 05:53:37.503614 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d6dcb7655-gxvnn" Oct 12 05:53:37 crc kubenswrapper[4930]: I1012 05:53:37.509489 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d6dcb7655-gxvnn"] Oct 12 05:53:37 crc kubenswrapper[4930]: I1012 05:53:37.640569 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66cab3cb-247c-426e-b3f5-ed62fd132596-config\") pod \"66cab3cb-247c-426e-b3f5-ed62fd132596\" (UID: \"66cab3cb-247c-426e-b3f5-ed62fd132596\") " Oct 12 05:53:37 crc kubenswrapper[4930]: I1012 05:53:37.640654 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66cab3cb-247c-426e-b3f5-ed62fd132596-serving-cert\") pod \"66cab3cb-247c-426e-b3f5-ed62fd132596\" (UID: \"66cab3cb-247c-426e-b3f5-ed62fd132596\") " Oct 12 05:53:37 crc kubenswrapper[4930]: I1012 05:53:37.640693 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/66cab3cb-247c-426e-b3f5-ed62fd132596-client-ca\") pod \"66cab3cb-247c-426e-b3f5-ed62fd132596\" (UID: \"66cab3cb-247c-426e-b3f5-ed62fd132596\") " Oct 12 05:53:37 crc kubenswrapper[4930]: I1012 05:53:37.640806 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qlsrt\" (UniqueName: \"kubernetes.io/projected/66cab3cb-247c-426e-b3f5-ed62fd132596-kube-api-access-qlsrt\") pod \"66cab3cb-247c-426e-b3f5-ed62fd132596\" (UID: \"66cab3cb-247c-426e-b3f5-ed62fd132596\") " Oct 12 05:53:37 crc kubenswrapper[4930]: I1012 05:53:37.641039 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v52b5\" (UniqueName: \"kubernetes.io/projected/db118d84-2d96-4f6f-8ccf-8b4fb89928a1-kube-api-access-v52b5\") pod \"route-controller-manager-6d6dcb7655-gxvnn\" (UID: \"db118d84-2d96-4f6f-8ccf-8b4fb89928a1\") " pod="openshift-route-controller-manager/route-controller-manager-6d6dcb7655-gxvnn" Oct 12 05:53:37 crc kubenswrapper[4930]: I1012 05:53:37.641119 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/db118d84-2d96-4f6f-8ccf-8b4fb89928a1-serving-cert\") pod \"route-controller-manager-6d6dcb7655-gxvnn\" (UID: \"db118d84-2d96-4f6f-8ccf-8b4fb89928a1\") " pod="openshift-route-controller-manager/route-controller-manager-6d6dcb7655-gxvnn" Oct 12 05:53:37 crc kubenswrapper[4930]: I1012 05:53:37.641164 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db118d84-2d96-4f6f-8ccf-8b4fb89928a1-config\") pod \"route-controller-manager-6d6dcb7655-gxvnn\" (UID: \"db118d84-2d96-4f6f-8ccf-8b4fb89928a1\") " pod="openshift-route-controller-manager/route-controller-manager-6d6dcb7655-gxvnn" Oct 12 05:53:37 crc kubenswrapper[4930]: I1012 05:53:37.641252 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/db118d84-2d96-4f6f-8ccf-8b4fb89928a1-client-ca\") pod \"route-controller-manager-6d6dcb7655-gxvnn\" (UID: \"db118d84-2d96-4f6f-8ccf-8b4fb89928a1\") " pod="openshift-route-controller-manager/route-controller-manager-6d6dcb7655-gxvnn" Oct 12 05:53:37 crc kubenswrapper[4930]: I1012 05:53:37.641767 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66cab3cb-247c-426e-b3f5-ed62fd132596-config" (OuterVolumeSpecName: "config") pod "66cab3cb-247c-426e-b3f5-ed62fd132596" (UID: "66cab3cb-247c-426e-b3f5-ed62fd132596"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:53:37 crc kubenswrapper[4930]: I1012 05:53:37.641810 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66cab3cb-247c-426e-b3f5-ed62fd132596-client-ca" (OuterVolumeSpecName: "client-ca") pod "66cab3cb-247c-426e-b3f5-ed62fd132596" (UID: "66cab3cb-247c-426e-b3f5-ed62fd132596"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:53:37 crc kubenswrapper[4930]: I1012 05:53:37.647573 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66cab3cb-247c-426e-b3f5-ed62fd132596-kube-api-access-qlsrt" (OuterVolumeSpecName: "kube-api-access-qlsrt") pod "66cab3cb-247c-426e-b3f5-ed62fd132596" (UID: "66cab3cb-247c-426e-b3f5-ed62fd132596"). InnerVolumeSpecName "kube-api-access-qlsrt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:53:37 crc kubenswrapper[4930]: I1012 05:53:37.647582 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66cab3cb-247c-426e-b3f5-ed62fd132596-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "66cab3cb-247c-426e-b3f5-ed62fd132596" (UID: "66cab3cb-247c-426e-b3f5-ed62fd132596"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:53:37 crc kubenswrapper[4930]: I1012 05:53:37.742620 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/db118d84-2d96-4f6f-8ccf-8b4fb89928a1-serving-cert\") pod \"route-controller-manager-6d6dcb7655-gxvnn\" (UID: \"db118d84-2d96-4f6f-8ccf-8b4fb89928a1\") " pod="openshift-route-controller-manager/route-controller-manager-6d6dcb7655-gxvnn" Oct 12 05:53:37 crc kubenswrapper[4930]: I1012 05:53:37.742721 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db118d84-2d96-4f6f-8ccf-8b4fb89928a1-config\") pod \"route-controller-manager-6d6dcb7655-gxvnn\" (UID: \"db118d84-2d96-4f6f-8ccf-8b4fb89928a1\") " pod="openshift-route-controller-manager/route-controller-manager-6d6dcb7655-gxvnn" Oct 12 05:53:37 crc kubenswrapper[4930]: I1012 05:53:37.742830 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/db118d84-2d96-4f6f-8ccf-8b4fb89928a1-client-ca\") pod \"route-controller-manager-6d6dcb7655-gxvnn\" (UID: \"db118d84-2d96-4f6f-8ccf-8b4fb89928a1\") " pod="openshift-route-controller-manager/route-controller-manager-6d6dcb7655-gxvnn" Oct 12 05:53:37 crc kubenswrapper[4930]: I1012 05:53:37.742910 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v52b5\" (UniqueName: \"kubernetes.io/projected/db118d84-2d96-4f6f-8ccf-8b4fb89928a1-kube-api-access-v52b5\") pod \"route-controller-manager-6d6dcb7655-gxvnn\" (UID: \"db118d84-2d96-4f6f-8ccf-8b4fb89928a1\") " pod="openshift-route-controller-manager/route-controller-manager-6d6dcb7655-gxvnn" Oct 12 05:53:37 crc kubenswrapper[4930]: I1012 05:53:37.742988 4930 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66cab3cb-247c-426e-b3f5-ed62fd132596-config\") on node \"crc\" DevicePath \"\"" Oct 12 05:53:37 crc kubenswrapper[4930]: I1012 05:53:37.743009 4930 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66cab3cb-247c-426e-b3f5-ed62fd132596-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 12 05:53:37 crc kubenswrapper[4930]: I1012 05:53:37.743027 4930 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/66cab3cb-247c-426e-b3f5-ed62fd132596-client-ca\") on node \"crc\" DevicePath \"\"" Oct 12 05:53:37 crc kubenswrapper[4930]: I1012 05:53:37.743042 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qlsrt\" (UniqueName: \"kubernetes.io/projected/66cab3cb-247c-426e-b3f5-ed62fd132596-kube-api-access-qlsrt\") on node \"crc\" DevicePath \"\"" Oct 12 05:53:37 crc kubenswrapper[4930]: I1012 05:53:37.744250 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/db118d84-2d96-4f6f-8ccf-8b4fb89928a1-client-ca\") pod \"route-controller-manager-6d6dcb7655-gxvnn\" (UID: \"db118d84-2d96-4f6f-8ccf-8b4fb89928a1\") " pod="openshift-route-controller-manager/route-controller-manager-6d6dcb7655-gxvnn" Oct 12 05:53:37 crc kubenswrapper[4930]: I1012 05:53:37.744954 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db118d84-2d96-4f6f-8ccf-8b4fb89928a1-config\") pod \"route-controller-manager-6d6dcb7655-gxvnn\" (UID: \"db118d84-2d96-4f6f-8ccf-8b4fb89928a1\") " pod="openshift-route-controller-manager/route-controller-manager-6d6dcb7655-gxvnn" Oct 12 05:53:37 crc kubenswrapper[4930]: I1012 05:53:37.747115 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/db118d84-2d96-4f6f-8ccf-8b4fb89928a1-serving-cert\") pod \"route-controller-manager-6d6dcb7655-gxvnn\" (UID: \"db118d84-2d96-4f6f-8ccf-8b4fb89928a1\") " pod="openshift-route-controller-manager/route-controller-manager-6d6dcb7655-gxvnn" Oct 12 05:53:37 crc kubenswrapper[4930]: I1012 05:53:37.783053 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v52b5\" (UniqueName: \"kubernetes.io/projected/db118d84-2d96-4f6f-8ccf-8b4fb89928a1-kube-api-access-v52b5\") pod \"route-controller-manager-6d6dcb7655-gxvnn\" (UID: \"db118d84-2d96-4f6f-8ccf-8b4fb89928a1\") " pod="openshift-route-controller-manager/route-controller-manager-6d6dcb7655-gxvnn" Oct 12 05:53:37 crc kubenswrapper[4930]: I1012 05:53:37.832268 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d6dcb7655-gxvnn" Oct 12 05:53:37 crc kubenswrapper[4930]: I1012 05:53:37.931493 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-59bdf459cf-2hhq5_66cab3cb-247c-426e-b3f5-ed62fd132596/route-controller-manager/0.log" Oct 12 05:53:37 crc kubenswrapper[4930]: I1012 05:53:37.931865 4930 generic.go:334] "Generic (PLEG): container finished" podID="66cab3cb-247c-426e-b3f5-ed62fd132596" containerID="62b8b97946544b7ded6d4631061bc3940433bccf4ab2a63ada626a3f1366f383" exitCode=255 Oct 12 05:53:37 crc kubenswrapper[4930]: I1012 05:53:37.932204 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-59bdf459cf-2hhq5" Oct 12 05:53:37 crc kubenswrapper[4930]: I1012 05:53:37.933020 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-59bdf459cf-2hhq5" event={"ID":"66cab3cb-247c-426e-b3f5-ed62fd132596","Type":"ContainerDied","Data":"62b8b97946544b7ded6d4631061bc3940433bccf4ab2a63ada626a3f1366f383"} Oct 12 05:53:37 crc kubenswrapper[4930]: I1012 05:53:37.933053 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-59bdf459cf-2hhq5" event={"ID":"66cab3cb-247c-426e-b3f5-ed62fd132596","Type":"ContainerDied","Data":"19237a1bb40af1cf3a2769a7a9cb4e5fbabd6abad8e0e20d56a6799ebd602ae5"} Oct 12 05:53:37 crc kubenswrapper[4930]: I1012 05:53:37.933072 4930 scope.go:117] "RemoveContainer" containerID="62b8b97946544b7ded6d4631061bc3940433bccf4ab2a63ada626a3f1366f383" Oct 12 05:53:37 crc kubenswrapper[4930]: I1012 05:53:37.941511 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-676bd8bf7c-s6bkp" Oct 12 05:53:37 crc kubenswrapper[4930]: I1012 05:53:37.966916 4930 scope.go:117] "RemoveContainer" containerID="62b8b97946544b7ded6d4631061bc3940433bccf4ab2a63ada626a3f1366f383" Oct 12 05:53:37 crc kubenswrapper[4930]: E1012 05:53:37.968176 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62b8b97946544b7ded6d4631061bc3940433bccf4ab2a63ada626a3f1366f383\": container with ID starting with 62b8b97946544b7ded6d4631061bc3940433bccf4ab2a63ada626a3f1366f383 not found: ID does not exist" containerID="62b8b97946544b7ded6d4631061bc3940433bccf4ab2a63ada626a3f1366f383" Oct 12 05:53:37 crc kubenswrapper[4930]: I1012 05:53:37.968213 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62b8b97946544b7ded6d4631061bc3940433bccf4ab2a63ada626a3f1366f383"} err="failed to get container status \"62b8b97946544b7ded6d4631061bc3940433bccf4ab2a63ada626a3f1366f383\": rpc error: code = NotFound desc = could not find container \"62b8b97946544b7ded6d4631061bc3940433bccf4ab2a63ada626a3f1366f383\": container with ID starting with 62b8b97946544b7ded6d4631061bc3940433bccf4ab2a63ada626a3f1366f383 not found: ID does not exist" Oct 12 05:53:37 crc kubenswrapper[4930]: I1012 05:53:37.995260 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59bdf459cf-2hhq5"] Oct 12 05:53:37 crc kubenswrapper[4930]: I1012 05:53:37.999665 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59bdf459cf-2hhq5"] Oct 12 05:53:38 crc kubenswrapper[4930]: I1012 05:53:38.066492 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d6dcb7655-gxvnn"] Oct 12 05:53:38 crc kubenswrapper[4930]: I1012 05:53:38.151851 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66cab3cb-247c-426e-b3f5-ed62fd132596" path="/var/lib/kubelet/pods/66cab3cb-247c-426e-b3f5-ed62fd132596/volumes" Oct 12 05:53:38 crc kubenswrapper[4930]: I1012 05:53:38.943247 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d6dcb7655-gxvnn" event={"ID":"db118d84-2d96-4f6f-8ccf-8b4fb89928a1","Type":"ContainerStarted","Data":"36d62a40a6b33d90dbd5d69325deb42c4799c40c04a694681585652ed97b66e0"} Oct 12 05:53:38 crc kubenswrapper[4930]: I1012 05:53:38.943872 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6d6dcb7655-gxvnn" Oct 12 05:53:38 crc kubenswrapper[4930]: I1012 05:53:38.943894 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d6dcb7655-gxvnn" event={"ID":"db118d84-2d96-4f6f-8ccf-8b4fb89928a1","Type":"ContainerStarted","Data":"2aaeb226cf63c556f1381da2759ff6e021e4403328aa8d47da38d72b4ef1793c"} Oct 12 05:53:38 crc kubenswrapper[4930]: I1012 05:53:38.949084 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6d6dcb7655-gxvnn" Oct 12 05:53:38 crc kubenswrapper[4930]: I1012 05:53:38.966573 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6d6dcb7655-gxvnn" podStartSLOduration=2.966553822 podStartE2EDuration="2.966553822s" podCreationTimestamp="2025-10-12 05:53:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:53:38.963195787 +0000 UTC m=+751.505297542" watchObservedRunningTime="2025-10-12 05:53:38.966553822 +0000 UTC m=+751.508655587" Oct 12 05:53:41 crc kubenswrapper[4930]: I1012 05:53:41.503098 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-7ccf4dfd66-4nwxw" Oct 12 05:53:42 crc kubenswrapper[4930]: I1012 05:53:42.451473 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-qv5cn"] Oct 12 05:53:42 crc kubenswrapper[4930]: I1012 05:53:42.453964 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-qv5cn" Oct 12 05:53:42 crc kubenswrapper[4930]: I1012 05:53:42.457395 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Oct 12 05:53:42 crc kubenswrapper[4930]: I1012 05:53:42.458654 4930 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Oct 12 05:53:42 crc kubenswrapper[4930]: I1012 05:53:42.459044 4930 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-8b4sh" Oct 12 05:53:42 crc kubenswrapper[4930]: I1012 05:53:42.477162 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-b5w72"] Oct 12 05:53:42 crc kubenswrapper[4930]: I1012 05:53:42.478164 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-b5w72" Oct 12 05:53:42 crc kubenswrapper[4930]: I1012 05:53:42.482672 4930 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Oct 12 05:53:42 crc kubenswrapper[4930]: I1012 05:53:42.505604 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-b5w72"] Oct 12 05:53:42 crc kubenswrapper[4930]: I1012 05:53:42.568987 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-dd89b"] Oct 12 05:53:42 crc kubenswrapper[4930]: I1012 05:53:42.569851 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-dd89b" Oct 12 05:53:42 crc kubenswrapper[4930]: I1012 05:53:42.575038 4930 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Oct 12 05:53:42 crc kubenswrapper[4930]: I1012 05:53:42.575080 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Oct 12 05:53:42 crc kubenswrapper[4930]: I1012 05:53:42.575104 4930 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Oct 12 05:53:42 crc kubenswrapper[4930]: I1012 05:53:42.575088 4930 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-hq5b2" Oct 12 05:53:42 crc kubenswrapper[4930]: I1012 05:53:42.581789 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-68d546b9d8-7sgrd"] Oct 12 05:53:42 crc kubenswrapper[4930]: I1012 05:53:42.582863 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-7sgrd" Oct 12 05:53:42 crc kubenswrapper[4930]: I1012 05:53:42.586438 4930 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Oct 12 05:53:42 crc kubenswrapper[4930]: I1012 05:53:42.594313 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-7sgrd"] Oct 12 05:53:42 crc kubenswrapper[4930]: I1012 05:53:42.610539 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c9aa3dfb-870d-4246-96a2-4c52c470239f-metrics-certs\") pod \"frr-k8s-qv5cn\" (UID: \"c9aa3dfb-870d-4246-96a2-4c52c470239f\") " pod="metallb-system/frr-k8s-qv5cn" Oct 12 05:53:42 crc kubenswrapper[4930]: I1012 05:53:42.610600 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/c9aa3dfb-870d-4246-96a2-4c52c470239f-metrics\") pod \"frr-k8s-qv5cn\" (UID: \"c9aa3dfb-870d-4246-96a2-4c52c470239f\") " pod="metallb-system/frr-k8s-qv5cn" Oct 12 05:53:42 crc kubenswrapper[4930]: I1012 05:53:42.610643 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/610c85e1-ab61-4938-a33d-cbed5a873874-cert\") pod \"frr-k8s-webhook-server-64bf5d555-b5w72\" (UID: \"610c85e1-ab61-4938-a33d-cbed5a873874\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-b5w72" Oct 12 05:53:42 crc kubenswrapper[4930]: I1012 05:53:42.610725 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/c9aa3dfb-870d-4246-96a2-4c52c470239f-frr-startup\") pod \"frr-k8s-qv5cn\" (UID: \"c9aa3dfb-870d-4246-96a2-4c52c470239f\") " pod="metallb-system/frr-k8s-qv5cn" Oct 12 05:53:42 crc kubenswrapper[4930]: I1012 05:53:42.610802 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9q6jz\" (UniqueName: \"kubernetes.io/projected/610c85e1-ab61-4938-a33d-cbed5a873874-kube-api-access-9q6jz\") pod \"frr-k8s-webhook-server-64bf5d555-b5w72\" (UID: \"610c85e1-ab61-4938-a33d-cbed5a873874\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-b5w72" Oct 12 05:53:42 crc kubenswrapper[4930]: I1012 05:53:42.610849 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/c9aa3dfb-870d-4246-96a2-4c52c470239f-reloader\") pod \"frr-k8s-qv5cn\" (UID: \"c9aa3dfb-870d-4246-96a2-4c52c470239f\") " pod="metallb-system/frr-k8s-qv5cn" Oct 12 05:53:42 crc kubenswrapper[4930]: I1012 05:53:42.611042 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/c9aa3dfb-870d-4246-96a2-4c52c470239f-frr-conf\") pod \"frr-k8s-qv5cn\" (UID: \"c9aa3dfb-870d-4246-96a2-4c52c470239f\") " pod="metallb-system/frr-k8s-qv5cn" Oct 12 05:53:42 crc kubenswrapper[4930]: I1012 05:53:42.611154 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spgfz\" (UniqueName: \"kubernetes.io/projected/c9aa3dfb-870d-4246-96a2-4c52c470239f-kube-api-access-spgfz\") pod \"frr-k8s-qv5cn\" (UID: \"c9aa3dfb-870d-4246-96a2-4c52c470239f\") " pod="metallb-system/frr-k8s-qv5cn" Oct 12 05:53:42 crc kubenswrapper[4930]: I1012 05:53:42.611207 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/c9aa3dfb-870d-4246-96a2-4c52c470239f-frr-sockets\") pod \"frr-k8s-qv5cn\" (UID: \"c9aa3dfb-870d-4246-96a2-4c52c470239f\") " pod="metallb-system/frr-k8s-qv5cn" Oct 12 05:53:42 crc kubenswrapper[4930]: I1012 05:53:42.712536 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9q6jz\" (UniqueName: \"kubernetes.io/projected/610c85e1-ab61-4938-a33d-cbed5a873874-kube-api-access-9q6jz\") pod \"frr-k8s-webhook-server-64bf5d555-b5w72\" (UID: \"610c85e1-ab61-4938-a33d-cbed5a873874\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-b5w72" Oct 12 05:53:42 crc kubenswrapper[4930]: I1012 05:53:42.712658 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/908d6f68-9800-4794-88a0-21cca2bb3691-cert\") pod \"controller-68d546b9d8-7sgrd\" (UID: \"908d6f68-9800-4794-88a0-21cca2bb3691\") " pod="metallb-system/controller-68d546b9d8-7sgrd" Oct 12 05:53:42 crc kubenswrapper[4930]: I1012 05:53:42.712683 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/c9aa3dfb-870d-4246-96a2-4c52c470239f-reloader\") pod \"frr-k8s-qv5cn\" (UID: \"c9aa3dfb-870d-4246-96a2-4c52c470239f\") " pod="metallb-system/frr-k8s-qv5cn" Oct 12 05:53:42 crc kubenswrapper[4930]: I1012 05:53:42.712705 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/c9aa3dfb-870d-4246-96a2-4c52c470239f-frr-conf\") pod \"frr-k8s-qv5cn\" (UID: \"c9aa3dfb-870d-4246-96a2-4c52c470239f\") " pod="metallb-system/frr-k8s-qv5cn" Oct 12 05:53:42 crc kubenswrapper[4930]: I1012 05:53:42.712723 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spgfz\" (UniqueName: \"kubernetes.io/projected/c9aa3dfb-870d-4246-96a2-4c52c470239f-kube-api-access-spgfz\") pod \"frr-k8s-qv5cn\" (UID: \"c9aa3dfb-870d-4246-96a2-4c52c470239f\") " pod="metallb-system/frr-k8s-qv5cn" Oct 12 05:53:42 crc kubenswrapper[4930]: I1012 05:53:42.712753 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/c9aa3dfb-870d-4246-96a2-4c52c470239f-frr-sockets\") pod \"frr-k8s-qv5cn\" (UID: \"c9aa3dfb-870d-4246-96a2-4c52c470239f\") " pod="metallb-system/frr-k8s-qv5cn" Oct 12 05:53:42 crc kubenswrapper[4930]: I1012 05:53:42.712770 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/61b8c651-74f8-4532-9554-55b01e58c0e6-metrics-certs\") pod \"speaker-dd89b\" (UID: \"61b8c651-74f8-4532-9554-55b01e58c0e6\") " pod="metallb-system/speaker-dd89b" Oct 12 05:53:42 crc kubenswrapper[4930]: I1012 05:53:42.712801 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sk686\" (UniqueName: \"kubernetes.io/projected/61b8c651-74f8-4532-9554-55b01e58c0e6-kube-api-access-sk686\") pod \"speaker-dd89b\" (UID: \"61b8c651-74f8-4532-9554-55b01e58c0e6\") " pod="metallb-system/speaker-dd89b" Oct 12 05:53:42 crc kubenswrapper[4930]: I1012 05:53:42.712822 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c9aa3dfb-870d-4246-96a2-4c52c470239f-metrics-certs\") pod \"frr-k8s-qv5cn\" (UID: \"c9aa3dfb-870d-4246-96a2-4c52c470239f\") " pod="metallb-system/frr-k8s-qv5cn" Oct 12 05:53:42 crc kubenswrapper[4930]: I1012 05:53:42.712839 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/c9aa3dfb-870d-4246-96a2-4c52c470239f-metrics\") pod \"frr-k8s-qv5cn\" (UID: \"c9aa3dfb-870d-4246-96a2-4c52c470239f\") " pod="metallb-system/frr-k8s-qv5cn" Oct 12 05:53:42 crc kubenswrapper[4930]: I1012 05:53:42.712861 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/610c85e1-ab61-4938-a33d-cbed5a873874-cert\") pod \"frr-k8s-webhook-server-64bf5d555-b5w72\" (UID: \"610c85e1-ab61-4938-a33d-cbed5a873874\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-b5w72" Oct 12 05:53:42 crc kubenswrapper[4930]: I1012 05:53:42.712887 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/61b8c651-74f8-4532-9554-55b01e58c0e6-metallb-excludel2\") pod \"speaker-dd89b\" (UID: \"61b8c651-74f8-4532-9554-55b01e58c0e6\") " pod="metallb-system/speaker-dd89b" Oct 12 05:53:42 crc kubenswrapper[4930]: I1012 05:53:42.712910 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/908d6f68-9800-4794-88a0-21cca2bb3691-metrics-certs\") pod \"controller-68d546b9d8-7sgrd\" (UID: \"908d6f68-9800-4794-88a0-21cca2bb3691\") " pod="metallb-system/controller-68d546b9d8-7sgrd" Oct 12 05:53:42 crc kubenswrapper[4930]: I1012 05:53:42.712929 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r68t9\" (UniqueName: \"kubernetes.io/projected/908d6f68-9800-4794-88a0-21cca2bb3691-kube-api-access-r68t9\") pod \"controller-68d546b9d8-7sgrd\" (UID: \"908d6f68-9800-4794-88a0-21cca2bb3691\") " pod="metallb-system/controller-68d546b9d8-7sgrd" Oct 12 05:53:42 crc kubenswrapper[4930]: I1012 05:53:42.712949 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/c9aa3dfb-870d-4246-96a2-4c52c470239f-frr-startup\") pod \"frr-k8s-qv5cn\" (UID: \"c9aa3dfb-870d-4246-96a2-4c52c470239f\") " pod="metallb-system/frr-k8s-qv5cn" Oct 12 05:53:42 crc kubenswrapper[4930]: I1012 05:53:42.712965 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/61b8c651-74f8-4532-9554-55b01e58c0e6-memberlist\") pod \"speaker-dd89b\" (UID: \"61b8c651-74f8-4532-9554-55b01e58c0e6\") " pod="metallb-system/speaker-dd89b" Oct 12 05:53:42 crc kubenswrapper[4930]: E1012 05:53:42.712990 4930 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Oct 12 05:53:42 crc kubenswrapper[4930]: E1012 05:53:42.713046 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c9aa3dfb-870d-4246-96a2-4c52c470239f-metrics-certs podName:c9aa3dfb-870d-4246-96a2-4c52c470239f nodeName:}" failed. No retries permitted until 2025-10-12 05:53:43.213030449 +0000 UTC m=+755.755132204 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c9aa3dfb-870d-4246-96a2-4c52c470239f-metrics-certs") pod "frr-k8s-qv5cn" (UID: "c9aa3dfb-870d-4246-96a2-4c52c470239f") : secret "frr-k8s-certs-secret" not found Oct 12 05:53:42 crc kubenswrapper[4930]: I1012 05:53:42.713136 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/c9aa3dfb-870d-4246-96a2-4c52c470239f-reloader\") pod \"frr-k8s-qv5cn\" (UID: \"c9aa3dfb-870d-4246-96a2-4c52c470239f\") " pod="metallb-system/frr-k8s-qv5cn" Oct 12 05:53:42 crc kubenswrapper[4930]: E1012 05:53:42.713287 4930 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Oct 12 05:53:42 crc kubenswrapper[4930]: E1012 05:53:42.713334 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/610c85e1-ab61-4938-a33d-cbed5a873874-cert podName:610c85e1-ab61-4938-a33d-cbed5a873874 nodeName:}" failed. No retries permitted until 2025-10-12 05:53:43.213318056 +0000 UTC m=+755.755419821 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/610c85e1-ab61-4938-a33d-cbed5a873874-cert") pod "frr-k8s-webhook-server-64bf5d555-b5w72" (UID: "610c85e1-ab61-4938-a33d-cbed5a873874") : secret "frr-k8s-webhook-server-cert" not found Oct 12 05:53:42 crc kubenswrapper[4930]: I1012 05:53:42.713388 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/c9aa3dfb-870d-4246-96a2-4c52c470239f-metrics\") pod \"frr-k8s-qv5cn\" (UID: \"c9aa3dfb-870d-4246-96a2-4c52c470239f\") " pod="metallb-system/frr-k8s-qv5cn" Oct 12 05:53:42 crc kubenswrapper[4930]: I1012 05:53:42.713507 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/c9aa3dfb-870d-4246-96a2-4c52c470239f-frr-sockets\") pod \"frr-k8s-qv5cn\" (UID: \"c9aa3dfb-870d-4246-96a2-4c52c470239f\") " pod="metallb-system/frr-k8s-qv5cn" Oct 12 05:53:42 crc kubenswrapper[4930]: I1012 05:53:42.713542 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/c9aa3dfb-870d-4246-96a2-4c52c470239f-frr-conf\") pod \"frr-k8s-qv5cn\" (UID: \"c9aa3dfb-870d-4246-96a2-4c52c470239f\") " pod="metallb-system/frr-k8s-qv5cn" Oct 12 05:53:42 crc kubenswrapper[4930]: I1012 05:53:42.714152 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/c9aa3dfb-870d-4246-96a2-4c52c470239f-frr-startup\") pod \"frr-k8s-qv5cn\" (UID: \"c9aa3dfb-870d-4246-96a2-4c52c470239f\") " pod="metallb-system/frr-k8s-qv5cn" Oct 12 05:53:42 crc kubenswrapper[4930]: I1012 05:53:42.730555 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9q6jz\" (UniqueName: \"kubernetes.io/projected/610c85e1-ab61-4938-a33d-cbed5a873874-kube-api-access-9q6jz\") pod \"frr-k8s-webhook-server-64bf5d555-b5w72\" (UID: \"610c85e1-ab61-4938-a33d-cbed5a873874\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-b5w72" Oct 12 05:53:42 crc kubenswrapper[4930]: I1012 05:53:42.737896 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spgfz\" (UniqueName: \"kubernetes.io/projected/c9aa3dfb-870d-4246-96a2-4c52c470239f-kube-api-access-spgfz\") pod \"frr-k8s-qv5cn\" (UID: \"c9aa3dfb-870d-4246-96a2-4c52c470239f\") " pod="metallb-system/frr-k8s-qv5cn" Oct 12 05:53:42 crc kubenswrapper[4930]: I1012 05:53:42.813706 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/908d6f68-9800-4794-88a0-21cca2bb3691-cert\") pod \"controller-68d546b9d8-7sgrd\" (UID: \"908d6f68-9800-4794-88a0-21cca2bb3691\") " pod="metallb-system/controller-68d546b9d8-7sgrd" Oct 12 05:53:42 crc kubenswrapper[4930]: I1012 05:53:42.813918 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/61b8c651-74f8-4532-9554-55b01e58c0e6-metrics-certs\") pod \"speaker-dd89b\" (UID: \"61b8c651-74f8-4532-9554-55b01e58c0e6\") " pod="metallb-system/speaker-dd89b" Oct 12 05:53:42 crc kubenswrapper[4930]: I1012 05:53:42.813948 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sk686\" (UniqueName: \"kubernetes.io/projected/61b8c651-74f8-4532-9554-55b01e58c0e6-kube-api-access-sk686\") pod \"speaker-dd89b\" (UID: \"61b8c651-74f8-4532-9554-55b01e58c0e6\") " pod="metallb-system/speaker-dd89b" Oct 12 05:53:42 crc kubenswrapper[4930]: I1012 05:53:42.814057 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/61b8c651-74f8-4532-9554-55b01e58c0e6-metallb-excludel2\") pod \"speaker-dd89b\" (UID: \"61b8c651-74f8-4532-9554-55b01e58c0e6\") " pod="metallb-system/speaker-dd89b" Oct 12 05:53:42 crc kubenswrapper[4930]: I1012 05:53:42.814080 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/908d6f68-9800-4794-88a0-21cca2bb3691-metrics-certs\") pod \"controller-68d546b9d8-7sgrd\" (UID: \"908d6f68-9800-4794-88a0-21cca2bb3691\") " pod="metallb-system/controller-68d546b9d8-7sgrd" Oct 12 05:53:42 crc kubenswrapper[4930]: I1012 05:53:42.814369 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r68t9\" (UniqueName: \"kubernetes.io/projected/908d6f68-9800-4794-88a0-21cca2bb3691-kube-api-access-r68t9\") pod \"controller-68d546b9d8-7sgrd\" (UID: \"908d6f68-9800-4794-88a0-21cca2bb3691\") " pod="metallb-system/controller-68d546b9d8-7sgrd" Oct 12 05:53:42 crc kubenswrapper[4930]: E1012 05:53:42.814248 4930 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Oct 12 05:53:42 crc kubenswrapper[4930]: E1012 05:53:42.814457 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/908d6f68-9800-4794-88a0-21cca2bb3691-metrics-certs podName:908d6f68-9800-4794-88a0-21cca2bb3691 nodeName:}" failed. No retries permitted until 2025-10-12 05:53:43.314440066 +0000 UTC m=+755.856541831 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/908d6f68-9800-4794-88a0-21cca2bb3691-metrics-certs") pod "controller-68d546b9d8-7sgrd" (UID: "908d6f68-9800-4794-88a0-21cca2bb3691") : secret "controller-certs-secret" not found Oct 12 05:53:42 crc kubenswrapper[4930]: I1012 05:53:42.814548 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/61b8c651-74f8-4532-9554-55b01e58c0e6-memberlist\") pod \"speaker-dd89b\" (UID: \"61b8c651-74f8-4532-9554-55b01e58c0e6\") " pod="metallb-system/speaker-dd89b" Oct 12 05:53:42 crc kubenswrapper[4930]: E1012 05:53:42.814559 4930 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 12 05:53:42 crc kubenswrapper[4930]: E1012 05:53:42.814624 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61b8c651-74f8-4532-9554-55b01e58c0e6-memberlist podName:61b8c651-74f8-4532-9554-55b01e58c0e6 nodeName:}" failed. No retries permitted until 2025-10-12 05:53:43.31460648 +0000 UTC m=+755.856708235 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/61b8c651-74f8-4532-9554-55b01e58c0e6-memberlist") pod "speaker-dd89b" (UID: "61b8c651-74f8-4532-9554-55b01e58c0e6") : secret "metallb-memberlist" not found Oct 12 05:53:42 crc kubenswrapper[4930]: I1012 05:53:42.815021 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/61b8c651-74f8-4532-9554-55b01e58c0e6-metallb-excludel2\") pod \"speaker-dd89b\" (UID: \"61b8c651-74f8-4532-9554-55b01e58c0e6\") " pod="metallb-system/speaker-dd89b" Oct 12 05:53:42 crc kubenswrapper[4930]: I1012 05:53:42.816165 4930 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 12 05:53:42 crc kubenswrapper[4930]: I1012 05:53:42.824429 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/61b8c651-74f8-4532-9554-55b01e58c0e6-metrics-certs\") pod \"speaker-dd89b\" (UID: \"61b8c651-74f8-4532-9554-55b01e58c0e6\") " pod="metallb-system/speaker-dd89b" Oct 12 05:53:42 crc kubenswrapper[4930]: I1012 05:53:42.831157 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/908d6f68-9800-4794-88a0-21cca2bb3691-cert\") pod \"controller-68d546b9d8-7sgrd\" (UID: \"908d6f68-9800-4794-88a0-21cca2bb3691\") " pod="metallb-system/controller-68d546b9d8-7sgrd" Oct 12 05:53:42 crc kubenswrapper[4930]: I1012 05:53:42.834881 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r68t9\" (UniqueName: \"kubernetes.io/projected/908d6f68-9800-4794-88a0-21cca2bb3691-kube-api-access-r68t9\") pod \"controller-68d546b9d8-7sgrd\" (UID: \"908d6f68-9800-4794-88a0-21cca2bb3691\") " pod="metallb-system/controller-68d546b9d8-7sgrd" Oct 12 05:53:42 crc kubenswrapper[4930]: I1012 05:53:42.835480 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sk686\" (UniqueName: \"kubernetes.io/projected/61b8c651-74f8-4532-9554-55b01e58c0e6-kube-api-access-sk686\") pod \"speaker-dd89b\" (UID: \"61b8c651-74f8-4532-9554-55b01e58c0e6\") " pod="metallb-system/speaker-dd89b" Oct 12 05:53:43 crc kubenswrapper[4930]: I1012 05:53:43.219795 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c9aa3dfb-870d-4246-96a2-4c52c470239f-metrics-certs\") pod \"frr-k8s-qv5cn\" (UID: \"c9aa3dfb-870d-4246-96a2-4c52c470239f\") " pod="metallb-system/frr-k8s-qv5cn" Oct 12 05:53:43 crc kubenswrapper[4930]: I1012 05:53:43.220114 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/610c85e1-ab61-4938-a33d-cbed5a873874-cert\") pod \"frr-k8s-webhook-server-64bf5d555-b5w72\" (UID: \"610c85e1-ab61-4938-a33d-cbed5a873874\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-b5w72" Oct 12 05:53:43 crc kubenswrapper[4930]: I1012 05:53:43.223983 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c9aa3dfb-870d-4246-96a2-4c52c470239f-metrics-certs\") pod \"frr-k8s-qv5cn\" (UID: \"c9aa3dfb-870d-4246-96a2-4c52c470239f\") " pod="metallb-system/frr-k8s-qv5cn" Oct 12 05:53:43 crc kubenswrapper[4930]: I1012 05:53:43.225540 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/610c85e1-ab61-4938-a33d-cbed5a873874-cert\") pod \"frr-k8s-webhook-server-64bf5d555-b5w72\" (UID: \"610c85e1-ab61-4938-a33d-cbed5a873874\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-b5w72" Oct 12 05:53:43 crc kubenswrapper[4930]: I1012 05:53:43.287543 4930 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 12 05:53:43 crc kubenswrapper[4930]: I1012 05:53:43.321009 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/908d6f68-9800-4794-88a0-21cca2bb3691-metrics-certs\") pod \"controller-68d546b9d8-7sgrd\" (UID: \"908d6f68-9800-4794-88a0-21cca2bb3691\") " pod="metallb-system/controller-68d546b9d8-7sgrd" Oct 12 05:53:43 crc kubenswrapper[4930]: I1012 05:53:43.321062 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/61b8c651-74f8-4532-9554-55b01e58c0e6-memberlist\") pod \"speaker-dd89b\" (UID: \"61b8c651-74f8-4532-9554-55b01e58c0e6\") " pod="metallb-system/speaker-dd89b" Oct 12 05:53:43 crc kubenswrapper[4930]: E1012 05:53:43.321244 4930 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 12 05:53:43 crc kubenswrapper[4930]: E1012 05:53:43.321312 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61b8c651-74f8-4532-9554-55b01e58c0e6-memberlist podName:61b8c651-74f8-4532-9554-55b01e58c0e6 nodeName:}" failed. No retries permitted until 2025-10-12 05:53:44.321297677 +0000 UTC m=+756.863399442 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/61b8c651-74f8-4532-9554-55b01e58c0e6-memberlist") pod "speaker-dd89b" (UID: "61b8c651-74f8-4532-9554-55b01e58c0e6") : secret "metallb-memberlist" not found Oct 12 05:53:43 crc kubenswrapper[4930]: I1012 05:53:43.325643 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/908d6f68-9800-4794-88a0-21cca2bb3691-metrics-certs\") pod \"controller-68d546b9d8-7sgrd\" (UID: \"908d6f68-9800-4794-88a0-21cca2bb3691\") " pod="metallb-system/controller-68d546b9d8-7sgrd" Oct 12 05:53:43 crc kubenswrapper[4930]: I1012 05:53:43.393316 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-qv5cn" Oct 12 05:53:43 crc kubenswrapper[4930]: I1012 05:53:43.405410 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-b5w72" Oct 12 05:53:43 crc kubenswrapper[4930]: I1012 05:53:43.499545 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-7sgrd" Oct 12 05:53:43 crc kubenswrapper[4930]: I1012 05:53:43.832594 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-b5w72"] Oct 12 05:53:43 crc kubenswrapper[4930]: W1012 05:53:43.842064 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod610c85e1_ab61_4938_a33d_cbed5a873874.slice/crio-27135fe49244a0349deca96d9ea8d7d046657edda5f3b3904fbf20918a8a208e WatchSource:0}: Error finding container 27135fe49244a0349deca96d9ea8d7d046657edda5f3b3904fbf20918a8a208e: Status 404 returned error can't find the container with id 27135fe49244a0349deca96d9ea8d7d046657edda5f3b3904fbf20918a8a208e Oct 12 05:53:43 crc kubenswrapper[4930]: I1012 05:53:43.974424 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-qv5cn" event={"ID":"c9aa3dfb-870d-4246-96a2-4c52c470239f","Type":"ContainerStarted","Data":"e85be7255ea62f25beb4627a865f4ee56d9bc0d9d0c34e320b1a5120f86d3255"} Oct 12 05:53:43 crc kubenswrapper[4930]: I1012 05:53:43.975475 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-b5w72" event={"ID":"610c85e1-ab61-4938-a33d-cbed5a873874","Type":"ContainerStarted","Data":"27135fe49244a0349deca96d9ea8d7d046657edda5f3b3904fbf20918a8a208e"} Oct 12 05:53:43 crc kubenswrapper[4930]: I1012 05:53:43.993766 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-7sgrd"] Oct 12 05:53:44 crc kubenswrapper[4930]: W1012 05:53:44.002152 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod908d6f68_9800_4794_88a0_21cca2bb3691.slice/crio-16828ea5209a159db1e79ece785ea9ab2420da65bd170bb9901601b94fd1834c WatchSource:0}: Error finding container 16828ea5209a159db1e79ece785ea9ab2420da65bd170bb9901601b94fd1834c: Status 404 returned error can't find the container with id 16828ea5209a159db1e79ece785ea9ab2420da65bd170bb9901601b94fd1834c Oct 12 05:53:44 crc kubenswrapper[4930]: I1012 05:53:44.339364 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/61b8c651-74f8-4532-9554-55b01e58c0e6-memberlist\") pod \"speaker-dd89b\" (UID: \"61b8c651-74f8-4532-9554-55b01e58c0e6\") " pod="metallb-system/speaker-dd89b" Oct 12 05:53:44 crc kubenswrapper[4930]: I1012 05:53:44.345359 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/61b8c651-74f8-4532-9554-55b01e58c0e6-memberlist\") pod \"speaker-dd89b\" (UID: \"61b8c651-74f8-4532-9554-55b01e58c0e6\") " pod="metallb-system/speaker-dd89b" Oct 12 05:53:44 crc kubenswrapper[4930]: I1012 05:53:44.385440 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-dd89b" Oct 12 05:53:44 crc kubenswrapper[4930]: I1012 05:53:44.993951 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-dd89b" event={"ID":"61b8c651-74f8-4532-9554-55b01e58c0e6","Type":"ContainerStarted","Data":"9fd54e025758684f712767e2d65438ec51b814faa02b84582ce2cb371cf18ebc"} Oct 12 05:53:44 crc kubenswrapper[4930]: I1012 05:53:44.994305 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-dd89b" event={"ID":"61b8c651-74f8-4532-9554-55b01e58c0e6","Type":"ContainerStarted","Data":"7e0bc969281dd5959be32b167a3692cc6faac141938da05d1bcaf5293b6b370b"} Oct 12 05:53:44 crc kubenswrapper[4930]: I1012 05:53:44.994319 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-dd89b" event={"ID":"61b8c651-74f8-4532-9554-55b01e58c0e6","Type":"ContainerStarted","Data":"8ccd15db670f68004ff2aa8639b9e5d1c5b53490e7937499676f7ab728aca0b8"} Oct 12 05:53:44 crc kubenswrapper[4930]: I1012 05:53:44.995363 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-dd89b" Oct 12 05:53:45 crc kubenswrapper[4930]: I1012 05:53:45.021967 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-7sgrd" event={"ID":"908d6f68-9800-4794-88a0-21cca2bb3691","Type":"ContainerStarted","Data":"78adc6d984eb5947641a0e9eaf74206f75568290323a214635f094a1d6d469c2"} Oct 12 05:53:45 crc kubenswrapper[4930]: I1012 05:53:45.022031 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-7sgrd" event={"ID":"908d6f68-9800-4794-88a0-21cca2bb3691","Type":"ContainerStarted","Data":"58170cb62b9409c679b5725b084cee0bbfa0037c35a9ed9cd51897fef47ece13"} Oct 12 05:53:45 crc kubenswrapper[4930]: I1012 05:53:45.022041 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-7sgrd" event={"ID":"908d6f68-9800-4794-88a0-21cca2bb3691","Type":"ContainerStarted","Data":"16828ea5209a159db1e79ece785ea9ab2420da65bd170bb9901601b94fd1834c"} Oct 12 05:53:45 crc kubenswrapper[4930]: I1012 05:53:45.022808 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-68d546b9d8-7sgrd" Oct 12 05:53:45 crc kubenswrapper[4930]: I1012 05:53:45.023695 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-dd89b" podStartSLOduration=3.023686128 podStartE2EDuration="3.023686128s" podCreationTimestamp="2025-10-12 05:53:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:53:45.022697333 +0000 UTC m=+757.564799098" watchObservedRunningTime="2025-10-12 05:53:45.023686128 +0000 UTC m=+757.565787893" Oct 12 05:53:45 crc kubenswrapper[4930]: I1012 05:53:45.059707 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-68d546b9d8-7sgrd" podStartSLOduration=3.059693843 podStartE2EDuration="3.059693843s" podCreationTimestamp="2025-10-12 05:53:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:53:45.056361708 +0000 UTC m=+757.598463473" watchObservedRunningTime="2025-10-12 05:53:45.059693843 +0000 UTC m=+757.601795608" Oct 12 05:53:49 crc kubenswrapper[4930]: I1012 05:53:49.089440 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4dfvt"] Oct 12 05:53:49 crc kubenswrapper[4930]: I1012 05:53:49.091497 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4dfvt" Oct 12 05:53:49 crc kubenswrapper[4930]: I1012 05:53:49.095216 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4dfvt"] Oct 12 05:53:49 crc kubenswrapper[4930]: I1012 05:53:49.220908 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7e86f7a-5516-4516-b82a-29c8f4339cdf-catalog-content\") pod \"redhat-operators-4dfvt\" (UID: \"a7e86f7a-5516-4516-b82a-29c8f4339cdf\") " pod="openshift-marketplace/redhat-operators-4dfvt" Oct 12 05:53:49 crc kubenswrapper[4930]: I1012 05:53:49.220967 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7e86f7a-5516-4516-b82a-29c8f4339cdf-utilities\") pod \"redhat-operators-4dfvt\" (UID: \"a7e86f7a-5516-4516-b82a-29c8f4339cdf\") " pod="openshift-marketplace/redhat-operators-4dfvt" Oct 12 05:53:49 crc kubenswrapper[4930]: I1012 05:53:49.221024 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dv8x5\" (UniqueName: \"kubernetes.io/projected/a7e86f7a-5516-4516-b82a-29c8f4339cdf-kube-api-access-dv8x5\") pod \"redhat-operators-4dfvt\" (UID: \"a7e86f7a-5516-4516-b82a-29c8f4339cdf\") " pod="openshift-marketplace/redhat-operators-4dfvt" Oct 12 05:53:49 crc kubenswrapper[4930]: I1012 05:53:49.322772 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7e86f7a-5516-4516-b82a-29c8f4339cdf-catalog-content\") pod \"redhat-operators-4dfvt\" (UID: \"a7e86f7a-5516-4516-b82a-29c8f4339cdf\") " pod="openshift-marketplace/redhat-operators-4dfvt" Oct 12 05:53:49 crc kubenswrapper[4930]: I1012 05:53:49.322829 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7e86f7a-5516-4516-b82a-29c8f4339cdf-utilities\") pod \"redhat-operators-4dfvt\" (UID: \"a7e86f7a-5516-4516-b82a-29c8f4339cdf\") " pod="openshift-marketplace/redhat-operators-4dfvt" Oct 12 05:53:49 crc kubenswrapper[4930]: I1012 05:53:49.322857 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dv8x5\" (UniqueName: \"kubernetes.io/projected/a7e86f7a-5516-4516-b82a-29c8f4339cdf-kube-api-access-dv8x5\") pod \"redhat-operators-4dfvt\" (UID: \"a7e86f7a-5516-4516-b82a-29c8f4339cdf\") " pod="openshift-marketplace/redhat-operators-4dfvt" Oct 12 05:53:49 crc kubenswrapper[4930]: I1012 05:53:49.323569 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7e86f7a-5516-4516-b82a-29c8f4339cdf-catalog-content\") pod \"redhat-operators-4dfvt\" (UID: \"a7e86f7a-5516-4516-b82a-29c8f4339cdf\") " pod="openshift-marketplace/redhat-operators-4dfvt" Oct 12 05:53:49 crc kubenswrapper[4930]: I1012 05:53:49.323592 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7e86f7a-5516-4516-b82a-29c8f4339cdf-utilities\") pod \"redhat-operators-4dfvt\" (UID: \"a7e86f7a-5516-4516-b82a-29c8f4339cdf\") " pod="openshift-marketplace/redhat-operators-4dfvt" Oct 12 05:53:49 crc kubenswrapper[4930]: I1012 05:53:49.346150 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dv8x5\" (UniqueName: \"kubernetes.io/projected/a7e86f7a-5516-4516-b82a-29c8f4339cdf-kube-api-access-dv8x5\") pod \"redhat-operators-4dfvt\" (UID: \"a7e86f7a-5516-4516-b82a-29c8f4339cdf\") " pod="openshift-marketplace/redhat-operators-4dfvt" Oct 12 05:53:49 crc kubenswrapper[4930]: I1012 05:53:49.446331 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4dfvt" Oct 12 05:53:51 crc kubenswrapper[4930]: I1012 05:53:51.851176 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4dfvt"] Oct 12 05:53:51 crc kubenswrapper[4930]: W1012 05:53:51.862674 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7e86f7a_5516_4516_b82a_29c8f4339cdf.slice/crio-059921041ca849703deed2426b127724881e847f7cb50ba9ca2f26b491679ac0 WatchSource:0}: Error finding container 059921041ca849703deed2426b127724881e847f7cb50ba9ca2f26b491679ac0: Status 404 returned error can't find the container with id 059921041ca849703deed2426b127724881e847f7cb50ba9ca2f26b491679ac0 Oct 12 05:53:52 crc kubenswrapper[4930]: I1012 05:53:52.075196 4930 generic.go:334] "Generic (PLEG): container finished" podID="c9aa3dfb-870d-4246-96a2-4c52c470239f" containerID="761cf2c98f723f4b82167eca90e0a905c124fd950d80ec4559e253986b03dc44" exitCode=0 Oct 12 05:53:52 crc kubenswrapper[4930]: I1012 05:53:52.075310 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-qv5cn" event={"ID":"c9aa3dfb-870d-4246-96a2-4c52c470239f","Type":"ContainerDied","Data":"761cf2c98f723f4b82167eca90e0a905c124fd950d80ec4559e253986b03dc44"} Oct 12 05:53:52 crc kubenswrapper[4930]: I1012 05:53:52.079324 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-b5w72" event={"ID":"610c85e1-ab61-4938-a33d-cbed5a873874","Type":"ContainerStarted","Data":"4ffe7605dd46ccd4bcb78556d4728509e7a802039d0a31f9bc2dbf9da630511d"} Oct 12 05:53:52 crc kubenswrapper[4930]: I1012 05:53:52.079558 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-b5w72" Oct 12 05:53:52 crc kubenswrapper[4930]: I1012 05:53:52.081383 4930 generic.go:334] "Generic (PLEG): container finished" podID="a7e86f7a-5516-4516-b82a-29c8f4339cdf" containerID="9a9ee19a88e6e08deca8b639875792be41b4d61b6b7086b20c87a0088191a248" exitCode=0 Oct 12 05:53:52 crc kubenswrapper[4930]: I1012 05:53:52.081434 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4dfvt" event={"ID":"a7e86f7a-5516-4516-b82a-29c8f4339cdf","Type":"ContainerDied","Data":"9a9ee19a88e6e08deca8b639875792be41b4d61b6b7086b20c87a0088191a248"} Oct 12 05:53:52 crc kubenswrapper[4930]: I1012 05:53:52.081499 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4dfvt" event={"ID":"a7e86f7a-5516-4516-b82a-29c8f4339cdf","Type":"ContainerStarted","Data":"059921041ca849703deed2426b127724881e847f7cb50ba9ca2f26b491679ac0"} Oct 12 05:53:52 crc kubenswrapper[4930]: I1012 05:53:52.129158 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-b5w72" podStartSLOduration=2.4374291599999998 podStartE2EDuration="10.129143605s" podCreationTimestamp="2025-10-12 05:53:42 +0000 UTC" firstStartedPulling="2025-10-12 05:53:43.844548804 +0000 UTC m=+756.386650569" lastFinishedPulling="2025-10-12 05:53:51.536263239 +0000 UTC m=+764.078365014" observedRunningTime="2025-10-12 05:53:52.12463736 +0000 UTC m=+764.666739125" watchObservedRunningTime="2025-10-12 05:53:52.129143605 +0000 UTC m=+764.671245370" Oct 12 05:53:53 crc kubenswrapper[4930]: I1012 05:53:53.089665 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4dfvt" event={"ID":"a7e86f7a-5516-4516-b82a-29c8f4339cdf","Type":"ContainerStarted","Data":"6947d8b8b2195daa13c692c7213291ceec2670b5b25ee1b55a18262aa09e28a4"} Oct 12 05:53:53 crc kubenswrapper[4930]: I1012 05:53:53.092442 4930 generic.go:334] "Generic (PLEG): container finished" podID="c9aa3dfb-870d-4246-96a2-4c52c470239f" containerID="0eae0189fb1895c2006668b703ec8321c13bd4dc42e9f53447684a57e6bd3f3c" exitCode=0 Oct 12 05:53:53 crc kubenswrapper[4930]: I1012 05:53:53.092923 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-qv5cn" event={"ID":"c9aa3dfb-870d-4246-96a2-4c52c470239f","Type":"ContainerDied","Data":"0eae0189fb1895c2006668b703ec8321c13bd4dc42e9f53447684a57e6bd3f3c"} Oct 12 05:53:54 crc kubenswrapper[4930]: I1012 05:53:54.105218 4930 generic.go:334] "Generic (PLEG): container finished" podID="c9aa3dfb-870d-4246-96a2-4c52c470239f" containerID="0b98e94e1826434c90156c2a2dce13aa492d198360ab1ab7bc6f458b85f15f0d" exitCode=0 Oct 12 05:53:54 crc kubenswrapper[4930]: I1012 05:53:54.105311 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-qv5cn" event={"ID":"c9aa3dfb-870d-4246-96a2-4c52c470239f","Type":"ContainerDied","Data":"0b98e94e1826434c90156c2a2dce13aa492d198360ab1ab7bc6f458b85f15f0d"} Oct 12 05:53:54 crc kubenswrapper[4930]: I1012 05:53:54.110023 4930 generic.go:334] "Generic (PLEG): container finished" podID="a7e86f7a-5516-4516-b82a-29c8f4339cdf" containerID="6947d8b8b2195daa13c692c7213291ceec2670b5b25ee1b55a18262aa09e28a4" exitCode=0 Oct 12 05:53:54 crc kubenswrapper[4930]: I1012 05:53:54.110110 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4dfvt" event={"ID":"a7e86f7a-5516-4516-b82a-29c8f4339cdf","Type":"ContainerDied","Data":"6947d8b8b2195daa13c692c7213291ceec2670b5b25ee1b55a18262aa09e28a4"} Oct 12 05:53:54 crc kubenswrapper[4930]: I1012 05:53:54.390697 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-dd89b" Oct 12 05:53:55 crc kubenswrapper[4930]: I1012 05:53:55.123624 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4dfvt" event={"ID":"a7e86f7a-5516-4516-b82a-29c8f4339cdf","Type":"ContainerStarted","Data":"ed4ecca54c6fc3a5878a063686deadff250e4e1e9e38ee786986ae14f8e6a72c"} Oct 12 05:53:55 crc kubenswrapper[4930]: I1012 05:53:55.128010 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-qv5cn" event={"ID":"c9aa3dfb-870d-4246-96a2-4c52c470239f","Type":"ContainerStarted","Data":"b198ee9e535d9cadbedba837c7f570bfea0144ec367462ab3c760cbf66010336"} Oct 12 05:53:55 crc kubenswrapper[4930]: I1012 05:53:55.128047 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-qv5cn" event={"ID":"c9aa3dfb-870d-4246-96a2-4c52c470239f","Type":"ContainerStarted","Data":"5afd692d08f5c7b655889a4b400d27b7b0363dcae167755a624597117155aac5"} Oct 12 05:53:55 crc kubenswrapper[4930]: I1012 05:53:55.128057 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-qv5cn" event={"ID":"c9aa3dfb-870d-4246-96a2-4c52c470239f","Type":"ContainerStarted","Data":"9b89f78ec0967cf9b4ff020b1ae4ee0b4cf11a36ea6de257b7ba96b467f33dfd"} Oct 12 05:53:55 crc kubenswrapper[4930]: I1012 05:53:55.128066 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-qv5cn" event={"ID":"c9aa3dfb-870d-4246-96a2-4c52c470239f","Type":"ContainerStarted","Data":"3958184e8a73ad9a43de09b6e3373407f7f66b9efa074000a584f3ff027a20aa"} Oct 12 05:53:55 crc kubenswrapper[4930]: I1012 05:53:55.128076 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-qv5cn" event={"ID":"c9aa3dfb-870d-4246-96a2-4c52c470239f","Type":"ContainerStarted","Data":"6516c29898169dc50db58fa5b9ea3757c163f53ca1861b8d8e5676416ba5fb6c"} Oct 12 05:53:56 crc kubenswrapper[4930]: I1012 05:53:56.163155 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-qv5cn" event={"ID":"c9aa3dfb-870d-4246-96a2-4c52c470239f","Type":"ContainerStarted","Data":"f872daa0fdc6476e71a35beb583780b94c70b57cec24f548a75e9f5327d9339c"} Oct 12 05:53:56 crc kubenswrapper[4930]: I1012 05:53:56.163206 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-qv5cn" Oct 12 05:53:56 crc kubenswrapper[4930]: I1012 05:53:56.194132 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-qv5cn" podStartSLOduration=6.21508809 podStartE2EDuration="14.194120296s" podCreationTimestamp="2025-10-12 05:53:42 +0000 UTC" firstStartedPulling="2025-10-12 05:53:43.519578176 +0000 UTC m=+756.061679941" lastFinishedPulling="2025-10-12 05:53:51.498610372 +0000 UTC m=+764.040712147" observedRunningTime="2025-10-12 05:53:56.193265284 +0000 UTC m=+768.735367049" watchObservedRunningTime="2025-10-12 05:53:56.194120296 +0000 UTC m=+768.736222051" Oct 12 05:53:56 crc kubenswrapper[4930]: I1012 05:53:56.195191 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4dfvt" podStartSLOduration=4.773767408 podStartE2EDuration="7.195186973s" podCreationTimestamp="2025-10-12 05:53:49 +0000 UTC" firstStartedPulling="2025-10-12 05:53:52.083274079 +0000 UTC m=+764.625375874" lastFinishedPulling="2025-10-12 05:53:54.504693664 +0000 UTC m=+767.046795439" observedRunningTime="2025-10-12 05:53:55.147454198 +0000 UTC m=+767.689555983" watchObservedRunningTime="2025-10-12 05:53:56.195186973 +0000 UTC m=+768.737288728" Oct 12 05:53:57 crc kubenswrapper[4930]: I1012 05:53:57.912229 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-std5c"] Oct 12 05:53:57 crc kubenswrapper[4930]: I1012 05:53:57.913307 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-std5c" Oct 12 05:53:57 crc kubenswrapper[4930]: I1012 05:53:57.915769 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Oct 12 05:53:57 crc kubenswrapper[4930]: I1012 05:53:57.916681 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Oct 12 05:53:57 crc kubenswrapper[4930]: I1012 05:53:57.927075 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-std5c"] Oct 12 05:53:58 crc kubenswrapper[4930]: I1012 05:53:58.054826 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpg2m\" (UniqueName: \"kubernetes.io/projected/d6c53d81-c42a-42a8-9017-a68663e7934f-kube-api-access-kpg2m\") pod \"openstack-operator-index-std5c\" (UID: \"d6c53d81-c42a-42a8-9017-a68663e7934f\") " pod="openstack-operators/openstack-operator-index-std5c" Oct 12 05:53:58 crc kubenswrapper[4930]: I1012 05:53:58.155720 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpg2m\" (UniqueName: \"kubernetes.io/projected/d6c53d81-c42a-42a8-9017-a68663e7934f-kube-api-access-kpg2m\") pod \"openstack-operator-index-std5c\" (UID: \"d6c53d81-c42a-42a8-9017-a68663e7934f\") " pod="openstack-operators/openstack-operator-index-std5c" Oct 12 05:53:58 crc kubenswrapper[4930]: I1012 05:53:58.181649 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpg2m\" (UniqueName: \"kubernetes.io/projected/d6c53d81-c42a-42a8-9017-a68663e7934f-kube-api-access-kpg2m\") pod \"openstack-operator-index-std5c\" (UID: \"d6c53d81-c42a-42a8-9017-a68663e7934f\") " pod="openstack-operators/openstack-operator-index-std5c" Oct 12 05:53:58 crc kubenswrapper[4930]: I1012 05:53:58.231565 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-std5c" Oct 12 05:53:58 crc kubenswrapper[4930]: I1012 05:53:58.394239 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-qv5cn" Oct 12 05:53:58 crc kubenswrapper[4930]: I1012 05:53:58.432232 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-qv5cn" Oct 12 05:53:58 crc kubenswrapper[4930]: I1012 05:53:58.664403 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-std5c"] Oct 12 05:53:58 crc kubenswrapper[4930]: W1012 05:53:58.671769 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6c53d81_c42a_42a8_9017_a68663e7934f.slice/crio-9e6f9ff6411f3fe07270e2957a573a8337a2512c9c6d223b356e3621f1286f86 WatchSource:0}: Error finding container 9e6f9ff6411f3fe07270e2957a573a8337a2512c9c6d223b356e3621f1286f86: Status 404 returned error can't find the container with id 9e6f9ff6411f3fe07270e2957a573a8337a2512c9c6d223b356e3621f1286f86 Oct 12 05:53:59 crc kubenswrapper[4930]: I1012 05:53:59.184959 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-std5c" event={"ID":"d6c53d81-c42a-42a8-9017-a68663e7934f","Type":"ContainerStarted","Data":"9e6f9ff6411f3fe07270e2957a573a8337a2512c9c6d223b356e3621f1286f86"} Oct 12 05:53:59 crc kubenswrapper[4930]: I1012 05:53:59.446875 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4dfvt" Oct 12 05:53:59 crc kubenswrapper[4930]: I1012 05:53:59.446977 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4dfvt" Oct 12 05:54:00 crc kubenswrapper[4930]: I1012 05:54:00.193007 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-std5c" event={"ID":"d6c53d81-c42a-42a8-9017-a68663e7934f","Type":"ContainerStarted","Data":"eb72d5103db9fafcf5efe6dec104a336a63caf382708c18ff67902aaf2094cfb"} Oct 12 05:54:00 crc kubenswrapper[4930]: I1012 05:54:00.218675 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-std5c" podStartSLOduration=2.279631437 podStartE2EDuration="3.218644439s" podCreationTimestamp="2025-10-12 05:53:57 +0000 UTC" firstStartedPulling="2025-10-12 05:53:58.673446202 +0000 UTC m=+771.215547997" lastFinishedPulling="2025-10-12 05:53:59.612459194 +0000 UTC m=+772.154560999" observedRunningTime="2025-10-12 05:54:00.209213389 +0000 UTC m=+772.751315154" watchObservedRunningTime="2025-10-12 05:54:00.218644439 +0000 UTC m=+772.760746244" Oct 12 05:54:00 crc kubenswrapper[4930]: I1012 05:54:00.514014 4930 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-4dfvt" podUID="a7e86f7a-5516-4516-b82a-29c8f4339cdf" containerName="registry-server" probeResult="failure" output=< Oct 12 05:54:00 crc kubenswrapper[4930]: timeout: failed to connect service ":50051" within 1s Oct 12 05:54:00 crc kubenswrapper[4930]: > Oct 12 05:54:01 crc kubenswrapper[4930]: I1012 05:54:01.295115 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-std5c"] Oct 12 05:54:01 crc kubenswrapper[4930]: I1012 05:54:01.909300 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-6prjm"] Oct 12 05:54:01 crc kubenswrapper[4930]: I1012 05:54:01.911269 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-6prjm" Oct 12 05:54:01 crc kubenswrapper[4930]: I1012 05:54:01.917506 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-dz72b" Oct 12 05:54:01 crc kubenswrapper[4930]: I1012 05:54:01.920784 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-6prjm"] Oct 12 05:54:02 crc kubenswrapper[4930]: I1012 05:54:02.019782 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w67tj\" (UniqueName: \"kubernetes.io/projected/2214be78-67f7-4965-b3a0-8c1401ff658c-kube-api-access-w67tj\") pod \"openstack-operator-index-6prjm\" (UID: \"2214be78-67f7-4965-b3a0-8c1401ff658c\") " pod="openstack-operators/openstack-operator-index-6prjm" Oct 12 05:54:02 crc kubenswrapper[4930]: I1012 05:54:02.122113 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w67tj\" (UniqueName: \"kubernetes.io/projected/2214be78-67f7-4965-b3a0-8c1401ff658c-kube-api-access-w67tj\") pod \"openstack-operator-index-6prjm\" (UID: \"2214be78-67f7-4965-b3a0-8c1401ff658c\") " pod="openstack-operators/openstack-operator-index-6prjm" Oct 12 05:54:02 crc kubenswrapper[4930]: I1012 05:54:02.157578 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w67tj\" (UniqueName: \"kubernetes.io/projected/2214be78-67f7-4965-b3a0-8c1401ff658c-kube-api-access-w67tj\") pod \"openstack-operator-index-6prjm\" (UID: \"2214be78-67f7-4965-b3a0-8c1401ff658c\") " pod="openstack-operators/openstack-operator-index-6prjm" Oct 12 05:54:02 crc kubenswrapper[4930]: I1012 05:54:02.212447 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-std5c" podUID="d6c53d81-c42a-42a8-9017-a68663e7934f" containerName="registry-server" containerID="cri-o://eb72d5103db9fafcf5efe6dec104a336a63caf382708c18ff67902aaf2094cfb" gracePeriod=2 Oct 12 05:54:02 crc kubenswrapper[4930]: I1012 05:54:02.245454 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-6prjm" Oct 12 05:54:02 crc kubenswrapper[4930]: I1012 05:54:02.726762 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-std5c" Oct 12 05:54:02 crc kubenswrapper[4930]: I1012 05:54:02.827391 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-6prjm"] Oct 12 05:54:02 crc kubenswrapper[4930]: I1012 05:54:02.831224 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpg2m\" (UniqueName: \"kubernetes.io/projected/d6c53d81-c42a-42a8-9017-a68663e7934f-kube-api-access-kpg2m\") pod \"d6c53d81-c42a-42a8-9017-a68663e7934f\" (UID: \"d6c53d81-c42a-42a8-9017-a68663e7934f\") " Oct 12 05:54:02 crc kubenswrapper[4930]: W1012 05:54:02.832632 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2214be78_67f7_4965_b3a0_8c1401ff658c.slice/crio-1619a5d050bfb24cc797a2b507ae34335ff38396f25dbfd8916b5f936e8bdbb4 WatchSource:0}: Error finding container 1619a5d050bfb24cc797a2b507ae34335ff38396f25dbfd8916b5f936e8bdbb4: Status 404 returned error can't find the container with id 1619a5d050bfb24cc797a2b507ae34335ff38396f25dbfd8916b5f936e8bdbb4 Oct 12 05:54:02 crc kubenswrapper[4930]: I1012 05:54:02.836165 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6c53d81-c42a-42a8-9017-a68663e7934f-kube-api-access-kpg2m" (OuterVolumeSpecName: "kube-api-access-kpg2m") pod "d6c53d81-c42a-42a8-9017-a68663e7934f" (UID: "d6c53d81-c42a-42a8-9017-a68663e7934f"). InnerVolumeSpecName "kube-api-access-kpg2m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:54:02 crc kubenswrapper[4930]: I1012 05:54:02.933437 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kpg2m\" (UniqueName: \"kubernetes.io/projected/d6c53d81-c42a-42a8-9017-a68663e7934f-kube-api-access-kpg2m\") on node \"crc\" DevicePath \"\"" Oct 12 05:54:03 crc kubenswrapper[4930]: I1012 05:54:03.223156 4930 generic.go:334] "Generic (PLEG): container finished" podID="d6c53d81-c42a-42a8-9017-a68663e7934f" containerID="eb72d5103db9fafcf5efe6dec104a336a63caf382708c18ff67902aaf2094cfb" exitCode=0 Oct 12 05:54:03 crc kubenswrapper[4930]: I1012 05:54:03.223220 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-std5c" event={"ID":"d6c53d81-c42a-42a8-9017-a68663e7934f","Type":"ContainerDied","Data":"eb72d5103db9fafcf5efe6dec104a336a63caf382708c18ff67902aaf2094cfb"} Oct 12 05:54:03 crc kubenswrapper[4930]: I1012 05:54:03.223296 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-std5c" event={"ID":"d6c53d81-c42a-42a8-9017-a68663e7934f","Type":"ContainerDied","Data":"9e6f9ff6411f3fe07270e2957a573a8337a2512c9c6d223b356e3621f1286f86"} Oct 12 05:54:03 crc kubenswrapper[4930]: I1012 05:54:03.223335 4930 scope.go:117] "RemoveContainer" containerID="eb72d5103db9fafcf5efe6dec104a336a63caf382708c18ff67902aaf2094cfb" Oct 12 05:54:03 crc kubenswrapper[4930]: I1012 05:54:03.223243 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-std5c" Oct 12 05:54:03 crc kubenswrapper[4930]: I1012 05:54:03.225146 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-6prjm" event={"ID":"2214be78-67f7-4965-b3a0-8c1401ff658c","Type":"ContainerStarted","Data":"1619a5d050bfb24cc797a2b507ae34335ff38396f25dbfd8916b5f936e8bdbb4"} Oct 12 05:54:03 crc kubenswrapper[4930]: I1012 05:54:03.261171 4930 scope.go:117] "RemoveContainer" containerID="eb72d5103db9fafcf5efe6dec104a336a63caf382708c18ff67902aaf2094cfb" Oct 12 05:54:03 crc kubenswrapper[4930]: E1012 05:54:03.261835 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb72d5103db9fafcf5efe6dec104a336a63caf382708c18ff67902aaf2094cfb\": container with ID starting with eb72d5103db9fafcf5efe6dec104a336a63caf382708c18ff67902aaf2094cfb not found: ID does not exist" containerID="eb72d5103db9fafcf5efe6dec104a336a63caf382708c18ff67902aaf2094cfb" Oct 12 05:54:03 crc kubenswrapper[4930]: I1012 05:54:03.261889 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb72d5103db9fafcf5efe6dec104a336a63caf382708c18ff67902aaf2094cfb"} err="failed to get container status \"eb72d5103db9fafcf5efe6dec104a336a63caf382708c18ff67902aaf2094cfb\": rpc error: code = NotFound desc = could not find container \"eb72d5103db9fafcf5efe6dec104a336a63caf382708c18ff67902aaf2094cfb\": container with ID starting with eb72d5103db9fafcf5efe6dec104a336a63caf382708c18ff67902aaf2094cfb not found: ID does not exist" Oct 12 05:54:03 crc kubenswrapper[4930]: I1012 05:54:03.282558 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-std5c"] Oct 12 05:54:03 crc kubenswrapper[4930]: I1012 05:54:03.287766 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-std5c"] Oct 12 05:54:03 crc kubenswrapper[4930]: I1012 05:54:03.413612 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-b5w72" Oct 12 05:54:03 crc kubenswrapper[4930]: I1012 05:54:03.505449 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-68d546b9d8-7sgrd" Oct 12 05:54:03 crc kubenswrapper[4930]: I1012 05:54:03.670159 4930 patch_prober.go:28] interesting pod/machine-config-daemon-mk4tf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 05:54:03 crc kubenswrapper[4930]: I1012 05:54:03.670256 4930 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 05:54:03 crc kubenswrapper[4930]: I1012 05:54:03.670321 4930 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" Oct 12 05:54:03 crc kubenswrapper[4930]: I1012 05:54:03.671235 4930 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8c4b30d4b3900fd24d77d876af5c72fe4eab21e88f5e73b0f78ab790ead4cf26"} pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 12 05:54:03 crc kubenswrapper[4930]: I1012 05:54:03.671343 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerName="machine-config-daemon" containerID="cri-o://8c4b30d4b3900fd24d77d876af5c72fe4eab21e88f5e73b0f78ab790ead4cf26" gracePeriod=600 Oct 12 05:54:04 crc kubenswrapper[4930]: I1012 05:54:04.149595 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6c53d81-c42a-42a8-9017-a68663e7934f" path="/var/lib/kubelet/pods/d6c53d81-c42a-42a8-9017-a68663e7934f/volumes" Oct 12 05:54:04 crc kubenswrapper[4930]: I1012 05:54:04.237308 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-6prjm" event={"ID":"2214be78-67f7-4965-b3a0-8c1401ff658c","Type":"ContainerStarted","Data":"1ca725e24bbfae0835dc4643c1c4844a0f6a63ded3a3fcdb4a91b95b14102fcc"} Oct 12 05:54:04 crc kubenswrapper[4930]: I1012 05:54:04.259953 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-6prjm" podStartSLOduration=2.721351451 podStartE2EDuration="3.259925268s" podCreationTimestamp="2025-10-12 05:54:01 +0000 UTC" firstStartedPulling="2025-10-12 05:54:02.836986357 +0000 UTC m=+775.379088122" lastFinishedPulling="2025-10-12 05:54:03.375560144 +0000 UTC m=+775.917661939" observedRunningTime="2025-10-12 05:54:04.256948972 +0000 UTC m=+776.799050767" watchObservedRunningTime="2025-10-12 05:54:04.259925268 +0000 UTC m=+776.802027073" Oct 12 05:54:05 crc kubenswrapper[4930]: I1012 05:54:05.249375 4930 generic.go:334] "Generic (PLEG): container finished" podID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerID="8c4b30d4b3900fd24d77d876af5c72fe4eab21e88f5e73b0f78ab790ead4cf26" exitCode=0 Oct 12 05:54:05 crc kubenswrapper[4930]: I1012 05:54:05.249551 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" event={"ID":"02f8684c-a3e4-44e8-9741-9f54488d8d8d","Type":"ContainerDied","Data":"8c4b30d4b3900fd24d77d876af5c72fe4eab21e88f5e73b0f78ab790ead4cf26"} Oct 12 05:54:05 crc kubenswrapper[4930]: I1012 05:54:05.249867 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" event={"ID":"02f8684c-a3e4-44e8-9741-9f54488d8d8d","Type":"ContainerStarted","Data":"86436bfc7a3b225084b0677c9406a12b80ec7ace76e26bb6e4b679b1e7256578"} Oct 12 05:54:05 crc kubenswrapper[4930]: I1012 05:54:05.249905 4930 scope.go:117] "RemoveContainer" containerID="f9b4436f5dc6b5a5cc61a2d31420352c424bca4ed9a68e98bbb43f4e86026438" Oct 12 05:54:09 crc kubenswrapper[4930]: I1012 05:54:09.529313 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4dfvt" Oct 12 05:54:09 crc kubenswrapper[4930]: I1012 05:54:09.613764 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4dfvt" Oct 12 05:54:11 crc kubenswrapper[4930]: I1012 05:54:11.898362 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4dfvt"] Oct 12 05:54:11 crc kubenswrapper[4930]: I1012 05:54:11.899116 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4dfvt" podUID="a7e86f7a-5516-4516-b82a-29c8f4339cdf" containerName="registry-server" containerID="cri-o://ed4ecca54c6fc3a5878a063686deadff250e4e1e9e38ee786986ae14f8e6a72c" gracePeriod=2 Oct 12 05:54:12 crc kubenswrapper[4930]: I1012 05:54:12.246647 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-6prjm" Oct 12 05:54:12 crc kubenswrapper[4930]: I1012 05:54:12.247126 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-6prjm" Oct 12 05:54:12 crc kubenswrapper[4930]: I1012 05:54:12.321534 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-6prjm" Oct 12 05:54:12 crc kubenswrapper[4930]: I1012 05:54:12.342667 4930 generic.go:334] "Generic (PLEG): container finished" podID="a7e86f7a-5516-4516-b82a-29c8f4339cdf" containerID="ed4ecca54c6fc3a5878a063686deadff250e4e1e9e38ee786986ae14f8e6a72c" exitCode=0 Oct 12 05:54:12 crc kubenswrapper[4930]: I1012 05:54:12.342903 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4dfvt" event={"ID":"a7e86f7a-5516-4516-b82a-29c8f4339cdf","Type":"ContainerDied","Data":"ed4ecca54c6fc3a5878a063686deadff250e4e1e9e38ee786986ae14f8e6a72c"} Oct 12 05:54:12 crc kubenswrapper[4930]: I1012 05:54:12.381619 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-6prjm" Oct 12 05:54:12 crc kubenswrapper[4930]: I1012 05:54:12.464699 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4dfvt" Oct 12 05:54:12 crc kubenswrapper[4930]: I1012 05:54:12.599348 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7e86f7a-5516-4516-b82a-29c8f4339cdf-utilities\") pod \"a7e86f7a-5516-4516-b82a-29c8f4339cdf\" (UID: \"a7e86f7a-5516-4516-b82a-29c8f4339cdf\") " Oct 12 05:54:12 crc kubenswrapper[4930]: I1012 05:54:12.599524 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dv8x5\" (UniqueName: \"kubernetes.io/projected/a7e86f7a-5516-4516-b82a-29c8f4339cdf-kube-api-access-dv8x5\") pod \"a7e86f7a-5516-4516-b82a-29c8f4339cdf\" (UID: \"a7e86f7a-5516-4516-b82a-29c8f4339cdf\") " Oct 12 05:54:12 crc kubenswrapper[4930]: I1012 05:54:12.599712 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7e86f7a-5516-4516-b82a-29c8f4339cdf-catalog-content\") pod \"a7e86f7a-5516-4516-b82a-29c8f4339cdf\" (UID: \"a7e86f7a-5516-4516-b82a-29c8f4339cdf\") " Oct 12 05:54:12 crc kubenswrapper[4930]: I1012 05:54:12.600388 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7e86f7a-5516-4516-b82a-29c8f4339cdf-utilities" (OuterVolumeSpecName: "utilities") pod "a7e86f7a-5516-4516-b82a-29c8f4339cdf" (UID: "a7e86f7a-5516-4516-b82a-29c8f4339cdf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 05:54:12 crc kubenswrapper[4930]: I1012 05:54:12.605894 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7e86f7a-5516-4516-b82a-29c8f4339cdf-kube-api-access-dv8x5" (OuterVolumeSpecName: "kube-api-access-dv8x5") pod "a7e86f7a-5516-4516-b82a-29c8f4339cdf" (UID: "a7e86f7a-5516-4516-b82a-29c8f4339cdf"). InnerVolumeSpecName "kube-api-access-dv8x5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:54:12 crc kubenswrapper[4930]: I1012 05:54:12.675045 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7e86f7a-5516-4516-b82a-29c8f4339cdf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a7e86f7a-5516-4516-b82a-29c8f4339cdf" (UID: "a7e86f7a-5516-4516-b82a-29c8f4339cdf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 05:54:12 crc kubenswrapper[4930]: I1012 05:54:12.701214 4930 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7e86f7a-5516-4516-b82a-29c8f4339cdf-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 05:54:12 crc kubenswrapper[4930]: I1012 05:54:12.701249 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dv8x5\" (UniqueName: \"kubernetes.io/projected/a7e86f7a-5516-4516-b82a-29c8f4339cdf-kube-api-access-dv8x5\") on node \"crc\" DevicePath \"\"" Oct 12 05:54:12 crc kubenswrapper[4930]: I1012 05:54:12.701260 4930 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7e86f7a-5516-4516-b82a-29c8f4339cdf-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 05:54:13 crc kubenswrapper[4930]: I1012 05:54:13.353816 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4dfvt" event={"ID":"a7e86f7a-5516-4516-b82a-29c8f4339cdf","Type":"ContainerDied","Data":"059921041ca849703deed2426b127724881e847f7cb50ba9ca2f26b491679ac0"} Oct 12 05:54:13 crc kubenswrapper[4930]: I1012 05:54:13.354297 4930 scope.go:117] "RemoveContainer" containerID="ed4ecca54c6fc3a5878a063686deadff250e4e1e9e38ee786986ae14f8e6a72c" Oct 12 05:54:13 crc kubenswrapper[4930]: I1012 05:54:13.353973 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4dfvt" Oct 12 05:54:13 crc kubenswrapper[4930]: I1012 05:54:13.397087 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4dfvt"] Oct 12 05:54:13 crc kubenswrapper[4930]: I1012 05:54:13.397481 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-qv5cn" Oct 12 05:54:13 crc kubenswrapper[4930]: I1012 05:54:13.399053 4930 scope.go:117] "RemoveContainer" containerID="6947d8b8b2195daa13c692c7213291ceec2670b5b25ee1b55a18262aa09e28a4" Oct 12 05:54:13 crc kubenswrapper[4930]: I1012 05:54:13.402985 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4dfvt"] Oct 12 05:54:13 crc kubenswrapper[4930]: I1012 05:54:13.427628 4930 scope.go:117] "RemoveContainer" containerID="9a9ee19a88e6e08deca8b639875792be41b4d61b6b7086b20c87a0088191a248" Oct 12 05:54:13 crc kubenswrapper[4930]: I1012 05:54:13.746920 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bpcx87"] Oct 12 05:54:13 crc kubenswrapper[4930]: E1012 05:54:13.747335 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7e86f7a-5516-4516-b82a-29c8f4339cdf" containerName="extract-content" Oct 12 05:54:13 crc kubenswrapper[4930]: I1012 05:54:13.747363 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7e86f7a-5516-4516-b82a-29c8f4339cdf" containerName="extract-content" Oct 12 05:54:13 crc kubenswrapper[4930]: E1012 05:54:13.747408 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7e86f7a-5516-4516-b82a-29c8f4339cdf" containerName="extract-utilities" Oct 12 05:54:13 crc kubenswrapper[4930]: I1012 05:54:13.747422 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7e86f7a-5516-4516-b82a-29c8f4339cdf" containerName="extract-utilities" Oct 12 05:54:13 crc kubenswrapper[4930]: E1012 05:54:13.747437 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7e86f7a-5516-4516-b82a-29c8f4339cdf" containerName="registry-server" Oct 12 05:54:13 crc kubenswrapper[4930]: I1012 05:54:13.747449 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7e86f7a-5516-4516-b82a-29c8f4339cdf" containerName="registry-server" Oct 12 05:54:13 crc kubenswrapper[4930]: E1012 05:54:13.747477 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6c53d81-c42a-42a8-9017-a68663e7934f" containerName="registry-server" Oct 12 05:54:13 crc kubenswrapper[4930]: I1012 05:54:13.747490 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6c53d81-c42a-42a8-9017-a68663e7934f" containerName="registry-server" Oct 12 05:54:13 crc kubenswrapper[4930]: I1012 05:54:13.747696 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6c53d81-c42a-42a8-9017-a68663e7934f" containerName="registry-server" Oct 12 05:54:13 crc kubenswrapper[4930]: I1012 05:54:13.747723 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7e86f7a-5516-4516-b82a-29c8f4339cdf" containerName="registry-server" Oct 12 05:54:13 crc kubenswrapper[4930]: I1012 05:54:13.749204 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bpcx87" Oct 12 05:54:13 crc kubenswrapper[4930]: I1012 05:54:13.753896 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-2c8bc" Oct 12 05:54:13 crc kubenswrapper[4930]: I1012 05:54:13.764633 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bpcx87"] Oct 12 05:54:13 crc kubenswrapper[4930]: I1012 05:54:13.919037 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/84bb56e1-6ddf-439a-9889-f2e1973ad8fa-bundle\") pod \"bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bpcx87\" (UID: \"84bb56e1-6ddf-439a-9889-f2e1973ad8fa\") " pod="openstack-operators/bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bpcx87" Oct 12 05:54:13 crc kubenswrapper[4930]: I1012 05:54:13.919476 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/84bb56e1-6ddf-439a-9889-f2e1973ad8fa-util\") pod \"bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bpcx87\" (UID: \"84bb56e1-6ddf-439a-9889-f2e1973ad8fa\") " pod="openstack-operators/bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bpcx87" Oct 12 05:54:13 crc kubenswrapper[4930]: I1012 05:54:13.919635 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzkqv\" (UniqueName: \"kubernetes.io/projected/84bb56e1-6ddf-439a-9889-f2e1973ad8fa-kube-api-access-lzkqv\") pod \"bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bpcx87\" (UID: \"84bb56e1-6ddf-439a-9889-f2e1973ad8fa\") " pod="openstack-operators/bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bpcx87" Oct 12 05:54:14 crc kubenswrapper[4930]: I1012 05:54:14.020982 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/84bb56e1-6ddf-439a-9889-f2e1973ad8fa-bundle\") pod \"bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bpcx87\" (UID: \"84bb56e1-6ddf-439a-9889-f2e1973ad8fa\") " pod="openstack-operators/bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bpcx87" Oct 12 05:54:14 crc kubenswrapper[4930]: I1012 05:54:14.021083 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/84bb56e1-6ddf-439a-9889-f2e1973ad8fa-util\") pod \"bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bpcx87\" (UID: \"84bb56e1-6ddf-439a-9889-f2e1973ad8fa\") " pod="openstack-operators/bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bpcx87" Oct 12 05:54:14 crc kubenswrapper[4930]: I1012 05:54:14.021119 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzkqv\" (UniqueName: \"kubernetes.io/projected/84bb56e1-6ddf-439a-9889-f2e1973ad8fa-kube-api-access-lzkqv\") pod \"bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bpcx87\" (UID: \"84bb56e1-6ddf-439a-9889-f2e1973ad8fa\") " pod="openstack-operators/bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bpcx87" Oct 12 05:54:14 crc kubenswrapper[4930]: I1012 05:54:14.021522 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/84bb56e1-6ddf-439a-9889-f2e1973ad8fa-bundle\") pod \"bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bpcx87\" (UID: \"84bb56e1-6ddf-439a-9889-f2e1973ad8fa\") " pod="openstack-operators/bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bpcx87" Oct 12 05:54:14 crc kubenswrapper[4930]: I1012 05:54:14.021572 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/84bb56e1-6ddf-439a-9889-f2e1973ad8fa-util\") pod \"bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bpcx87\" (UID: \"84bb56e1-6ddf-439a-9889-f2e1973ad8fa\") " pod="openstack-operators/bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bpcx87" Oct 12 05:54:14 crc kubenswrapper[4930]: I1012 05:54:14.041689 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzkqv\" (UniqueName: \"kubernetes.io/projected/84bb56e1-6ddf-439a-9889-f2e1973ad8fa-kube-api-access-lzkqv\") pod \"bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bpcx87\" (UID: \"84bb56e1-6ddf-439a-9889-f2e1973ad8fa\") " pod="openstack-operators/bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bpcx87" Oct 12 05:54:14 crc kubenswrapper[4930]: I1012 05:54:14.081393 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bpcx87" Oct 12 05:54:14 crc kubenswrapper[4930]: I1012 05:54:14.147039 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7e86f7a-5516-4516-b82a-29c8f4339cdf" path="/var/lib/kubelet/pods/a7e86f7a-5516-4516-b82a-29c8f4339cdf/volumes" Oct 12 05:54:14 crc kubenswrapper[4930]: I1012 05:54:14.578932 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bpcx87"] Oct 12 05:54:15 crc kubenswrapper[4930]: I1012 05:54:15.373235 4930 generic.go:334] "Generic (PLEG): container finished" podID="84bb56e1-6ddf-439a-9889-f2e1973ad8fa" containerID="4c25ba259797d755d9958c7b407fba2ad19bcbb93ab4821c7501b5ebf11644d5" exitCode=0 Oct 12 05:54:15 crc kubenswrapper[4930]: I1012 05:54:15.373355 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bpcx87" event={"ID":"84bb56e1-6ddf-439a-9889-f2e1973ad8fa","Type":"ContainerDied","Data":"4c25ba259797d755d9958c7b407fba2ad19bcbb93ab4821c7501b5ebf11644d5"} Oct 12 05:54:15 crc kubenswrapper[4930]: I1012 05:54:15.373713 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bpcx87" event={"ID":"84bb56e1-6ddf-439a-9889-f2e1973ad8fa","Type":"ContainerStarted","Data":"5a327aff005650499edf594b689a20329890ed8442d3fb82db12fd0ae3705355"} Oct 12 05:54:17 crc kubenswrapper[4930]: I1012 05:54:17.391163 4930 generic.go:334] "Generic (PLEG): container finished" podID="84bb56e1-6ddf-439a-9889-f2e1973ad8fa" containerID="b9abc5f0ab12d67d589165fb44f5179047e65b3ed67f20bae436819e3cf610b8" exitCode=0 Oct 12 05:54:17 crc kubenswrapper[4930]: I1012 05:54:17.391251 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bpcx87" event={"ID":"84bb56e1-6ddf-439a-9889-f2e1973ad8fa","Type":"ContainerDied","Data":"b9abc5f0ab12d67d589165fb44f5179047e65b3ed67f20bae436819e3cf610b8"} Oct 12 05:54:18 crc kubenswrapper[4930]: I1012 05:54:18.415364 4930 generic.go:334] "Generic (PLEG): container finished" podID="84bb56e1-6ddf-439a-9889-f2e1973ad8fa" containerID="162100e7dcc422a324f7abbaa07b30cf42452a179cb56b726da11bb43783facb" exitCode=0 Oct 12 05:54:18 crc kubenswrapper[4930]: I1012 05:54:18.415451 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bpcx87" event={"ID":"84bb56e1-6ddf-439a-9889-f2e1973ad8fa","Type":"ContainerDied","Data":"162100e7dcc422a324f7abbaa07b30cf42452a179cb56b726da11bb43783facb"} Oct 12 05:54:19 crc kubenswrapper[4930]: I1012 05:54:19.861517 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bpcx87" Oct 12 05:54:20 crc kubenswrapper[4930]: I1012 05:54:20.015159 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzkqv\" (UniqueName: \"kubernetes.io/projected/84bb56e1-6ddf-439a-9889-f2e1973ad8fa-kube-api-access-lzkqv\") pod \"84bb56e1-6ddf-439a-9889-f2e1973ad8fa\" (UID: \"84bb56e1-6ddf-439a-9889-f2e1973ad8fa\") " Oct 12 05:54:20 crc kubenswrapper[4930]: I1012 05:54:20.015689 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/84bb56e1-6ddf-439a-9889-f2e1973ad8fa-util\") pod \"84bb56e1-6ddf-439a-9889-f2e1973ad8fa\" (UID: \"84bb56e1-6ddf-439a-9889-f2e1973ad8fa\") " Oct 12 05:54:20 crc kubenswrapper[4930]: I1012 05:54:20.015781 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/84bb56e1-6ddf-439a-9889-f2e1973ad8fa-bundle\") pod \"84bb56e1-6ddf-439a-9889-f2e1973ad8fa\" (UID: \"84bb56e1-6ddf-439a-9889-f2e1973ad8fa\") " Oct 12 05:54:20 crc kubenswrapper[4930]: I1012 05:54:20.017383 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84bb56e1-6ddf-439a-9889-f2e1973ad8fa-bundle" (OuterVolumeSpecName: "bundle") pod "84bb56e1-6ddf-439a-9889-f2e1973ad8fa" (UID: "84bb56e1-6ddf-439a-9889-f2e1973ad8fa"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 05:54:20 crc kubenswrapper[4930]: I1012 05:54:20.026413 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84bb56e1-6ddf-439a-9889-f2e1973ad8fa-kube-api-access-lzkqv" (OuterVolumeSpecName: "kube-api-access-lzkqv") pod "84bb56e1-6ddf-439a-9889-f2e1973ad8fa" (UID: "84bb56e1-6ddf-439a-9889-f2e1973ad8fa"). InnerVolumeSpecName "kube-api-access-lzkqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:54:20 crc kubenswrapper[4930]: I1012 05:54:20.117207 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzkqv\" (UniqueName: \"kubernetes.io/projected/84bb56e1-6ddf-439a-9889-f2e1973ad8fa-kube-api-access-lzkqv\") on node \"crc\" DevicePath \"\"" Oct 12 05:54:20 crc kubenswrapper[4930]: I1012 05:54:20.117260 4930 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/84bb56e1-6ddf-439a-9889-f2e1973ad8fa-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 05:54:20 crc kubenswrapper[4930]: I1012 05:54:20.300554 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84bb56e1-6ddf-439a-9889-f2e1973ad8fa-util" (OuterVolumeSpecName: "util") pod "84bb56e1-6ddf-439a-9889-f2e1973ad8fa" (UID: "84bb56e1-6ddf-439a-9889-f2e1973ad8fa"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 05:54:20 crc kubenswrapper[4930]: I1012 05:54:20.320072 4930 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/84bb56e1-6ddf-439a-9889-f2e1973ad8fa-util\") on node \"crc\" DevicePath \"\"" Oct 12 05:54:20 crc kubenswrapper[4930]: I1012 05:54:20.432290 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bpcx87" event={"ID":"84bb56e1-6ddf-439a-9889-f2e1973ad8fa","Type":"ContainerDied","Data":"5a327aff005650499edf594b689a20329890ed8442d3fb82db12fd0ae3705355"} Oct 12 05:54:20 crc kubenswrapper[4930]: I1012 05:54:20.432329 4930 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a327aff005650499edf594b689a20329890ed8442d3fb82db12fd0ae3705355" Oct 12 05:54:20 crc kubenswrapper[4930]: I1012 05:54:20.432636 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bpcx87" Oct 12 05:54:24 crc kubenswrapper[4930]: I1012 05:54:24.909761 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-cmhjh"] Oct 12 05:54:24 crc kubenswrapper[4930]: E1012 05:54:24.910921 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84bb56e1-6ddf-439a-9889-f2e1973ad8fa" containerName="extract" Oct 12 05:54:24 crc kubenswrapper[4930]: I1012 05:54:24.910944 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="84bb56e1-6ddf-439a-9889-f2e1973ad8fa" containerName="extract" Oct 12 05:54:24 crc kubenswrapper[4930]: E1012 05:54:24.910976 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84bb56e1-6ddf-439a-9889-f2e1973ad8fa" containerName="pull" Oct 12 05:54:24 crc kubenswrapper[4930]: I1012 05:54:24.910988 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="84bb56e1-6ddf-439a-9889-f2e1973ad8fa" containerName="pull" Oct 12 05:54:24 crc kubenswrapper[4930]: E1012 05:54:24.911012 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84bb56e1-6ddf-439a-9889-f2e1973ad8fa" containerName="util" Oct 12 05:54:24 crc kubenswrapper[4930]: I1012 05:54:24.911024 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="84bb56e1-6ddf-439a-9889-f2e1973ad8fa" containerName="util" Oct 12 05:54:24 crc kubenswrapper[4930]: I1012 05:54:24.911254 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="84bb56e1-6ddf-439a-9889-f2e1973ad8fa" containerName="extract" Oct 12 05:54:24 crc kubenswrapper[4930]: I1012 05:54:24.912816 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cmhjh" Oct 12 05:54:24 crc kubenswrapper[4930]: I1012 05:54:24.923425 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cmhjh"] Oct 12 05:54:25 crc kubenswrapper[4930]: I1012 05:54:25.006834 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f4ceee7-2ffd-4323-8f24-56522dcf260d-catalog-content\") pod \"certified-operators-cmhjh\" (UID: \"5f4ceee7-2ffd-4323-8f24-56522dcf260d\") " pod="openshift-marketplace/certified-operators-cmhjh" Oct 12 05:54:25 crc kubenswrapper[4930]: I1012 05:54:25.006973 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f4ceee7-2ffd-4323-8f24-56522dcf260d-utilities\") pod \"certified-operators-cmhjh\" (UID: \"5f4ceee7-2ffd-4323-8f24-56522dcf260d\") " pod="openshift-marketplace/certified-operators-cmhjh" Oct 12 05:54:25 crc kubenswrapper[4930]: I1012 05:54:25.007043 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lht9t\" (UniqueName: \"kubernetes.io/projected/5f4ceee7-2ffd-4323-8f24-56522dcf260d-kube-api-access-lht9t\") pod \"certified-operators-cmhjh\" (UID: \"5f4ceee7-2ffd-4323-8f24-56522dcf260d\") " pod="openshift-marketplace/certified-operators-cmhjh" Oct 12 05:54:25 crc kubenswrapper[4930]: I1012 05:54:25.108848 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lht9t\" (UniqueName: \"kubernetes.io/projected/5f4ceee7-2ffd-4323-8f24-56522dcf260d-kube-api-access-lht9t\") pod \"certified-operators-cmhjh\" (UID: \"5f4ceee7-2ffd-4323-8f24-56522dcf260d\") " pod="openshift-marketplace/certified-operators-cmhjh" Oct 12 05:54:25 crc kubenswrapper[4930]: I1012 05:54:25.108952 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f4ceee7-2ffd-4323-8f24-56522dcf260d-catalog-content\") pod \"certified-operators-cmhjh\" (UID: \"5f4ceee7-2ffd-4323-8f24-56522dcf260d\") " pod="openshift-marketplace/certified-operators-cmhjh" Oct 12 05:54:25 crc kubenswrapper[4930]: I1012 05:54:25.109071 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f4ceee7-2ffd-4323-8f24-56522dcf260d-utilities\") pod \"certified-operators-cmhjh\" (UID: \"5f4ceee7-2ffd-4323-8f24-56522dcf260d\") " pod="openshift-marketplace/certified-operators-cmhjh" Oct 12 05:54:25 crc kubenswrapper[4930]: I1012 05:54:25.110068 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f4ceee7-2ffd-4323-8f24-56522dcf260d-catalog-content\") pod \"certified-operators-cmhjh\" (UID: \"5f4ceee7-2ffd-4323-8f24-56522dcf260d\") " pod="openshift-marketplace/certified-operators-cmhjh" Oct 12 05:54:25 crc kubenswrapper[4930]: I1012 05:54:25.110114 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f4ceee7-2ffd-4323-8f24-56522dcf260d-utilities\") pod \"certified-operators-cmhjh\" (UID: \"5f4ceee7-2ffd-4323-8f24-56522dcf260d\") " pod="openshift-marketplace/certified-operators-cmhjh" Oct 12 05:54:25 crc kubenswrapper[4930]: I1012 05:54:25.135669 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lht9t\" (UniqueName: \"kubernetes.io/projected/5f4ceee7-2ffd-4323-8f24-56522dcf260d-kube-api-access-lht9t\") pod \"certified-operators-cmhjh\" (UID: \"5f4ceee7-2ffd-4323-8f24-56522dcf260d\") " pod="openshift-marketplace/certified-operators-cmhjh" Oct 12 05:54:25 crc kubenswrapper[4930]: I1012 05:54:25.202461 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-688d597459-t5cps"] Oct 12 05:54:25 crc kubenswrapper[4930]: I1012 05:54:25.203649 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-688d597459-t5cps" Oct 12 05:54:25 crc kubenswrapper[4930]: I1012 05:54:25.206241 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-jr82p" Oct 12 05:54:25 crc kubenswrapper[4930]: I1012 05:54:25.235463 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-688d597459-t5cps"] Oct 12 05:54:25 crc kubenswrapper[4930]: I1012 05:54:25.241543 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cmhjh" Oct 12 05:54:25 crc kubenswrapper[4930]: I1012 05:54:25.311976 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6l28s\" (UniqueName: \"kubernetes.io/projected/d0de84bb-9b0d-4552-887d-1da3d50467e2-kube-api-access-6l28s\") pod \"openstack-operator-controller-operator-688d597459-t5cps\" (UID: \"d0de84bb-9b0d-4552-887d-1da3d50467e2\") " pod="openstack-operators/openstack-operator-controller-operator-688d597459-t5cps" Oct 12 05:54:25 crc kubenswrapper[4930]: I1012 05:54:25.413403 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6l28s\" (UniqueName: \"kubernetes.io/projected/d0de84bb-9b0d-4552-887d-1da3d50467e2-kube-api-access-6l28s\") pod \"openstack-operator-controller-operator-688d597459-t5cps\" (UID: \"d0de84bb-9b0d-4552-887d-1da3d50467e2\") " pod="openstack-operators/openstack-operator-controller-operator-688d597459-t5cps" Oct 12 05:54:25 crc kubenswrapper[4930]: I1012 05:54:25.437729 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6l28s\" (UniqueName: \"kubernetes.io/projected/d0de84bb-9b0d-4552-887d-1da3d50467e2-kube-api-access-6l28s\") pod \"openstack-operator-controller-operator-688d597459-t5cps\" (UID: \"d0de84bb-9b0d-4552-887d-1da3d50467e2\") " pod="openstack-operators/openstack-operator-controller-operator-688d597459-t5cps" Oct 12 05:54:25 crc kubenswrapper[4930]: I1012 05:54:25.523201 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-688d597459-t5cps" Oct 12 05:54:25 crc kubenswrapper[4930]: I1012 05:54:25.696903 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cmhjh"] Oct 12 05:54:25 crc kubenswrapper[4930]: W1012 05:54:25.703784 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f4ceee7_2ffd_4323_8f24_56522dcf260d.slice/crio-bd873a4101f105c63c06d44d507c44335377eb02ef0a657ddf8a574950e97b43 WatchSource:0}: Error finding container bd873a4101f105c63c06d44d507c44335377eb02ef0a657ddf8a574950e97b43: Status 404 returned error can't find the container with id bd873a4101f105c63c06d44d507c44335377eb02ef0a657ddf8a574950e97b43 Oct 12 05:54:26 crc kubenswrapper[4930]: I1012 05:54:26.015551 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-688d597459-t5cps"] Oct 12 05:54:26 crc kubenswrapper[4930]: W1012 05:54:26.019543 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0de84bb_9b0d_4552_887d_1da3d50467e2.slice/crio-535884da959244b2e541de59f54faa2d3e773d7dd7ca0ae1b70fa587afd9a1ba WatchSource:0}: Error finding container 535884da959244b2e541de59f54faa2d3e773d7dd7ca0ae1b70fa587afd9a1ba: Status 404 returned error can't find the container with id 535884da959244b2e541de59f54faa2d3e773d7dd7ca0ae1b70fa587afd9a1ba Oct 12 05:54:26 crc kubenswrapper[4930]: I1012 05:54:26.131456 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2h4bm"] Oct 12 05:54:26 crc kubenswrapper[4930]: I1012 05:54:26.132952 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2h4bm" Oct 12 05:54:26 crc kubenswrapper[4930]: I1012 05:54:26.189153 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2h4bm"] Oct 12 05:54:26 crc kubenswrapper[4930]: I1012 05:54:26.225391 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/933e9f85-8241-4fae-95f6-e73fe1d5ef12-utilities\") pod \"redhat-marketplace-2h4bm\" (UID: \"933e9f85-8241-4fae-95f6-e73fe1d5ef12\") " pod="openshift-marketplace/redhat-marketplace-2h4bm" Oct 12 05:54:26 crc kubenswrapper[4930]: I1012 05:54:26.225494 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/933e9f85-8241-4fae-95f6-e73fe1d5ef12-catalog-content\") pod \"redhat-marketplace-2h4bm\" (UID: \"933e9f85-8241-4fae-95f6-e73fe1d5ef12\") " pod="openshift-marketplace/redhat-marketplace-2h4bm" Oct 12 05:54:26 crc kubenswrapper[4930]: I1012 05:54:26.225594 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppqzf\" (UniqueName: \"kubernetes.io/projected/933e9f85-8241-4fae-95f6-e73fe1d5ef12-kube-api-access-ppqzf\") pod \"redhat-marketplace-2h4bm\" (UID: \"933e9f85-8241-4fae-95f6-e73fe1d5ef12\") " pod="openshift-marketplace/redhat-marketplace-2h4bm" Oct 12 05:54:26 crc kubenswrapper[4930]: I1012 05:54:26.326485 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppqzf\" (UniqueName: \"kubernetes.io/projected/933e9f85-8241-4fae-95f6-e73fe1d5ef12-kube-api-access-ppqzf\") pod \"redhat-marketplace-2h4bm\" (UID: \"933e9f85-8241-4fae-95f6-e73fe1d5ef12\") " pod="openshift-marketplace/redhat-marketplace-2h4bm" Oct 12 05:54:26 crc kubenswrapper[4930]: I1012 05:54:26.326539 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/933e9f85-8241-4fae-95f6-e73fe1d5ef12-utilities\") pod \"redhat-marketplace-2h4bm\" (UID: \"933e9f85-8241-4fae-95f6-e73fe1d5ef12\") " pod="openshift-marketplace/redhat-marketplace-2h4bm" Oct 12 05:54:26 crc kubenswrapper[4930]: I1012 05:54:26.326584 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/933e9f85-8241-4fae-95f6-e73fe1d5ef12-catalog-content\") pod \"redhat-marketplace-2h4bm\" (UID: \"933e9f85-8241-4fae-95f6-e73fe1d5ef12\") " pod="openshift-marketplace/redhat-marketplace-2h4bm" Oct 12 05:54:26 crc kubenswrapper[4930]: I1012 05:54:26.327088 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/933e9f85-8241-4fae-95f6-e73fe1d5ef12-catalog-content\") pod \"redhat-marketplace-2h4bm\" (UID: \"933e9f85-8241-4fae-95f6-e73fe1d5ef12\") " pod="openshift-marketplace/redhat-marketplace-2h4bm" Oct 12 05:54:26 crc kubenswrapper[4930]: I1012 05:54:26.327152 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/933e9f85-8241-4fae-95f6-e73fe1d5ef12-utilities\") pod \"redhat-marketplace-2h4bm\" (UID: \"933e9f85-8241-4fae-95f6-e73fe1d5ef12\") " pod="openshift-marketplace/redhat-marketplace-2h4bm" Oct 12 05:54:26 crc kubenswrapper[4930]: I1012 05:54:26.344395 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppqzf\" (UniqueName: \"kubernetes.io/projected/933e9f85-8241-4fae-95f6-e73fe1d5ef12-kube-api-access-ppqzf\") pod \"redhat-marketplace-2h4bm\" (UID: \"933e9f85-8241-4fae-95f6-e73fe1d5ef12\") " pod="openshift-marketplace/redhat-marketplace-2h4bm" Oct 12 05:54:26 crc kubenswrapper[4930]: I1012 05:54:26.468680 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2h4bm" Oct 12 05:54:26 crc kubenswrapper[4930]: I1012 05:54:26.505454 4930 generic.go:334] "Generic (PLEG): container finished" podID="5f4ceee7-2ffd-4323-8f24-56522dcf260d" containerID="9a506e4ff582853f81c84116e4da101a0e014d41ed801320d9989923a33e7a0c" exitCode=0 Oct 12 05:54:26 crc kubenswrapper[4930]: I1012 05:54:26.505530 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cmhjh" event={"ID":"5f4ceee7-2ffd-4323-8f24-56522dcf260d","Type":"ContainerDied","Data":"9a506e4ff582853f81c84116e4da101a0e014d41ed801320d9989923a33e7a0c"} Oct 12 05:54:26 crc kubenswrapper[4930]: I1012 05:54:26.505954 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cmhjh" event={"ID":"5f4ceee7-2ffd-4323-8f24-56522dcf260d","Type":"ContainerStarted","Data":"bd873a4101f105c63c06d44d507c44335377eb02ef0a657ddf8a574950e97b43"} Oct 12 05:54:26 crc kubenswrapper[4930]: I1012 05:54:26.508446 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-688d597459-t5cps" event={"ID":"d0de84bb-9b0d-4552-887d-1da3d50467e2","Type":"ContainerStarted","Data":"535884da959244b2e541de59f54faa2d3e773d7dd7ca0ae1b70fa587afd9a1ba"} Oct 12 05:54:26 crc kubenswrapper[4930]: I1012 05:54:26.732541 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2h4bm"] Oct 12 05:54:27 crc kubenswrapper[4930]: I1012 05:54:27.518360 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cmhjh" event={"ID":"5f4ceee7-2ffd-4323-8f24-56522dcf260d","Type":"ContainerStarted","Data":"773d574071ae9cf1f4840958e940ae9628e520d684b759cc5dd829f3f30b4a0c"} Oct 12 05:54:27 crc kubenswrapper[4930]: I1012 05:54:27.526185 4930 generic.go:334] "Generic (PLEG): container finished" podID="933e9f85-8241-4fae-95f6-e73fe1d5ef12" containerID="273905afe8e8525244809c821347adb82203cdebe5f154bb5f5965189e1ce4a8" exitCode=0 Oct 12 05:54:27 crc kubenswrapper[4930]: I1012 05:54:27.526235 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2h4bm" event={"ID":"933e9f85-8241-4fae-95f6-e73fe1d5ef12","Type":"ContainerDied","Data":"273905afe8e8525244809c821347adb82203cdebe5f154bb5f5965189e1ce4a8"} Oct 12 05:54:27 crc kubenswrapper[4930]: I1012 05:54:27.526261 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2h4bm" event={"ID":"933e9f85-8241-4fae-95f6-e73fe1d5ef12","Type":"ContainerStarted","Data":"b55365591bb70c88e4bbd263b9349a62f0600c7c0e31f48f0cc4ab30f29f30af"} Oct 12 05:54:28 crc kubenswrapper[4930]: I1012 05:54:28.534182 4930 generic.go:334] "Generic (PLEG): container finished" podID="5f4ceee7-2ffd-4323-8f24-56522dcf260d" containerID="773d574071ae9cf1f4840958e940ae9628e520d684b759cc5dd829f3f30b4a0c" exitCode=0 Oct 12 05:54:28 crc kubenswrapper[4930]: I1012 05:54:28.534230 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cmhjh" event={"ID":"5f4ceee7-2ffd-4323-8f24-56522dcf260d","Type":"ContainerDied","Data":"773d574071ae9cf1f4840958e940ae9628e520d684b759cc5dd829f3f30b4a0c"} Oct 12 05:54:29 crc kubenswrapper[4930]: I1012 05:54:29.309443 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5tbxq"] Oct 12 05:54:29 crc kubenswrapper[4930]: I1012 05:54:29.314010 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5tbxq" Oct 12 05:54:29 crc kubenswrapper[4930]: I1012 05:54:29.322042 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5tbxq"] Oct 12 05:54:29 crc kubenswrapper[4930]: I1012 05:54:29.374860 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rt5vq\" (UniqueName: \"kubernetes.io/projected/1b3e7cf8-66dd-4666-b908-c902e809897d-kube-api-access-rt5vq\") pod \"community-operators-5tbxq\" (UID: \"1b3e7cf8-66dd-4666-b908-c902e809897d\") " pod="openshift-marketplace/community-operators-5tbxq" Oct 12 05:54:29 crc kubenswrapper[4930]: I1012 05:54:29.374925 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b3e7cf8-66dd-4666-b908-c902e809897d-catalog-content\") pod \"community-operators-5tbxq\" (UID: \"1b3e7cf8-66dd-4666-b908-c902e809897d\") " pod="openshift-marketplace/community-operators-5tbxq" Oct 12 05:54:29 crc kubenswrapper[4930]: I1012 05:54:29.374989 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b3e7cf8-66dd-4666-b908-c902e809897d-utilities\") pod \"community-operators-5tbxq\" (UID: \"1b3e7cf8-66dd-4666-b908-c902e809897d\") " pod="openshift-marketplace/community-operators-5tbxq" Oct 12 05:54:29 crc kubenswrapper[4930]: I1012 05:54:29.476977 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b3e7cf8-66dd-4666-b908-c902e809897d-catalog-content\") pod \"community-operators-5tbxq\" (UID: \"1b3e7cf8-66dd-4666-b908-c902e809897d\") " pod="openshift-marketplace/community-operators-5tbxq" Oct 12 05:54:29 crc kubenswrapper[4930]: I1012 05:54:29.477546 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b3e7cf8-66dd-4666-b908-c902e809897d-utilities\") pod \"community-operators-5tbxq\" (UID: \"1b3e7cf8-66dd-4666-b908-c902e809897d\") " pod="openshift-marketplace/community-operators-5tbxq" Oct 12 05:54:29 crc kubenswrapper[4930]: I1012 05:54:29.477617 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rt5vq\" (UniqueName: \"kubernetes.io/projected/1b3e7cf8-66dd-4666-b908-c902e809897d-kube-api-access-rt5vq\") pod \"community-operators-5tbxq\" (UID: \"1b3e7cf8-66dd-4666-b908-c902e809897d\") " pod="openshift-marketplace/community-operators-5tbxq" Oct 12 05:54:29 crc kubenswrapper[4930]: I1012 05:54:29.477643 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b3e7cf8-66dd-4666-b908-c902e809897d-catalog-content\") pod \"community-operators-5tbxq\" (UID: \"1b3e7cf8-66dd-4666-b908-c902e809897d\") " pod="openshift-marketplace/community-operators-5tbxq" Oct 12 05:54:29 crc kubenswrapper[4930]: I1012 05:54:29.478527 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b3e7cf8-66dd-4666-b908-c902e809897d-utilities\") pod \"community-operators-5tbxq\" (UID: \"1b3e7cf8-66dd-4666-b908-c902e809897d\") " pod="openshift-marketplace/community-operators-5tbxq" Oct 12 05:54:29 crc kubenswrapper[4930]: I1012 05:54:29.498900 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rt5vq\" (UniqueName: \"kubernetes.io/projected/1b3e7cf8-66dd-4666-b908-c902e809897d-kube-api-access-rt5vq\") pod \"community-operators-5tbxq\" (UID: \"1b3e7cf8-66dd-4666-b908-c902e809897d\") " pod="openshift-marketplace/community-operators-5tbxq" Oct 12 05:54:29 crc kubenswrapper[4930]: I1012 05:54:29.644936 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5tbxq" Oct 12 05:54:31 crc kubenswrapper[4930]: I1012 05:54:31.561557 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-688d597459-t5cps" event={"ID":"d0de84bb-9b0d-4552-887d-1da3d50467e2","Type":"ContainerStarted","Data":"ee55122413c114c7bbac51a5ff560ad36cf5110646b7fc6848969ac4d65a6396"} Oct 12 05:54:31 crc kubenswrapper[4930]: I1012 05:54:31.566705 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cmhjh" event={"ID":"5f4ceee7-2ffd-4323-8f24-56522dcf260d","Type":"ContainerStarted","Data":"6ce111130db665fe672eb681d527017d8451763a0a2a260d9aab4de44d7f837d"} Oct 12 05:54:31 crc kubenswrapper[4930]: I1012 05:54:31.569182 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2h4bm" event={"ID":"933e9f85-8241-4fae-95f6-e73fe1d5ef12","Type":"ContainerStarted","Data":"222f1699985bfde9cb349b2b5b68d72b864b4fbd9cbbedc8071854713665068e"} Oct 12 05:54:31 crc kubenswrapper[4930]: I1012 05:54:31.594888 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-cmhjh" podStartSLOduration=2.850602451 podStartE2EDuration="7.594872095s" podCreationTimestamp="2025-10-12 05:54:24 +0000 UTC" firstStartedPulling="2025-10-12 05:54:26.507310208 +0000 UTC m=+799.049411983" lastFinishedPulling="2025-10-12 05:54:31.251579852 +0000 UTC m=+803.793681627" observedRunningTime="2025-10-12 05:54:31.594492765 +0000 UTC m=+804.136594540" watchObservedRunningTime="2025-10-12 05:54:31.594872095 +0000 UTC m=+804.136973860" Oct 12 05:54:31 crc kubenswrapper[4930]: I1012 05:54:31.661730 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5tbxq"] Oct 12 05:54:31 crc kubenswrapper[4930]: W1012 05:54:31.687647 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b3e7cf8_66dd_4666_b908_c902e809897d.slice/crio-72bcefe83cf6b01570934154d9b82e54a1a3e070b4a9072dd000f5f55a9fa7a0 WatchSource:0}: Error finding container 72bcefe83cf6b01570934154d9b82e54a1a3e070b4a9072dd000f5f55a9fa7a0: Status 404 returned error can't find the container with id 72bcefe83cf6b01570934154d9b82e54a1a3e070b4a9072dd000f5f55a9fa7a0 Oct 12 05:54:32 crc kubenswrapper[4930]: I1012 05:54:32.586030 4930 generic.go:334] "Generic (PLEG): container finished" podID="1b3e7cf8-66dd-4666-b908-c902e809897d" containerID="01cadd9a4da44f9444f87664b61d1e3725d2bafa76e4374f7cec643e7d401229" exitCode=0 Oct 12 05:54:32 crc kubenswrapper[4930]: I1012 05:54:32.586567 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5tbxq" event={"ID":"1b3e7cf8-66dd-4666-b908-c902e809897d","Type":"ContainerDied","Data":"01cadd9a4da44f9444f87664b61d1e3725d2bafa76e4374f7cec643e7d401229"} Oct 12 05:54:32 crc kubenswrapper[4930]: I1012 05:54:32.586616 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5tbxq" event={"ID":"1b3e7cf8-66dd-4666-b908-c902e809897d","Type":"ContainerStarted","Data":"72bcefe83cf6b01570934154d9b82e54a1a3e070b4a9072dd000f5f55a9fa7a0"} Oct 12 05:54:32 crc kubenswrapper[4930]: I1012 05:54:32.588765 4930 generic.go:334] "Generic (PLEG): container finished" podID="933e9f85-8241-4fae-95f6-e73fe1d5ef12" containerID="222f1699985bfde9cb349b2b5b68d72b864b4fbd9cbbedc8071854713665068e" exitCode=0 Oct 12 05:54:32 crc kubenswrapper[4930]: I1012 05:54:32.588873 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2h4bm" event={"ID":"933e9f85-8241-4fae-95f6-e73fe1d5ef12","Type":"ContainerDied","Data":"222f1699985bfde9cb349b2b5b68d72b864b4fbd9cbbedc8071854713665068e"} Oct 12 05:54:34 crc kubenswrapper[4930]: I1012 05:54:34.606547 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2h4bm" event={"ID":"933e9f85-8241-4fae-95f6-e73fe1d5ef12","Type":"ContainerStarted","Data":"4c8dabfbc56bfec1a2a19e8e418bc8ef7d4fc3a46482d596ad0ed3cdebbe49cb"} Oct 12 05:54:34 crc kubenswrapper[4930]: I1012 05:54:34.610320 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-688d597459-t5cps" event={"ID":"d0de84bb-9b0d-4552-887d-1da3d50467e2","Type":"ContainerStarted","Data":"ecce488d837c2f0bc595f5620024b2fbcb707e8014726147bdee2f7adfb3467f"} Oct 12 05:54:34 crc kubenswrapper[4930]: I1012 05:54:34.611351 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-688d597459-t5cps" Oct 12 05:54:34 crc kubenswrapper[4930]: I1012 05:54:34.613277 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5tbxq" event={"ID":"1b3e7cf8-66dd-4666-b908-c902e809897d","Type":"ContainerStarted","Data":"2d9288fdbeaf63639f03dd055301f13ecf2fb836b25ded4e4b6bd30ba4e4603b"} Oct 12 05:54:34 crc kubenswrapper[4930]: I1012 05:54:34.686334 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2h4bm" podStartSLOduration=2.08493773 podStartE2EDuration="8.686316847s" podCreationTimestamp="2025-10-12 05:54:26 +0000 UTC" firstStartedPulling="2025-10-12 05:54:27.527634887 +0000 UTC m=+800.069736652" lastFinishedPulling="2025-10-12 05:54:34.129014004 +0000 UTC m=+806.671115769" observedRunningTime="2025-10-12 05:54:34.685342482 +0000 UTC m=+807.227444257" watchObservedRunningTime="2025-10-12 05:54:34.686316847 +0000 UTC m=+807.228418632" Oct 12 05:54:34 crc kubenswrapper[4930]: I1012 05:54:34.722702 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-688d597459-t5cps" podStartSLOduration=1.614346228 podStartE2EDuration="9.722566658s" podCreationTimestamp="2025-10-12 05:54:25 +0000 UTC" firstStartedPulling="2025-10-12 05:54:26.02180777 +0000 UTC m=+798.563909535" lastFinishedPulling="2025-10-12 05:54:34.13002819 +0000 UTC m=+806.672129965" observedRunningTime="2025-10-12 05:54:34.715681943 +0000 UTC m=+807.257783728" watchObservedRunningTime="2025-10-12 05:54:34.722566658 +0000 UTC m=+807.264668433" Oct 12 05:54:35 crc kubenswrapper[4930]: I1012 05:54:35.242648 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-cmhjh" Oct 12 05:54:35 crc kubenswrapper[4930]: I1012 05:54:35.243229 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-cmhjh" Oct 12 05:54:35 crc kubenswrapper[4930]: I1012 05:54:35.294352 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-cmhjh" Oct 12 05:54:35 crc kubenswrapper[4930]: I1012 05:54:35.625987 4930 generic.go:334] "Generic (PLEG): container finished" podID="1b3e7cf8-66dd-4666-b908-c902e809897d" containerID="2d9288fdbeaf63639f03dd055301f13ecf2fb836b25ded4e4b6bd30ba4e4603b" exitCode=0 Oct 12 05:54:35 crc kubenswrapper[4930]: I1012 05:54:35.626050 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5tbxq" event={"ID":"1b3e7cf8-66dd-4666-b908-c902e809897d","Type":"ContainerDied","Data":"2d9288fdbeaf63639f03dd055301f13ecf2fb836b25ded4e4b6bd30ba4e4603b"} Oct 12 05:54:36 crc kubenswrapper[4930]: I1012 05:54:36.469775 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2h4bm" Oct 12 05:54:36 crc kubenswrapper[4930]: I1012 05:54:36.469837 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2h4bm" Oct 12 05:54:36 crc kubenswrapper[4930]: I1012 05:54:36.556303 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2h4bm" Oct 12 05:54:36 crc kubenswrapper[4930]: I1012 05:54:36.646893 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5tbxq" event={"ID":"1b3e7cf8-66dd-4666-b908-c902e809897d","Type":"ContainerStarted","Data":"dbfef3690429662cf59e5a4d4812bd24636bc3140a5dd9aaf653610e5828688f"} Oct 12 05:54:36 crc kubenswrapper[4930]: I1012 05:54:36.651147 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-688d597459-t5cps" Oct 12 05:54:36 crc kubenswrapper[4930]: I1012 05:54:36.677786 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5tbxq" podStartSLOduration=4.198098867 podStartE2EDuration="7.677728003s" podCreationTimestamp="2025-10-12 05:54:29 +0000 UTC" firstStartedPulling="2025-10-12 05:54:32.879016438 +0000 UTC m=+805.421118203" lastFinishedPulling="2025-10-12 05:54:36.358645534 +0000 UTC m=+808.900747339" observedRunningTime="2025-10-12 05:54:36.671132195 +0000 UTC m=+809.213233970" watchObservedRunningTime="2025-10-12 05:54:36.677728003 +0000 UTC m=+809.219829808" Oct 12 05:54:36 crc kubenswrapper[4930]: I1012 05:54:36.729360 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-cmhjh" Oct 12 05:54:39 crc kubenswrapper[4930]: I1012 05:54:39.645538 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5tbxq" Oct 12 05:54:39 crc kubenswrapper[4930]: I1012 05:54:39.646019 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5tbxq" Oct 12 05:54:39 crc kubenswrapper[4930]: I1012 05:54:39.713929 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5tbxq" Oct 12 05:54:41 crc kubenswrapper[4930]: I1012 05:54:41.094581 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cmhjh"] Oct 12 05:54:41 crc kubenswrapper[4930]: I1012 05:54:41.094820 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-cmhjh" podUID="5f4ceee7-2ffd-4323-8f24-56522dcf260d" containerName="registry-server" containerID="cri-o://6ce111130db665fe672eb681d527017d8451763a0a2a260d9aab4de44d7f837d" gracePeriod=2 Oct 12 05:54:41 crc kubenswrapper[4930]: I1012 05:54:41.732322 4930 generic.go:334] "Generic (PLEG): container finished" podID="5f4ceee7-2ffd-4323-8f24-56522dcf260d" containerID="6ce111130db665fe672eb681d527017d8451763a0a2a260d9aab4de44d7f837d" exitCode=0 Oct 12 05:54:41 crc kubenswrapper[4930]: I1012 05:54:41.732399 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cmhjh" event={"ID":"5f4ceee7-2ffd-4323-8f24-56522dcf260d","Type":"ContainerDied","Data":"6ce111130db665fe672eb681d527017d8451763a0a2a260d9aab4de44d7f837d"} Oct 12 05:54:42 crc kubenswrapper[4930]: I1012 05:54:42.121521 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cmhjh" Oct 12 05:54:42 crc kubenswrapper[4930]: I1012 05:54:42.297863 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f4ceee7-2ffd-4323-8f24-56522dcf260d-utilities\") pod \"5f4ceee7-2ffd-4323-8f24-56522dcf260d\" (UID: \"5f4ceee7-2ffd-4323-8f24-56522dcf260d\") " Oct 12 05:54:42 crc kubenswrapper[4930]: I1012 05:54:42.297925 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f4ceee7-2ffd-4323-8f24-56522dcf260d-catalog-content\") pod \"5f4ceee7-2ffd-4323-8f24-56522dcf260d\" (UID: \"5f4ceee7-2ffd-4323-8f24-56522dcf260d\") " Oct 12 05:54:42 crc kubenswrapper[4930]: I1012 05:54:42.297950 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lht9t\" (UniqueName: \"kubernetes.io/projected/5f4ceee7-2ffd-4323-8f24-56522dcf260d-kube-api-access-lht9t\") pod \"5f4ceee7-2ffd-4323-8f24-56522dcf260d\" (UID: \"5f4ceee7-2ffd-4323-8f24-56522dcf260d\") " Oct 12 05:54:42 crc kubenswrapper[4930]: I1012 05:54:42.307758 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f4ceee7-2ffd-4323-8f24-56522dcf260d-kube-api-access-lht9t" (OuterVolumeSpecName: "kube-api-access-lht9t") pod "5f4ceee7-2ffd-4323-8f24-56522dcf260d" (UID: "5f4ceee7-2ffd-4323-8f24-56522dcf260d"). InnerVolumeSpecName "kube-api-access-lht9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:54:42 crc kubenswrapper[4930]: I1012 05:54:42.310447 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f4ceee7-2ffd-4323-8f24-56522dcf260d-utilities" (OuterVolumeSpecName: "utilities") pod "5f4ceee7-2ffd-4323-8f24-56522dcf260d" (UID: "5f4ceee7-2ffd-4323-8f24-56522dcf260d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 05:54:42 crc kubenswrapper[4930]: I1012 05:54:42.355063 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f4ceee7-2ffd-4323-8f24-56522dcf260d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5f4ceee7-2ffd-4323-8f24-56522dcf260d" (UID: "5f4ceee7-2ffd-4323-8f24-56522dcf260d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 05:54:42 crc kubenswrapper[4930]: I1012 05:54:42.399346 4930 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f4ceee7-2ffd-4323-8f24-56522dcf260d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 05:54:42 crc kubenswrapper[4930]: I1012 05:54:42.399371 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lht9t\" (UniqueName: \"kubernetes.io/projected/5f4ceee7-2ffd-4323-8f24-56522dcf260d-kube-api-access-lht9t\") on node \"crc\" DevicePath \"\"" Oct 12 05:54:42 crc kubenswrapper[4930]: I1012 05:54:42.399383 4930 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f4ceee7-2ffd-4323-8f24-56522dcf260d-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 05:54:42 crc kubenswrapper[4930]: I1012 05:54:42.743562 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cmhjh" event={"ID":"5f4ceee7-2ffd-4323-8f24-56522dcf260d","Type":"ContainerDied","Data":"bd873a4101f105c63c06d44d507c44335377eb02ef0a657ddf8a574950e97b43"} Oct 12 05:54:42 crc kubenswrapper[4930]: I1012 05:54:42.743630 4930 scope.go:117] "RemoveContainer" containerID="6ce111130db665fe672eb681d527017d8451763a0a2a260d9aab4de44d7f837d" Oct 12 05:54:42 crc kubenswrapper[4930]: I1012 05:54:42.743887 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cmhjh" Oct 12 05:54:42 crc kubenswrapper[4930]: I1012 05:54:42.772229 4930 scope.go:117] "RemoveContainer" containerID="773d574071ae9cf1f4840958e940ae9628e520d684b759cc5dd829f3f30b4a0c" Oct 12 05:54:42 crc kubenswrapper[4930]: I1012 05:54:42.790020 4930 scope.go:117] "RemoveContainer" containerID="9a506e4ff582853f81c84116e4da101a0e014d41ed801320d9989923a33e7a0c" Oct 12 05:54:42 crc kubenswrapper[4930]: I1012 05:54:42.790999 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cmhjh"] Oct 12 05:54:42 crc kubenswrapper[4930]: I1012 05:54:42.796355 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-cmhjh"] Oct 12 05:54:44 crc kubenswrapper[4930]: I1012 05:54:44.142466 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f4ceee7-2ffd-4323-8f24-56522dcf260d" path="/var/lib/kubelet/pods/5f4ceee7-2ffd-4323-8f24-56522dcf260d/volumes" Oct 12 05:54:46 crc kubenswrapper[4930]: I1012 05:54:46.580432 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2h4bm" Oct 12 05:54:46 crc kubenswrapper[4930]: I1012 05:54:46.659032 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2h4bm"] Oct 12 05:54:46 crc kubenswrapper[4930]: I1012 05:54:46.780079 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2h4bm" podUID="933e9f85-8241-4fae-95f6-e73fe1d5ef12" containerName="registry-server" containerID="cri-o://4c8dabfbc56bfec1a2a19e8e418bc8ef7d4fc3a46482d596ad0ed3cdebbe49cb" gracePeriod=2 Oct 12 05:54:47 crc kubenswrapper[4930]: I1012 05:54:47.252054 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2h4bm" Oct 12 05:54:47 crc kubenswrapper[4930]: I1012 05:54:47.411936 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/933e9f85-8241-4fae-95f6-e73fe1d5ef12-catalog-content\") pod \"933e9f85-8241-4fae-95f6-e73fe1d5ef12\" (UID: \"933e9f85-8241-4fae-95f6-e73fe1d5ef12\") " Oct 12 05:54:47 crc kubenswrapper[4930]: I1012 05:54:47.412042 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/933e9f85-8241-4fae-95f6-e73fe1d5ef12-utilities\") pod \"933e9f85-8241-4fae-95f6-e73fe1d5ef12\" (UID: \"933e9f85-8241-4fae-95f6-e73fe1d5ef12\") " Oct 12 05:54:47 crc kubenswrapper[4930]: I1012 05:54:47.412100 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppqzf\" (UniqueName: \"kubernetes.io/projected/933e9f85-8241-4fae-95f6-e73fe1d5ef12-kube-api-access-ppqzf\") pod \"933e9f85-8241-4fae-95f6-e73fe1d5ef12\" (UID: \"933e9f85-8241-4fae-95f6-e73fe1d5ef12\") " Oct 12 05:54:47 crc kubenswrapper[4930]: I1012 05:54:47.413609 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/933e9f85-8241-4fae-95f6-e73fe1d5ef12-utilities" (OuterVolumeSpecName: "utilities") pod "933e9f85-8241-4fae-95f6-e73fe1d5ef12" (UID: "933e9f85-8241-4fae-95f6-e73fe1d5ef12"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 05:54:47 crc kubenswrapper[4930]: I1012 05:54:47.429020 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/933e9f85-8241-4fae-95f6-e73fe1d5ef12-kube-api-access-ppqzf" (OuterVolumeSpecName: "kube-api-access-ppqzf") pod "933e9f85-8241-4fae-95f6-e73fe1d5ef12" (UID: "933e9f85-8241-4fae-95f6-e73fe1d5ef12"). InnerVolumeSpecName "kube-api-access-ppqzf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:54:47 crc kubenswrapper[4930]: I1012 05:54:47.437096 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/933e9f85-8241-4fae-95f6-e73fe1d5ef12-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "933e9f85-8241-4fae-95f6-e73fe1d5ef12" (UID: "933e9f85-8241-4fae-95f6-e73fe1d5ef12"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 05:54:47 crc kubenswrapper[4930]: I1012 05:54:47.513892 4930 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/933e9f85-8241-4fae-95f6-e73fe1d5ef12-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 05:54:47 crc kubenswrapper[4930]: I1012 05:54:47.513941 4930 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/933e9f85-8241-4fae-95f6-e73fe1d5ef12-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 05:54:47 crc kubenswrapper[4930]: I1012 05:54:47.513961 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppqzf\" (UniqueName: \"kubernetes.io/projected/933e9f85-8241-4fae-95f6-e73fe1d5ef12-kube-api-access-ppqzf\") on node \"crc\" DevicePath \"\"" Oct 12 05:54:47 crc kubenswrapper[4930]: I1012 05:54:47.791402 4930 generic.go:334] "Generic (PLEG): container finished" podID="933e9f85-8241-4fae-95f6-e73fe1d5ef12" containerID="4c8dabfbc56bfec1a2a19e8e418bc8ef7d4fc3a46482d596ad0ed3cdebbe49cb" exitCode=0 Oct 12 05:54:47 crc kubenswrapper[4930]: I1012 05:54:47.791452 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2h4bm" event={"ID":"933e9f85-8241-4fae-95f6-e73fe1d5ef12","Type":"ContainerDied","Data":"4c8dabfbc56bfec1a2a19e8e418bc8ef7d4fc3a46482d596ad0ed3cdebbe49cb"} Oct 12 05:54:47 crc kubenswrapper[4930]: I1012 05:54:47.791521 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2h4bm" event={"ID":"933e9f85-8241-4fae-95f6-e73fe1d5ef12","Type":"ContainerDied","Data":"b55365591bb70c88e4bbd263b9349a62f0600c7c0e31f48f0cc4ab30f29f30af"} Oct 12 05:54:47 crc kubenswrapper[4930]: I1012 05:54:47.791547 4930 scope.go:117] "RemoveContainer" containerID="4c8dabfbc56bfec1a2a19e8e418bc8ef7d4fc3a46482d596ad0ed3cdebbe49cb" Oct 12 05:54:47 crc kubenswrapper[4930]: I1012 05:54:47.791548 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2h4bm" Oct 12 05:54:47 crc kubenswrapper[4930]: I1012 05:54:47.813526 4930 scope.go:117] "RemoveContainer" containerID="222f1699985bfde9cb349b2b5b68d72b864b4fbd9cbbedc8071854713665068e" Oct 12 05:54:47 crc kubenswrapper[4930]: I1012 05:54:47.841277 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2h4bm"] Oct 12 05:54:47 crc kubenswrapper[4930]: I1012 05:54:47.842054 4930 scope.go:117] "RemoveContainer" containerID="273905afe8e8525244809c821347adb82203cdebe5f154bb5f5965189e1ce4a8" Oct 12 05:54:47 crc kubenswrapper[4930]: I1012 05:54:47.851171 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2h4bm"] Oct 12 05:54:47 crc kubenswrapper[4930]: I1012 05:54:47.878047 4930 scope.go:117] "RemoveContainer" containerID="4c8dabfbc56bfec1a2a19e8e418bc8ef7d4fc3a46482d596ad0ed3cdebbe49cb" Oct 12 05:54:47 crc kubenswrapper[4930]: E1012 05:54:47.881885 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c8dabfbc56bfec1a2a19e8e418bc8ef7d4fc3a46482d596ad0ed3cdebbe49cb\": container with ID starting with 4c8dabfbc56bfec1a2a19e8e418bc8ef7d4fc3a46482d596ad0ed3cdebbe49cb not found: ID does not exist" containerID="4c8dabfbc56bfec1a2a19e8e418bc8ef7d4fc3a46482d596ad0ed3cdebbe49cb" Oct 12 05:54:47 crc kubenswrapper[4930]: I1012 05:54:47.881943 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c8dabfbc56bfec1a2a19e8e418bc8ef7d4fc3a46482d596ad0ed3cdebbe49cb"} err="failed to get container status \"4c8dabfbc56bfec1a2a19e8e418bc8ef7d4fc3a46482d596ad0ed3cdebbe49cb\": rpc error: code = NotFound desc = could not find container \"4c8dabfbc56bfec1a2a19e8e418bc8ef7d4fc3a46482d596ad0ed3cdebbe49cb\": container with ID starting with 4c8dabfbc56bfec1a2a19e8e418bc8ef7d4fc3a46482d596ad0ed3cdebbe49cb not found: ID does not exist" Oct 12 05:54:47 crc kubenswrapper[4930]: I1012 05:54:47.881998 4930 scope.go:117] "RemoveContainer" containerID="222f1699985bfde9cb349b2b5b68d72b864b4fbd9cbbedc8071854713665068e" Oct 12 05:54:47 crc kubenswrapper[4930]: E1012 05:54:47.882505 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"222f1699985bfde9cb349b2b5b68d72b864b4fbd9cbbedc8071854713665068e\": container with ID starting with 222f1699985bfde9cb349b2b5b68d72b864b4fbd9cbbedc8071854713665068e not found: ID does not exist" containerID="222f1699985bfde9cb349b2b5b68d72b864b4fbd9cbbedc8071854713665068e" Oct 12 05:54:47 crc kubenswrapper[4930]: I1012 05:54:47.882562 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"222f1699985bfde9cb349b2b5b68d72b864b4fbd9cbbedc8071854713665068e"} err="failed to get container status \"222f1699985bfde9cb349b2b5b68d72b864b4fbd9cbbedc8071854713665068e\": rpc error: code = NotFound desc = could not find container \"222f1699985bfde9cb349b2b5b68d72b864b4fbd9cbbedc8071854713665068e\": container with ID starting with 222f1699985bfde9cb349b2b5b68d72b864b4fbd9cbbedc8071854713665068e not found: ID does not exist" Oct 12 05:54:47 crc kubenswrapper[4930]: I1012 05:54:47.882606 4930 scope.go:117] "RemoveContainer" containerID="273905afe8e8525244809c821347adb82203cdebe5f154bb5f5965189e1ce4a8" Oct 12 05:54:47 crc kubenswrapper[4930]: E1012 05:54:47.883141 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"273905afe8e8525244809c821347adb82203cdebe5f154bb5f5965189e1ce4a8\": container with ID starting with 273905afe8e8525244809c821347adb82203cdebe5f154bb5f5965189e1ce4a8 not found: ID does not exist" containerID="273905afe8e8525244809c821347adb82203cdebe5f154bb5f5965189e1ce4a8" Oct 12 05:54:47 crc kubenswrapper[4930]: I1012 05:54:47.883187 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"273905afe8e8525244809c821347adb82203cdebe5f154bb5f5965189e1ce4a8"} err="failed to get container status \"273905afe8e8525244809c821347adb82203cdebe5f154bb5f5965189e1ce4a8\": rpc error: code = NotFound desc = could not find container \"273905afe8e8525244809c821347adb82203cdebe5f154bb5f5965189e1ce4a8\": container with ID starting with 273905afe8e8525244809c821347adb82203cdebe5f154bb5f5965189e1ce4a8 not found: ID does not exist" Oct 12 05:54:48 crc kubenswrapper[4930]: I1012 05:54:48.143644 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="933e9f85-8241-4fae-95f6-e73fe1d5ef12" path="/var/lib/kubelet/pods/933e9f85-8241-4fae-95f6-e73fe1d5ef12/volumes" Oct 12 05:54:49 crc kubenswrapper[4930]: I1012 05:54:49.705956 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5tbxq" Oct 12 05:54:50 crc kubenswrapper[4930]: I1012 05:54:50.710567 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5tbxq"] Oct 12 05:54:50 crc kubenswrapper[4930]: I1012 05:54:50.711253 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5tbxq" podUID="1b3e7cf8-66dd-4666-b908-c902e809897d" containerName="registry-server" containerID="cri-o://dbfef3690429662cf59e5a4d4812bd24636bc3140a5dd9aaf653610e5828688f" gracePeriod=2 Oct 12 05:54:51 crc kubenswrapper[4930]: I1012 05:54:51.157830 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5tbxq" Oct 12 05:54:51 crc kubenswrapper[4930]: I1012 05:54:51.265113 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rt5vq\" (UniqueName: \"kubernetes.io/projected/1b3e7cf8-66dd-4666-b908-c902e809897d-kube-api-access-rt5vq\") pod \"1b3e7cf8-66dd-4666-b908-c902e809897d\" (UID: \"1b3e7cf8-66dd-4666-b908-c902e809897d\") " Oct 12 05:54:51 crc kubenswrapper[4930]: I1012 05:54:51.265215 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b3e7cf8-66dd-4666-b908-c902e809897d-utilities\") pod \"1b3e7cf8-66dd-4666-b908-c902e809897d\" (UID: \"1b3e7cf8-66dd-4666-b908-c902e809897d\") " Oct 12 05:54:51 crc kubenswrapper[4930]: I1012 05:54:51.265280 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b3e7cf8-66dd-4666-b908-c902e809897d-catalog-content\") pod \"1b3e7cf8-66dd-4666-b908-c902e809897d\" (UID: \"1b3e7cf8-66dd-4666-b908-c902e809897d\") " Oct 12 05:54:51 crc kubenswrapper[4930]: I1012 05:54:51.266326 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b3e7cf8-66dd-4666-b908-c902e809897d-utilities" (OuterVolumeSpecName: "utilities") pod "1b3e7cf8-66dd-4666-b908-c902e809897d" (UID: "1b3e7cf8-66dd-4666-b908-c902e809897d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 05:54:51 crc kubenswrapper[4930]: I1012 05:54:51.273105 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b3e7cf8-66dd-4666-b908-c902e809897d-kube-api-access-rt5vq" (OuterVolumeSpecName: "kube-api-access-rt5vq") pod "1b3e7cf8-66dd-4666-b908-c902e809897d" (UID: "1b3e7cf8-66dd-4666-b908-c902e809897d"). InnerVolumeSpecName "kube-api-access-rt5vq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:54:51 crc kubenswrapper[4930]: I1012 05:54:51.320601 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b3e7cf8-66dd-4666-b908-c902e809897d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1b3e7cf8-66dd-4666-b908-c902e809897d" (UID: "1b3e7cf8-66dd-4666-b908-c902e809897d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 05:54:51 crc kubenswrapper[4930]: I1012 05:54:51.367037 4930 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b3e7cf8-66dd-4666-b908-c902e809897d-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 05:54:51 crc kubenswrapper[4930]: I1012 05:54:51.367075 4930 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b3e7cf8-66dd-4666-b908-c902e809897d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 05:54:51 crc kubenswrapper[4930]: I1012 05:54:51.367091 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rt5vq\" (UniqueName: \"kubernetes.io/projected/1b3e7cf8-66dd-4666-b908-c902e809897d-kube-api-access-rt5vq\") on node \"crc\" DevicePath \"\"" Oct 12 05:54:51 crc kubenswrapper[4930]: I1012 05:54:51.830559 4930 generic.go:334] "Generic (PLEG): container finished" podID="1b3e7cf8-66dd-4666-b908-c902e809897d" containerID="dbfef3690429662cf59e5a4d4812bd24636bc3140a5dd9aaf653610e5828688f" exitCode=0 Oct 12 05:54:51 crc kubenswrapper[4930]: I1012 05:54:51.830609 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5tbxq" event={"ID":"1b3e7cf8-66dd-4666-b908-c902e809897d","Type":"ContainerDied","Data":"dbfef3690429662cf59e5a4d4812bd24636bc3140a5dd9aaf653610e5828688f"} Oct 12 05:54:51 crc kubenswrapper[4930]: I1012 05:54:51.830642 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5tbxq" event={"ID":"1b3e7cf8-66dd-4666-b908-c902e809897d","Type":"ContainerDied","Data":"72bcefe83cf6b01570934154d9b82e54a1a3e070b4a9072dd000f5f55a9fa7a0"} Oct 12 05:54:51 crc kubenswrapper[4930]: I1012 05:54:51.830662 4930 scope.go:117] "RemoveContainer" containerID="dbfef3690429662cf59e5a4d4812bd24636bc3140a5dd9aaf653610e5828688f" Oct 12 05:54:51 crc kubenswrapper[4930]: I1012 05:54:51.830711 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5tbxq" Oct 12 05:54:51 crc kubenswrapper[4930]: I1012 05:54:51.852185 4930 scope.go:117] "RemoveContainer" containerID="2d9288fdbeaf63639f03dd055301f13ecf2fb836b25ded4e4b6bd30ba4e4603b" Oct 12 05:54:51 crc kubenswrapper[4930]: I1012 05:54:51.874196 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5tbxq"] Oct 12 05:54:51 crc kubenswrapper[4930]: I1012 05:54:51.874622 4930 scope.go:117] "RemoveContainer" containerID="01cadd9a4da44f9444f87664b61d1e3725d2bafa76e4374f7cec643e7d401229" Oct 12 05:54:51 crc kubenswrapper[4930]: I1012 05:54:51.883043 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5tbxq"] Oct 12 05:54:51 crc kubenswrapper[4930]: I1012 05:54:51.899624 4930 scope.go:117] "RemoveContainer" containerID="dbfef3690429662cf59e5a4d4812bd24636bc3140a5dd9aaf653610e5828688f" Oct 12 05:54:51 crc kubenswrapper[4930]: E1012 05:54:51.900156 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbfef3690429662cf59e5a4d4812bd24636bc3140a5dd9aaf653610e5828688f\": container with ID starting with dbfef3690429662cf59e5a4d4812bd24636bc3140a5dd9aaf653610e5828688f not found: ID does not exist" containerID="dbfef3690429662cf59e5a4d4812bd24636bc3140a5dd9aaf653610e5828688f" Oct 12 05:54:51 crc kubenswrapper[4930]: I1012 05:54:51.900207 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbfef3690429662cf59e5a4d4812bd24636bc3140a5dd9aaf653610e5828688f"} err="failed to get container status \"dbfef3690429662cf59e5a4d4812bd24636bc3140a5dd9aaf653610e5828688f\": rpc error: code = NotFound desc = could not find container \"dbfef3690429662cf59e5a4d4812bd24636bc3140a5dd9aaf653610e5828688f\": container with ID starting with dbfef3690429662cf59e5a4d4812bd24636bc3140a5dd9aaf653610e5828688f not found: ID does not exist" Oct 12 05:54:51 crc kubenswrapper[4930]: I1012 05:54:51.900237 4930 scope.go:117] "RemoveContainer" containerID="2d9288fdbeaf63639f03dd055301f13ecf2fb836b25ded4e4b6bd30ba4e4603b" Oct 12 05:54:51 crc kubenswrapper[4930]: E1012 05:54:51.900718 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d9288fdbeaf63639f03dd055301f13ecf2fb836b25ded4e4b6bd30ba4e4603b\": container with ID starting with 2d9288fdbeaf63639f03dd055301f13ecf2fb836b25ded4e4b6bd30ba4e4603b not found: ID does not exist" containerID="2d9288fdbeaf63639f03dd055301f13ecf2fb836b25ded4e4b6bd30ba4e4603b" Oct 12 05:54:51 crc kubenswrapper[4930]: I1012 05:54:51.900772 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d9288fdbeaf63639f03dd055301f13ecf2fb836b25ded4e4b6bd30ba4e4603b"} err="failed to get container status \"2d9288fdbeaf63639f03dd055301f13ecf2fb836b25ded4e4b6bd30ba4e4603b\": rpc error: code = NotFound desc = could not find container \"2d9288fdbeaf63639f03dd055301f13ecf2fb836b25ded4e4b6bd30ba4e4603b\": container with ID starting with 2d9288fdbeaf63639f03dd055301f13ecf2fb836b25ded4e4b6bd30ba4e4603b not found: ID does not exist" Oct 12 05:54:51 crc kubenswrapper[4930]: I1012 05:54:51.900798 4930 scope.go:117] "RemoveContainer" containerID="01cadd9a4da44f9444f87664b61d1e3725d2bafa76e4374f7cec643e7d401229" Oct 12 05:54:51 crc kubenswrapper[4930]: E1012 05:54:51.901162 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01cadd9a4da44f9444f87664b61d1e3725d2bafa76e4374f7cec643e7d401229\": container with ID starting with 01cadd9a4da44f9444f87664b61d1e3725d2bafa76e4374f7cec643e7d401229 not found: ID does not exist" containerID="01cadd9a4da44f9444f87664b61d1e3725d2bafa76e4374f7cec643e7d401229" Oct 12 05:54:51 crc kubenswrapper[4930]: I1012 05:54:51.901237 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01cadd9a4da44f9444f87664b61d1e3725d2bafa76e4374f7cec643e7d401229"} err="failed to get container status \"01cadd9a4da44f9444f87664b61d1e3725d2bafa76e4374f7cec643e7d401229\": rpc error: code = NotFound desc = could not find container \"01cadd9a4da44f9444f87664b61d1e3725d2bafa76e4374f7cec643e7d401229\": container with ID starting with 01cadd9a4da44f9444f87664b61d1e3725d2bafa76e4374f7cec643e7d401229 not found: ID does not exist" Oct 12 05:54:52 crc kubenswrapper[4930]: I1012 05:54:52.145579 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b3e7cf8-66dd-4666-b908-c902e809897d" path="/var/lib/kubelet/pods/1b3e7cf8-66dd-4666-b908-c902e809897d/volumes" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.144784 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-658bdf4b74-vf2tl"] Oct 12 05:54:53 crc kubenswrapper[4930]: E1012 05:54:53.145911 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f4ceee7-2ffd-4323-8f24-56522dcf260d" containerName="extract-utilities" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.145934 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f4ceee7-2ffd-4323-8f24-56522dcf260d" containerName="extract-utilities" Oct 12 05:54:53 crc kubenswrapper[4930]: E1012 05:54:53.145950 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f4ceee7-2ffd-4323-8f24-56522dcf260d" containerName="extract-content" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.145959 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f4ceee7-2ffd-4323-8f24-56522dcf260d" containerName="extract-content" Oct 12 05:54:53 crc kubenswrapper[4930]: E1012 05:54:53.145975 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="933e9f85-8241-4fae-95f6-e73fe1d5ef12" containerName="extract-content" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.145985 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="933e9f85-8241-4fae-95f6-e73fe1d5ef12" containerName="extract-content" Oct 12 05:54:53 crc kubenswrapper[4930]: E1012 05:54:53.146000 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b3e7cf8-66dd-4666-b908-c902e809897d" containerName="extract-utilities" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.146009 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b3e7cf8-66dd-4666-b908-c902e809897d" containerName="extract-utilities" Oct 12 05:54:53 crc kubenswrapper[4930]: E1012 05:54:53.146026 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="933e9f85-8241-4fae-95f6-e73fe1d5ef12" containerName="registry-server" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.146036 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="933e9f85-8241-4fae-95f6-e73fe1d5ef12" containerName="registry-server" Oct 12 05:54:53 crc kubenswrapper[4930]: E1012 05:54:53.146051 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b3e7cf8-66dd-4666-b908-c902e809897d" containerName="registry-server" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.146060 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b3e7cf8-66dd-4666-b908-c902e809897d" containerName="registry-server" Oct 12 05:54:53 crc kubenswrapper[4930]: E1012 05:54:53.146071 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="933e9f85-8241-4fae-95f6-e73fe1d5ef12" containerName="extract-utilities" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.146080 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="933e9f85-8241-4fae-95f6-e73fe1d5ef12" containerName="extract-utilities" Oct 12 05:54:53 crc kubenswrapper[4930]: E1012 05:54:53.146091 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f4ceee7-2ffd-4323-8f24-56522dcf260d" containerName="registry-server" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.146120 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f4ceee7-2ffd-4323-8f24-56522dcf260d" containerName="registry-server" Oct 12 05:54:53 crc kubenswrapper[4930]: E1012 05:54:53.146134 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b3e7cf8-66dd-4666-b908-c902e809897d" containerName="extract-content" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.146143 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b3e7cf8-66dd-4666-b908-c902e809897d" containerName="extract-content" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.146357 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="933e9f85-8241-4fae-95f6-e73fe1d5ef12" containerName="registry-server" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.146380 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b3e7cf8-66dd-4666-b908-c902e809897d" containerName="registry-server" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.146401 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f4ceee7-2ffd-4323-8f24-56522dcf260d" containerName="registry-server" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.147654 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-vf2tl" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.150100 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-6s5n4" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.156836 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-7b7fb68549-2p7m6"] Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.158332 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-2p7m6" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.159697 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-fk5t7" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.170184 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-658bdf4b74-vf2tl"] Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.172267 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-85d5d9dd78-8vdcg"] Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.173786 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-8vdcg" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.175772 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-4rc7h" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.185274 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-85d5d9dd78-8vdcg"] Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.189650 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-84b9b84486-ckctw"] Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.191505 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-84b9b84486-ckctw" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.192793 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hphdk\" (UniqueName: \"kubernetes.io/projected/93426c54-3448-421e-aa85-b03c466c7bf8-kube-api-access-hphdk\") pod \"glance-operator-controller-manager-84b9b84486-ckctw\" (UID: \"93426c54-3448-421e-aa85-b03c466c7bf8\") " pod="openstack-operators/glance-operator-controller-manager-84b9b84486-ckctw" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.192859 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjcdj\" (UniqueName: \"kubernetes.io/projected/5a7f54b7-1891-4e3a-a768-e937269bd384-kube-api-access-hjcdj\") pod \"cinder-operator-controller-manager-7b7fb68549-2p7m6\" (UID: \"5a7f54b7-1891-4e3a-a768-e937269bd384\") " pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-2p7m6" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.192927 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gj29l\" (UniqueName: \"kubernetes.io/projected/9e6cd80c-4aa5-40de-81fc-10d0329f5481-kube-api-access-gj29l\") pod \"barbican-operator-controller-manager-658bdf4b74-vf2tl\" (UID: \"9e6cd80c-4aa5-40de-81fc-10d0329f5481\") " pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-vf2tl" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.193007 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvsbt\" (UniqueName: \"kubernetes.io/projected/f883b7e0-65a4-4cea-9d02-7cdedfaf9ec2-kube-api-access-gvsbt\") pod \"designate-operator-controller-manager-85d5d9dd78-8vdcg\" (UID: \"f883b7e0-65a4-4cea-9d02-7cdedfaf9ec2\") " pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-8vdcg" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.195087 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-7dwxw" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.203485 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-84b9b84486-ckctw"] Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.210482 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-858f76bbdd-2fq48"] Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.211647 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-2fq48" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.215380 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-lgzt7" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.235644 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-7b7fb68549-2p7m6"] Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.297729 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjcdj\" (UniqueName: \"kubernetes.io/projected/5a7f54b7-1891-4e3a-a768-e937269bd384-kube-api-access-hjcdj\") pod \"cinder-operator-controller-manager-7b7fb68549-2p7m6\" (UID: \"5a7f54b7-1891-4e3a-a768-e937269bd384\") " pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-2p7m6" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.297825 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gj29l\" (UniqueName: \"kubernetes.io/projected/9e6cd80c-4aa5-40de-81fc-10d0329f5481-kube-api-access-gj29l\") pod \"barbican-operator-controller-manager-658bdf4b74-vf2tl\" (UID: \"9e6cd80c-4aa5-40de-81fc-10d0329f5481\") " pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-vf2tl" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.297890 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvsbt\" (UniqueName: \"kubernetes.io/projected/f883b7e0-65a4-4cea-9d02-7cdedfaf9ec2-kube-api-access-gvsbt\") pod \"designate-operator-controller-manager-85d5d9dd78-8vdcg\" (UID: \"f883b7e0-65a4-4cea-9d02-7cdedfaf9ec2\") " pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-8vdcg" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.297936 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hphdk\" (UniqueName: \"kubernetes.io/projected/93426c54-3448-421e-aa85-b03c466c7bf8-kube-api-access-hphdk\") pod \"glance-operator-controller-manager-84b9b84486-ckctw\" (UID: \"93426c54-3448-421e-aa85-b03c466c7bf8\") " pod="openstack-operators/glance-operator-controller-manager-84b9b84486-ckctw" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.298441 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-858f76bbdd-2fq48"] Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.322127 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hphdk\" (UniqueName: \"kubernetes.io/projected/93426c54-3448-421e-aa85-b03c466c7bf8-kube-api-access-hphdk\") pod \"glance-operator-controller-manager-84b9b84486-ckctw\" (UID: \"93426c54-3448-421e-aa85-b03c466c7bf8\") " pod="openstack-operators/glance-operator-controller-manager-84b9b84486-ckctw" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.327234 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gj29l\" (UniqueName: \"kubernetes.io/projected/9e6cd80c-4aa5-40de-81fc-10d0329f5481-kube-api-access-gj29l\") pod \"barbican-operator-controller-manager-658bdf4b74-vf2tl\" (UID: \"9e6cd80c-4aa5-40de-81fc-10d0329f5481\") " pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-vf2tl" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.334494 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-7ffbcb7588-zh959"] Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.348529 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-zh959" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.349154 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjcdj\" (UniqueName: \"kubernetes.io/projected/5a7f54b7-1891-4e3a-a768-e937269bd384-kube-api-access-hjcdj\") pod \"cinder-operator-controller-manager-7b7fb68549-2p7m6\" (UID: \"5a7f54b7-1891-4e3a-a768-e937269bd384\") " pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-2p7m6" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.352882 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-7ffbcb7588-zh959"] Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.356448 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-fpr8r" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.389633 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvsbt\" (UniqueName: \"kubernetes.io/projected/f883b7e0-65a4-4cea-9d02-7cdedfaf9ec2-kube-api-access-gvsbt\") pod \"designate-operator-controller-manager-85d5d9dd78-8vdcg\" (UID: \"f883b7e0-65a4-4cea-9d02-7cdedfaf9ec2\") " pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-8vdcg" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.398974 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59bfg\" (UniqueName: \"kubernetes.io/projected/ee89a0b0-868b-4b2e-a274-c5a4ee40a872-kube-api-access-59bfg\") pod \"horizon-operator-controller-manager-7ffbcb7588-zh959\" (UID: \"ee89a0b0-868b-4b2e-a274-c5a4ee40a872\") " pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-zh959" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.399041 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nrm5\" (UniqueName: \"kubernetes.io/projected/25276148-1b95-4b4d-9f18-ef97020632a7-kube-api-access-4nrm5\") pod \"heat-operator-controller-manager-858f76bbdd-2fq48\" (UID: \"25276148-1b95-4b4d-9f18-ef97020632a7\") " pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-2fq48" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.408165 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-656bcbd775-wgp8z"] Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.410414 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-656bcbd775-wgp8z" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.417189 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-qvjk2" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.417386 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.425561 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-9c5c78d49-ksz2s"] Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.427061 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-ksz2s" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.437419 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-l6hvq" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.444592 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-656bcbd775-wgp8z"] Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.452570 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-9c5c78d49-ksz2s"] Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.471729 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-wk6qh"] Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.472983 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-wk6qh" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.483273 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-k7wqh" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.483456 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-5f67fbc655-ql78t"] Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.484679 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-ql78t" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.487402 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-76ks9" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.493283 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-wk6qh"] Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.498678 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-qcc9w"] Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.500220 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59bfg\" (UniqueName: \"kubernetes.io/projected/ee89a0b0-868b-4b2e-a274-c5a4ee40a872-kube-api-access-59bfg\") pod \"horizon-operator-controller-manager-7ffbcb7588-zh959\" (UID: \"ee89a0b0-868b-4b2e-a274-c5a4ee40a872\") " pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-zh959" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.500282 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nrm5\" (UniqueName: \"kubernetes.io/projected/25276148-1b95-4b4d-9f18-ef97020632a7-kube-api-access-4nrm5\") pod \"heat-operator-controller-manager-858f76bbdd-2fq48\" (UID: \"25276148-1b95-4b4d-9f18-ef97020632a7\") " pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-2fq48" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.501866 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-qcc9w" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.509086 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-58ztn" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.509135 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-vf2tl" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.516974 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-2p7m6" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.520811 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-5f67fbc655-ql78t"] Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.525821 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-79d585cb66-2brdt"] Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.527588 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59bfg\" (UniqueName: \"kubernetes.io/projected/ee89a0b0-868b-4b2e-a274-c5a4ee40a872-kube-api-access-59bfg\") pod \"horizon-operator-controller-manager-7ffbcb7588-zh959\" (UID: \"ee89a0b0-868b-4b2e-a274-c5a4ee40a872\") " pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-zh959" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.527722 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nrm5\" (UniqueName: \"kubernetes.io/projected/25276148-1b95-4b4d-9f18-ef97020632a7-kube-api-access-4nrm5\") pod \"heat-operator-controller-manager-858f76bbdd-2fq48\" (UID: \"25276148-1b95-4b4d-9f18-ef97020632a7\") " pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-2fq48" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.544308 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-8vdcg" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.557844 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-84b9b84486-ckctw" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.557885 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-qcc9w"] Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.557914 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-79d585cb66-2brdt"] Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.558014 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-2brdt" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.563538 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-vn54f" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.563941 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-2fq48" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.585432 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-5df598886f-hx224"] Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.586672 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5df598886f-hx224" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.590447 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-h8p94" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.590991 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-vn5dq"] Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.592181 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-vn5dq" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.594377 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-9whr7" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.596992 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5df598886f-hx224"] Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.603854 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddgzt\" (UniqueName: \"kubernetes.io/projected/df7a25ba-c240-4d05-a117-0040e24bb33c-kube-api-access-ddgzt\") pod \"infra-operator-controller-manager-656bcbd775-wgp8z\" (UID: \"df7a25ba-c240-4d05-a117-0040e24bb33c\") " pod="openstack-operators/infra-operator-controller-manager-656bcbd775-wgp8z" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.603902 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d44pf\" (UniqueName: \"kubernetes.io/projected/37d1af03-8709-4b4a-8d4c-bda1dbefff59-kube-api-access-d44pf\") pod \"mariadb-operator-controller-manager-f9fb45f8f-qcc9w\" (UID: \"37d1af03-8709-4b4a-8d4c-bda1dbefff59\") " pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-qcc9w" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.603925 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trvr5\" (UniqueName: \"kubernetes.io/projected/7134f9eb-cfa6-41b8-a245-2f1b17669ca4-kube-api-access-trvr5\") pod \"neutron-operator-controller-manager-79d585cb66-2brdt\" (UID: \"7134f9eb-cfa6-41b8-a245-2f1b17669ca4\") " pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-2brdt" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.603943 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/df7a25ba-c240-4d05-a117-0040e24bb33c-cert\") pod \"infra-operator-controller-manager-656bcbd775-wgp8z\" (UID: \"df7a25ba-c240-4d05-a117-0040e24bb33c\") " pod="openstack-operators/infra-operator-controller-manager-656bcbd775-wgp8z" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.603977 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n79kj\" (UniqueName: \"kubernetes.io/projected/fe819f44-6224-4b45-a33c-6b6ef8e73b92-kube-api-access-n79kj\") pod \"manila-operator-controller-manager-5f67fbc655-ql78t\" (UID: \"fe819f44-6224-4b45-a33c-6b6ef8e73b92\") " pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-ql78t" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.604005 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpmxb\" (UniqueName: \"kubernetes.io/projected/fe5f36d2-82b4-4bce-a189-7844dae5dc0e-kube-api-access-dpmxb\") pod \"keystone-operator-controller-manager-55b6b7c7b8-wk6qh\" (UID: \"fe5f36d2-82b4-4bce-a189-7844dae5dc0e\") " pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-wk6qh" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.604023 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kn6v6\" (UniqueName: \"kubernetes.io/projected/5383069b-8f72-4173-97bf-34ffc36c235e-kube-api-access-kn6v6\") pod \"octavia-operator-controller-manager-69fdcfc5f5-vn5dq\" (UID: \"5383069b-8f72-4173-97bf-34ffc36c235e\") " pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-vn5dq" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.604041 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9k9vz\" (UniqueName: \"kubernetes.io/projected/510d5f0a-5f67-4171-99e6-1de6734e7bdf-kube-api-access-9k9vz\") pod \"nova-operator-controller-manager-5df598886f-hx224\" (UID: \"510d5f0a-5f67-4171-99e6-1de6734e7bdf\") " pod="openstack-operators/nova-operator-controller-manager-5df598886f-hx224" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.604065 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-db2v4\" (UniqueName: \"kubernetes.io/projected/adf9f01b-70b6-46b9-acde-c1eedc16f299-kube-api-access-db2v4\") pod \"ironic-operator-controller-manager-9c5c78d49-ksz2s\" (UID: \"adf9f01b-70b6-46b9-acde-c1eedc16f299\") " pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-ksz2s" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.607111 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5956dffb7bwrr99"] Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.608189 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5956dffb7bwrr99" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.609727 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-k2fhf" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.610164 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.612586 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-vn5dq"] Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.618577 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-79df5fb58c-2hx92"] Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.620508 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-79df5fb58c-2hx92" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.621550 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-9mmjn" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.634513 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5956dffb7bwrr99"] Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.641862 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-68b6c87b68-l2x8t"] Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.646066 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-l2x8t" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.647109 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-79df5fb58c-2hx92"] Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.650806 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-68b6c87b68-l2x8t"] Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.653029 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-4cpz8" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.657810 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-db6d7f97b-862f6"] Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.659334 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-862f6" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.660912 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-trvzw" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.670059 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-db6d7f97b-862f6"] Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.679661 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-67cfc6749b-wk5xm"] Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.681681 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-wk5xm" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.684087 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-xpn4k" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.692630 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-67cfc6749b-wk5xm"] Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.709332 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n79kj\" (UniqueName: \"kubernetes.io/projected/fe819f44-6224-4b45-a33c-6b6ef8e73b92-kube-api-access-n79kj\") pod \"manila-operator-controller-manager-5f67fbc655-ql78t\" (UID: \"fe819f44-6224-4b45-a33c-6b6ef8e73b92\") " pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-ql78t" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.709398 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpmxb\" (UniqueName: \"kubernetes.io/projected/fe5f36d2-82b4-4bce-a189-7844dae5dc0e-kube-api-access-dpmxb\") pod \"keystone-operator-controller-manager-55b6b7c7b8-wk6qh\" (UID: \"fe5f36d2-82b4-4bce-a189-7844dae5dc0e\") " pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-wk6qh" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.709424 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kn6v6\" (UniqueName: \"kubernetes.io/projected/5383069b-8f72-4173-97bf-34ffc36c235e-kube-api-access-kn6v6\") pod \"octavia-operator-controller-manager-69fdcfc5f5-vn5dq\" (UID: \"5383069b-8f72-4173-97bf-34ffc36c235e\") " pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-vn5dq" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.709446 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9k9vz\" (UniqueName: \"kubernetes.io/projected/510d5f0a-5f67-4171-99e6-1de6734e7bdf-kube-api-access-9k9vz\") pod \"nova-operator-controller-manager-5df598886f-hx224\" (UID: \"510d5f0a-5f67-4171-99e6-1de6734e7bdf\") " pod="openstack-operators/nova-operator-controller-manager-5df598886f-hx224" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.709475 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-db2v4\" (UniqueName: \"kubernetes.io/projected/adf9f01b-70b6-46b9-acde-c1eedc16f299-kube-api-access-db2v4\") pod \"ironic-operator-controller-manager-9c5c78d49-ksz2s\" (UID: \"adf9f01b-70b6-46b9-acde-c1eedc16f299\") " pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-ksz2s" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.709530 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddgzt\" (UniqueName: \"kubernetes.io/projected/df7a25ba-c240-4d05-a117-0040e24bb33c-kube-api-access-ddgzt\") pod \"infra-operator-controller-manager-656bcbd775-wgp8z\" (UID: \"df7a25ba-c240-4d05-a117-0040e24bb33c\") " pod="openstack-operators/infra-operator-controller-manager-656bcbd775-wgp8z" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.709564 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d44pf\" (UniqueName: \"kubernetes.io/projected/37d1af03-8709-4b4a-8d4c-bda1dbefff59-kube-api-access-d44pf\") pod \"mariadb-operator-controller-manager-f9fb45f8f-qcc9w\" (UID: \"37d1af03-8709-4b4a-8d4c-bda1dbefff59\") " pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-qcc9w" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.709591 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trvr5\" (UniqueName: \"kubernetes.io/projected/7134f9eb-cfa6-41b8-a245-2f1b17669ca4-kube-api-access-trvr5\") pod \"neutron-operator-controller-manager-79d585cb66-2brdt\" (UID: \"7134f9eb-cfa6-41b8-a245-2f1b17669ca4\") " pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-2brdt" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.709611 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/df7a25ba-c240-4d05-a117-0040e24bb33c-cert\") pod \"infra-operator-controller-manager-656bcbd775-wgp8z\" (UID: \"df7a25ba-c240-4d05-a117-0040e24bb33c\") " pod="openstack-operators/infra-operator-controller-manager-656bcbd775-wgp8z" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.721423 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-zh959" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.740215 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d44pf\" (UniqueName: \"kubernetes.io/projected/37d1af03-8709-4b4a-8d4c-bda1dbefff59-kube-api-access-d44pf\") pod \"mariadb-operator-controller-manager-f9fb45f8f-qcc9w\" (UID: \"37d1af03-8709-4b4a-8d4c-bda1dbefff59\") " pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-qcc9w" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.751760 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trvr5\" (UniqueName: \"kubernetes.io/projected/7134f9eb-cfa6-41b8-a245-2f1b17669ca4-kube-api-access-trvr5\") pod \"neutron-operator-controller-manager-79d585cb66-2brdt\" (UID: \"7134f9eb-cfa6-41b8-a245-2f1b17669ca4\") " pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-2brdt" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.758072 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n79kj\" (UniqueName: \"kubernetes.io/projected/fe819f44-6224-4b45-a33c-6b6ef8e73b92-kube-api-access-n79kj\") pod \"manila-operator-controller-manager-5f67fbc655-ql78t\" (UID: \"fe819f44-6224-4b45-a33c-6b6ef8e73b92\") " pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-ql78t" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.763027 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9k9vz\" (UniqueName: \"kubernetes.io/projected/510d5f0a-5f67-4171-99e6-1de6734e7bdf-kube-api-access-9k9vz\") pod \"nova-operator-controller-manager-5df598886f-hx224\" (UID: \"510d5f0a-5f67-4171-99e6-1de6734e7bdf\") " pod="openstack-operators/nova-operator-controller-manager-5df598886f-hx224" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.771563 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/df7a25ba-c240-4d05-a117-0040e24bb33c-cert\") pod \"infra-operator-controller-manager-656bcbd775-wgp8z\" (UID: \"df7a25ba-c240-4d05-a117-0040e24bb33c\") " pod="openstack-operators/infra-operator-controller-manager-656bcbd775-wgp8z" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.771676 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpmxb\" (UniqueName: \"kubernetes.io/projected/fe5f36d2-82b4-4bce-a189-7844dae5dc0e-kube-api-access-dpmxb\") pod \"keystone-operator-controller-manager-55b6b7c7b8-wk6qh\" (UID: \"fe5f36d2-82b4-4bce-a189-7844dae5dc0e\") " pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-wk6qh" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.773335 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kn6v6\" (UniqueName: \"kubernetes.io/projected/5383069b-8f72-4173-97bf-34ffc36c235e-kube-api-access-kn6v6\") pod \"octavia-operator-controller-manager-69fdcfc5f5-vn5dq\" (UID: \"5383069b-8f72-4173-97bf-34ffc36c235e\") " pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-vn5dq" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.774694 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-db2v4\" (UniqueName: \"kubernetes.io/projected/adf9f01b-70b6-46b9-acde-c1eedc16f299-kube-api-access-db2v4\") pod \"ironic-operator-controller-manager-9c5c78d49-ksz2s\" (UID: \"adf9f01b-70b6-46b9-acde-c1eedc16f299\") " pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-ksz2s" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.779460 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddgzt\" (UniqueName: \"kubernetes.io/projected/df7a25ba-c240-4d05-a117-0040e24bb33c-kube-api-access-ddgzt\") pod \"infra-operator-controller-manager-656bcbd775-wgp8z\" (UID: \"df7a25ba-c240-4d05-a117-0040e24bb33c\") " pod="openstack-operators/infra-operator-controller-manager-656bcbd775-wgp8z" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.779542 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5458f77c4-52242"] Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.785045 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5458f77c4-52242" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.792966 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-4t28g" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.816379 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-wk6qh" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.817423 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98xnw\" (UniqueName: \"kubernetes.io/projected/20b56dea-8d10-4b11-b437-fc38320417c9-kube-api-access-98xnw\") pod \"openstack-baremetal-operator-controller-manager-5956dffb7bwrr99\" (UID: \"20b56dea-8d10-4b11-b437-fc38320417c9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5956dffb7bwrr99" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.817468 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvq72\" (UniqueName: \"kubernetes.io/projected/89d21577-9d93-4003-bae1-3b66e679eeeb-kube-api-access-jvq72\") pod \"telemetry-operator-controller-manager-67cfc6749b-wk5xm\" (UID: \"89d21577-9d93-4003-bae1-3b66e679eeeb\") " pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-wk5xm" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.817493 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b56dea-8d10-4b11-b437-fc38320417c9-cert\") pod \"openstack-baremetal-operator-controller-manager-5956dffb7bwrr99\" (UID: \"20b56dea-8d10-4b11-b437-fc38320417c9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5956dffb7bwrr99" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.817519 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7j42\" (UniqueName: \"kubernetes.io/projected/97b9da7d-7c47-46b5-ac4a-f190a92dceef-kube-api-access-v7j42\") pod \"placement-operator-controller-manager-68b6c87b68-l2x8t\" (UID: \"97b9da7d-7c47-46b5-ac4a-f190a92dceef\") " pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-l2x8t" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.817547 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l22m4\" (UniqueName: \"kubernetes.io/projected/27d33d74-f1d1-4208-aead-8f6091c524df-kube-api-access-l22m4\") pod \"swift-operator-controller-manager-db6d7f97b-862f6\" (UID: \"27d33d74-f1d1-4208-aead-8f6091c524df\") " pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-862f6" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.817566 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gp4l\" (UniqueName: \"kubernetes.io/projected/8ac93070-497c-48ca-a58a-fb47657e6c2a-kube-api-access-5gp4l\") pod \"ovn-operator-controller-manager-79df5fb58c-2hx92\" (UID: \"8ac93070-497c-48ca-a58a-fb47657e6c2a\") " pod="openstack-operators/ovn-operator-controller-manager-79df5fb58c-2hx92" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.820549 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5458f77c4-52242"] Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.879107 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-ql78t" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.890620 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-7f554bff7b-mkzqf"] Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.891904 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-7f554bff7b-mkzqf" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.893942 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-qcc9w" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.896144 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-lgbkd" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.909491 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-2brdt" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.920511 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p48cg\" (UniqueName: \"kubernetes.io/projected/18286f44-9f6c-4699-9d4d-afa069c980ed-kube-api-access-p48cg\") pod \"test-operator-controller-manager-5458f77c4-52242\" (UID: \"18286f44-9f6c-4699-9d4d-afa069c980ed\") " pod="openstack-operators/test-operator-controller-manager-5458f77c4-52242" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.920568 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98xnw\" (UniqueName: \"kubernetes.io/projected/20b56dea-8d10-4b11-b437-fc38320417c9-kube-api-access-98xnw\") pod \"openstack-baremetal-operator-controller-manager-5956dffb7bwrr99\" (UID: \"20b56dea-8d10-4b11-b437-fc38320417c9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5956dffb7bwrr99" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.920612 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvq72\" (UniqueName: \"kubernetes.io/projected/89d21577-9d93-4003-bae1-3b66e679eeeb-kube-api-access-jvq72\") pod \"telemetry-operator-controller-manager-67cfc6749b-wk5xm\" (UID: \"89d21577-9d93-4003-bae1-3b66e679eeeb\") " pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-wk5xm" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.920638 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b56dea-8d10-4b11-b437-fc38320417c9-cert\") pod \"openstack-baremetal-operator-controller-manager-5956dffb7bwrr99\" (UID: \"20b56dea-8d10-4b11-b437-fc38320417c9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5956dffb7bwrr99" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.920654 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7j42\" (UniqueName: \"kubernetes.io/projected/97b9da7d-7c47-46b5-ac4a-f190a92dceef-kube-api-access-v7j42\") pod \"placement-operator-controller-manager-68b6c87b68-l2x8t\" (UID: \"97b9da7d-7c47-46b5-ac4a-f190a92dceef\") " pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-l2x8t" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.920687 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l22m4\" (UniqueName: \"kubernetes.io/projected/27d33d74-f1d1-4208-aead-8f6091c524df-kube-api-access-l22m4\") pod \"swift-operator-controller-manager-db6d7f97b-862f6\" (UID: \"27d33d74-f1d1-4208-aead-8f6091c524df\") " pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-862f6" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.920705 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gp4l\" (UniqueName: \"kubernetes.io/projected/8ac93070-497c-48ca-a58a-fb47657e6c2a-kube-api-access-5gp4l\") pod \"ovn-operator-controller-manager-79df5fb58c-2hx92\" (UID: \"8ac93070-497c-48ca-a58a-fb47657e6c2a\") " pod="openstack-operators/ovn-operator-controller-manager-79df5fb58c-2hx92" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.921369 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-7f554bff7b-mkzqf"] Oct 12 05:54:53 crc kubenswrapper[4930]: E1012 05:54:53.921454 4930 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 12 05:54:53 crc kubenswrapper[4930]: E1012 05:54:53.921570 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/20b56dea-8d10-4b11-b437-fc38320417c9-cert podName:20b56dea-8d10-4b11-b437-fc38320417c9 nodeName:}" failed. No retries permitted until 2025-10-12 05:54:54.421554702 +0000 UTC m=+826.963656467 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/20b56dea-8d10-4b11-b437-fc38320417c9-cert") pod "openstack-baremetal-operator-controller-manager-5956dffb7bwrr99" (UID: "20b56dea-8d10-4b11-b437-fc38320417c9") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.923194 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5df598886f-hx224" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.954686 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gp4l\" (UniqueName: \"kubernetes.io/projected/8ac93070-497c-48ca-a58a-fb47657e6c2a-kube-api-access-5gp4l\") pod \"ovn-operator-controller-manager-79df5fb58c-2hx92\" (UID: \"8ac93070-497c-48ca-a58a-fb47657e6c2a\") " pod="openstack-operators/ovn-operator-controller-manager-79df5fb58c-2hx92" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.956853 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7j42\" (UniqueName: \"kubernetes.io/projected/97b9da7d-7c47-46b5-ac4a-f190a92dceef-kube-api-access-v7j42\") pod \"placement-operator-controller-manager-68b6c87b68-l2x8t\" (UID: \"97b9da7d-7c47-46b5-ac4a-f190a92dceef\") " pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-l2x8t" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.957178 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-vn5dq" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.957729 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-79df5fb58c-2hx92" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.960070 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5b95c8954b-46gk6"] Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.961531 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5b95c8954b-46gk6" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.965135 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98xnw\" (UniqueName: \"kubernetes.io/projected/20b56dea-8d10-4b11-b437-fc38320417c9-kube-api-access-98xnw\") pod \"openstack-baremetal-operator-controller-manager-5956dffb7bwrr99\" (UID: \"20b56dea-8d10-4b11-b437-fc38320417c9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5956dffb7bwrr99" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.965997 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l22m4\" (UniqueName: \"kubernetes.io/projected/27d33d74-f1d1-4208-aead-8f6091c524df-kube-api-access-l22m4\") pod \"swift-operator-controller-manager-db6d7f97b-862f6\" (UID: \"27d33d74-f1d1-4208-aead-8f6091c524df\") " pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-862f6" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.966705 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvq72\" (UniqueName: \"kubernetes.io/projected/89d21577-9d93-4003-bae1-3b66e679eeeb-kube-api-access-jvq72\") pod \"telemetry-operator-controller-manager-67cfc6749b-wk5xm\" (UID: \"89d21577-9d93-4003-bae1-3b66e679eeeb\") " pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-wk5xm" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.970446 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-82t8b" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.970721 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.972527 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-l2x8t" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.977134 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5b95c8954b-46gk6"] Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.983995 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-5vrjw"] Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.984935 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-5vrjw" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.986189 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-862f6" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.987527 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-7rn9k" Oct 12 05:54:53 crc kubenswrapper[4930]: I1012 05:54:53.989481 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-5vrjw"] Oct 12 05:54:54 crc kubenswrapper[4930]: I1012 05:54:54.008526 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-wk5xm" Oct 12 05:54:54 crc kubenswrapper[4930]: I1012 05:54:54.023217 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfjd2\" (UniqueName: \"kubernetes.io/projected/7498c8de-da98-47fd-8096-58cb4f1c4f87-kube-api-access-qfjd2\") pod \"watcher-operator-controller-manager-7f554bff7b-mkzqf\" (UID: \"7498c8de-da98-47fd-8096-58cb4f1c4f87\") " pod="openstack-operators/watcher-operator-controller-manager-7f554bff7b-mkzqf" Oct 12 05:54:54 crc kubenswrapper[4930]: I1012 05:54:54.023392 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p48cg\" (UniqueName: \"kubernetes.io/projected/18286f44-9f6c-4699-9d4d-afa069c980ed-kube-api-access-p48cg\") pod \"test-operator-controller-manager-5458f77c4-52242\" (UID: \"18286f44-9f6c-4699-9d4d-afa069c980ed\") " pod="openstack-operators/test-operator-controller-manager-5458f77c4-52242" Oct 12 05:54:54 crc kubenswrapper[4930]: I1012 05:54:54.030019 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-656bcbd775-wgp8z" Oct 12 05:54:54 crc kubenswrapper[4930]: I1012 05:54:54.044984 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-ksz2s" Oct 12 05:54:54 crc kubenswrapper[4930]: I1012 05:54:54.048780 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p48cg\" (UniqueName: \"kubernetes.io/projected/18286f44-9f6c-4699-9d4d-afa069c980ed-kube-api-access-p48cg\") pod \"test-operator-controller-manager-5458f77c4-52242\" (UID: \"18286f44-9f6c-4699-9d4d-afa069c980ed\") " pod="openstack-operators/test-operator-controller-manager-5458f77c4-52242" Oct 12 05:54:54 crc kubenswrapper[4930]: I1012 05:54:54.093094 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-658bdf4b74-vf2tl"] Oct 12 05:54:54 crc kubenswrapper[4930]: I1012 05:54:54.125213 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tg6gt\" (UniqueName: \"kubernetes.io/projected/f89b4da4-a74f-4f12-b056-05f201bedabd-kube-api-access-tg6gt\") pod \"openstack-operator-controller-manager-5b95c8954b-46gk6\" (UID: \"f89b4da4-a74f-4f12-b056-05f201bedabd\") " pod="openstack-operators/openstack-operator-controller-manager-5b95c8954b-46gk6" Oct 12 05:54:54 crc kubenswrapper[4930]: I1012 05:54:54.125286 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pqdv\" (UniqueName: \"kubernetes.io/projected/7063a11c-2d85-47f0-85ca-61cf4949e10d-kube-api-access-7pqdv\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-5vrjw\" (UID: \"7063a11c-2d85-47f0-85ca-61cf4949e10d\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-5vrjw" Oct 12 05:54:54 crc kubenswrapper[4930]: I1012 05:54:54.125310 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f89b4da4-a74f-4f12-b056-05f201bedabd-cert\") pod \"openstack-operator-controller-manager-5b95c8954b-46gk6\" (UID: \"f89b4da4-a74f-4f12-b056-05f201bedabd\") " pod="openstack-operators/openstack-operator-controller-manager-5b95c8954b-46gk6" Oct 12 05:54:54 crc kubenswrapper[4930]: I1012 05:54:54.125357 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfjd2\" (UniqueName: \"kubernetes.io/projected/7498c8de-da98-47fd-8096-58cb4f1c4f87-kube-api-access-qfjd2\") pod \"watcher-operator-controller-manager-7f554bff7b-mkzqf\" (UID: \"7498c8de-da98-47fd-8096-58cb4f1c4f87\") " pod="openstack-operators/watcher-operator-controller-manager-7f554bff7b-mkzqf" Oct 12 05:54:54 crc kubenswrapper[4930]: I1012 05:54:54.131470 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5458f77c4-52242" Oct 12 05:54:54 crc kubenswrapper[4930]: I1012 05:54:54.151373 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfjd2\" (UniqueName: \"kubernetes.io/projected/7498c8de-da98-47fd-8096-58cb4f1c4f87-kube-api-access-qfjd2\") pod \"watcher-operator-controller-manager-7f554bff7b-mkzqf\" (UID: \"7498c8de-da98-47fd-8096-58cb4f1c4f87\") " pod="openstack-operators/watcher-operator-controller-manager-7f554bff7b-mkzqf" Oct 12 05:54:54 crc kubenswrapper[4930]: I1012 05:54:54.232258 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pqdv\" (UniqueName: \"kubernetes.io/projected/7063a11c-2d85-47f0-85ca-61cf4949e10d-kube-api-access-7pqdv\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-5vrjw\" (UID: \"7063a11c-2d85-47f0-85ca-61cf4949e10d\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-5vrjw" Oct 12 05:54:54 crc kubenswrapper[4930]: I1012 05:54:54.232631 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f89b4da4-a74f-4f12-b056-05f201bedabd-cert\") pod \"openstack-operator-controller-manager-5b95c8954b-46gk6\" (UID: \"f89b4da4-a74f-4f12-b056-05f201bedabd\") " pod="openstack-operators/openstack-operator-controller-manager-5b95c8954b-46gk6" Oct 12 05:54:54 crc kubenswrapper[4930]: I1012 05:54:54.232787 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tg6gt\" (UniqueName: \"kubernetes.io/projected/f89b4da4-a74f-4f12-b056-05f201bedabd-kube-api-access-tg6gt\") pod \"openstack-operator-controller-manager-5b95c8954b-46gk6\" (UID: \"f89b4da4-a74f-4f12-b056-05f201bedabd\") " pod="openstack-operators/openstack-operator-controller-manager-5b95c8954b-46gk6" Oct 12 05:54:54 crc kubenswrapper[4930]: E1012 05:54:54.233380 4930 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Oct 12 05:54:54 crc kubenswrapper[4930]: E1012 05:54:54.233465 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f89b4da4-a74f-4f12-b056-05f201bedabd-cert podName:f89b4da4-a74f-4f12-b056-05f201bedabd nodeName:}" failed. No retries permitted until 2025-10-12 05:54:54.733442638 +0000 UTC m=+827.275544403 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f89b4da4-a74f-4f12-b056-05f201bedabd-cert") pod "openstack-operator-controller-manager-5b95c8954b-46gk6" (UID: "f89b4da4-a74f-4f12-b056-05f201bedabd") : secret "webhook-server-cert" not found Oct 12 05:54:54 crc kubenswrapper[4930]: I1012 05:54:54.246242 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-7f554bff7b-mkzqf" Oct 12 05:54:54 crc kubenswrapper[4930]: I1012 05:54:54.258345 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-7b7fb68549-2p7m6"] Oct 12 05:54:54 crc kubenswrapper[4930]: I1012 05:54:54.272681 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-84b9b84486-ckctw"] Oct 12 05:54:54 crc kubenswrapper[4930]: I1012 05:54:54.273635 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pqdv\" (UniqueName: \"kubernetes.io/projected/7063a11c-2d85-47f0-85ca-61cf4949e10d-kube-api-access-7pqdv\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-5vrjw\" (UID: \"7063a11c-2d85-47f0-85ca-61cf4949e10d\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-5vrjw" Oct 12 05:54:54 crc kubenswrapper[4930]: I1012 05:54:54.274043 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tg6gt\" (UniqueName: \"kubernetes.io/projected/f89b4da4-a74f-4f12-b056-05f201bedabd-kube-api-access-tg6gt\") pod \"openstack-operator-controller-manager-5b95c8954b-46gk6\" (UID: \"f89b4da4-a74f-4f12-b056-05f201bedabd\") " pod="openstack-operators/openstack-operator-controller-manager-5b95c8954b-46gk6" Oct 12 05:54:54 crc kubenswrapper[4930]: W1012 05:54:54.305995 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a7f54b7_1891_4e3a_a768_e937269bd384.slice/crio-e925f6aa51314ab3eb15bfb8fcd8ebd37db1ebd16950db5e942edaf8b07c8020 WatchSource:0}: Error finding container e925f6aa51314ab3eb15bfb8fcd8ebd37db1ebd16950db5e942edaf8b07c8020: Status 404 returned error can't find the container with id e925f6aa51314ab3eb15bfb8fcd8ebd37db1ebd16950db5e942edaf8b07c8020 Oct 12 05:54:54 crc kubenswrapper[4930]: I1012 05:54:54.310257 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-5vrjw" Oct 12 05:54:54 crc kubenswrapper[4930]: I1012 05:54:54.443334 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b56dea-8d10-4b11-b437-fc38320417c9-cert\") pod \"openstack-baremetal-operator-controller-manager-5956dffb7bwrr99\" (UID: \"20b56dea-8d10-4b11-b437-fc38320417c9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5956dffb7bwrr99" Oct 12 05:54:54 crc kubenswrapper[4930]: I1012 05:54:54.475385 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b56dea-8d10-4b11-b437-fc38320417c9-cert\") pod \"openstack-baremetal-operator-controller-manager-5956dffb7bwrr99\" (UID: \"20b56dea-8d10-4b11-b437-fc38320417c9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5956dffb7bwrr99" Oct 12 05:54:54 crc kubenswrapper[4930]: I1012 05:54:54.557074 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5956dffb7bwrr99" Oct 12 05:54:54 crc kubenswrapper[4930]: I1012 05:54:54.558227 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-858f76bbdd-2fq48"] Oct 12 05:54:54 crc kubenswrapper[4930]: I1012 05:54:54.620259 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-85d5d9dd78-8vdcg"] Oct 12 05:54:54 crc kubenswrapper[4930]: I1012 05:54:54.747525 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f89b4da4-a74f-4f12-b056-05f201bedabd-cert\") pod \"openstack-operator-controller-manager-5b95c8954b-46gk6\" (UID: \"f89b4da4-a74f-4f12-b056-05f201bedabd\") " pod="openstack-operators/openstack-operator-controller-manager-5b95c8954b-46gk6" Oct 12 05:54:54 crc kubenswrapper[4930]: I1012 05:54:54.760011 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f89b4da4-a74f-4f12-b056-05f201bedabd-cert\") pod \"openstack-operator-controller-manager-5b95c8954b-46gk6\" (UID: \"f89b4da4-a74f-4f12-b056-05f201bedabd\") " pod="openstack-operators/openstack-operator-controller-manager-5b95c8954b-46gk6" Oct 12 05:54:54 crc kubenswrapper[4930]: I1012 05:54:54.869112 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-vf2tl" event={"ID":"9e6cd80c-4aa5-40de-81fc-10d0329f5481","Type":"ContainerStarted","Data":"dfb461127062892b3aa41b314becddd024275f5012ee19c39d96f5795cb19b99"} Oct 12 05:54:54 crc kubenswrapper[4930]: I1012 05:54:54.870159 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-8vdcg" event={"ID":"f883b7e0-65a4-4cea-9d02-7cdedfaf9ec2","Type":"ContainerStarted","Data":"79dfec76141c9835a41e92ff871719e29fdcf41cc584fe17f24df54e82f3da27"} Oct 12 05:54:54 crc kubenswrapper[4930]: I1012 05:54:54.871061 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-2fq48" event={"ID":"25276148-1b95-4b4d-9f18-ef97020632a7","Type":"ContainerStarted","Data":"58a5795020111f645e2d963395ae0aedc690907ae3e213478c4cade2b76c2bd1"} Oct 12 05:54:54 crc kubenswrapper[4930]: I1012 05:54:54.871831 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84b9b84486-ckctw" event={"ID":"93426c54-3448-421e-aa85-b03c466c7bf8","Type":"ContainerStarted","Data":"a7c4985f452d40e9dc0d3fe4453bad87aad9748ed482a83bfd5224b5bf0dd873"} Oct 12 05:54:54 crc kubenswrapper[4930]: I1012 05:54:54.872553 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-2p7m6" event={"ID":"5a7f54b7-1891-4e3a-a768-e937269bd384","Type":"ContainerStarted","Data":"e925f6aa51314ab3eb15bfb8fcd8ebd37db1ebd16950db5e942edaf8b07c8020"} Oct 12 05:54:54 crc kubenswrapper[4930]: I1012 05:54:54.895321 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5b95c8954b-46gk6" Oct 12 05:54:54 crc kubenswrapper[4930]: I1012 05:54:54.964868 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-7ffbcb7588-zh959"] Oct 12 05:54:54 crc kubenswrapper[4930]: I1012 05:54:54.977899 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-vn5dq"] Oct 12 05:54:55 crc kubenswrapper[4930]: I1012 05:54:55.004569 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-wk6qh"] Oct 12 05:54:55 crc kubenswrapper[4930]: W1012 05:54:55.018045 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe5f36d2_82b4_4bce_a189_7844dae5dc0e.slice/crio-c32521cba3ca90319f5602855d092dd0c564bbcc381001090cb77cc650009dc5 WatchSource:0}: Error finding container c32521cba3ca90319f5602855d092dd0c564bbcc381001090cb77cc650009dc5: Status 404 returned error can't find the container with id c32521cba3ca90319f5602855d092dd0c564bbcc381001090cb77cc650009dc5 Oct 12 05:54:55 crc kubenswrapper[4930]: W1012 05:54:55.027081 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37d1af03_8709_4b4a_8d4c_bda1dbefff59.slice/crio-51b1796e7a77efc42fe6e7fae3a23c20b8028354ec3f88c04a6123059a37d799 WatchSource:0}: Error finding container 51b1796e7a77efc42fe6e7fae3a23c20b8028354ec3f88c04a6123059a37d799: Status 404 returned error can't find the container with id 51b1796e7a77efc42fe6e7fae3a23c20b8028354ec3f88c04a6123059a37d799 Oct 12 05:54:55 crc kubenswrapper[4930]: I1012 05:54:55.029872 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-qcc9w"] Oct 12 05:54:55 crc kubenswrapper[4930]: I1012 05:54:55.038468 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-68b6c87b68-l2x8t"] Oct 12 05:54:55 crc kubenswrapper[4930]: I1012 05:54:55.053815 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5df598886f-hx224"] Oct 12 05:54:55 crc kubenswrapper[4930]: I1012 05:54:55.066124 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-5f67fbc655-ql78t"] Oct 12 05:54:55 crc kubenswrapper[4930]: W1012 05:54:55.074509 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe819f44_6224_4b45_a33c_6b6ef8e73b92.slice/crio-88fcd30e5b84298dcd90bb6345cb57de864b9a09ab1180b092141537b87a345b WatchSource:0}: Error finding container 88fcd30e5b84298dcd90bb6345cb57de864b9a09ab1180b092141537b87a345b: Status 404 returned error can't find the container with id 88fcd30e5b84298dcd90bb6345cb57de864b9a09ab1180b092141537b87a345b Oct 12 05:54:55 crc kubenswrapper[4930]: I1012 05:54:55.487939 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-7f554bff7b-mkzqf"] Oct 12 05:54:55 crc kubenswrapper[4930]: I1012 05:54:55.514271 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-9c5c78d49-ksz2s"] Oct 12 05:54:55 crc kubenswrapper[4930]: I1012 05:54:55.518156 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-5vrjw"] Oct 12 05:54:55 crc kubenswrapper[4930]: I1012 05:54:55.523904 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-db6d7f97b-862f6"] Oct 12 05:54:55 crc kubenswrapper[4930]: I1012 05:54:55.528289 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-79df5fb58c-2hx92"] Oct 12 05:54:55 crc kubenswrapper[4930]: I1012 05:54:55.532025 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-656bcbd775-wgp8z"] Oct 12 05:54:55 crc kubenswrapper[4930]: I1012 05:54:55.535841 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-67cfc6749b-wk5xm"] Oct 12 05:54:55 crc kubenswrapper[4930]: I1012 05:54:55.541247 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-79d585cb66-2brdt"] Oct 12 05:54:55 crc kubenswrapper[4930]: I1012 05:54:55.545432 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5458f77c4-52242"] Oct 12 05:54:55 crc kubenswrapper[4930]: W1012 05:54:55.550253 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ac93070_497c_48ca_a58a_fb47657e6c2a.slice/crio-f21bc5c0c3fb762048348e99e1213e4563008337c13dd55d6156a363eded2a5d WatchSource:0}: Error finding container f21bc5c0c3fb762048348e99e1213e4563008337c13dd55d6156a363eded2a5d: Status 404 returned error can't find the container with id f21bc5c0c3fb762048348e99e1213e4563008337c13dd55d6156a363eded2a5d Oct 12 05:54:55 crc kubenswrapper[4930]: E1012 05:54:55.566654 4930 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:315e558023b41ac1aa215082096995a03810c5b42910a33b00427ffcac9c6a14,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5gp4l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-79df5fb58c-2hx92_openstack-operators(8ac93070-497c-48ca-a58a-fb47657e6c2a): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 12 05:54:55 crc kubenswrapper[4930]: E1012 05:54:55.567468 4930 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:4b4a17fe08ce00e375afaaec6a28835f5c1784f03d11c4558376ac04130f3a9e,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-l22m4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-db6d7f97b-862f6_openstack-operators(27d33d74-f1d1-4208-aead-8f6091c524df): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 12 05:54:55 crc kubenswrapper[4930]: E1012 05:54:55.569480 4930 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:7e584b1c430441c8b6591dadeff32e065de8a185ad37ef90d2e08d37e59aab4a,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-p48cg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5458f77c4-52242_openstack-operators(18286f44-9f6c-4699-9d4d-afa069c980ed): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 12 05:54:55 crc kubenswrapper[4930]: I1012 05:54:55.586992 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5956dffb7bwrr99"] Oct 12 05:54:55 crc kubenswrapper[4930]: W1012 05:54:55.587175 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7134f9eb_cfa6_41b8_a245_2f1b17669ca4.slice/crio-ff409417edf9eb1e2f0eb03db120447f218a366e2b933d054d733166b05c54bd WatchSource:0}: Error finding container ff409417edf9eb1e2f0eb03db120447f218a366e2b933d054d733166b05c54bd: Status 404 returned error can't find the container with id ff409417edf9eb1e2f0eb03db120447f218a366e2b933d054d733166b05c54bd Oct 12 05:54:55 crc kubenswrapper[4930]: E1012 05:54:55.610219 4930 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/infra-operator@sha256:5cfb2ae1092445950b39dd59caa9a8c9367f42fb8353a8c3848d3bc729f24492,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{600 -3} {} 600m DecimalSI},memory: {{2147483648 0} {} 2Gi BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{536870912 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ddgzt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infra-operator-controller-manager-656bcbd775-wgp8z_openstack-operators(df7a25ba-c240-4d05-a117-0040e24bb33c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 12 05:54:55 crc kubenswrapper[4930]: E1012 05:54:55.623816 4930 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:a17fc270857869fd1efe5020b2a1cb8c2abbd838f08de88f3a6a59e8754ec351,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-baremetal-operator-agent@sha256:03b4f3db4b373515f7e4095984b97197c05a14f87b2a0a525eb5d7be1d7bda66,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_ANSIBLEEE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-ansibleee-runner@sha256:6722a752fb7cbffbae811f6ad6567120fbd4ebbe8c38a83ec2df02850a3276bd,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-api@sha256:2115452234aedb505ed4efc6cd9b9a4ce3b9809aa7d0128d8fbeeee84dad1a69,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_EVALUATOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-evaluator@sha256:50597a8eaa6c4383f357574dcab8358b698729797b4156d932985a08ab86b7cd,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-listener@sha256:cb4997d62c7b2534233a676cb92e19cf85dda07e2fb9fa642c28aab30489f69a,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_NOTIFIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-notifier@sha256:1ccbf3f6cf24c9ee91bed71467491e22b8cb4b95bce90250f4174fae936b0fa1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_APACHE_IMAGE_URL_DEFAULT,Value:registry.redhat.io/ubi9/httpd-24@sha256:e91d58021b54c46883595ff66be65882de54abdb3be2ca53c4162b20d18b5f48,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:cbe345acb37e57986ecf6685d28c72d0e639bdb493a18e9d3ba947d6c3a16384,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_KEYSTONE_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-keystone-listener@sha256:e7dcc3bf23d5e0393ac173e3c43d4ae85f4613a4fd16b3c147dc32ae491d49bf,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-worker@sha256:2a1a8b582c6e4cc31081bd8b0887acf45e31c1d14596c4e361d27d08fef0debf,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-central@sha256:86daeb9c834bfcedb533086dff59a6b5b6e832b94ce2a9116337f8736bb80032,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_IPMI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-ipmi@sha256:6d28de018f6e1672e775a75735e3bc16b63da41acd8fb5196ee0b06856c07133,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_MYSQLD_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/mysqld-exporter@sha256:7211a617ec657701ca819aa0ba28e1d5750f5bf2c1391b755cc4a48cc360b0fa,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_NOTIFICATION_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-notification@sha256:c5fc9b72fc593bcf3b569c7ed24a256448eb1afab1504e668a3822e978be1306,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_SGCORE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/sg-core@sha256:09b5017c95d7697e66b9c64846bc48ef5826a009cba89b956ec54561e5f4a2d1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:85c75d60e1bd2f8a9ea0a2bb21a8df64c0a6f7b504cc1a05a355981d4b90e92f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_BACKUP_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-backup@sha256:88b99249f15470f359fb554f7f3a56974b743f4655e3f0c982c0260f75a67697,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-scheduler@sha256:e861d66785047d39eb68d9bac23e3f57ac84d9bd95593502d9b3b913b99fd1a4,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_VOLUME_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-volume@sha256:b95f09bf3d259f9eacf3b63931977483f5c3c332f49b95ee8a69d8e3fb71d082,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-api@sha256:6fc7801c0d18d41b9f11484b1cdb342de9cebd93072ec2205dbe40945715184f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_BACKENDBIND9_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-backend-bind9@sha256:d4d824b80cbed683543d9e8c7045ac97e080774f45a5067ccbca26404e067821,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-central@sha256:182ec75938d8d3fb7d8f916373368add24062fec90489aa57776a81d0b36ea20,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_MDNS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-mdns@sha256:9507ba5ab74cbae902e2dc07f89c7b3b5b76d8079e444365fe0eee6000fd7aaa,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_PRODUCER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-producer@sha256:17db080dcc4099f8a20aa0f238b6bca5c104672ae46743adeab9d1637725ecaa,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_UNBOUND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-unbound@sha256:fd55cf3d73bfdc518419c9ba0b0cbef275140ae2d3bd0342a7310f81d57c2d78,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-worker@sha256:d164a9bd383f50df69fc22e7422f4650cd5076c90ed19278fc0f04e54345a63d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_FRR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-frr@sha256:6beffe7d0bd75f9d1f495aeb7ab2334a2414af2c581d4833363df8441ed01018,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_ISCSID_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_KEPLER_IMAGE_URL_DEFAULT,Value:quay.io/sustainable_computing_io/kepler@sha256:581b65b646301e0fcb07582150ba63438f1353a85bf9acf1eb2acb4ce71c58bd,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_LOGROTATE_CROND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cron@sha256:2308c7b6c3d0aabbadfc9a06d84d67d2243f27fe8eed740ee96b1ce910203f62,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_MULTIPATHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_DHCP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent@sha256:9cf0ca292340f1f978603955ef682effbf24316d6e2376b1c89906d84c3f06d0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_METADATA_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_OVN_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-ovn-agent@sha256:58f678016d7f6c8fe579abe886fd138ef853642faa6766ca60639feac12d82ac,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_SRIOV_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent@sha256:46f92909153aaf03a585374b77d103c536509747e3270558d9a533295c46a7c5,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NODE_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_OVN_BGP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-bgp-agent@sha256:7fe367f51638c5c302fd3f8e66a31b09cb3b11519a7f72ef142b6c6fe8b91694,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_PODMAN_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_GLANCE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-glance-api@sha256:9ebf424d4107275a2e3f21f7a18ef257ff2f97c1298109ac7c802a5a4f4794f2,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api@sha256:4fcbe0d9a3c845708ecc32102ad4abbcbd947d87e5cf91f186de75b5d84ec681,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_CFNAPI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api-cfn@sha256:58a4e9a4dea86635c93ce37a2bb3c60ece62b3d656f6ee6a8845347cbb3e90fd,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-engine@sha256:6f2b843bc9f4ceb1ee873972d69e6bae6e1dbd378b486995bc3697d8bcff6339,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HORIZON_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-horizon@sha256:03b4bb79b71d5ca7792d19c4c0ee08a5e5a407ad844c087305c42dd909ee7490,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_MEMCACHED_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-memcached@sha256:773daada6402d9cad089cdc809d6c0335456d057ac1a25441ab5d82add2f70f4,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_REDIS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-redis@sha256:7323406a63fb3fdbb3eea4da0f7e8ed89c94c9bd0ad5ecd6c18fa4a4c2c550c4,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-api@sha256:7ae82068011e2d2e5ddc88c943fd32ff4a11902793e7a1df729811b2e27122a0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-conductor@sha256:0c762c15d9d98d39cc9dc3d1f9a70f9188fef58d4e2f3b0c69c896cab8da5e48,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_INSPECTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-inspector@sha256:febf65561eeef5b36b70d0d65ee83f6451e43ec97bfab4d826e14215da6ff19b,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_NEUTRON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-neutron-agent@sha256:b8aadfc3d547c5ef1e27fcb573d4760cf8c2f2271eefe1793c35a0d46b640837,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PXE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-pxe@sha256:ecc91fd5079ee6d0c6ae1b11e97da790e33864d0e1930e574f959da2bddfa59a,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PYTHON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/ironic-python-agent@sha256:2e981e93f99c929a3f04e5e41c8f645d44d390a9aeee3c5193cce7ec2edcbf3a,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KEYSTONE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-keystone@sha256:1e5714637b6e1a24c2858fe6d9bbb3f00bc61d69ad74a657b1c23682bf4cb2b7,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KSM_IMAGE_URL_DEFAULT,Value:registry.k8s.io/kube-state-metrics/kube-state-metrics@sha256:db384bf43222b066c378e77027a675d4cd9911107adba46c2922b3a55e10d6fb,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-api@sha256:35b8dcf27dc3b67f3840fa0e693ff312f74f7e22c634dff206a5c4d0133c716c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-scheduler@sha256:e109e4863e05e803dbfe04917756fd52231c560c65353170a2000be6cc2bb53d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SHARE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-share@sha256:6df0bebd9318ce11624413249e7e9781311638f276f8877668d3b382fe90e62f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MARIADB_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:56b75d97f4a48c8cf58b3a7c18c43618efb308bf0188124f6301142e61299b0c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NET_UTILS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-netutils@sha256:a51ed62767206067aa501142dbf01f20b3d65325d30faf1b4d6424d5b17dfba5,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NEUTRON_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:c4e71b2158fd939dad8b8e705273493051d3023273d23b279f2699dce6db33df,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-api@sha256:592e3cd32d3cc97a69093ad905b449aa374ffbb1b2644b738bb6c1434476d1f6,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-conductor@sha256:9596452e283febbe08204d0ef0fd1992af3395d0969f7ac76663ed7c8be5b4d4,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_NOVNC_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-novncproxy@sha256:d61005a10bef1b37762a8a41e6755c1169241e36cc5f92886bca6f4f6b9c381a,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-scheduler@sha256:e6a4335bcbeed3cd3e73ac879f754e314761e4a417a67539ca88e96a79346328,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-api@sha256:97d88fc53421b699fc91983313d7beec4a0f177089e95bdf5ba15c3f521db9a9,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HEALTHMANAGER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-health-manager@sha256:5365e5c9c3ad2ede1b6945255b2cc6b009d642c39babdf25e0655282cfa646fe,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HOUSEKEEPING_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-housekeeping@sha256:5b55795d774e0ea160ff8a7fd491ed41cf2d93c7d821694abb3a879eaffcefeb,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_RSYSLOG_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rsyslog@sha256:26e955c46a6063eafcfeb79430bf3d9268dbe95687c00e63a624b3ec5a846f5a,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-worker@sha256:58939baa18ab09e2b24996c5f3665ae52274b781f661ea06a67c991e9a832d5a,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_CLIENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-openstackclient@sha256:b8bff6857fec93c3c1521f1a8c23de21bcb86fc0f960972e81f6c3f95d4185be,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_MUST_GATHER_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-must-gather@sha256:943eee724277e252795909137538a553ef5284c8103ad01b9be7b0138c66d14d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_NETWORK_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OS_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/edpm-hardened-uefi@sha256:d97b08fd421065c8c33a523973822ac468500cbe853069aa9214393fbda7a908,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_OVS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-base@sha256:289dea3beea1cd4405895fc42e44372b35e4a941e31c59e102c333471a3ca9b7,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server@sha256:9b19894fa67a81bf8ba4159b55b49f38877c670aeb97e2021c341cef2a9294e4,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NORTHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-northd@sha256:ea164961ad30453ad0301c6b73364e1f1024f689634c88dd98265f9c7048e31d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_SB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server@sha256:6f9f2ea45f0271f6da8eb05a5f74cf5ce6769479346f5c2f407ee6f31a9c7ff3,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PLACEMENT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-placement-api@sha256:59448516174fc3bab679b9a8dd62cb9a9d16b5734aadbeb98e960e3b7c79bd22,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_RABBITMQ_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:adcdeb8ecd601fb03c3b0901d5b5111af2ca48f7dd443e22224db6daaf08f5d0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_ACCOUNT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-account@sha256:2bf32d9b95899d7637dfe19d07cf1ecc9a06593984faff57a3c0dce060012edb,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-container@sha256:7a452cd18b64d522e8a1e25bdcea543e9fe5f5b76e1c5e044c2b5334e06a326b,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_OBJECT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-object@sha256:6a46aa13aa359b8e782a22d67db42db02bbf2bb7e35df4b684ac1daeda38cde3,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_PROXY_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-proxy-server@sha256:f6824854bea6b2acbb00c34639799b4744818d4adbdd40e37dc5088f9ae18d58,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_TEST_TEMPEST_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-tempest-all@sha256:a66d2fdc21f25c690f02e643d2666dbe7df43a64cd55086ec33d6755e6d809b9,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-api@sha256:30701a65382430570f6fb35621f64f1003f727b6da745ce84fb1a90436ee2350,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_APPLIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-applier@sha256:b9a657c51bbcc236e6c906a6df6c42cd2a28bab69e7ab58b0e9ced12295b2d87,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_DECISION_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-decision-engine@sha256:fd65fb5c9710c46aa1c31e65a51cd5c23ec35cf68c2452d421f919f2aa9b6255,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-98xnw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-baremetal-operator-controller-manager-5956dffb7bwrr99_openstack-operators(20b56dea-8d10-4b11-b437-fc38320417c9): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 12 05:54:55 crc kubenswrapper[4930]: I1012 05:54:55.627251 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5b95c8954b-46gk6"] Oct 12 05:54:55 crc kubenswrapper[4930]: W1012 05:54:55.655294 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf89b4da4_a74f_4f12_b056_05f201bedabd.slice/crio-a3d60b6cd93d344af2a27d26c4f47bccba7d888e7a2b207067b47255002c0f34 WatchSource:0}: Error finding container a3d60b6cd93d344af2a27d26c4f47bccba7d888e7a2b207067b47255002c0f34: Status 404 returned error can't find the container with id a3d60b6cd93d344af2a27d26c4f47bccba7d888e7a2b207067b47255002c0f34 Oct 12 05:54:55 crc kubenswrapper[4930]: E1012 05:54:55.761443 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-862f6" podUID="27d33d74-f1d1-4208-aead-8f6091c524df" Oct 12 05:54:55 crc kubenswrapper[4930]: E1012 05:54:55.765571 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-79df5fb58c-2hx92" podUID="8ac93070-497c-48ca-a58a-fb47657e6c2a" Oct 12 05:54:55 crc kubenswrapper[4930]: E1012 05:54:55.800523 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-5458f77c4-52242" podUID="18286f44-9f6c-4699-9d4d-afa069c980ed" Oct 12 05:54:55 crc kubenswrapper[4930]: I1012 05:54:55.895566 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-wk6qh" event={"ID":"fe5f36d2-82b4-4bce-a189-7844dae5dc0e","Type":"ContainerStarted","Data":"c32521cba3ca90319f5602855d092dd0c564bbcc381001090cb77cc650009dc5"} Oct 12 05:54:55 crc kubenswrapper[4930]: E1012 05:54:55.910349 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/infra-operator-controller-manager-656bcbd775-wgp8z" podUID="df7a25ba-c240-4d05-a117-0040e24bb33c" Oct 12 05:54:55 crc kubenswrapper[4930]: E1012 05:54:55.941558 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5956dffb7bwrr99" podUID="20b56dea-8d10-4b11-b437-fc38320417c9" Oct 12 05:54:55 crc kubenswrapper[4930]: I1012 05:54:55.941959 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-7f554bff7b-mkzqf" event={"ID":"7498c8de-da98-47fd-8096-58cb4f1c4f87","Type":"ContainerStarted","Data":"5da337fb9a166ac7f08ae8ae19ea1e65dee085c8dbb2ff8bf08517343a80f2be"} Oct 12 05:54:55 crc kubenswrapper[4930]: I1012 05:54:55.959484 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5b95c8954b-46gk6" event={"ID":"f89b4da4-a74f-4f12-b056-05f201bedabd","Type":"ContainerStarted","Data":"a3d60b6cd93d344af2a27d26c4f47bccba7d888e7a2b207067b47255002c0f34"} Oct 12 05:54:55 crc kubenswrapper[4930]: I1012 05:54:55.961120 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-2brdt" event={"ID":"7134f9eb-cfa6-41b8-a245-2f1b17669ca4","Type":"ContainerStarted","Data":"ff409417edf9eb1e2f0eb03db120447f218a366e2b933d054d733166b05c54bd"} Oct 12 05:54:55 crc kubenswrapper[4930]: I1012 05:54:55.966532 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-79df5fb58c-2hx92" event={"ID":"8ac93070-497c-48ca-a58a-fb47657e6c2a","Type":"ContainerStarted","Data":"d6f69238014f355bbfa37a9e67f455b45f0a8f60346966d05b1e6602b9f35941"} Oct 12 05:54:55 crc kubenswrapper[4930]: I1012 05:54:55.966594 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-79df5fb58c-2hx92" event={"ID":"8ac93070-497c-48ca-a58a-fb47657e6c2a","Type":"ContainerStarted","Data":"f21bc5c0c3fb762048348e99e1213e4563008337c13dd55d6156a363eded2a5d"} Oct 12 05:54:55 crc kubenswrapper[4930]: E1012 05:54:55.973498 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:315e558023b41ac1aa215082096995a03810c5b42910a33b00427ffcac9c6a14\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-79df5fb58c-2hx92" podUID="8ac93070-497c-48ca-a58a-fb47657e6c2a" Oct 12 05:54:55 crc kubenswrapper[4930]: I1012 05:54:55.977550 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-656bcbd775-wgp8z" event={"ID":"df7a25ba-c240-4d05-a117-0040e24bb33c","Type":"ContainerStarted","Data":"2cd5725e288cc19ae22eec20a6c036596a39ec2974ff0a52a4ba3ad26ff8b03e"} Oct 12 05:54:55 crc kubenswrapper[4930]: E1012 05:54:55.981862 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:5cfb2ae1092445950b39dd59caa9a8c9367f42fb8353a8c3848d3bc729f24492\\\"\"" pod="openstack-operators/infra-operator-controller-manager-656bcbd775-wgp8z" podUID="df7a25ba-c240-4d05-a117-0040e24bb33c" Oct 12 05:54:55 crc kubenswrapper[4930]: I1012 05:54:55.988806 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-ql78t" event={"ID":"fe819f44-6224-4b45-a33c-6b6ef8e73b92","Type":"ContainerStarted","Data":"88fcd30e5b84298dcd90bb6345cb57de864b9a09ab1180b092141537b87a345b"} Oct 12 05:54:56 crc kubenswrapper[4930]: I1012 05:54:56.004774 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-l2x8t" event={"ID":"97b9da7d-7c47-46b5-ac4a-f190a92dceef","Type":"ContainerStarted","Data":"e4cb0f9e90cad1445f8cd3565958874587f2e0925cdc9f9eaf935701fc9f11ad"} Oct 12 05:54:56 crc kubenswrapper[4930]: I1012 05:54:56.008880 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5458f77c4-52242" event={"ID":"18286f44-9f6c-4699-9d4d-afa069c980ed","Type":"ContainerStarted","Data":"f23fc015ed2d6325c65a05eff30ee44c7c280102d23903435c7ab45b6df3efd3"} Oct 12 05:54:56 crc kubenswrapper[4930]: I1012 05:54:56.008910 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5458f77c4-52242" event={"ID":"18286f44-9f6c-4699-9d4d-afa069c980ed","Type":"ContainerStarted","Data":"ec0ac56fc407a7859909bd22a4787f8c41090cf231661c931f71e6ffb4793b17"} Oct 12 05:54:56 crc kubenswrapper[4930]: E1012 05:54:56.021866 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:7e584b1c430441c8b6591dadeff32e065de8a185ad37ef90d2e08d37e59aab4a\\\"\"" pod="openstack-operators/test-operator-controller-manager-5458f77c4-52242" podUID="18286f44-9f6c-4699-9d4d-afa069c980ed" Oct 12 05:54:56 crc kubenswrapper[4930]: I1012 05:54:56.022631 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-wk5xm" event={"ID":"89d21577-9d93-4003-bae1-3b66e679eeeb","Type":"ContainerStarted","Data":"34d24fb8e979824bbbf37f145c2d53a70f7e4b84ffdb2bca2b433da03fe72489"} Oct 12 05:54:56 crc kubenswrapper[4930]: I1012 05:54:56.024020 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5df598886f-hx224" event={"ID":"510d5f0a-5f67-4171-99e6-1de6734e7bdf","Type":"ContainerStarted","Data":"fb45d70629c0bdacff6c663f6e6c3ee505b6e8ff2cf808112cbbf56f5797ac57"} Oct 12 05:54:56 crc kubenswrapper[4930]: I1012 05:54:56.025768 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5956dffb7bwrr99" event={"ID":"20b56dea-8d10-4b11-b437-fc38320417c9","Type":"ContainerStarted","Data":"61a1c17a78d166929558686c3c8d732db1d734449a1085448310c91378657543"} Oct 12 05:54:56 crc kubenswrapper[4930]: I1012 05:54:56.028598 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-5vrjw" event={"ID":"7063a11c-2d85-47f0-85ca-61cf4949e10d","Type":"ContainerStarted","Data":"3a33ba42ee41fc7ad7cd77f766f2c9c0cc7e81fe752c0b4d06988a19b698a296"} Oct 12 05:54:56 crc kubenswrapper[4930]: E1012 05:54:56.030812 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:a17fc270857869fd1efe5020b2a1cb8c2abbd838f08de88f3a6a59e8754ec351\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5956dffb7bwrr99" podUID="20b56dea-8d10-4b11-b437-fc38320417c9" Oct 12 05:54:56 crc kubenswrapper[4930]: I1012 05:54:56.038688 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-vn5dq" event={"ID":"5383069b-8f72-4173-97bf-34ffc36c235e","Type":"ContainerStarted","Data":"38e84927669374871e30e82d1789954f0c2edb8bd7dc6da2bb96244d644991e3"} Oct 12 05:54:56 crc kubenswrapper[4930]: I1012 05:54:56.040926 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-qcc9w" event={"ID":"37d1af03-8709-4b4a-8d4c-bda1dbefff59","Type":"ContainerStarted","Data":"51b1796e7a77efc42fe6e7fae3a23c20b8028354ec3f88c04a6123059a37d799"} Oct 12 05:54:56 crc kubenswrapper[4930]: I1012 05:54:56.042529 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-ksz2s" event={"ID":"adf9f01b-70b6-46b9-acde-c1eedc16f299","Type":"ContainerStarted","Data":"19580a79f1ee1487e143951a2039e10b797843baca75646592a16ee277c23c08"} Oct 12 05:54:56 crc kubenswrapper[4930]: I1012 05:54:56.094329 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-862f6" event={"ID":"27d33d74-f1d1-4208-aead-8f6091c524df","Type":"ContainerStarted","Data":"26e265b29d454b929319685d0d89752ba547c14a801d14f17bc0a73acd37fe65"} Oct 12 05:54:56 crc kubenswrapper[4930]: I1012 05:54:56.094375 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-862f6" event={"ID":"27d33d74-f1d1-4208-aead-8f6091c524df","Type":"ContainerStarted","Data":"b6281ee04a0ac6cf96fa8a373ef6a2d5e2cbd346b606cae6b339049c6f113190"} Oct 12 05:54:56 crc kubenswrapper[4930]: E1012 05:54:56.095790 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:4b4a17fe08ce00e375afaaec6a28835f5c1784f03d11c4558376ac04130f3a9e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-862f6" podUID="27d33d74-f1d1-4208-aead-8f6091c524df" Oct 12 05:54:56 crc kubenswrapper[4930]: I1012 05:54:56.100611 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-zh959" event={"ID":"ee89a0b0-868b-4b2e-a274-c5a4ee40a872","Type":"ContainerStarted","Data":"9bf43b5f1dbf77e40223e3560e9774c5bdb50b78f7526053b257de4dff203879"} Oct 12 05:54:57 crc kubenswrapper[4930]: I1012 05:54:57.116610 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5b95c8954b-46gk6" event={"ID":"f89b4da4-a74f-4f12-b056-05f201bedabd","Type":"ContainerStarted","Data":"2792948bab25927a794490e1a7a4fa9b043d9cb150c0504975df41d80cfd4b12"} Oct 12 05:54:57 crc kubenswrapper[4930]: I1012 05:54:57.117097 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5b95c8954b-46gk6" event={"ID":"f89b4da4-a74f-4f12-b056-05f201bedabd","Type":"ContainerStarted","Data":"8a1c3913c8752e92d2a02a4352225f8430be5dae19106de84ba0cd03e942721e"} Oct 12 05:54:57 crc kubenswrapper[4930]: I1012 05:54:57.117187 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-5b95c8954b-46gk6" Oct 12 05:54:57 crc kubenswrapper[4930]: I1012 05:54:57.143203 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5956dffb7bwrr99" event={"ID":"20b56dea-8d10-4b11-b437-fc38320417c9","Type":"ContainerStarted","Data":"b327b229c48e16fb343212285f82cbc22f3992f1a1351ce8ae5c51270735b28d"} Oct 12 05:54:57 crc kubenswrapper[4930]: E1012 05:54:57.149517 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:a17fc270857869fd1efe5020b2a1cb8c2abbd838f08de88f3a6a59e8754ec351\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5956dffb7bwrr99" podUID="20b56dea-8d10-4b11-b437-fc38320417c9" Oct 12 05:54:57 crc kubenswrapper[4930]: I1012 05:54:57.159484 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-5b95c8954b-46gk6" podStartSLOduration=4.159459955 podStartE2EDuration="4.159459955s" podCreationTimestamp="2025-10-12 05:54:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:54:57.153823422 +0000 UTC m=+829.695925187" watchObservedRunningTime="2025-10-12 05:54:57.159459955 +0000 UTC m=+829.701561720" Oct 12 05:54:57 crc kubenswrapper[4930]: I1012 05:54:57.162700 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-656bcbd775-wgp8z" event={"ID":"df7a25ba-c240-4d05-a117-0040e24bb33c","Type":"ContainerStarted","Data":"ca08853ca4e85a16a85eb6c36b57b8e34fa258018e63500b1611537150c9ad22"} Oct 12 05:54:57 crc kubenswrapper[4930]: E1012 05:54:57.164976 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:5cfb2ae1092445950b39dd59caa9a8c9367f42fb8353a8c3848d3bc729f24492\\\"\"" pod="openstack-operators/infra-operator-controller-manager-656bcbd775-wgp8z" podUID="df7a25ba-c240-4d05-a117-0040e24bb33c" Oct 12 05:54:57 crc kubenswrapper[4930]: E1012 05:54:57.166232 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:7e584b1c430441c8b6591dadeff32e065de8a185ad37ef90d2e08d37e59aab4a\\\"\"" pod="openstack-operators/test-operator-controller-manager-5458f77c4-52242" podUID="18286f44-9f6c-4699-9d4d-afa069c980ed" Oct 12 05:54:57 crc kubenswrapper[4930]: E1012 05:54:57.166639 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:315e558023b41ac1aa215082096995a03810c5b42910a33b00427ffcac9c6a14\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-79df5fb58c-2hx92" podUID="8ac93070-497c-48ca-a58a-fb47657e6c2a" Oct 12 05:54:57 crc kubenswrapper[4930]: E1012 05:54:57.170639 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:4b4a17fe08ce00e375afaaec6a28835f5c1784f03d11c4558376ac04130f3a9e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-862f6" podUID="27d33d74-f1d1-4208-aead-8f6091c524df" Oct 12 05:54:58 crc kubenswrapper[4930]: E1012 05:54:58.176199 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:5cfb2ae1092445950b39dd59caa9a8c9367f42fb8353a8c3848d3bc729f24492\\\"\"" pod="openstack-operators/infra-operator-controller-manager-656bcbd775-wgp8z" podUID="df7a25ba-c240-4d05-a117-0040e24bb33c" Oct 12 05:54:58 crc kubenswrapper[4930]: E1012 05:54:58.176878 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:a17fc270857869fd1efe5020b2a1cb8c2abbd838f08de88f3a6a59e8754ec351\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5956dffb7bwrr99" podUID="20b56dea-8d10-4b11-b437-fc38320417c9" Oct 12 05:55:04 crc kubenswrapper[4930]: I1012 05:55:04.903758 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-5b95c8954b-46gk6" Oct 12 05:55:06 crc kubenswrapper[4930]: I1012 05:55:06.258216 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84b9b84486-ckctw" event={"ID":"93426c54-3448-421e-aa85-b03c466c7bf8","Type":"ContainerStarted","Data":"c2828afd8ec7ea29bcda373afb5af7d8653d752bf9975ea4d917c8cc9c08b3b2"} Oct 12 05:55:06 crc kubenswrapper[4930]: I1012 05:55:06.263792 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-vf2tl" event={"ID":"9e6cd80c-4aa5-40de-81fc-10d0329f5481","Type":"ContainerStarted","Data":"1e96525d72e66955299c6b1762cb46951eff39290ab9a9523973b2b601e801cb"} Oct 12 05:55:06 crc kubenswrapper[4930]: I1012 05:55:06.270196 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-5vrjw" event={"ID":"7063a11c-2d85-47f0-85ca-61cf4949e10d","Type":"ContainerStarted","Data":"44dc732ca0306d4f38c2c511b411631060857a4b110eedffd930626d504120e3"} Oct 12 05:55:06 crc kubenswrapper[4930]: I1012 05:55:06.277265 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-vn5dq" event={"ID":"5383069b-8f72-4173-97bf-34ffc36c235e","Type":"ContainerStarted","Data":"67359b898e3596130b0c0a7888925c945e9b819191eb9749c5fb08403365d3a0"} Oct 12 05:55:06 crc kubenswrapper[4930]: I1012 05:55:06.284444 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-8vdcg" event={"ID":"f883b7e0-65a4-4cea-9d02-7cdedfaf9ec2","Type":"ContainerStarted","Data":"b62723b4a9d529e0002966f8f1154d80324322419f4ecfd8a067ce94cbad9ead"} Oct 12 05:55:06 crc kubenswrapper[4930]: I1012 05:55:06.284594 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-8vdcg" Oct 12 05:55:06 crc kubenswrapper[4930]: I1012 05:55:06.294361 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-2p7m6" event={"ID":"5a7f54b7-1891-4e3a-a768-e937269bd384","Type":"ContainerStarted","Data":"30aad2e45ea82e04b0bbbde13437336bb0100357470bf65da931a6ca66fe2cbf"} Oct 12 05:55:06 crc kubenswrapper[4930]: I1012 05:55:06.300801 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-5vrjw" podStartSLOduration=3.574731566 podStartE2EDuration="13.300785688s" podCreationTimestamp="2025-10-12 05:54:53 +0000 UTC" firstStartedPulling="2025-10-12 05:54:55.537143728 +0000 UTC m=+828.079245493" lastFinishedPulling="2025-10-12 05:55:05.26319781 +0000 UTC m=+837.805299615" observedRunningTime="2025-10-12 05:55:06.299472455 +0000 UTC m=+838.841574220" watchObservedRunningTime="2025-10-12 05:55:06.300785688 +0000 UTC m=+838.842887453" Oct 12 05:55:06 crc kubenswrapper[4930]: I1012 05:55:06.303688 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-wk6qh" event={"ID":"fe5f36d2-82b4-4bce-a189-7844dae5dc0e","Type":"ContainerStarted","Data":"c6ebfad60d9e603d94c28b385439e44cd55e81e97c62dad50fb00ac46db3de1f"} Oct 12 05:55:06 crc kubenswrapper[4930]: I1012 05:55:06.312301 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-2fq48" event={"ID":"25276148-1b95-4b4d-9f18-ef97020632a7","Type":"ContainerStarted","Data":"a0d0c354c87ae0804a0fabb322ab055991b8b069604bdf74891bb7facbab7f61"} Oct 12 05:55:06 crc kubenswrapper[4930]: I1012 05:55:06.312344 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-2fq48" event={"ID":"25276148-1b95-4b4d-9f18-ef97020632a7","Type":"ContainerStarted","Data":"fbc1701e7c338c175df1b1bebee7b5b3ab86be1a28286f1dda5dff9236a1adce"} Oct 12 05:55:06 crc kubenswrapper[4930]: I1012 05:55:06.312828 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-2fq48" Oct 12 05:55:06 crc kubenswrapper[4930]: I1012 05:55:06.321039 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-zh959" event={"ID":"ee89a0b0-868b-4b2e-a274-c5a4ee40a872","Type":"ContainerStarted","Data":"0f4dc6a0490425247b09262e23bbd575aebea5292cc988d80a90caee00a4604a"} Oct 12 05:55:06 crc kubenswrapper[4930]: I1012 05:55:06.326207 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-wk5xm" event={"ID":"89d21577-9d93-4003-bae1-3b66e679eeeb","Type":"ContainerStarted","Data":"3b70bc57d6dcd14cf27d0330e66d1b7fdbfa60fb9263a97ffd9d17d638be384e"} Oct 12 05:55:06 crc kubenswrapper[4930]: I1012 05:55:06.334826 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-qcc9w" event={"ID":"37d1af03-8709-4b4a-8d4c-bda1dbefff59","Type":"ContainerStarted","Data":"bec5339120e8db191c1b7fe5eed9c0b645670d2946e703673f4fc70e11b386e6"} Oct 12 05:55:06 crc kubenswrapper[4930]: I1012 05:55:06.371336 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-7f554bff7b-mkzqf" event={"ID":"7498c8de-da98-47fd-8096-58cb4f1c4f87","Type":"ContainerStarted","Data":"76c3d0ba3662fe6c2fcaedb04c8c66dfc9388aff65b0159715f44f3413ffb8ac"} Oct 12 05:55:06 crc kubenswrapper[4930]: I1012 05:55:06.372279 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-7f554bff7b-mkzqf" event={"ID":"7498c8de-da98-47fd-8096-58cb4f1c4f87","Type":"ContainerStarted","Data":"c3c96c65413e8483624b09d12f066ea550d4d6a6a3f08ea8536f9b30ee0e5272"} Oct 12 05:55:06 crc kubenswrapper[4930]: I1012 05:55:06.372417 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-7f554bff7b-mkzqf" Oct 12 05:55:06 crc kubenswrapper[4930]: I1012 05:55:06.403597 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-8vdcg" podStartSLOduration=2.939310329 podStartE2EDuration="13.403581651s" podCreationTimestamp="2025-10-12 05:54:53 +0000 UTC" firstStartedPulling="2025-10-12 05:54:54.654875028 +0000 UTC m=+827.196976793" lastFinishedPulling="2025-10-12 05:55:05.11914631 +0000 UTC m=+837.661248115" observedRunningTime="2025-10-12 05:55:06.381936271 +0000 UTC m=+838.924038036" watchObservedRunningTime="2025-10-12 05:55:06.403581651 +0000 UTC m=+838.945683406" Oct 12 05:55:06 crc kubenswrapper[4930]: I1012 05:55:06.418261 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-l2x8t" event={"ID":"97b9da7d-7c47-46b5-ac4a-f190a92dceef","Type":"ContainerStarted","Data":"eb9f5321760dd2623b916cf96f500b77dc0a08c5647e0e01334c3e225ac03532"} Oct 12 05:55:06 crc kubenswrapper[4930]: I1012 05:55:06.454110 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-7f554bff7b-mkzqf" podStartSLOduration=3.723317002 podStartE2EDuration="13.454093574s" podCreationTimestamp="2025-10-12 05:54:53 +0000 UTC" firstStartedPulling="2025-10-12 05:54:55.536886872 +0000 UTC m=+828.078988637" lastFinishedPulling="2025-10-12 05:55:05.267663434 +0000 UTC m=+837.809765209" observedRunningTime="2025-10-12 05:55:06.404383631 +0000 UTC m=+838.946485396" watchObservedRunningTime="2025-10-12 05:55:06.454093574 +0000 UTC m=+838.996195339" Oct 12 05:55:06 crc kubenswrapper[4930]: I1012 05:55:06.456405 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-2fq48" podStartSLOduration=2.893980826 podStartE2EDuration="13.456399653s" podCreationTimestamp="2025-10-12 05:54:53 +0000 UTC" firstStartedPulling="2025-10-12 05:54:54.611979367 +0000 UTC m=+827.154081132" lastFinishedPulling="2025-10-12 05:55:05.174398194 +0000 UTC m=+837.716499959" observedRunningTime="2025-10-12 05:55:06.440489789 +0000 UTC m=+838.982591554" watchObservedRunningTime="2025-10-12 05:55:06.456399653 +0000 UTC m=+838.998501418" Oct 12 05:55:07 crc kubenswrapper[4930]: I1012 05:55:07.427316 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-2p7m6" event={"ID":"5a7f54b7-1891-4e3a-a768-e937269bd384","Type":"ContainerStarted","Data":"4457c65ea5741e0d8384f89dbc8b2084e97b7caa9b4fd8828d83629f6e34bc0f"} Oct 12 05:55:07 crc kubenswrapper[4930]: I1012 05:55:07.428369 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-2p7m6" Oct 12 05:55:07 crc kubenswrapper[4930]: I1012 05:55:07.430391 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-2brdt" event={"ID":"7134f9eb-cfa6-41b8-a245-2f1b17669ca4","Type":"ContainerStarted","Data":"7bf2b3efbfedb9836ff88d43ce0798829cd853828f01321daa26c090a56126dc"} Oct 12 05:55:07 crc kubenswrapper[4930]: I1012 05:55:07.430416 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-2brdt" event={"ID":"7134f9eb-cfa6-41b8-a245-2f1b17669ca4","Type":"ContainerStarted","Data":"717651aeeb9ef169fe2e648d1cb27f36c820ef94f27ac6509b0bfd722b323f4b"} Oct 12 05:55:07 crc kubenswrapper[4930]: I1012 05:55:07.430808 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-2brdt" Oct 12 05:55:07 crc kubenswrapper[4930]: I1012 05:55:07.433444 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-l2x8t" event={"ID":"97b9da7d-7c47-46b5-ac4a-f190a92dceef","Type":"ContainerStarted","Data":"ccf39d9f485302538f2b6868213a2e1c51ee1ddb0183a01bd49fb78e459ae318"} Oct 12 05:55:07 crc kubenswrapper[4930]: I1012 05:55:07.433829 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-l2x8t" Oct 12 05:55:07 crc kubenswrapper[4930]: I1012 05:55:07.435837 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-vf2tl" event={"ID":"9e6cd80c-4aa5-40de-81fc-10d0329f5481","Type":"ContainerStarted","Data":"017877211e9198b93060c929fd12013ccb4d56d6144c313f80af44a5b795786d"} Oct 12 05:55:07 crc kubenswrapper[4930]: I1012 05:55:07.436174 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-vf2tl" Oct 12 05:55:07 crc kubenswrapper[4930]: I1012 05:55:07.438255 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-8vdcg" event={"ID":"f883b7e0-65a4-4cea-9d02-7cdedfaf9ec2","Type":"ContainerStarted","Data":"f3de4633a6ed1f7780d6dbfabe9c4b10afdeca00fe4be772f64b7956f15f9564"} Oct 12 05:55:07 crc kubenswrapper[4930]: I1012 05:55:07.440527 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-wk5xm" event={"ID":"89d21577-9d93-4003-bae1-3b66e679eeeb","Type":"ContainerStarted","Data":"199253080e0c453c57ed561a2bff19ca5b30bf702e1894c02f3d6e8ad4fb0ba2"} Oct 12 05:55:07 crc kubenswrapper[4930]: I1012 05:55:07.440922 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-wk5xm" Oct 12 05:55:07 crc kubenswrapper[4930]: I1012 05:55:07.442959 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-ksz2s" event={"ID":"adf9f01b-70b6-46b9-acde-c1eedc16f299","Type":"ContainerStarted","Data":"79e5dbcba199ed8d2ba9246314957d632fc98a3083ef4cd9a0e46ccd05640ce1"} Oct 12 05:55:07 crc kubenswrapper[4930]: I1012 05:55:07.443026 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-ksz2s" event={"ID":"adf9f01b-70b6-46b9-acde-c1eedc16f299","Type":"ContainerStarted","Data":"dd790baee25653f9382f4b3f3d811ce0607a5b39ac95d41640e1cb89e6d4bdc0"} Oct 12 05:55:07 crc kubenswrapper[4930]: I1012 05:55:07.443486 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-2p7m6" podStartSLOduration=3.488864723 podStartE2EDuration="14.443475606s" podCreationTimestamp="2025-10-12 05:54:53 +0000 UTC" firstStartedPulling="2025-10-12 05:54:54.308890105 +0000 UTC m=+826.850991870" lastFinishedPulling="2025-10-12 05:55:05.263500948 +0000 UTC m=+837.805602753" observedRunningTime="2025-10-12 05:55:07.441757513 +0000 UTC m=+839.983859278" watchObservedRunningTime="2025-10-12 05:55:07.443475606 +0000 UTC m=+839.985577361" Oct 12 05:55:07 crc kubenswrapper[4930]: I1012 05:55:07.444061 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-ksz2s" Oct 12 05:55:07 crc kubenswrapper[4930]: I1012 05:55:07.446390 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-zh959" event={"ID":"ee89a0b0-868b-4b2e-a274-c5a4ee40a872","Type":"ContainerStarted","Data":"0a1478e36e5e8a70693312c3ae4345caa1c61973e25193b902d4c3df9dcdd12d"} Oct 12 05:55:07 crc kubenswrapper[4930]: I1012 05:55:07.446604 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-zh959" Oct 12 05:55:07 crc kubenswrapper[4930]: I1012 05:55:07.449264 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5df598886f-hx224" event={"ID":"510d5f0a-5f67-4171-99e6-1de6734e7bdf","Type":"ContainerStarted","Data":"0b4418062f47a32924b6b56da3f1091e3488a89ee95673cefd567f8f5715406c"} Oct 12 05:55:07 crc kubenswrapper[4930]: I1012 05:55:07.449341 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5df598886f-hx224" event={"ID":"510d5f0a-5f67-4171-99e6-1de6734e7bdf","Type":"ContainerStarted","Data":"298d2c9e2714597be4baa3ec978fef581a533f951e006b1a84b282bf48f4f3cd"} Oct 12 05:55:07 crc kubenswrapper[4930]: I1012 05:55:07.449484 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-5df598886f-hx224" Oct 12 05:55:07 crc kubenswrapper[4930]: I1012 05:55:07.454189 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84b9b84486-ckctw" event={"ID":"93426c54-3448-421e-aa85-b03c466c7bf8","Type":"ContainerStarted","Data":"adf3bd7aaba26fbc44ddb0794329b399fe56ff272cebb691c7d2fd4461cfe92e"} Oct 12 05:55:07 crc kubenswrapper[4930]: I1012 05:55:07.455107 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-84b9b84486-ckctw" Oct 12 05:55:07 crc kubenswrapper[4930]: I1012 05:55:07.457359 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-vn5dq" event={"ID":"5383069b-8f72-4173-97bf-34ffc36c235e","Type":"ContainerStarted","Data":"3c150946951f594ae7e66a3f082ef2d8737a582a293ae167282c72877303edcf"} Oct 12 05:55:07 crc kubenswrapper[4930]: I1012 05:55:07.457715 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-vn5dq" Oct 12 05:55:07 crc kubenswrapper[4930]: I1012 05:55:07.460046 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-qcc9w" event={"ID":"37d1af03-8709-4b4a-8d4c-bda1dbefff59","Type":"ContainerStarted","Data":"2bf16ed076aac4b6fcafc1366aa8393db6f1368bcdd12764a23fef4e120be384"} Oct 12 05:55:07 crc kubenswrapper[4930]: I1012 05:55:07.460382 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-qcc9w" Oct 12 05:55:07 crc kubenswrapper[4930]: I1012 05:55:07.468906 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-ql78t" event={"ID":"fe819f44-6224-4b45-a33c-6b6ef8e73b92","Type":"ContainerStarted","Data":"b70752e4f8758c6f421394ec528a3bc38d88612b8db40e0885e08213783e911c"} Oct 12 05:55:07 crc kubenswrapper[4930]: I1012 05:55:07.472134 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-wk6qh" event={"ID":"fe5f36d2-82b4-4bce-a189-7844dae5dc0e","Type":"ContainerStarted","Data":"d23efeaa5f03fe495c2126d7cc5a9289013b8f1849f609bb512b6b4dff9ca574"} Oct 12 05:55:07 crc kubenswrapper[4930]: I1012 05:55:07.488586 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-l2x8t" podStartSLOduration=4.302937331 podStartE2EDuration="14.488566482s" podCreationTimestamp="2025-10-12 05:54:53 +0000 UTC" firstStartedPulling="2025-10-12 05:54:55.067646837 +0000 UTC m=+827.609748602" lastFinishedPulling="2025-10-12 05:55:05.253275948 +0000 UTC m=+837.795377753" observedRunningTime="2025-10-12 05:55:07.486770936 +0000 UTC m=+840.028872691" watchObservedRunningTime="2025-10-12 05:55:07.488566482 +0000 UTC m=+840.030668247" Oct 12 05:55:07 crc kubenswrapper[4930]: I1012 05:55:07.492163 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-2brdt" podStartSLOduration=4.838476561 podStartE2EDuration="14.492152393s" podCreationTimestamp="2025-10-12 05:54:53 +0000 UTC" firstStartedPulling="2025-10-12 05:54:55.610035291 +0000 UTC m=+828.152137056" lastFinishedPulling="2025-10-12 05:55:05.263711113 +0000 UTC m=+837.805812888" observedRunningTime="2025-10-12 05:55:07.46487602 +0000 UTC m=+840.006977785" watchObservedRunningTime="2025-10-12 05:55:07.492152393 +0000 UTC m=+840.034254158" Oct 12 05:55:07 crc kubenswrapper[4930]: I1012 05:55:07.547411 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-wk5xm" podStartSLOduration=4.812154671 podStartE2EDuration="14.547390657s" podCreationTimestamp="2025-10-12 05:54:53 +0000 UTC" firstStartedPulling="2025-10-12 05:54:55.536396909 +0000 UTC m=+828.078498674" lastFinishedPulling="2025-10-12 05:55:05.271632855 +0000 UTC m=+837.813734660" observedRunningTime="2025-10-12 05:55:07.524337981 +0000 UTC m=+840.066439746" watchObservedRunningTime="2025-10-12 05:55:07.547390657 +0000 UTC m=+840.089492422" Oct 12 05:55:07 crc kubenswrapper[4930]: I1012 05:55:07.550666 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-vf2tl" podStartSLOduration=4.6625417890000005 podStartE2EDuration="14.55065667s" podCreationTimestamp="2025-10-12 05:54:53 +0000 UTC" firstStartedPulling="2025-10-12 05:54:54.208697379 +0000 UTC m=+826.750799144" lastFinishedPulling="2025-10-12 05:55:04.09681226 +0000 UTC m=+836.638914025" observedRunningTime="2025-10-12 05:55:07.548632118 +0000 UTC m=+840.090733873" watchObservedRunningTime="2025-10-12 05:55:07.55065667 +0000 UTC m=+840.092758435" Oct 12 05:55:07 crc kubenswrapper[4930]: I1012 05:55:07.619567 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-ksz2s" podStartSLOduration=4.898208148 podStartE2EDuration="14.619549431s" podCreationTimestamp="2025-10-12 05:54:53 +0000 UTC" firstStartedPulling="2025-10-12 05:54:55.550371964 +0000 UTC m=+828.092473729" lastFinishedPulling="2025-10-12 05:55:05.271713247 +0000 UTC m=+837.813815012" observedRunningTime="2025-10-12 05:55:07.582088559 +0000 UTC m=+840.124190324" watchObservedRunningTime="2025-10-12 05:55:07.619549431 +0000 UTC m=+840.161651196" Oct 12 05:55:07 crc kubenswrapper[4930]: I1012 05:55:07.621425 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-vn5dq" podStartSLOduration=4.377720371 podStartE2EDuration="14.621420138s" podCreationTimestamp="2025-10-12 05:54:53 +0000 UTC" firstStartedPulling="2025-10-12 05:54:55.015184854 +0000 UTC m=+827.557286619" lastFinishedPulling="2025-10-12 05:55:05.258884611 +0000 UTC m=+837.800986386" observedRunningTime="2025-10-12 05:55:07.613138208 +0000 UTC m=+840.155239973" watchObservedRunningTime="2025-10-12 05:55:07.621420138 +0000 UTC m=+840.163521903" Oct 12 05:55:07 crc kubenswrapper[4930]: I1012 05:55:07.640943 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-zh959" podStartSLOduration=4.375596838 podStartE2EDuration="14.640926654s" podCreationTimestamp="2025-10-12 05:54:53 +0000 UTC" firstStartedPulling="2025-10-12 05:54:55.006087703 +0000 UTC m=+827.548189468" lastFinishedPulling="2025-10-12 05:55:05.271417509 +0000 UTC m=+837.813519284" observedRunningTime="2025-10-12 05:55:07.635850205 +0000 UTC m=+840.177951970" watchObservedRunningTime="2025-10-12 05:55:07.640926654 +0000 UTC m=+840.183028419" Oct 12 05:55:07 crc kubenswrapper[4930]: I1012 05:55:07.661264 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-84b9b84486-ckctw" podStartSLOduration=3.852891374 podStartE2EDuration="14.66124677s" podCreationTimestamp="2025-10-12 05:54:53 +0000 UTC" firstStartedPulling="2025-10-12 05:54:54.310228459 +0000 UTC m=+826.852330224" lastFinishedPulling="2025-10-12 05:55:05.118583855 +0000 UTC m=+837.660685620" observedRunningTime="2025-10-12 05:55:07.656974492 +0000 UTC m=+840.199076257" watchObservedRunningTime="2025-10-12 05:55:07.66124677 +0000 UTC m=+840.203348535" Oct 12 05:55:07 crc kubenswrapper[4930]: I1012 05:55:07.680997 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-wk6qh" podStartSLOduration=4.517882104 podStartE2EDuration="14.680978072s" podCreationTimestamp="2025-10-12 05:54:53 +0000 UTC" firstStartedPulling="2025-10-12 05:54:55.023481615 +0000 UTC m=+827.565583380" lastFinishedPulling="2025-10-12 05:55:05.186577583 +0000 UTC m=+837.728679348" observedRunningTime="2025-10-12 05:55:07.676074887 +0000 UTC m=+840.218176652" watchObservedRunningTime="2025-10-12 05:55:07.680978072 +0000 UTC m=+840.223079837" Oct 12 05:55:07 crc kubenswrapper[4930]: I1012 05:55:07.699985 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-5df598886f-hx224" podStartSLOduration=4.5004712300000005 podStartE2EDuration="14.699969044s" podCreationTimestamp="2025-10-12 05:54:53 +0000 UTC" firstStartedPulling="2025-10-12 05:54:55.069999797 +0000 UTC m=+827.612101562" lastFinishedPulling="2025-10-12 05:55:05.269497571 +0000 UTC m=+837.811599376" observedRunningTime="2025-10-12 05:55:07.696129697 +0000 UTC m=+840.238231462" watchObservedRunningTime="2025-10-12 05:55:07.699969044 +0000 UTC m=+840.242070809" Oct 12 05:55:08 crc kubenswrapper[4930]: I1012 05:55:08.147416 4930 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 12 05:55:08 crc kubenswrapper[4930]: I1012 05:55:08.487143 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-ql78t" event={"ID":"fe819f44-6224-4b45-a33c-6b6ef8e73b92","Type":"ContainerStarted","Data":"8bd42dab89735fb25841722a1c8589436b823009ed4b3b1cd7ae73dfcc289dc7"} Oct 12 05:55:08 crc kubenswrapper[4930]: I1012 05:55:08.488062 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-wk6qh" Oct 12 05:55:08 crc kubenswrapper[4930]: I1012 05:55:08.518091 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-ql78t" podStartSLOduration=5.334547587 podStartE2EDuration="15.518075924s" podCreationTimestamp="2025-10-12 05:54:53 +0000 UTC" firstStartedPulling="2025-10-12 05:54:55.084507086 +0000 UTC m=+827.626608851" lastFinishedPulling="2025-10-12 05:55:05.268035383 +0000 UTC m=+837.810137188" observedRunningTime="2025-10-12 05:55:08.516156776 +0000 UTC m=+841.058258571" watchObservedRunningTime="2025-10-12 05:55:08.518075924 +0000 UTC m=+841.060177699" Oct 12 05:55:08 crc kubenswrapper[4930]: I1012 05:55:08.518234 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-qcc9w" podStartSLOduration=5.27960382 podStartE2EDuration="15.518229728s" podCreationTimestamp="2025-10-12 05:54:53 +0000 UTC" firstStartedPulling="2025-10-12 05:54:55.031007566 +0000 UTC m=+827.573109321" lastFinishedPulling="2025-10-12 05:55:05.269633464 +0000 UTC m=+837.811735229" observedRunningTime="2025-10-12 05:55:07.723551954 +0000 UTC m=+840.265653719" watchObservedRunningTime="2025-10-12 05:55:08.518229728 +0000 UTC m=+841.060331503" Oct 12 05:55:09 crc kubenswrapper[4930]: I1012 05:55:09.498298 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-ql78t" Oct 12 05:55:10 crc kubenswrapper[4930]: I1012 05:55:10.520561 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5458f77c4-52242" event={"ID":"18286f44-9f6c-4699-9d4d-afa069c980ed","Type":"ContainerStarted","Data":"db447ec03a5e9abfde324b71d640ca33ec83e3c7d72d9ec8f7b1e13ae9c0549a"} Oct 12 05:55:10 crc kubenswrapper[4930]: I1012 05:55:10.522118 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5458f77c4-52242" Oct 12 05:55:13 crc kubenswrapper[4930]: I1012 05:55:13.514506 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-vf2tl" Oct 12 05:55:13 crc kubenswrapper[4930]: I1012 05:55:13.520375 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-2p7m6" Oct 12 05:55:13 crc kubenswrapper[4930]: I1012 05:55:13.542363 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5458f77c4-52242" podStartSLOduration=6.27118256 podStartE2EDuration="20.542334144s" podCreationTimestamp="2025-10-12 05:54:53 +0000 UTC" firstStartedPulling="2025-10-12 05:54:55.569403368 +0000 UTC m=+828.111505133" lastFinishedPulling="2025-10-12 05:55:09.840554952 +0000 UTC m=+842.382656717" observedRunningTime="2025-10-12 05:55:10.547265181 +0000 UTC m=+843.089366946" watchObservedRunningTime="2025-10-12 05:55:13.542334144 +0000 UTC m=+846.084435949" Oct 12 05:55:13 crc kubenswrapper[4930]: I1012 05:55:13.549220 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-8vdcg" Oct 12 05:55:13 crc kubenswrapper[4930]: I1012 05:55:13.613496 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-2fq48" Oct 12 05:55:13 crc kubenswrapper[4930]: I1012 05:55:13.613696 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-84b9b84486-ckctw" Oct 12 05:55:13 crc kubenswrapper[4930]: I1012 05:55:13.727581 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-zh959" Oct 12 05:55:13 crc kubenswrapper[4930]: I1012 05:55:13.821202 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-wk6qh" Oct 12 05:55:13 crc kubenswrapper[4930]: I1012 05:55:13.882893 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-ql78t" Oct 12 05:55:13 crc kubenswrapper[4930]: I1012 05:55:13.903698 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-qcc9w" Oct 12 05:55:13 crc kubenswrapper[4930]: I1012 05:55:13.927724 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-5df598886f-hx224" Oct 12 05:55:13 crc kubenswrapper[4930]: I1012 05:55:13.931236 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-2brdt" Oct 12 05:55:13 crc kubenswrapper[4930]: I1012 05:55:13.963903 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-vn5dq" Oct 12 05:55:13 crc kubenswrapper[4930]: I1012 05:55:13.980182 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-l2x8t" Oct 12 05:55:14 crc kubenswrapper[4930]: I1012 05:55:14.012230 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-wk5xm" Oct 12 05:55:14 crc kubenswrapper[4930]: I1012 05:55:14.047556 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-ksz2s" Oct 12 05:55:14 crc kubenswrapper[4930]: I1012 05:55:14.250814 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-7f554bff7b-mkzqf" Oct 12 05:55:18 crc kubenswrapper[4930]: I1012 05:55:18.600496 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5956dffb7bwrr99" event={"ID":"20b56dea-8d10-4b11-b437-fc38320417c9","Type":"ContainerStarted","Data":"258c495a33196b1b479d8480b4ee0cea84cae7de0a2bfd50e9a64d67fdf29048"} Oct 12 05:55:18 crc kubenswrapper[4930]: I1012 05:55:18.601219 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5956dffb7bwrr99" Oct 12 05:55:18 crc kubenswrapper[4930]: I1012 05:55:18.602182 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-656bcbd775-wgp8z" event={"ID":"df7a25ba-c240-4d05-a117-0040e24bb33c","Type":"ContainerStarted","Data":"e4d6dd0a8a2c3db6ea4e8d402eb21fac3be094092a1dd76947094514896b9501"} Oct 12 05:55:18 crc kubenswrapper[4930]: I1012 05:55:18.602462 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-656bcbd775-wgp8z" Oct 12 05:55:18 crc kubenswrapper[4930]: I1012 05:55:18.642095 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5956dffb7bwrr99" podStartSLOduration=3.537373615 podStartE2EDuration="25.642080696s" podCreationTimestamp="2025-10-12 05:54:53 +0000 UTC" firstStartedPulling="2025-10-12 05:54:55.623266517 +0000 UTC m=+828.165368282" lastFinishedPulling="2025-10-12 05:55:17.727973608 +0000 UTC m=+850.270075363" observedRunningTime="2025-10-12 05:55:18.635782068 +0000 UTC m=+851.177883833" watchObservedRunningTime="2025-10-12 05:55:18.642080696 +0000 UTC m=+851.184182461" Oct 12 05:55:18 crc kubenswrapper[4930]: I1012 05:55:18.652112 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-656bcbd775-wgp8z" podStartSLOduration=3.534081768 podStartE2EDuration="25.652100758s" podCreationTimestamp="2025-10-12 05:54:53 +0000 UTC" firstStartedPulling="2025-10-12 05:54:55.610061381 +0000 UTC m=+828.152163146" lastFinishedPulling="2025-10-12 05:55:17.728080331 +0000 UTC m=+850.270182136" observedRunningTime="2025-10-12 05:55:18.650277542 +0000 UTC m=+851.192379307" watchObservedRunningTime="2025-10-12 05:55:18.652100758 +0000 UTC m=+851.194202513" Oct 12 05:55:19 crc kubenswrapper[4930]: I1012 05:55:19.628427 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-79df5fb58c-2hx92" event={"ID":"8ac93070-497c-48ca-a58a-fb47657e6c2a","Type":"ContainerStarted","Data":"4f09fa3347b368497f449b3d488fad07364bfa2d21788510169fb6607f5f9287"} Oct 12 05:55:19 crc kubenswrapper[4930]: I1012 05:55:19.630074 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-79df5fb58c-2hx92" Oct 12 05:55:19 crc kubenswrapper[4930]: I1012 05:55:19.634185 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-862f6" event={"ID":"27d33d74-f1d1-4208-aead-8f6091c524df","Type":"ContainerStarted","Data":"c83ccce18327199587322d02dfc8d45c3cfacdfe84cbb949d5cf5bfe6354c874"} Oct 12 05:55:19 crc kubenswrapper[4930]: I1012 05:55:19.635331 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-862f6" Oct 12 05:55:19 crc kubenswrapper[4930]: I1012 05:55:19.660922 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-79df5fb58c-2hx92" podStartSLOduration=2.969973601 podStartE2EDuration="26.660900481s" podCreationTimestamp="2025-10-12 05:54:53 +0000 UTC" firstStartedPulling="2025-10-12 05:54:55.566525655 +0000 UTC m=+828.108627420" lastFinishedPulling="2025-10-12 05:55:19.257452505 +0000 UTC m=+851.799554300" observedRunningTime="2025-10-12 05:55:19.659300371 +0000 UTC m=+852.201402166" watchObservedRunningTime="2025-10-12 05:55:19.660900481 +0000 UTC m=+852.203002286" Oct 12 05:55:19 crc kubenswrapper[4930]: I1012 05:55:19.699121 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-862f6" podStartSLOduration=2.9939093 podStartE2EDuration="26.699088588s" podCreationTimestamp="2025-10-12 05:54:53 +0000 UTC" firstStartedPulling="2025-10-12 05:54:55.567327955 +0000 UTC m=+828.109429720" lastFinishedPulling="2025-10-12 05:55:19.272507233 +0000 UTC m=+851.814609008" observedRunningTime="2025-10-12 05:55:19.69037308 +0000 UTC m=+852.232474885" watchObservedRunningTime="2025-10-12 05:55:19.699088588 +0000 UTC m=+852.241190383" Oct 12 05:55:24 crc kubenswrapper[4930]: I1012 05:55:24.039273 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-656bcbd775-wgp8z" Oct 12 05:55:24 crc kubenswrapper[4930]: I1012 05:55:24.153045 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5458f77c4-52242" Oct 12 05:55:24 crc kubenswrapper[4930]: I1012 05:55:24.566508 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5956dffb7bwrr99" Oct 12 05:55:33 crc kubenswrapper[4930]: I1012 05:55:33.961488 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-79df5fb58c-2hx92" Oct 12 05:55:33 crc kubenswrapper[4930]: I1012 05:55:33.990462 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-862f6" Oct 12 05:56:05 crc kubenswrapper[4930]: I1012 05:56:05.083515 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8468885bfc-zhxs4"] Oct 12 05:56:05 crc kubenswrapper[4930]: I1012 05:56:05.085215 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8468885bfc-zhxs4" Oct 12 05:56:05 crc kubenswrapper[4930]: I1012 05:56:05.095476 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Oct 12 05:56:05 crc kubenswrapper[4930]: I1012 05:56:05.095624 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 12 05:56:05 crc kubenswrapper[4930]: I1012 05:56:05.095729 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-cxzn4" Oct 12 05:56:05 crc kubenswrapper[4930]: I1012 05:56:05.101290 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Oct 12 05:56:05 crc kubenswrapper[4930]: I1012 05:56:05.125399 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8468885bfc-zhxs4"] Oct 12 05:56:05 crc kubenswrapper[4930]: I1012 05:56:05.227190 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xd9d9\" (UniqueName: \"kubernetes.io/projected/f9e22aa8-e04a-4743-a638-6835e95f5945-kube-api-access-xd9d9\") pod \"dnsmasq-dns-8468885bfc-zhxs4\" (UID: \"f9e22aa8-e04a-4743-a638-6835e95f5945\") " pod="openstack/dnsmasq-dns-8468885bfc-zhxs4" Oct 12 05:56:05 crc kubenswrapper[4930]: I1012 05:56:05.227313 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9e22aa8-e04a-4743-a638-6835e95f5945-config\") pod \"dnsmasq-dns-8468885bfc-zhxs4\" (UID: \"f9e22aa8-e04a-4743-a638-6835e95f5945\") " pod="openstack/dnsmasq-dns-8468885bfc-zhxs4" Oct 12 05:56:05 crc kubenswrapper[4930]: I1012 05:56:05.235221 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-545d49fd5c-78lqk"] Oct 12 05:56:05 crc kubenswrapper[4930]: I1012 05:56:05.236426 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-545d49fd5c-78lqk" Oct 12 05:56:05 crc kubenswrapper[4930]: I1012 05:56:05.239260 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 12 05:56:05 crc kubenswrapper[4930]: I1012 05:56:05.277846 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-545d49fd5c-78lqk"] Oct 12 05:56:05 crc kubenswrapper[4930]: I1012 05:56:05.328711 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9e22aa8-e04a-4743-a638-6835e95f5945-config\") pod \"dnsmasq-dns-8468885bfc-zhxs4\" (UID: \"f9e22aa8-e04a-4743-a638-6835e95f5945\") " pod="openstack/dnsmasq-dns-8468885bfc-zhxs4" Oct 12 05:56:05 crc kubenswrapper[4930]: I1012 05:56:05.328809 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xd9d9\" (UniqueName: \"kubernetes.io/projected/f9e22aa8-e04a-4743-a638-6835e95f5945-kube-api-access-xd9d9\") pod \"dnsmasq-dns-8468885bfc-zhxs4\" (UID: \"f9e22aa8-e04a-4743-a638-6835e95f5945\") " pod="openstack/dnsmasq-dns-8468885bfc-zhxs4" Oct 12 05:56:05 crc kubenswrapper[4930]: I1012 05:56:05.328891 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8qtc\" (UniqueName: \"kubernetes.io/projected/4b4dd6b0-8862-4c13-83df-abb6c3752260-kube-api-access-l8qtc\") pod \"dnsmasq-dns-545d49fd5c-78lqk\" (UID: \"4b4dd6b0-8862-4c13-83df-abb6c3752260\") " pod="openstack/dnsmasq-dns-545d49fd5c-78lqk" Oct 12 05:56:05 crc kubenswrapper[4930]: I1012 05:56:05.328942 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4b4dd6b0-8862-4c13-83df-abb6c3752260-dns-svc\") pod \"dnsmasq-dns-545d49fd5c-78lqk\" (UID: \"4b4dd6b0-8862-4c13-83df-abb6c3752260\") " pod="openstack/dnsmasq-dns-545d49fd5c-78lqk" Oct 12 05:56:05 crc kubenswrapper[4930]: I1012 05:56:05.328960 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b4dd6b0-8862-4c13-83df-abb6c3752260-config\") pod \"dnsmasq-dns-545d49fd5c-78lqk\" (UID: \"4b4dd6b0-8862-4c13-83df-abb6c3752260\") " pod="openstack/dnsmasq-dns-545d49fd5c-78lqk" Oct 12 05:56:05 crc kubenswrapper[4930]: I1012 05:56:05.329750 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9e22aa8-e04a-4743-a638-6835e95f5945-config\") pod \"dnsmasq-dns-8468885bfc-zhxs4\" (UID: \"f9e22aa8-e04a-4743-a638-6835e95f5945\") " pod="openstack/dnsmasq-dns-8468885bfc-zhxs4" Oct 12 05:56:05 crc kubenswrapper[4930]: I1012 05:56:05.360635 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xd9d9\" (UniqueName: \"kubernetes.io/projected/f9e22aa8-e04a-4743-a638-6835e95f5945-kube-api-access-xd9d9\") pod \"dnsmasq-dns-8468885bfc-zhxs4\" (UID: \"f9e22aa8-e04a-4743-a638-6835e95f5945\") " pod="openstack/dnsmasq-dns-8468885bfc-zhxs4" Oct 12 05:56:05 crc kubenswrapper[4930]: I1012 05:56:05.429971 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8468885bfc-zhxs4" Oct 12 05:56:05 crc kubenswrapper[4930]: I1012 05:56:05.430321 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4b4dd6b0-8862-4c13-83df-abb6c3752260-dns-svc\") pod \"dnsmasq-dns-545d49fd5c-78lqk\" (UID: \"4b4dd6b0-8862-4c13-83df-abb6c3752260\") " pod="openstack/dnsmasq-dns-545d49fd5c-78lqk" Oct 12 05:56:05 crc kubenswrapper[4930]: I1012 05:56:05.430362 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b4dd6b0-8862-4c13-83df-abb6c3752260-config\") pod \"dnsmasq-dns-545d49fd5c-78lqk\" (UID: \"4b4dd6b0-8862-4c13-83df-abb6c3752260\") " pod="openstack/dnsmasq-dns-545d49fd5c-78lqk" Oct 12 05:56:05 crc kubenswrapper[4930]: I1012 05:56:05.430557 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8qtc\" (UniqueName: \"kubernetes.io/projected/4b4dd6b0-8862-4c13-83df-abb6c3752260-kube-api-access-l8qtc\") pod \"dnsmasq-dns-545d49fd5c-78lqk\" (UID: \"4b4dd6b0-8862-4c13-83df-abb6c3752260\") " pod="openstack/dnsmasq-dns-545d49fd5c-78lqk" Oct 12 05:56:05 crc kubenswrapper[4930]: I1012 05:56:05.431254 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b4dd6b0-8862-4c13-83df-abb6c3752260-config\") pod \"dnsmasq-dns-545d49fd5c-78lqk\" (UID: \"4b4dd6b0-8862-4c13-83df-abb6c3752260\") " pod="openstack/dnsmasq-dns-545d49fd5c-78lqk" Oct 12 05:56:05 crc kubenswrapper[4930]: I1012 05:56:05.431452 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4b4dd6b0-8862-4c13-83df-abb6c3752260-dns-svc\") pod \"dnsmasq-dns-545d49fd5c-78lqk\" (UID: \"4b4dd6b0-8862-4c13-83df-abb6c3752260\") " pod="openstack/dnsmasq-dns-545d49fd5c-78lqk" Oct 12 05:56:05 crc kubenswrapper[4930]: I1012 05:56:05.451782 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8qtc\" (UniqueName: \"kubernetes.io/projected/4b4dd6b0-8862-4c13-83df-abb6c3752260-kube-api-access-l8qtc\") pod \"dnsmasq-dns-545d49fd5c-78lqk\" (UID: \"4b4dd6b0-8862-4c13-83df-abb6c3752260\") " pod="openstack/dnsmasq-dns-545d49fd5c-78lqk" Oct 12 05:56:05 crc kubenswrapper[4930]: I1012 05:56:05.551331 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-545d49fd5c-78lqk" Oct 12 05:56:05 crc kubenswrapper[4930]: I1012 05:56:05.803078 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-545d49fd5c-78lqk"] Oct 12 05:56:05 crc kubenswrapper[4930]: I1012 05:56:05.870392 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8468885bfc-zhxs4"] Oct 12 05:56:05 crc kubenswrapper[4930]: W1012 05:56:05.876614 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9e22aa8_e04a_4743_a638_6835e95f5945.slice/crio-8dc6f450223300e68b84e073f702d9f1e6b8dead0ac5d0ce1d810c2a15bdb277 WatchSource:0}: Error finding container 8dc6f450223300e68b84e073f702d9f1e6b8dead0ac5d0ce1d810c2a15bdb277: Status 404 returned error can't find the container with id 8dc6f450223300e68b84e073f702d9f1e6b8dead0ac5d0ce1d810c2a15bdb277 Oct 12 05:56:06 crc kubenswrapper[4930]: I1012 05:56:06.107452 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-545d49fd5c-78lqk" event={"ID":"4b4dd6b0-8862-4c13-83df-abb6c3752260","Type":"ContainerStarted","Data":"7d4d0185a16ee99c19787cfa2a7ecd71ed9c2b3e01b98e42fc1511db3d2cad24"} Oct 12 05:56:06 crc kubenswrapper[4930]: I1012 05:56:06.109673 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8468885bfc-zhxs4" event={"ID":"f9e22aa8-e04a-4743-a638-6835e95f5945","Type":"ContainerStarted","Data":"8dc6f450223300e68b84e073f702d9f1e6b8dead0ac5d0ce1d810c2a15bdb277"} Oct 12 05:56:07 crc kubenswrapper[4930]: I1012 05:56:07.703361 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-545d49fd5c-78lqk"] Oct 12 05:56:07 crc kubenswrapper[4930]: I1012 05:56:07.740312 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-589dfdf7-fq2v4"] Oct 12 05:56:07 crc kubenswrapper[4930]: I1012 05:56:07.741957 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589dfdf7-fq2v4" Oct 12 05:56:07 crc kubenswrapper[4930]: I1012 05:56:07.757641 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-589dfdf7-fq2v4"] Oct 12 05:56:07 crc kubenswrapper[4930]: I1012 05:56:07.877717 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrrs7\" (UniqueName: \"kubernetes.io/projected/cbf6abbf-41bf-4616-84ec-9bc01291d120-kube-api-access-mrrs7\") pod \"dnsmasq-dns-589dfdf7-fq2v4\" (UID: \"cbf6abbf-41bf-4616-84ec-9bc01291d120\") " pod="openstack/dnsmasq-dns-589dfdf7-fq2v4" Oct 12 05:56:07 crc kubenswrapper[4930]: I1012 05:56:07.877927 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cbf6abbf-41bf-4616-84ec-9bc01291d120-dns-svc\") pod \"dnsmasq-dns-589dfdf7-fq2v4\" (UID: \"cbf6abbf-41bf-4616-84ec-9bc01291d120\") " pod="openstack/dnsmasq-dns-589dfdf7-fq2v4" Oct 12 05:56:07 crc kubenswrapper[4930]: I1012 05:56:07.877957 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbf6abbf-41bf-4616-84ec-9bc01291d120-config\") pod \"dnsmasq-dns-589dfdf7-fq2v4\" (UID: \"cbf6abbf-41bf-4616-84ec-9bc01291d120\") " pod="openstack/dnsmasq-dns-589dfdf7-fq2v4" Oct 12 05:56:07 crc kubenswrapper[4930]: I1012 05:56:07.982141 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrrs7\" (UniqueName: \"kubernetes.io/projected/cbf6abbf-41bf-4616-84ec-9bc01291d120-kube-api-access-mrrs7\") pod \"dnsmasq-dns-589dfdf7-fq2v4\" (UID: \"cbf6abbf-41bf-4616-84ec-9bc01291d120\") " pod="openstack/dnsmasq-dns-589dfdf7-fq2v4" Oct 12 05:56:07 crc kubenswrapper[4930]: I1012 05:56:07.982293 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cbf6abbf-41bf-4616-84ec-9bc01291d120-dns-svc\") pod \"dnsmasq-dns-589dfdf7-fq2v4\" (UID: \"cbf6abbf-41bf-4616-84ec-9bc01291d120\") " pod="openstack/dnsmasq-dns-589dfdf7-fq2v4" Oct 12 05:56:07 crc kubenswrapper[4930]: I1012 05:56:07.982321 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbf6abbf-41bf-4616-84ec-9bc01291d120-config\") pod \"dnsmasq-dns-589dfdf7-fq2v4\" (UID: \"cbf6abbf-41bf-4616-84ec-9bc01291d120\") " pod="openstack/dnsmasq-dns-589dfdf7-fq2v4" Oct 12 05:56:07 crc kubenswrapper[4930]: I1012 05:56:07.983422 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cbf6abbf-41bf-4616-84ec-9bc01291d120-dns-svc\") pod \"dnsmasq-dns-589dfdf7-fq2v4\" (UID: \"cbf6abbf-41bf-4616-84ec-9bc01291d120\") " pod="openstack/dnsmasq-dns-589dfdf7-fq2v4" Oct 12 05:56:07 crc kubenswrapper[4930]: I1012 05:56:07.983474 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbf6abbf-41bf-4616-84ec-9bc01291d120-config\") pod \"dnsmasq-dns-589dfdf7-fq2v4\" (UID: \"cbf6abbf-41bf-4616-84ec-9bc01291d120\") " pod="openstack/dnsmasq-dns-589dfdf7-fq2v4" Oct 12 05:56:07 crc kubenswrapper[4930]: I1012 05:56:07.999591 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8468885bfc-zhxs4"] Oct 12 05:56:08 crc kubenswrapper[4930]: I1012 05:56:08.011457 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrrs7\" (UniqueName: \"kubernetes.io/projected/cbf6abbf-41bf-4616-84ec-9bc01291d120-kube-api-access-mrrs7\") pod \"dnsmasq-dns-589dfdf7-fq2v4\" (UID: \"cbf6abbf-41bf-4616-84ec-9bc01291d120\") " pod="openstack/dnsmasq-dns-589dfdf7-fq2v4" Oct 12 05:56:08 crc kubenswrapper[4930]: I1012 05:56:08.038724 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6665b8cd9-rr4j9"] Oct 12 05:56:08 crc kubenswrapper[4930]: I1012 05:56:08.041861 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6665b8cd9-rr4j9" Oct 12 05:56:08 crc kubenswrapper[4930]: I1012 05:56:08.063059 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6665b8cd9-rr4j9"] Oct 12 05:56:08 crc kubenswrapper[4930]: I1012 05:56:08.071468 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589dfdf7-fq2v4" Oct 12 05:56:08 crc kubenswrapper[4930]: I1012 05:56:08.194352 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wwqb\" (UniqueName: \"kubernetes.io/projected/e314b90f-0ecb-4ab1-b7da-86f2f6d1c36b-kube-api-access-6wwqb\") pod \"dnsmasq-dns-6665b8cd9-rr4j9\" (UID: \"e314b90f-0ecb-4ab1-b7da-86f2f6d1c36b\") " pod="openstack/dnsmasq-dns-6665b8cd9-rr4j9" Oct 12 05:56:08 crc kubenswrapper[4930]: I1012 05:56:08.194916 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e314b90f-0ecb-4ab1-b7da-86f2f6d1c36b-dns-svc\") pod \"dnsmasq-dns-6665b8cd9-rr4j9\" (UID: \"e314b90f-0ecb-4ab1-b7da-86f2f6d1c36b\") " pod="openstack/dnsmasq-dns-6665b8cd9-rr4j9" Oct 12 05:56:08 crc kubenswrapper[4930]: I1012 05:56:08.195005 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e314b90f-0ecb-4ab1-b7da-86f2f6d1c36b-config\") pod \"dnsmasq-dns-6665b8cd9-rr4j9\" (UID: \"e314b90f-0ecb-4ab1-b7da-86f2f6d1c36b\") " pod="openstack/dnsmasq-dns-6665b8cd9-rr4j9" Oct 12 05:56:08 crc kubenswrapper[4930]: I1012 05:56:08.298380 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e314b90f-0ecb-4ab1-b7da-86f2f6d1c36b-config\") pod \"dnsmasq-dns-6665b8cd9-rr4j9\" (UID: \"e314b90f-0ecb-4ab1-b7da-86f2f6d1c36b\") " pod="openstack/dnsmasq-dns-6665b8cd9-rr4j9" Oct 12 05:56:08 crc kubenswrapper[4930]: I1012 05:56:08.298469 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wwqb\" (UniqueName: \"kubernetes.io/projected/e314b90f-0ecb-4ab1-b7da-86f2f6d1c36b-kube-api-access-6wwqb\") pod \"dnsmasq-dns-6665b8cd9-rr4j9\" (UID: \"e314b90f-0ecb-4ab1-b7da-86f2f6d1c36b\") " pod="openstack/dnsmasq-dns-6665b8cd9-rr4j9" Oct 12 05:56:08 crc kubenswrapper[4930]: I1012 05:56:08.298497 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e314b90f-0ecb-4ab1-b7da-86f2f6d1c36b-dns-svc\") pod \"dnsmasq-dns-6665b8cd9-rr4j9\" (UID: \"e314b90f-0ecb-4ab1-b7da-86f2f6d1c36b\") " pod="openstack/dnsmasq-dns-6665b8cd9-rr4j9" Oct 12 05:56:08 crc kubenswrapper[4930]: I1012 05:56:08.299339 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e314b90f-0ecb-4ab1-b7da-86f2f6d1c36b-dns-svc\") pod \"dnsmasq-dns-6665b8cd9-rr4j9\" (UID: \"e314b90f-0ecb-4ab1-b7da-86f2f6d1c36b\") " pod="openstack/dnsmasq-dns-6665b8cd9-rr4j9" Oct 12 05:56:08 crc kubenswrapper[4930]: I1012 05:56:08.300353 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e314b90f-0ecb-4ab1-b7da-86f2f6d1c36b-config\") pod \"dnsmasq-dns-6665b8cd9-rr4j9\" (UID: \"e314b90f-0ecb-4ab1-b7da-86f2f6d1c36b\") " pod="openstack/dnsmasq-dns-6665b8cd9-rr4j9" Oct 12 05:56:08 crc kubenswrapper[4930]: I1012 05:56:08.337251 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wwqb\" (UniqueName: \"kubernetes.io/projected/e314b90f-0ecb-4ab1-b7da-86f2f6d1c36b-kube-api-access-6wwqb\") pod \"dnsmasq-dns-6665b8cd9-rr4j9\" (UID: \"e314b90f-0ecb-4ab1-b7da-86f2f6d1c36b\") " pod="openstack/dnsmasq-dns-6665b8cd9-rr4j9" Oct 12 05:56:08 crc kubenswrapper[4930]: I1012 05:56:08.363906 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6665b8cd9-rr4j9" Oct 12 05:56:08 crc kubenswrapper[4930]: I1012 05:56:08.434862 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-589dfdf7-fq2v4"] Oct 12 05:56:08 crc kubenswrapper[4930]: I1012 05:56:08.465343 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5449989c59-cnw2z"] Oct 12 05:56:08 crc kubenswrapper[4930]: I1012 05:56:08.472192 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5449989c59-cnw2z" Oct 12 05:56:08 crc kubenswrapper[4930]: I1012 05:56:08.497465 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5449989c59-cnw2z"] Oct 12 05:56:08 crc kubenswrapper[4930]: I1012 05:56:08.507114 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4pkc\" (UniqueName: \"kubernetes.io/projected/c3424140-ca19-4ff4-b19a-c236e8868d38-kube-api-access-z4pkc\") pod \"dnsmasq-dns-5449989c59-cnw2z\" (UID: \"c3424140-ca19-4ff4-b19a-c236e8868d38\") " pod="openstack/dnsmasq-dns-5449989c59-cnw2z" Oct 12 05:56:08 crc kubenswrapper[4930]: I1012 05:56:08.507224 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c3424140-ca19-4ff4-b19a-c236e8868d38-dns-svc\") pod \"dnsmasq-dns-5449989c59-cnw2z\" (UID: \"c3424140-ca19-4ff4-b19a-c236e8868d38\") " pod="openstack/dnsmasq-dns-5449989c59-cnw2z" Oct 12 05:56:08 crc kubenswrapper[4930]: I1012 05:56:08.507271 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3424140-ca19-4ff4-b19a-c236e8868d38-config\") pod \"dnsmasq-dns-5449989c59-cnw2z\" (UID: \"c3424140-ca19-4ff4-b19a-c236e8868d38\") " pod="openstack/dnsmasq-dns-5449989c59-cnw2z" Oct 12 05:56:08 crc kubenswrapper[4930]: I1012 05:56:08.507941 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-589dfdf7-fq2v4"] Oct 12 05:56:08 crc kubenswrapper[4930]: I1012 05:56:08.608871 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c3424140-ca19-4ff4-b19a-c236e8868d38-dns-svc\") pod \"dnsmasq-dns-5449989c59-cnw2z\" (UID: \"c3424140-ca19-4ff4-b19a-c236e8868d38\") " pod="openstack/dnsmasq-dns-5449989c59-cnw2z" Oct 12 05:56:08 crc kubenswrapper[4930]: I1012 05:56:08.608947 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3424140-ca19-4ff4-b19a-c236e8868d38-config\") pod \"dnsmasq-dns-5449989c59-cnw2z\" (UID: \"c3424140-ca19-4ff4-b19a-c236e8868d38\") " pod="openstack/dnsmasq-dns-5449989c59-cnw2z" Oct 12 05:56:08 crc kubenswrapper[4930]: I1012 05:56:08.608978 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4pkc\" (UniqueName: \"kubernetes.io/projected/c3424140-ca19-4ff4-b19a-c236e8868d38-kube-api-access-z4pkc\") pod \"dnsmasq-dns-5449989c59-cnw2z\" (UID: \"c3424140-ca19-4ff4-b19a-c236e8868d38\") " pod="openstack/dnsmasq-dns-5449989c59-cnw2z" Oct 12 05:56:08 crc kubenswrapper[4930]: I1012 05:56:08.611107 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c3424140-ca19-4ff4-b19a-c236e8868d38-dns-svc\") pod \"dnsmasq-dns-5449989c59-cnw2z\" (UID: \"c3424140-ca19-4ff4-b19a-c236e8868d38\") " pod="openstack/dnsmasq-dns-5449989c59-cnw2z" Oct 12 05:56:08 crc kubenswrapper[4930]: I1012 05:56:08.611299 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3424140-ca19-4ff4-b19a-c236e8868d38-config\") pod \"dnsmasq-dns-5449989c59-cnw2z\" (UID: \"c3424140-ca19-4ff4-b19a-c236e8868d38\") " pod="openstack/dnsmasq-dns-5449989c59-cnw2z" Oct 12 05:56:08 crc kubenswrapper[4930]: I1012 05:56:08.686761 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4pkc\" (UniqueName: \"kubernetes.io/projected/c3424140-ca19-4ff4-b19a-c236e8868d38-kube-api-access-z4pkc\") pod \"dnsmasq-dns-5449989c59-cnw2z\" (UID: \"c3424140-ca19-4ff4-b19a-c236e8868d38\") " pod="openstack/dnsmasq-dns-5449989c59-cnw2z" Oct 12 05:56:08 crc kubenswrapper[4930]: I1012 05:56:08.818110 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5449989c59-cnw2z" Oct 12 05:56:08 crc kubenswrapper[4930]: I1012 05:56:08.895810 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-notifications-server-0"] Oct 12 05:56:08 crc kubenswrapper[4930]: I1012 05:56:08.901963 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-notifications-server-0" Oct 12 05:56:08 crc kubenswrapper[4930]: I1012 05:56:08.904992 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-notifications-svc" Oct 12 05:56:08 crc kubenswrapper[4930]: I1012 05:56:08.905179 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-notifications-config-data" Oct 12 05:56:08 crc kubenswrapper[4930]: I1012 05:56:08.905356 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-notifications-default-user" Oct 12 05:56:08 crc kubenswrapper[4930]: I1012 05:56:08.905378 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-notifications-erlang-cookie" Oct 12 05:56:08 crc kubenswrapper[4930]: I1012 05:56:08.905361 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-notifications-plugins-conf" Oct 12 05:56:08 crc kubenswrapper[4930]: I1012 05:56:08.906787 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-notifications-server-dockercfg-q48nm" Oct 12 05:56:08 crc kubenswrapper[4930]: I1012 05:56:08.913558 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-notifications-server-0"] Oct 12 05:56:08 crc kubenswrapper[4930]: I1012 05:56:08.921180 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-notifications-server-conf" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.022193 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-notifications-server-0\" (UID: \"1594846a-5c2f-49f8-9bea-22661720c5a6\") " pod="openstack/rabbitmq-notifications-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.022270 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1594846a-5c2f-49f8-9bea-22661720c5a6-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"1594846a-5c2f-49f8-9bea-22661720c5a6\") " pod="openstack/rabbitmq-notifications-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.022304 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1594846a-5c2f-49f8-9bea-22661720c5a6-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"1594846a-5c2f-49f8-9bea-22661720c5a6\") " pod="openstack/rabbitmq-notifications-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.022332 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1594846a-5c2f-49f8-9bea-22661720c5a6-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"1594846a-5c2f-49f8-9bea-22661720c5a6\") " pod="openstack/rabbitmq-notifications-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.023499 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1594846a-5c2f-49f8-9bea-22661720c5a6-config-data\") pod \"rabbitmq-notifications-server-0\" (UID: \"1594846a-5c2f-49f8-9bea-22661720c5a6\") " pod="openstack/rabbitmq-notifications-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.023572 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1594846a-5c2f-49f8-9bea-22661720c5a6-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"1594846a-5c2f-49f8-9bea-22661720c5a6\") " pod="openstack/rabbitmq-notifications-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.023635 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1594846a-5c2f-49f8-9bea-22661720c5a6-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"1594846a-5c2f-49f8-9bea-22661720c5a6\") " pod="openstack/rabbitmq-notifications-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.023677 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1594846a-5c2f-49f8-9bea-22661720c5a6-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"1594846a-5c2f-49f8-9bea-22661720c5a6\") " pod="openstack/rabbitmq-notifications-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.023724 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1594846a-5c2f-49f8-9bea-22661720c5a6-rabbitmq-tls\") pod \"rabbitmq-notifications-server-0\" (UID: \"1594846a-5c2f-49f8-9bea-22661720c5a6\") " pod="openstack/rabbitmq-notifications-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.023850 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1594846a-5c2f-49f8-9bea-22661720c5a6-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"1594846a-5c2f-49f8-9bea-22661720c5a6\") " pod="openstack/rabbitmq-notifications-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.023885 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7v8w\" (UniqueName: \"kubernetes.io/projected/1594846a-5c2f-49f8-9bea-22661720c5a6-kube-api-access-m7v8w\") pod \"rabbitmq-notifications-server-0\" (UID: \"1594846a-5c2f-49f8-9bea-22661720c5a6\") " pod="openstack/rabbitmq-notifications-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.051350 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6665b8cd9-rr4j9"] Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.124971 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1594846a-5c2f-49f8-9bea-22661720c5a6-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"1594846a-5c2f-49f8-9bea-22661720c5a6\") " pod="openstack/rabbitmq-notifications-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.125861 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1594846a-5c2f-49f8-9bea-22661720c5a6-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"1594846a-5c2f-49f8-9bea-22661720c5a6\") " pod="openstack/rabbitmq-notifications-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.125852 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7v8w\" (UniqueName: \"kubernetes.io/projected/1594846a-5c2f-49f8-9bea-22661720c5a6-kube-api-access-m7v8w\") pod \"rabbitmq-notifications-server-0\" (UID: \"1594846a-5c2f-49f8-9bea-22661720c5a6\") " pod="openstack/rabbitmq-notifications-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.125942 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-notifications-server-0\" (UID: \"1594846a-5c2f-49f8-9bea-22661720c5a6\") " pod="openstack/rabbitmq-notifications-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.125977 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1594846a-5c2f-49f8-9bea-22661720c5a6-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"1594846a-5c2f-49f8-9bea-22661720c5a6\") " pod="openstack/rabbitmq-notifications-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.126004 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1594846a-5c2f-49f8-9bea-22661720c5a6-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"1594846a-5c2f-49f8-9bea-22661720c5a6\") " pod="openstack/rabbitmq-notifications-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.126027 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1594846a-5c2f-49f8-9bea-22661720c5a6-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"1594846a-5c2f-49f8-9bea-22661720c5a6\") " pod="openstack/rabbitmq-notifications-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.126051 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1594846a-5c2f-49f8-9bea-22661720c5a6-config-data\") pod \"rabbitmq-notifications-server-0\" (UID: \"1594846a-5c2f-49f8-9bea-22661720c5a6\") " pod="openstack/rabbitmq-notifications-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.126072 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1594846a-5c2f-49f8-9bea-22661720c5a6-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"1594846a-5c2f-49f8-9bea-22661720c5a6\") " pod="openstack/rabbitmq-notifications-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.126096 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1594846a-5c2f-49f8-9bea-22661720c5a6-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"1594846a-5c2f-49f8-9bea-22661720c5a6\") " pod="openstack/rabbitmq-notifications-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.126117 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1594846a-5c2f-49f8-9bea-22661720c5a6-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"1594846a-5c2f-49f8-9bea-22661720c5a6\") " pod="openstack/rabbitmq-notifications-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.126138 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1594846a-5c2f-49f8-9bea-22661720c5a6-rabbitmq-tls\") pod \"rabbitmq-notifications-server-0\" (UID: \"1594846a-5c2f-49f8-9bea-22661720c5a6\") " pod="openstack/rabbitmq-notifications-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.126835 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1594846a-5c2f-49f8-9bea-22661720c5a6-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"1594846a-5c2f-49f8-9bea-22661720c5a6\") " pod="openstack/rabbitmq-notifications-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.127040 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1594846a-5c2f-49f8-9bea-22661720c5a6-config-data\") pod \"rabbitmq-notifications-server-0\" (UID: \"1594846a-5c2f-49f8-9bea-22661720c5a6\") " pod="openstack/rabbitmq-notifications-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.127053 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1594846a-5c2f-49f8-9bea-22661720c5a6-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"1594846a-5c2f-49f8-9bea-22661720c5a6\") " pod="openstack/rabbitmq-notifications-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.127493 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1594846a-5c2f-49f8-9bea-22661720c5a6-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"1594846a-5c2f-49f8-9bea-22661720c5a6\") " pod="openstack/rabbitmq-notifications-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.127542 4930 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-notifications-server-0\" (UID: \"1594846a-5c2f-49f8-9bea-22661720c5a6\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-notifications-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.130477 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1594846a-5c2f-49f8-9bea-22661720c5a6-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"1594846a-5c2f-49f8-9bea-22661720c5a6\") " pod="openstack/rabbitmq-notifications-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.130615 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1594846a-5c2f-49f8-9bea-22661720c5a6-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"1594846a-5c2f-49f8-9bea-22661720c5a6\") " pod="openstack/rabbitmq-notifications-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.131315 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1594846a-5c2f-49f8-9bea-22661720c5a6-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"1594846a-5c2f-49f8-9bea-22661720c5a6\") " pod="openstack/rabbitmq-notifications-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.138478 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6665b8cd9-rr4j9" event={"ID":"e314b90f-0ecb-4ab1-b7da-86f2f6d1c36b","Type":"ContainerStarted","Data":"0e2a981eb4f13758764cf8208237a9ab6ec705314bc5fe9aa0254a66bcb636b4"} Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.141419 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589dfdf7-fq2v4" event={"ID":"cbf6abbf-41bf-4616-84ec-9bc01291d120","Type":"ContainerStarted","Data":"a9708018d46ef09d1279bb4e404f629c61461aca504053e877fec7dc5531f946"} Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.145479 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1594846a-5c2f-49f8-9bea-22661720c5a6-rabbitmq-tls\") pod \"rabbitmq-notifications-server-0\" (UID: \"1594846a-5c2f-49f8-9bea-22661720c5a6\") " pod="openstack/rabbitmq-notifications-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.154147 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7v8w\" (UniqueName: \"kubernetes.io/projected/1594846a-5c2f-49f8-9bea-22661720c5a6-kube-api-access-m7v8w\") pod \"rabbitmq-notifications-server-0\" (UID: \"1594846a-5c2f-49f8-9bea-22661720c5a6\") " pod="openstack/rabbitmq-notifications-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.162432 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.168295 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.170423 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.172138 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.172596 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.172758 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.173094 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.173334 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.173473 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.173593 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-54fs6" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.197107 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-notifications-server-0\" (UID: \"1594846a-5c2f-49f8-9bea-22661720c5a6\") " pod="openstack/rabbitmq-notifications-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.232409 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-notifications-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.334853 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bad3587e-d515-4add-9edd-da341fe519b7-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"bad3587e-d515-4add-9edd-da341fe519b7\") " pod="openstack/rabbitmq-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.334903 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bad3587e-d515-4add-9edd-da341fe519b7-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"bad3587e-d515-4add-9edd-da341fe519b7\") " pod="openstack/rabbitmq-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.334939 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bad3587e-d515-4add-9edd-da341fe519b7-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"bad3587e-d515-4add-9edd-da341fe519b7\") " pod="openstack/rabbitmq-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.334987 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bad3587e-d515-4add-9edd-da341fe519b7-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"bad3587e-d515-4add-9edd-da341fe519b7\") " pod="openstack/rabbitmq-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.335052 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bad3587e-d515-4add-9edd-da341fe519b7-server-conf\") pod \"rabbitmq-server-0\" (UID: \"bad3587e-d515-4add-9edd-da341fe519b7\") " pod="openstack/rabbitmq-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.335067 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhrgr\" (UniqueName: \"kubernetes.io/projected/bad3587e-d515-4add-9edd-da341fe519b7-kube-api-access-jhrgr\") pod \"rabbitmq-server-0\" (UID: \"bad3587e-d515-4add-9edd-da341fe519b7\") " pod="openstack/rabbitmq-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.335122 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bad3587e-d515-4add-9edd-da341fe519b7-pod-info\") pod \"rabbitmq-server-0\" (UID: \"bad3587e-d515-4add-9edd-da341fe519b7\") " pod="openstack/rabbitmq-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.335144 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bad3587e-d515-4add-9edd-da341fe519b7-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"bad3587e-d515-4add-9edd-da341fe519b7\") " pod="openstack/rabbitmq-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.335168 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"bad3587e-d515-4add-9edd-da341fe519b7\") " pod="openstack/rabbitmq-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.335220 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bad3587e-d515-4add-9edd-da341fe519b7-config-data\") pod \"rabbitmq-server-0\" (UID: \"bad3587e-d515-4add-9edd-da341fe519b7\") " pod="openstack/rabbitmq-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.335244 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bad3587e-d515-4add-9edd-da341fe519b7-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"bad3587e-d515-4add-9edd-da341fe519b7\") " pod="openstack/rabbitmq-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.369815 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5449989c59-cnw2z"] Oct 12 05:56:09 crc kubenswrapper[4930]: W1012 05:56:09.397014 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc3424140_ca19_4ff4_b19a_c236e8868d38.slice/crio-78b3feacd68cbc1fabafebaaee6f788b3292e824999b36e5f86bdadce04fd873 WatchSource:0}: Error finding container 78b3feacd68cbc1fabafebaaee6f788b3292e824999b36e5f86bdadce04fd873: Status 404 returned error can't find the container with id 78b3feacd68cbc1fabafebaaee6f788b3292e824999b36e5f86bdadce04fd873 Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.436730 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bad3587e-d515-4add-9edd-da341fe519b7-server-conf\") pod \"rabbitmq-server-0\" (UID: \"bad3587e-d515-4add-9edd-da341fe519b7\") " pod="openstack/rabbitmq-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.436777 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhrgr\" (UniqueName: \"kubernetes.io/projected/bad3587e-d515-4add-9edd-da341fe519b7-kube-api-access-jhrgr\") pod \"rabbitmq-server-0\" (UID: \"bad3587e-d515-4add-9edd-da341fe519b7\") " pod="openstack/rabbitmq-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.436817 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bad3587e-d515-4add-9edd-da341fe519b7-pod-info\") pod \"rabbitmq-server-0\" (UID: \"bad3587e-d515-4add-9edd-da341fe519b7\") " pod="openstack/rabbitmq-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.436845 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bad3587e-d515-4add-9edd-da341fe519b7-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"bad3587e-d515-4add-9edd-da341fe519b7\") " pod="openstack/rabbitmq-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.436871 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"bad3587e-d515-4add-9edd-da341fe519b7\") " pod="openstack/rabbitmq-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.436908 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bad3587e-d515-4add-9edd-da341fe519b7-config-data\") pod \"rabbitmq-server-0\" (UID: \"bad3587e-d515-4add-9edd-da341fe519b7\") " pod="openstack/rabbitmq-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.436937 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bad3587e-d515-4add-9edd-da341fe519b7-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"bad3587e-d515-4add-9edd-da341fe519b7\") " pod="openstack/rabbitmq-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.436974 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bad3587e-d515-4add-9edd-da341fe519b7-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"bad3587e-d515-4add-9edd-da341fe519b7\") " pod="openstack/rabbitmq-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.436989 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bad3587e-d515-4add-9edd-da341fe519b7-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"bad3587e-d515-4add-9edd-da341fe519b7\") " pod="openstack/rabbitmq-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.437016 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bad3587e-d515-4add-9edd-da341fe519b7-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"bad3587e-d515-4add-9edd-da341fe519b7\") " pod="openstack/rabbitmq-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.437059 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bad3587e-d515-4add-9edd-da341fe519b7-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"bad3587e-d515-4add-9edd-da341fe519b7\") " pod="openstack/rabbitmq-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.437493 4930 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"bad3587e-d515-4add-9edd-da341fe519b7\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/rabbitmq-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.437731 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bad3587e-d515-4add-9edd-da341fe519b7-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"bad3587e-d515-4add-9edd-da341fe519b7\") " pod="openstack/rabbitmq-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.438139 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bad3587e-d515-4add-9edd-da341fe519b7-config-data\") pod \"rabbitmq-server-0\" (UID: \"bad3587e-d515-4add-9edd-da341fe519b7\") " pod="openstack/rabbitmq-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.438617 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bad3587e-d515-4add-9edd-da341fe519b7-server-conf\") pod \"rabbitmq-server-0\" (UID: \"bad3587e-d515-4add-9edd-da341fe519b7\") " pod="openstack/rabbitmq-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.439266 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bad3587e-d515-4add-9edd-da341fe519b7-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"bad3587e-d515-4add-9edd-da341fe519b7\") " pod="openstack/rabbitmq-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.439284 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bad3587e-d515-4add-9edd-da341fe519b7-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"bad3587e-d515-4add-9edd-da341fe519b7\") " pod="openstack/rabbitmq-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.451150 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bad3587e-d515-4add-9edd-da341fe519b7-pod-info\") pod \"rabbitmq-server-0\" (UID: \"bad3587e-d515-4add-9edd-da341fe519b7\") " pod="openstack/rabbitmq-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.451452 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bad3587e-d515-4add-9edd-da341fe519b7-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"bad3587e-d515-4add-9edd-da341fe519b7\") " pod="openstack/rabbitmq-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.451608 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bad3587e-d515-4add-9edd-da341fe519b7-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"bad3587e-d515-4add-9edd-da341fe519b7\") " pod="openstack/rabbitmq-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.452102 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bad3587e-d515-4add-9edd-da341fe519b7-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"bad3587e-d515-4add-9edd-da341fe519b7\") " pod="openstack/rabbitmq-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.456673 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhrgr\" (UniqueName: \"kubernetes.io/projected/bad3587e-d515-4add-9edd-da341fe519b7-kube-api-access-jhrgr\") pod \"rabbitmq-server-0\" (UID: \"bad3587e-d515-4add-9edd-da341fe519b7\") " pod="openstack/rabbitmq-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.477987 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"bad3587e-d515-4add-9edd-da341fe519b7\") " pod="openstack/rabbitmq-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.543496 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.606265 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.608225 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.610876 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.611082 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.611392 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.611505 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.613977 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.614327 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-ltsgl" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.615111 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.622810 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.678503 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-notifications-server-0"] Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.742027 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/fe886795-2501-4474-bfbc-9febcc5113f3-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"fe886795-2501-4474-bfbc-9febcc5113f3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.742084 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/fe886795-2501-4474-bfbc-9febcc5113f3-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"fe886795-2501-4474-bfbc-9febcc5113f3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.742112 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"fe886795-2501-4474-bfbc-9febcc5113f3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.742406 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/fe886795-2501-4474-bfbc-9febcc5113f3-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"fe886795-2501-4474-bfbc-9febcc5113f3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.742785 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/fe886795-2501-4474-bfbc-9febcc5113f3-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"fe886795-2501-4474-bfbc-9febcc5113f3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.742998 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzflr\" (UniqueName: \"kubernetes.io/projected/fe886795-2501-4474-bfbc-9febcc5113f3-kube-api-access-zzflr\") pod \"rabbitmq-cell1-server-0\" (UID: \"fe886795-2501-4474-bfbc-9febcc5113f3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.743037 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fe886795-2501-4474-bfbc-9febcc5113f3-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"fe886795-2501-4474-bfbc-9febcc5113f3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.743085 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/fe886795-2501-4474-bfbc-9febcc5113f3-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"fe886795-2501-4474-bfbc-9febcc5113f3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.746150 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/fe886795-2501-4474-bfbc-9febcc5113f3-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"fe886795-2501-4474-bfbc-9febcc5113f3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.746267 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/fe886795-2501-4474-bfbc-9febcc5113f3-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"fe886795-2501-4474-bfbc-9febcc5113f3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.746340 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/fe886795-2501-4474-bfbc-9febcc5113f3-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"fe886795-2501-4474-bfbc-9febcc5113f3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.847942 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/fe886795-2501-4474-bfbc-9febcc5113f3-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"fe886795-2501-4474-bfbc-9febcc5113f3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.848021 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/fe886795-2501-4474-bfbc-9febcc5113f3-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"fe886795-2501-4474-bfbc-9febcc5113f3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.848116 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/fe886795-2501-4474-bfbc-9febcc5113f3-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"fe886795-2501-4474-bfbc-9febcc5113f3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.848151 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/fe886795-2501-4474-bfbc-9febcc5113f3-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"fe886795-2501-4474-bfbc-9febcc5113f3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.848201 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/fe886795-2501-4474-bfbc-9febcc5113f3-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"fe886795-2501-4474-bfbc-9febcc5113f3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.848280 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"fe886795-2501-4474-bfbc-9febcc5113f3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.848305 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/fe886795-2501-4474-bfbc-9febcc5113f3-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"fe886795-2501-4474-bfbc-9febcc5113f3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.848358 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/fe886795-2501-4474-bfbc-9febcc5113f3-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"fe886795-2501-4474-bfbc-9febcc5113f3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.848423 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzflr\" (UniqueName: \"kubernetes.io/projected/fe886795-2501-4474-bfbc-9febcc5113f3-kube-api-access-zzflr\") pod \"rabbitmq-cell1-server-0\" (UID: \"fe886795-2501-4474-bfbc-9febcc5113f3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.848446 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fe886795-2501-4474-bfbc-9febcc5113f3-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"fe886795-2501-4474-bfbc-9febcc5113f3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.848563 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/fe886795-2501-4474-bfbc-9febcc5113f3-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"fe886795-2501-4474-bfbc-9febcc5113f3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.848616 4930 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"fe886795-2501-4474-bfbc-9febcc5113f3\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/rabbitmq-cell1-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.848703 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/fe886795-2501-4474-bfbc-9febcc5113f3-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"fe886795-2501-4474-bfbc-9febcc5113f3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.850596 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/fe886795-2501-4474-bfbc-9febcc5113f3-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"fe886795-2501-4474-bfbc-9febcc5113f3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.851182 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/fe886795-2501-4474-bfbc-9febcc5113f3-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"fe886795-2501-4474-bfbc-9febcc5113f3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.851608 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fe886795-2501-4474-bfbc-9febcc5113f3-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"fe886795-2501-4474-bfbc-9febcc5113f3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.851201 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/fe886795-2501-4474-bfbc-9febcc5113f3-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"fe886795-2501-4474-bfbc-9febcc5113f3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.851963 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/fe886795-2501-4474-bfbc-9febcc5113f3-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"fe886795-2501-4474-bfbc-9febcc5113f3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.852562 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/fe886795-2501-4474-bfbc-9febcc5113f3-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"fe886795-2501-4474-bfbc-9febcc5113f3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.852581 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/fe886795-2501-4474-bfbc-9febcc5113f3-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"fe886795-2501-4474-bfbc-9febcc5113f3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.854557 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/fe886795-2501-4474-bfbc-9febcc5113f3-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"fe886795-2501-4474-bfbc-9febcc5113f3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.865135 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzflr\" (UniqueName: \"kubernetes.io/projected/fe886795-2501-4474-bfbc-9febcc5113f3-kube-api-access-zzflr\") pod \"rabbitmq-cell1-server-0\" (UID: \"fe886795-2501-4474-bfbc-9febcc5113f3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.870902 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"fe886795-2501-4474-bfbc-9febcc5113f3\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 05:56:09 crc kubenswrapper[4930]: I1012 05:56:09.934852 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 12 05:56:10 crc kubenswrapper[4930]: I1012 05:56:10.018608 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 12 05:56:10 crc kubenswrapper[4930]: W1012 05:56:10.053836 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbad3587e_d515_4add_9edd_da341fe519b7.slice/crio-8fedf8e7d958f38b23fceca5dd59d3b7c1d5d56ff5df608c2df7deae4e7103ec WatchSource:0}: Error finding container 8fedf8e7d958f38b23fceca5dd59d3b7c1d5d56ff5df608c2df7deae4e7103ec: Status 404 returned error can't find the container with id 8fedf8e7d958f38b23fceca5dd59d3b7c1d5d56ff5df608c2df7deae4e7103ec Oct 12 05:56:10 crc kubenswrapper[4930]: I1012 05:56:10.159613 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"bad3587e-d515-4add-9edd-da341fe519b7","Type":"ContainerStarted","Data":"8fedf8e7d958f38b23fceca5dd59d3b7c1d5d56ff5df608c2df7deae4e7103ec"} Oct 12 05:56:10 crc kubenswrapper[4930]: I1012 05:56:10.160923 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5449989c59-cnw2z" event={"ID":"c3424140-ca19-4ff4-b19a-c236e8868d38","Type":"ContainerStarted","Data":"78b3feacd68cbc1fabafebaaee6f788b3292e824999b36e5f86bdadce04fd873"} Oct 12 05:56:10 crc kubenswrapper[4930]: I1012 05:56:10.166675 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-notifications-server-0" event={"ID":"1594846a-5c2f-49f8-9bea-22661720c5a6","Type":"ContainerStarted","Data":"d8eba6404d02b1faaa7539fcba9aaf566c0a7cf8cffd5c953d5fea17c8a8b1d4"} Oct 12 05:56:10 crc kubenswrapper[4930]: I1012 05:56:10.458270 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 12 05:56:11 crc kubenswrapper[4930]: I1012 05:56:11.183599 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Oct 12 05:56:11 crc kubenswrapper[4930]: I1012 05:56:11.186081 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 12 05:56:11 crc kubenswrapper[4930]: I1012 05:56:11.188819 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Oct 12 05:56:11 crc kubenswrapper[4930]: I1012 05:56:11.192173 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"fe886795-2501-4474-bfbc-9febcc5113f3","Type":"ContainerStarted","Data":"f9945bf75945e3910574098df1060de9d6a3b6bc351c9cb1ab39a8c4ef341187"} Oct 12 05:56:11 crc kubenswrapper[4930]: I1012 05:56:11.200481 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Oct 12 05:56:11 crc kubenswrapper[4930]: I1012 05:56:11.200834 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-qgtfd" Oct 12 05:56:11 crc kubenswrapper[4930]: I1012 05:56:11.201178 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Oct 12 05:56:11 crc kubenswrapper[4930]: I1012 05:56:11.202101 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 12 05:56:11 crc kubenswrapper[4930]: I1012 05:56:11.203716 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Oct 12 05:56:11 crc kubenswrapper[4930]: I1012 05:56:11.214940 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Oct 12 05:56:11 crc kubenswrapper[4930]: I1012 05:56:11.271961 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e52fde5b-22df-4fea-ae39-2bb2ef6fa033-config-data-default\") pod \"openstack-galera-0\" (UID: \"e52fde5b-22df-4fea-ae39-2bb2ef6fa033\") " pod="openstack/openstack-galera-0" Oct 12 05:56:11 crc kubenswrapper[4930]: I1012 05:56:11.272050 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/e52fde5b-22df-4fea-ae39-2bb2ef6fa033-secrets\") pod \"openstack-galera-0\" (UID: \"e52fde5b-22df-4fea-ae39-2bb2ef6fa033\") " pod="openstack/openstack-galera-0" Oct 12 05:56:11 crc kubenswrapper[4930]: I1012 05:56:11.272148 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5r9t\" (UniqueName: \"kubernetes.io/projected/e52fde5b-22df-4fea-ae39-2bb2ef6fa033-kube-api-access-r5r9t\") pod \"openstack-galera-0\" (UID: \"e52fde5b-22df-4fea-ae39-2bb2ef6fa033\") " pod="openstack/openstack-galera-0" Oct 12 05:56:11 crc kubenswrapper[4930]: I1012 05:56:11.272302 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e52fde5b-22df-4fea-ae39-2bb2ef6fa033-kolla-config\") pod \"openstack-galera-0\" (UID: \"e52fde5b-22df-4fea-ae39-2bb2ef6fa033\") " pod="openstack/openstack-galera-0" Oct 12 05:56:11 crc kubenswrapper[4930]: I1012 05:56:11.272373 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e52fde5b-22df-4fea-ae39-2bb2ef6fa033-operator-scripts\") pod \"openstack-galera-0\" (UID: \"e52fde5b-22df-4fea-ae39-2bb2ef6fa033\") " pod="openstack/openstack-galera-0" Oct 12 05:56:11 crc kubenswrapper[4930]: I1012 05:56:11.272394 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e52fde5b-22df-4fea-ae39-2bb2ef6fa033-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"e52fde5b-22df-4fea-ae39-2bb2ef6fa033\") " pod="openstack/openstack-galera-0" Oct 12 05:56:11 crc kubenswrapper[4930]: I1012 05:56:11.272536 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"e52fde5b-22df-4fea-ae39-2bb2ef6fa033\") " pod="openstack/openstack-galera-0" Oct 12 05:56:11 crc kubenswrapper[4930]: I1012 05:56:11.272563 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e52fde5b-22df-4fea-ae39-2bb2ef6fa033-config-data-generated\") pod \"openstack-galera-0\" (UID: \"e52fde5b-22df-4fea-ae39-2bb2ef6fa033\") " pod="openstack/openstack-galera-0" Oct 12 05:56:11 crc kubenswrapper[4930]: I1012 05:56:11.272655 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e52fde5b-22df-4fea-ae39-2bb2ef6fa033-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"e52fde5b-22df-4fea-ae39-2bb2ef6fa033\") " pod="openstack/openstack-galera-0" Oct 12 05:56:11 crc kubenswrapper[4930]: I1012 05:56:11.374238 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/e52fde5b-22df-4fea-ae39-2bb2ef6fa033-secrets\") pod \"openstack-galera-0\" (UID: \"e52fde5b-22df-4fea-ae39-2bb2ef6fa033\") " pod="openstack/openstack-galera-0" Oct 12 05:56:11 crc kubenswrapper[4930]: I1012 05:56:11.374288 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5r9t\" (UniqueName: \"kubernetes.io/projected/e52fde5b-22df-4fea-ae39-2bb2ef6fa033-kube-api-access-r5r9t\") pod \"openstack-galera-0\" (UID: \"e52fde5b-22df-4fea-ae39-2bb2ef6fa033\") " pod="openstack/openstack-galera-0" Oct 12 05:56:11 crc kubenswrapper[4930]: I1012 05:56:11.374354 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e52fde5b-22df-4fea-ae39-2bb2ef6fa033-kolla-config\") pod \"openstack-galera-0\" (UID: \"e52fde5b-22df-4fea-ae39-2bb2ef6fa033\") " pod="openstack/openstack-galera-0" Oct 12 05:56:11 crc kubenswrapper[4930]: I1012 05:56:11.374391 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e52fde5b-22df-4fea-ae39-2bb2ef6fa033-operator-scripts\") pod \"openstack-galera-0\" (UID: \"e52fde5b-22df-4fea-ae39-2bb2ef6fa033\") " pod="openstack/openstack-galera-0" Oct 12 05:56:11 crc kubenswrapper[4930]: I1012 05:56:11.374409 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e52fde5b-22df-4fea-ae39-2bb2ef6fa033-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"e52fde5b-22df-4fea-ae39-2bb2ef6fa033\") " pod="openstack/openstack-galera-0" Oct 12 05:56:11 crc kubenswrapper[4930]: I1012 05:56:11.374445 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"e52fde5b-22df-4fea-ae39-2bb2ef6fa033\") " pod="openstack/openstack-galera-0" Oct 12 05:56:11 crc kubenswrapper[4930]: I1012 05:56:11.374470 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e52fde5b-22df-4fea-ae39-2bb2ef6fa033-config-data-generated\") pod \"openstack-galera-0\" (UID: \"e52fde5b-22df-4fea-ae39-2bb2ef6fa033\") " pod="openstack/openstack-galera-0" Oct 12 05:56:11 crc kubenswrapper[4930]: I1012 05:56:11.374497 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e52fde5b-22df-4fea-ae39-2bb2ef6fa033-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"e52fde5b-22df-4fea-ae39-2bb2ef6fa033\") " pod="openstack/openstack-galera-0" Oct 12 05:56:11 crc kubenswrapper[4930]: I1012 05:56:11.374521 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e52fde5b-22df-4fea-ae39-2bb2ef6fa033-config-data-default\") pod \"openstack-galera-0\" (UID: \"e52fde5b-22df-4fea-ae39-2bb2ef6fa033\") " pod="openstack/openstack-galera-0" Oct 12 05:56:11 crc kubenswrapper[4930]: I1012 05:56:11.375632 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e52fde5b-22df-4fea-ae39-2bb2ef6fa033-config-data-default\") pod \"openstack-galera-0\" (UID: \"e52fde5b-22df-4fea-ae39-2bb2ef6fa033\") " pod="openstack/openstack-galera-0" Oct 12 05:56:11 crc kubenswrapper[4930]: I1012 05:56:11.375935 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e52fde5b-22df-4fea-ae39-2bb2ef6fa033-operator-scripts\") pod \"openstack-galera-0\" (UID: \"e52fde5b-22df-4fea-ae39-2bb2ef6fa033\") " pod="openstack/openstack-galera-0" Oct 12 05:56:11 crc kubenswrapper[4930]: I1012 05:56:11.375987 4930 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"e52fde5b-22df-4fea-ae39-2bb2ef6fa033\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/openstack-galera-0" Oct 12 05:56:11 crc kubenswrapper[4930]: I1012 05:56:11.376937 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e52fde5b-22df-4fea-ae39-2bb2ef6fa033-config-data-generated\") pod \"openstack-galera-0\" (UID: \"e52fde5b-22df-4fea-ae39-2bb2ef6fa033\") " pod="openstack/openstack-galera-0" Oct 12 05:56:11 crc kubenswrapper[4930]: I1012 05:56:11.377068 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e52fde5b-22df-4fea-ae39-2bb2ef6fa033-kolla-config\") pod \"openstack-galera-0\" (UID: \"e52fde5b-22df-4fea-ae39-2bb2ef6fa033\") " pod="openstack/openstack-galera-0" Oct 12 05:56:11 crc kubenswrapper[4930]: I1012 05:56:11.385568 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e52fde5b-22df-4fea-ae39-2bb2ef6fa033-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"e52fde5b-22df-4fea-ae39-2bb2ef6fa033\") " pod="openstack/openstack-galera-0" Oct 12 05:56:11 crc kubenswrapper[4930]: I1012 05:56:11.386017 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e52fde5b-22df-4fea-ae39-2bb2ef6fa033-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"e52fde5b-22df-4fea-ae39-2bb2ef6fa033\") " pod="openstack/openstack-galera-0" Oct 12 05:56:11 crc kubenswrapper[4930]: I1012 05:56:11.391868 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/e52fde5b-22df-4fea-ae39-2bb2ef6fa033-secrets\") pod \"openstack-galera-0\" (UID: \"e52fde5b-22df-4fea-ae39-2bb2ef6fa033\") " pod="openstack/openstack-galera-0" Oct 12 05:56:11 crc kubenswrapper[4930]: I1012 05:56:11.392887 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5r9t\" (UniqueName: \"kubernetes.io/projected/e52fde5b-22df-4fea-ae39-2bb2ef6fa033-kube-api-access-r5r9t\") pod \"openstack-galera-0\" (UID: \"e52fde5b-22df-4fea-ae39-2bb2ef6fa033\") " pod="openstack/openstack-galera-0" Oct 12 05:56:11 crc kubenswrapper[4930]: I1012 05:56:11.400135 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"e52fde5b-22df-4fea-ae39-2bb2ef6fa033\") " pod="openstack/openstack-galera-0" Oct 12 05:56:11 crc kubenswrapper[4930]: I1012 05:56:11.522439 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 12 05:56:12 crc kubenswrapper[4930]: I1012 05:56:12.398228 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 12 05:56:12 crc kubenswrapper[4930]: W1012 05:56:12.423714 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode52fde5b_22df_4fea_ae39_2bb2ef6fa033.slice/crio-229d3954cc36f02991d0b4e3fbe794779cf2cb86931275210c4759582add60e8 WatchSource:0}: Error finding container 229d3954cc36f02991d0b4e3fbe794779cf2cb86931275210c4759582add60e8: Status 404 returned error can't find the container with id 229d3954cc36f02991d0b4e3fbe794779cf2cb86931275210c4759582add60e8 Oct 12 05:56:12 crc kubenswrapper[4930]: I1012 05:56:12.826456 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 12 05:56:12 crc kubenswrapper[4930]: I1012 05:56:12.828682 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 12 05:56:12 crc kubenswrapper[4930]: I1012 05:56:12.833855 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-zm59j" Oct 12 05:56:12 crc kubenswrapper[4930]: I1012 05:56:12.834425 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Oct 12 05:56:12 crc kubenswrapper[4930]: I1012 05:56:12.836943 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Oct 12 05:56:12 crc kubenswrapper[4930]: I1012 05:56:12.837408 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Oct 12 05:56:12 crc kubenswrapper[4930]: I1012 05:56:12.848481 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 12 05:56:12 crc kubenswrapper[4930]: I1012 05:56:12.918342 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/46dfc5e5-80f4-49c8-bd9d-885e5dcb3fe5-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"46dfc5e5-80f4-49c8-bd9d-885e5dcb3fe5\") " pod="openstack/openstack-cell1-galera-0" Oct 12 05:56:12 crc kubenswrapper[4930]: I1012 05:56:12.920652 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"46dfc5e5-80f4-49c8-bd9d-885e5dcb3fe5\") " pod="openstack/openstack-cell1-galera-0" Oct 12 05:56:12 crc kubenswrapper[4930]: I1012 05:56:12.920680 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46dfc5e5-80f4-49c8-bd9d-885e5dcb3fe5-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"46dfc5e5-80f4-49c8-bd9d-885e5dcb3fe5\") " pod="openstack/openstack-cell1-galera-0" Oct 12 05:56:12 crc kubenswrapper[4930]: I1012 05:56:12.920695 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stj9p\" (UniqueName: \"kubernetes.io/projected/46dfc5e5-80f4-49c8-bd9d-885e5dcb3fe5-kube-api-access-stj9p\") pod \"openstack-cell1-galera-0\" (UID: \"46dfc5e5-80f4-49c8-bd9d-885e5dcb3fe5\") " pod="openstack/openstack-cell1-galera-0" Oct 12 05:56:12 crc kubenswrapper[4930]: I1012 05:56:12.925656 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/46dfc5e5-80f4-49c8-bd9d-885e5dcb3fe5-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"46dfc5e5-80f4-49c8-bd9d-885e5dcb3fe5\") " pod="openstack/openstack-cell1-galera-0" Oct 12 05:56:12 crc kubenswrapper[4930]: I1012 05:56:12.926493 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/46dfc5e5-80f4-49c8-bd9d-885e5dcb3fe5-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"46dfc5e5-80f4-49c8-bd9d-885e5dcb3fe5\") " pod="openstack/openstack-cell1-galera-0" Oct 12 05:56:12 crc kubenswrapper[4930]: I1012 05:56:12.926617 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/46dfc5e5-80f4-49c8-bd9d-885e5dcb3fe5-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"46dfc5e5-80f4-49c8-bd9d-885e5dcb3fe5\") " pod="openstack/openstack-cell1-galera-0" Oct 12 05:56:12 crc kubenswrapper[4930]: I1012 05:56:12.926673 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/46dfc5e5-80f4-49c8-bd9d-885e5dcb3fe5-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"46dfc5e5-80f4-49c8-bd9d-885e5dcb3fe5\") " pod="openstack/openstack-cell1-galera-0" Oct 12 05:56:12 crc kubenswrapper[4930]: I1012 05:56:12.926696 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46dfc5e5-80f4-49c8-bd9d-885e5dcb3fe5-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"46dfc5e5-80f4-49c8-bd9d-885e5dcb3fe5\") " pod="openstack/openstack-cell1-galera-0" Oct 12 05:56:12 crc kubenswrapper[4930]: I1012 05:56:12.978357 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Oct 12 05:56:12 crc kubenswrapper[4930]: I1012 05:56:12.980092 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 12 05:56:12 crc kubenswrapper[4930]: I1012 05:56:12.983279 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Oct 12 05:56:12 crc kubenswrapper[4930]: I1012 05:56:12.983797 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Oct 12 05:56:12 crc kubenswrapper[4930]: I1012 05:56:12.984328 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-7cs78" Oct 12 05:56:13 crc kubenswrapper[4930]: I1012 05:56:13.005587 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 12 05:56:13 crc kubenswrapper[4930]: I1012 05:56:13.032924 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqffh\" (UniqueName: \"kubernetes.io/projected/76348a63-90b8-46b5-8856-da5c983b6d72-kube-api-access-jqffh\") pod \"memcached-0\" (UID: \"76348a63-90b8-46b5-8856-da5c983b6d72\") " pod="openstack/memcached-0" Oct 12 05:56:13 crc kubenswrapper[4930]: I1012 05:56:13.033027 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/46dfc5e5-80f4-49c8-bd9d-885e5dcb3fe5-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"46dfc5e5-80f4-49c8-bd9d-885e5dcb3fe5\") " pod="openstack/openstack-cell1-galera-0" Oct 12 05:56:13 crc kubenswrapper[4930]: I1012 05:56:13.033078 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76348a63-90b8-46b5-8856-da5c983b6d72-combined-ca-bundle\") pod \"memcached-0\" (UID: \"76348a63-90b8-46b5-8856-da5c983b6d72\") " pod="openstack/memcached-0" Oct 12 05:56:13 crc kubenswrapper[4930]: I1012 05:56:13.033112 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/46dfc5e5-80f4-49c8-bd9d-885e5dcb3fe5-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"46dfc5e5-80f4-49c8-bd9d-885e5dcb3fe5\") " pod="openstack/openstack-cell1-galera-0" Oct 12 05:56:13 crc kubenswrapper[4930]: I1012 05:56:13.033145 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46dfc5e5-80f4-49c8-bd9d-885e5dcb3fe5-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"46dfc5e5-80f4-49c8-bd9d-885e5dcb3fe5\") " pod="openstack/openstack-cell1-galera-0" Oct 12 05:56:13 crc kubenswrapper[4930]: I1012 05:56:13.033179 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/46dfc5e5-80f4-49c8-bd9d-885e5dcb3fe5-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"46dfc5e5-80f4-49c8-bd9d-885e5dcb3fe5\") " pod="openstack/openstack-cell1-galera-0" Oct 12 05:56:13 crc kubenswrapper[4930]: I1012 05:56:13.033203 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/76348a63-90b8-46b5-8856-da5c983b6d72-kolla-config\") pod \"memcached-0\" (UID: \"76348a63-90b8-46b5-8856-da5c983b6d72\") " pod="openstack/memcached-0" Oct 12 05:56:13 crc kubenswrapper[4930]: I1012 05:56:13.033253 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/76348a63-90b8-46b5-8856-da5c983b6d72-config-data\") pod \"memcached-0\" (UID: \"76348a63-90b8-46b5-8856-da5c983b6d72\") " pod="openstack/memcached-0" Oct 12 05:56:13 crc kubenswrapper[4930]: I1012 05:56:13.033294 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"46dfc5e5-80f4-49c8-bd9d-885e5dcb3fe5\") " pod="openstack/openstack-cell1-galera-0" Oct 12 05:56:13 crc kubenswrapper[4930]: I1012 05:56:13.033311 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46dfc5e5-80f4-49c8-bd9d-885e5dcb3fe5-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"46dfc5e5-80f4-49c8-bd9d-885e5dcb3fe5\") " pod="openstack/openstack-cell1-galera-0" Oct 12 05:56:13 crc kubenswrapper[4930]: I1012 05:56:13.033335 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stj9p\" (UniqueName: \"kubernetes.io/projected/46dfc5e5-80f4-49c8-bd9d-885e5dcb3fe5-kube-api-access-stj9p\") pod \"openstack-cell1-galera-0\" (UID: \"46dfc5e5-80f4-49c8-bd9d-885e5dcb3fe5\") " pod="openstack/openstack-cell1-galera-0" Oct 12 05:56:13 crc kubenswrapper[4930]: I1012 05:56:13.033414 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/46dfc5e5-80f4-49c8-bd9d-885e5dcb3fe5-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"46dfc5e5-80f4-49c8-bd9d-885e5dcb3fe5\") " pod="openstack/openstack-cell1-galera-0" Oct 12 05:56:13 crc kubenswrapper[4930]: I1012 05:56:13.033513 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/46dfc5e5-80f4-49c8-bd9d-885e5dcb3fe5-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"46dfc5e5-80f4-49c8-bd9d-885e5dcb3fe5\") " pod="openstack/openstack-cell1-galera-0" Oct 12 05:56:13 crc kubenswrapper[4930]: I1012 05:56:13.033593 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/76348a63-90b8-46b5-8856-da5c983b6d72-memcached-tls-certs\") pod \"memcached-0\" (UID: \"76348a63-90b8-46b5-8856-da5c983b6d72\") " pod="openstack/memcached-0" Oct 12 05:56:13 crc kubenswrapper[4930]: I1012 05:56:13.033655 4930 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"46dfc5e5-80f4-49c8-bd9d-885e5dcb3fe5\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/openstack-cell1-galera-0" Oct 12 05:56:13 crc kubenswrapper[4930]: I1012 05:56:13.033895 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/46dfc5e5-80f4-49c8-bd9d-885e5dcb3fe5-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"46dfc5e5-80f4-49c8-bd9d-885e5dcb3fe5\") " pod="openstack/openstack-cell1-galera-0" Oct 12 05:56:13 crc kubenswrapper[4930]: I1012 05:56:13.034962 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/46dfc5e5-80f4-49c8-bd9d-885e5dcb3fe5-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"46dfc5e5-80f4-49c8-bd9d-885e5dcb3fe5\") " pod="openstack/openstack-cell1-galera-0" Oct 12 05:56:13 crc kubenswrapper[4930]: I1012 05:56:13.035334 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46dfc5e5-80f4-49c8-bd9d-885e5dcb3fe5-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"46dfc5e5-80f4-49c8-bd9d-885e5dcb3fe5\") " pod="openstack/openstack-cell1-galera-0" Oct 12 05:56:13 crc kubenswrapper[4930]: I1012 05:56:13.035805 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/46dfc5e5-80f4-49c8-bd9d-885e5dcb3fe5-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"46dfc5e5-80f4-49c8-bd9d-885e5dcb3fe5\") " pod="openstack/openstack-cell1-galera-0" Oct 12 05:56:13 crc kubenswrapper[4930]: I1012 05:56:13.039966 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/46dfc5e5-80f4-49c8-bd9d-885e5dcb3fe5-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"46dfc5e5-80f4-49c8-bd9d-885e5dcb3fe5\") " pod="openstack/openstack-cell1-galera-0" Oct 12 05:56:13 crc kubenswrapper[4930]: I1012 05:56:13.040159 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46dfc5e5-80f4-49c8-bd9d-885e5dcb3fe5-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"46dfc5e5-80f4-49c8-bd9d-885e5dcb3fe5\") " pod="openstack/openstack-cell1-galera-0" Oct 12 05:56:13 crc kubenswrapper[4930]: I1012 05:56:13.052413 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/46dfc5e5-80f4-49c8-bd9d-885e5dcb3fe5-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"46dfc5e5-80f4-49c8-bd9d-885e5dcb3fe5\") " pod="openstack/openstack-cell1-galera-0" Oct 12 05:56:13 crc kubenswrapper[4930]: I1012 05:56:13.068421 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stj9p\" (UniqueName: \"kubernetes.io/projected/46dfc5e5-80f4-49c8-bd9d-885e5dcb3fe5-kube-api-access-stj9p\") pod \"openstack-cell1-galera-0\" (UID: \"46dfc5e5-80f4-49c8-bd9d-885e5dcb3fe5\") " pod="openstack/openstack-cell1-galera-0" Oct 12 05:56:13 crc kubenswrapper[4930]: I1012 05:56:13.079038 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"46dfc5e5-80f4-49c8-bd9d-885e5dcb3fe5\") " pod="openstack/openstack-cell1-galera-0" Oct 12 05:56:13 crc kubenswrapper[4930]: I1012 05:56:13.135323 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76348a63-90b8-46b5-8856-da5c983b6d72-combined-ca-bundle\") pod \"memcached-0\" (UID: \"76348a63-90b8-46b5-8856-da5c983b6d72\") " pod="openstack/memcached-0" Oct 12 05:56:13 crc kubenswrapper[4930]: I1012 05:56:13.135397 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/76348a63-90b8-46b5-8856-da5c983b6d72-kolla-config\") pod \"memcached-0\" (UID: \"76348a63-90b8-46b5-8856-da5c983b6d72\") " pod="openstack/memcached-0" Oct 12 05:56:13 crc kubenswrapper[4930]: I1012 05:56:13.135436 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/76348a63-90b8-46b5-8856-da5c983b6d72-config-data\") pod \"memcached-0\" (UID: \"76348a63-90b8-46b5-8856-da5c983b6d72\") " pod="openstack/memcached-0" Oct 12 05:56:13 crc kubenswrapper[4930]: I1012 05:56:13.135497 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/76348a63-90b8-46b5-8856-da5c983b6d72-memcached-tls-certs\") pod \"memcached-0\" (UID: \"76348a63-90b8-46b5-8856-da5c983b6d72\") " pod="openstack/memcached-0" Oct 12 05:56:13 crc kubenswrapper[4930]: I1012 05:56:13.135525 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqffh\" (UniqueName: \"kubernetes.io/projected/76348a63-90b8-46b5-8856-da5c983b6d72-kube-api-access-jqffh\") pod \"memcached-0\" (UID: \"76348a63-90b8-46b5-8856-da5c983b6d72\") " pod="openstack/memcached-0" Oct 12 05:56:13 crc kubenswrapper[4930]: I1012 05:56:13.137364 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/76348a63-90b8-46b5-8856-da5c983b6d72-kolla-config\") pod \"memcached-0\" (UID: \"76348a63-90b8-46b5-8856-da5c983b6d72\") " pod="openstack/memcached-0" Oct 12 05:56:13 crc kubenswrapper[4930]: I1012 05:56:13.137885 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/76348a63-90b8-46b5-8856-da5c983b6d72-config-data\") pod \"memcached-0\" (UID: \"76348a63-90b8-46b5-8856-da5c983b6d72\") " pod="openstack/memcached-0" Oct 12 05:56:13 crc kubenswrapper[4930]: I1012 05:56:13.146189 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76348a63-90b8-46b5-8856-da5c983b6d72-combined-ca-bundle\") pod \"memcached-0\" (UID: \"76348a63-90b8-46b5-8856-da5c983b6d72\") " pod="openstack/memcached-0" Oct 12 05:56:13 crc kubenswrapper[4930]: I1012 05:56:13.153792 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/76348a63-90b8-46b5-8856-da5c983b6d72-memcached-tls-certs\") pod \"memcached-0\" (UID: \"76348a63-90b8-46b5-8856-da5c983b6d72\") " pod="openstack/memcached-0" Oct 12 05:56:13 crc kubenswrapper[4930]: I1012 05:56:13.165650 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqffh\" (UniqueName: \"kubernetes.io/projected/76348a63-90b8-46b5-8856-da5c983b6d72-kube-api-access-jqffh\") pod \"memcached-0\" (UID: \"76348a63-90b8-46b5-8856-da5c983b6d72\") " pod="openstack/memcached-0" Oct 12 05:56:13 crc kubenswrapper[4930]: I1012 05:56:13.186250 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 12 05:56:13 crc kubenswrapper[4930]: I1012 05:56:13.233300 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e52fde5b-22df-4fea-ae39-2bb2ef6fa033","Type":"ContainerStarted","Data":"229d3954cc36f02991d0b4e3fbe794779cf2cb86931275210c4759582add60e8"} Oct 12 05:56:13 crc kubenswrapper[4930]: I1012 05:56:13.328768 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 12 05:56:13 crc kubenswrapper[4930]: W1012 05:56:13.670449 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46dfc5e5_80f4_49c8_bd9d_885e5dcb3fe5.slice/crio-a754a3c5a2207e6fd4f0917be892de11942e844dd49093067d1a66ae9389d0dc WatchSource:0}: Error finding container a754a3c5a2207e6fd4f0917be892de11942e844dd49093067d1a66ae9389d0dc: Status 404 returned error can't find the container with id a754a3c5a2207e6fd4f0917be892de11942e844dd49093067d1a66ae9389d0dc Oct 12 05:56:13 crc kubenswrapper[4930]: I1012 05:56:13.681124 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 12 05:56:13 crc kubenswrapper[4930]: I1012 05:56:13.837110 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 12 05:56:13 crc kubenswrapper[4930]: W1012 05:56:13.852395 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod76348a63_90b8_46b5_8856_da5c983b6d72.slice/crio-f603d9de9d9f03ebe82e4ae5c9a79ccebc42059e2d64356b674e3770163e8d4f WatchSource:0}: Error finding container f603d9de9d9f03ebe82e4ae5c9a79ccebc42059e2d64356b674e3770163e8d4f: Status 404 returned error can't find the container with id f603d9de9d9f03ebe82e4ae5c9a79ccebc42059e2d64356b674e3770163e8d4f Oct 12 05:56:14 crc kubenswrapper[4930]: I1012 05:56:14.290233 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"76348a63-90b8-46b5-8856-da5c983b6d72","Type":"ContainerStarted","Data":"f603d9de9d9f03ebe82e4ae5c9a79ccebc42059e2d64356b674e3770163e8d4f"} Oct 12 05:56:14 crc kubenswrapper[4930]: I1012 05:56:14.301916 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"46dfc5e5-80f4-49c8-bd9d-885e5dcb3fe5","Type":"ContainerStarted","Data":"a754a3c5a2207e6fd4f0917be892de11942e844dd49093067d1a66ae9389d0dc"} Oct 12 05:56:14 crc kubenswrapper[4930]: I1012 05:56:14.880014 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 12 05:56:14 crc kubenswrapper[4930]: I1012 05:56:14.881257 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 12 05:56:14 crc kubenswrapper[4930]: I1012 05:56:14.885685 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-7qjsm" Oct 12 05:56:14 crc kubenswrapper[4930]: I1012 05:56:14.895228 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 12 05:56:14 crc kubenswrapper[4930]: I1012 05:56:14.986435 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87rq6\" (UniqueName: \"kubernetes.io/projected/86e4fb08-5a1f-4f1e-8cf0-7ef59386d66b-kube-api-access-87rq6\") pod \"kube-state-metrics-0\" (UID: \"86e4fb08-5a1f-4f1e-8cf0-7ef59386d66b\") " pod="openstack/kube-state-metrics-0" Oct 12 05:56:15 crc kubenswrapper[4930]: I1012 05:56:15.091267 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87rq6\" (UniqueName: \"kubernetes.io/projected/86e4fb08-5a1f-4f1e-8cf0-7ef59386d66b-kube-api-access-87rq6\") pod \"kube-state-metrics-0\" (UID: \"86e4fb08-5a1f-4f1e-8cf0-7ef59386d66b\") " pod="openstack/kube-state-metrics-0" Oct 12 05:56:15 crc kubenswrapper[4930]: I1012 05:56:15.158089 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87rq6\" (UniqueName: \"kubernetes.io/projected/86e4fb08-5a1f-4f1e-8cf0-7ef59386d66b-kube-api-access-87rq6\") pod \"kube-state-metrics-0\" (UID: \"86e4fb08-5a1f-4f1e-8cf0-7ef59386d66b\") " pod="openstack/kube-state-metrics-0" Oct 12 05:56:15 crc kubenswrapper[4930]: I1012 05:56:15.223292 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 12 05:56:15 crc kubenswrapper[4930]: I1012 05:56:15.893517 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 12 05:56:15 crc kubenswrapper[4930]: W1012 05:56:15.905518 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod86e4fb08_5a1f_4f1e_8cf0_7ef59386d66b.slice/crio-3f900e0e86e6104fce7c6e34d04508a5e51f581c9c394ebd4e4884c908ec1ac2 WatchSource:0}: Error finding container 3f900e0e86e6104fce7c6e34d04508a5e51f581c9c394ebd4e4884c908ec1ac2: Status 404 returned error can't find the container with id 3f900e0e86e6104fce7c6e34d04508a5e51f581c9c394ebd4e4884c908ec1ac2 Oct 12 05:56:16 crc kubenswrapper[4930]: I1012 05:56:16.189312 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 12 05:56:16 crc kubenswrapper[4930]: I1012 05:56:16.191202 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 12 05:56:16 crc kubenswrapper[4930]: I1012 05:56:16.194231 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-jt8z4" Oct 12 05:56:16 crc kubenswrapper[4930]: I1012 05:56:16.194262 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Oct 12 05:56:16 crc kubenswrapper[4930]: I1012 05:56:16.194532 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Oct 12 05:56:16 crc kubenswrapper[4930]: I1012 05:56:16.195144 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Oct 12 05:56:16 crc kubenswrapper[4930]: I1012 05:56:16.195706 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Oct 12 05:56:16 crc kubenswrapper[4930]: I1012 05:56:16.202708 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Oct 12 05:56:16 crc kubenswrapper[4930]: I1012 05:56:16.212914 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 12 05:56:16 crc kubenswrapper[4930]: I1012 05:56:16.323848 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"86e4fb08-5a1f-4f1e-8cf0-7ef59386d66b","Type":"ContainerStarted","Data":"3f900e0e86e6104fce7c6e34d04508a5e51f581c9c394ebd4e4884c908ec1ac2"} Oct 12 05:56:16 crc kubenswrapper[4930]: I1012 05:56:16.326447 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/80c83b5b-7221-4d68-b4e1-b8c622bfa7cf-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"80c83b5b-7221-4d68-b4e1-b8c622bfa7cf\") " pod="openstack/prometheus-metric-storage-0" Oct 12 05:56:16 crc kubenswrapper[4930]: I1012 05:56:16.326527 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-22a7639f-0965-4497-b4ff-8d976229c443\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-22a7639f-0965-4497-b4ff-8d976229c443\") pod \"prometheus-metric-storage-0\" (UID: \"80c83b5b-7221-4d68-b4e1-b8c622bfa7cf\") " pod="openstack/prometheus-metric-storage-0" Oct 12 05:56:16 crc kubenswrapper[4930]: I1012 05:56:16.326613 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/80c83b5b-7221-4d68-b4e1-b8c622bfa7cf-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"80c83b5b-7221-4d68-b4e1-b8c622bfa7cf\") " pod="openstack/prometheus-metric-storage-0" Oct 12 05:56:16 crc kubenswrapper[4930]: I1012 05:56:16.326667 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/80c83b5b-7221-4d68-b4e1-b8c622bfa7cf-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"80c83b5b-7221-4d68-b4e1-b8c622bfa7cf\") " pod="openstack/prometheus-metric-storage-0" Oct 12 05:56:16 crc kubenswrapper[4930]: I1012 05:56:16.326896 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qg2n4\" (UniqueName: \"kubernetes.io/projected/80c83b5b-7221-4d68-b4e1-b8c622bfa7cf-kube-api-access-qg2n4\") pod \"prometheus-metric-storage-0\" (UID: \"80c83b5b-7221-4d68-b4e1-b8c622bfa7cf\") " pod="openstack/prometheus-metric-storage-0" Oct 12 05:56:16 crc kubenswrapper[4930]: I1012 05:56:16.326963 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/80c83b5b-7221-4d68-b4e1-b8c622bfa7cf-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"80c83b5b-7221-4d68-b4e1-b8c622bfa7cf\") " pod="openstack/prometheus-metric-storage-0" Oct 12 05:56:16 crc kubenswrapper[4930]: I1012 05:56:16.327028 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/80c83b5b-7221-4d68-b4e1-b8c622bfa7cf-config\") pod \"prometheus-metric-storage-0\" (UID: \"80c83b5b-7221-4d68-b4e1-b8c622bfa7cf\") " pod="openstack/prometheus-metric-storage-0" Oct 12 05:56:16 crc kubenswrapper[4930]: I1012 05:56:16.327159 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/80c83b5b-7221-4d68-b4e1-b8c622bfa7cf-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"80c83b5b-7221-4d68-b4e1-b8c622bfa7cf\") " pod="openstack/prometheus-metric-storage-0" Oct 12 05:56:16 crc kubenswrapper[4930]: I1012 05:56:16.428948 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/80c83b5b-7221-4d68-b4e1-b8c622bfa7cf-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"80c83b5b-7221-4d68-b4e1-b8c622bfa7cf\") " pod="openstack/prometheus-metric-storage-0" Oct 12 05:56:16 crc kubenswrapper[4930]: I1012 05:56:16.429042 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qg2n4\" (UniqueName: \"kubernetes.io/projected/80c83b5b-7221-4d68-b4e1-b8c622bfa7cf-kube-api-access-qg2n4\") pod \"prometheus-metric-storage-0\" (UID: \"80c83b5b-7221-4d68-b4e1-b8c622bfa7cf\") " pod="openstack/prometheus-metric-storage-0" Oct 12 05:56:16 crc kubenswrapper[4930]: I1012 05:56:16.429070 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/80c83b5b-7221-4d68-b4e1-b8c622bfa7cf-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"80c83b5b-7221-4d68-b4e1-b8c622bfa7cf\") " pod="openstack/prometheus-metric-storage-0" Oct 12 05:56:16 crc kubenswrapper[4930]: I1012 05:56:16.429094 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/80c83b5b-7221-4d68-b4e1-b8c622bfa7cf-config\") pod \"prometheus-metric-storage-0\" (UID: \"80c83b5b-7221-4d68-b4e1-b8c622bfa7cf\") " pod="openstack/prometheus-metric-storage-0" Oct 12 05:56:16 crc kubenswrapper[4930]: I1012 05:56:16.429128 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/80c83b5b-7221-4d68-b4e1-b8c622bfa7cf-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"80c83b5b-7221-4d68-b4e1-b8c622bfa7cf\") " pod="openstack/prometheus-metric-storage-0" Oct 12 05:56:16 crc kubenswrapper[4930]: I1012 05:56:16.429220 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/80c83b5b-7221-4d68-b4e1-b8c622bfa7cf-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"80c83b5b-7221-4d68-b4e1-b8c622bfa7cf\") " pod="openstack/prometheus-metric-storage-0" Oct 12 05:56:16 crc kubenswrapper[4930]: I1012 05:56:16.429247 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-22a7639f-0965-4497-b4ff-8d976229c443\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-22a7639f-0965-4497-b4ff-8d976229c443\") pod \"prometheus-metric-storage-0\" (UID: \"80c83b5b-7221-4d68-b4e1-b8c622bfa7cf\") " pod="openstack/prometheus-metric-storage-0" Oct 12 05:56:16 crc kubenswrapper[4930]: I1012 05:56:16.429296 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/80c83b5b-7221-4d68-b4e1-b8c622bfa7cf-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"80c83b5b-7221-4d68-b4e1-b8c622bfa7cf\") " pod="openstack/prometheus-metric-storage-0" Oct 12 05:56:16 crc kubenswrapper[4930]: I1012 05:56:16.431817 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/80c83b5b-7221-4d68-b4e1-b8c622bfa7cf-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"80c83b5b-7221-4d68-b4e1-b8c622bfa7cf\") " pod="openstack/prometheus-metric-storage-0" Oct 12 05:56:16 crc kubenswrapper[4930]: I1012 05:56:16.434776 4930 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 12 05:56:16 crc kubenswrapper[4930]: I1012 05:56:16.434821 4930 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-22a7639f-0965-4497-b4ff-8d976229c443\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-22a7639f-0965-4497-b4ff-8d976229c443\") pod \"prometheus-metric-storage-0\" (UID: \"80c83b5b-7221-4d68-b4e1-b8c622bfa7cf\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/6e15b61105eae1f7086da32ef53b808da7b93145612971cfb218d121d2d8a399/globalmount\"" pod="openstack/prometheus-metric-storage-0" Oct 12 05:56:16 crc kubenswrapper[4930]: I1012 05:56:16.446481 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/80c83b5b-7221-4d68-b4e1-b8c622bfa7cf-config\") pod \"prometheus-metric-storage-0\" (UID: \"80c83b5b-7221-4d68-b4e1-b8c622bfa7cf\") " pod="openstack/prometheus-metric-storage-0" Oct 12 05:56:16 crc kubenswrapper[4930]: I1012 05:56:16.446890 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/80c83b5b-7221-4d68-b4e1-b8c622bfa7cf-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"80c83b5b-7221-4d68-b4e1-b8c622bfa7cf\") " pod="openstack/prometheus-metric-storage-0" Oct 12 05:56:16 crc kubenswrapper[4930]: I1012 05:56:16.447043 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/80c83b5b-7221-4d68-b4e1-b8c622bfa7cf-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"80c83b5b-7221-4d68-b4e1-b8c622bfa7cf\") " pod="openstack/prometheus-metric-storage-0" Oct 12 05:56:16 crc kubenswrapper[4930]: I1012 05:56:16.447360 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/80c83b5b-7221-4d68-b4e1-b8c622bfa7cf-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"80c83b5b-7221-4d68-b4e1-b8c622bfa7cf\") " pod="openstack/prometheus-metric-storage-0" Oct 12 05:56:16 crc kubenswrapper[4930]: I1012 05:56:16.449132 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/80c83b5b-7221-4d68-b4e1-b8c622bfa7cf-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"80c83b5b-7221-4d68-b4e1-b8c622bfa7cf\") " pod="openstack/prometheus-metric-storage-0" Oct 12 05:56:16 crc kubenswrapper[4930]: I1012 05:56:16.465585 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qg2n4\" (UniqueName: \"kubernetes.io/projected/80c83b5b-7221-4d68-b4e1-b8c622bfa7cf-kube-api-access-qg2n4\") pod \"prometheus-metric-storage-0\" (UID: \"80c83b5b-7221-4d68-b4e1-b8c622bfa7cf\") " pod="openstack/prometheus-metric-storage-0" Oct 12 05:56:16 crc kubenswrapper[4930]: I1012 05:56:16.492006 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-22a7639f-0965-4497-b4ff-8d976229c443\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-22a7639f-0965-4497-b4ff-8d976229c443\") pod \"prometheus-metric-storage-0\" (UID: \"80c83b5b-7221-4d68-b4e1-b8c622bfa7cf\") " pod="openstack/prometheus-metric-storage-0" Oct 12 05:56:16 crc kubenswrapper[4930]: I1012 05:56:16.523107 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 12 05:56:18 crc kubenswrapper[4930]: I1012 05:56:18.349067 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 12 05:56:18 crc kubenswrapper[4930]: I1012 05:56:18.464375 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-nw5dm"] Oct 12 05:56:18 crc kubenswrapper[4930]: I1012 05:56:18.467450 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nw5dm" Oct 12 05:56:18 crc kubenswrapper[4930]: I1012 05:56:18.473694 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-f7cmb" Oct 12 05:56:18 crc kubenswrapper[4930]: I1012 05:56:18.474059 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Oct 12 05:56:18 crc kubenswrapper[4930]: I1012 05:56:18.474178 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Oct 12 05:56:18 crc kubenswrapper[4930]: I1012 05:56:18.481044 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-nw5dm"] Oct 12 05:56:18 crc kubenswrapper[4930]: I1012 05:56:18.505889 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-hjqfl"] Oct 12 05:56:18 crc kubenswrapper[4930]: I1012 05:56:18.507653 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-hjqfl" Oct 12 05:56:18 crc kubenswrapper[4930]: I1012 05:56:18.512752 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-hjqfl"] Oct 12 05:56:18 crc kubenswrapper[4930]: I1012 05:56:18.566509 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ddccae59-8916-4bd7-bffa-041cf574e89e-var-log-ovn\") pod \"ovn-controller-nw5dm\" (UID: \"ddccae59-8916-4bd7-bffa-041cf574e89e\") " pod="openstack/ovn-controller-nw5dm" Oct 12 05:56:18 crc kubenswrapper[4930]: I1012 05:56:18.566580 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ddccae59-8916-4bd7-bffa-041cf574e89e-var-run\") pod \"ovn-controller-nw5dm\" (UID: \"ddccae59-8916-4bd7-bffa-041cf574e89e\") " pod="openstack/ovn-controller-nw5dm" Oct 12 05:56:18 crc kubenswrapper[4930]: I1012 05:56:18.566644 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddccae59-8916-4bd7-bffa-041cf574e89e-ovn-controller-tls-certs\") pod \"ovn-controller-nw5dm\" (UID: \"ddccae59-8916-4bd7-bffa-041cf574e89e\") " pod="openstack/ovn-controller-nw5dm" Oct 12 05:56:18 crc kubenswrapper[4930]: I1012 05:56:18.566692 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ddccae59-8916-4bd7-bffa-041cf574e89e-var-run-ovn\") pod \"ovn-controller-nw5dm\" (UID: \"ddccae59-8916-4bd7-bffa-041cf574e89e\") " pod="openstack/ovn-controller-nw5dm" Oct 12 05:56:18 crc kubenswrapper[4930]: I1012 05:56:18.566717 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ddccae59-8916-4bd7-bffa-041cf574e89e-scripts\") pod \"ovn-controller-nw5dm\" (UID: \"ddccae59-8916-4bd7-bffa-041cf574e89e\") " pod="openstack/ovn-controller-nw5dm" Oct 12 05:56:18 crc kubenswrapper[4930]: I1012 05:56:18.566852 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddccae59-8916-4bd7-bffa-041cf574e89e-combined-ca-bundle\") pod \"ovn-controller-nw5dm\" (UID: \"ddccae59-8916-4bd7-bffa-041cf574e89e\") " pod="openstack/ovn-controller-nw5dm" Oct 12 05:56:18 crc kubenswrapper[4930]: I1012 05:56:18.566898 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2khz\" (UniqueName: \"kubernetes.io/projected/ddccae59-8916-4bd7-bffa-041cf574e89e-kube-api-access-j2khz\") pod \"ovn-controller-nw5dm\" (UID: \"ddccae59-8916-4bd7-bffa-041cf574e89e\") " pod="openstack/ovn-controller-nw5dm" Oct 12 05:56:18 crc kubenswrapper[4930]: I1012 05:56:18.667751 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/382036ff-8896-494d-9670-ec527019676f-var-lib\") pod \"ovn-controller-ovs-hjqfl\" (UID: \"382036ff-8896-494d-9670-ec527019676f\") " pod="openstack/ovn-controller-ovs-hjqfl" Oct 12 05:56:18 crc kubenswrapper[4930]: I1012 05:56:18.668128 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ddccae59-8916-4bd7-bffa-041cf574e89e-var-log-ovn\") pod \"ovn-controller-nw5dm\" (UID: \"ddccae59-8916-4bd7-bffa-041cf574e89e\") " pod="openstack/ovn-controller-nw5dm" Oct 12 05:56:18 crc kubenswrapper[4930]: I1012 05:56:18.668156 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ddccae59-8916-4bd7-bffa-041cf574e89e-var-run\") pod \"ovn-controller-nw5dm\" (UID: \"ddccae59-8916-4bd7-bffa-041cf574e89e\") " pod="openstack/ovn-controller-nw5dm" Oct 12 05:56:18 crc kubenswrapper[4930]: I1012 05:56:18.668175 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/382036ff-8896-494d-9670-ec527019676f-scripts\") pod \"ovn-controller-ovs-hjqfl\" (UID: \"382036ff-8896-494d-9670-ec527019676f\") " pod="openstack/ovn-controller-ovs-hjqfl" Oct 12 05:56:18 crc kubenswrapper[4930]: I1012 05:56:18.668585 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ddccae59-8916-4bd7-bffa-041cf574e89e-var-run\") pod \"ovn-controller-nw5dm\" (UID: \"ddccae59-8916-4bd7-bffa-041cf574e89e\") " pod="openstack/ovn-controller-nw5dm" Oct 12 05:56:18 crc kubenswrapper[4930]: I1012 05:56:18.668634 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5l4qf\" (UniqueName: \"kubernetes.io/projected/382036ff-8896-494d-9670-ec527019676f-kube-api-access-5l4qf\") pod \"ovn-controller-ovs-hjqfl\" (UID: \"382036ff-8896-494d-9670-ec527019676f\") " pod="openstack/ovn-controller-ovs-hjqfl" Oct 12 05:56:18 crc kubenswrapper[4930]: I1012 05:56:18.668662 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ddccae59-8916-4bd7-bffa-041cf574e89e-var-log-ovn\") pod \"ovn-controller-nw5dm\" (UID: \"ddccae59-8916-4bd7-bffa-041cf574e89e\") " pod="openstack/ovn-controller-nw5dm" Oct 12 05:56:18 crc kubenswrapper[4930]: I1012 05:56:18.668680 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddccae59-8916-4bd7-bffa-041cf574e89e-ovn-controller-tls-certs\") pod \"ovn-controller-nw5dm\" (UID: \"ddccae59-8916-4bd7-bffa-041cf574e89e\") " pod="openstack/ovn-controller-nw5dm" Oct 12 05:56:18 crc kubenswrapper[4930]: I1012 05:56:18.668716 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ddccae59-8916-4bd7-bffa-041cf574e89e-var-run-ovn\") pod \"ovn-controller-nw5dm\" (UID: \"ddccae59-8916-4bd7-bffa-041cf574e89e\") " pod="openstack/ovn-controller-nw5dm" Oct 12 05:56:18 crc kubenswrapper[4930]: I1012 05:56:18.668748 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ddccae59-8916-4bd7-bffa-041cf574e89e-scripts\") pod \"ovn-controller-nw5dm\" (UID: \"ddccae59-8916-4bd7-bffa-041cf574e89e\") " pod="openstack/ovn-controller-nw5dm" Oct 12 05:56:18 crc kubenswrapper[4930]: I1012 05:56:18.668774 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/382036ff-8896-494d-9670-ec527019676f-var-run\") pod \"ovn-controller-ovs-hjqfl\" (UID: \"382036ff-8896-494d-9670-ec527019676f\") " pod="openstack/ovn-controller-ovs-hjqfl" Oct 12 05:56:18 crc kubenswrapper[4930]: I1012 05:56:18.668797 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddccae59-8916-4bd7-bffa-041cf574e89e-combined-ca-bundle\") pod \"ovn-controller-nw5dm\" (UID: \"ddccae59-8916-4bd7-bffa-041cf574e89e\") " pod="openstack/ovn-controller-nw5dm" Oct 12 05:56:18 crc kubenswrapper[4930]: I1012 05:56:18.668820 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/382036ff-8896-494d-9670-ec527019676f-etc-ovs\") pod \"ovn-controller-ovs-hjqfl\" (UID: \"382036ff-8896-494d-9670-ec527019676f\") " pod="openstack/ovn-controller-ovs-hjqfl" Oct 12 05:56:18 crc kubenswrapper[4930]: I1012 05:56:18.668842 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2khz\" (UniqueName: \"kubernetes.io/projected/ddccae59-8916-4bd7-bffa-041cf574e89e-kube-api-access-j2khz\") pod \"ovn-controller-nw5dm\" (UID: \"ddccae59-8916-4bd7-bffa-041cf574e89e\") " pod="openstack/ovn-controller-nw5dm" Oct 12 05:56:18 crc kubenswrapper[4930]: I1012 05:56:18.668868 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/382036ff-8896-494d-9670-ec527019676f-var-log\") pod \"ovn-controller-ovs-hjqfl\" (UID: \"382036ff-8896-494d-9670-ec527019676f\") " pod="openstack/ovn-controller-ovs-hjqfl" Oct 12 05:56:18 crc kubenswrapper[4930]: I1012 05:56:18.669008 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ddccae59-8916-4bd7-bffa-041cf574e89e-var-run-ovn\") pod \"ovn-controller-nw5dm\" (UID: \"ddccae59-8916-4bd7-bffa-041cf574e89e\") " pod="openstack/ovn-controller-nw5dm" Oct 12 05:56:18 crc kubenswrapper[4930]: I1012 05:56:18.671178 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ddccae59-8916-4bd7-bffa-041cf574e89e-scripts\") pod \"ovn-controller-nw5dm\" (UID: \"ddccae59-8916-4bd7-bffa-041cf574e89e\") " pod="openstack/ovn-controller-nw5dm" Oct 12 05:56:18 crc kubenswrapper[4930]: I1012 05:56:18.676782 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddccae59-8916-4bd7-bffa-041cf574e89e-ovn-controller-tls-certs\") pod \"ovn-controller-nw5dm\" (UID: \"ddccae59-8916-4bd7-bffa-041cf574e89e\") " pod="openstack/ovn-controller-nw5dm" Oct 12 05:56:18 crc kubenswrapper[4930]: I1012 05:56:18.686034 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddccae59-8916-4bd7-bffa-041cf574e89e-combined-ca-bundle\") pod \"ovn-controller-nw5dm\" (UID: \"ddccae59-8916-4bd7-bffa-041cf574e89e\") " pod="openstack/ovn-controller-nw5dm" Oct 12 05:56:18 crc kubenswrapper[4930]: I1012 05:56:18.686425 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2khz\" (UniqueName: \"kubernetes.io/projected/ddccae59-8916-4bd7-bffa-041cf574e89e-kube-api-access-j2khz\") pod \"ovn-controller-nw5dm\" (UID: \"ddccae59-8916-4bd7-bffa-041cf574e89e\") " pod="openstack/ovn-controller-nw5dm" Oct 12 05:56:18 crc kubenswrapper[4930]: I1012 05:56:18.770701 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/382036ff-8896-494d-9670-ec527019676f-var-run\") pod \"ovn-controller-ovs-hjqfl\" (UID: \"382036ff-8896-494d-9670-ec527019676f\") " pod="openstack/ovn-controller-ovs-hjqfl" Oct 12 05:56:18 crc kubenswrapper[4930]: I1012 05:56:18.770786 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/382036ff-8896-494d-9670-ec527019676f-etc-ovs\") pod \"ovn-controller-ovs-hjqfl\" (UID: \"382036ff-8896-494d-9670-ec527019676f\") " pod="openstack/ovn-controller-ovs-hjqfl" Oct 12 05:56:18 crc kubenswrapper[4930]: I1012 05:56:18.770830 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/382036ff-8896-494d-9670-ec527019676f-var-log\") pod \"ovn-controller-ovs-hjqfl\" (UID: \"382036ff-8896-494d-9670-ec527019676f\") " pod="openstack/ovn-controller-ovs-hjqfl" Oct 12 05:56:18 crc kubenswrapper[4930]: I1012 05:56:18.770855 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/382036ff-8896-494d-9670-ec527019676f-var-lib\") pod \"ovn-controller-ovs-hjqfl\" (UID: \"382036ff-8896-494d-9670-ec527019676f\") " pod="openstack/ovn-controller-ovs-hjqfl" Oct 12 05:56:18 crc kubenswrapper[4930]: I1012 05:56:18.770883 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/382036ff-8896-494d-9670-ec527019676f-scripts\") pod \"ovn-controller-ovs-hjqfl\" (UID: \"382036ff-8896-494d-9670-ec527019676f\") " pod="openstack/ovn-controller-ovs-hjqfl" Oct 12 05:56:18 crc kubenswrapper[4930]: I1012 05:56:18.770888 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/382036ff-8896-494d-9670-ec527019676f-var-run\") pod \"ovn-controller-ovs-hjqfl\" (UID: \"382036ff-8896-494d-9670-ec527019676f\") " pod="openstack/ovn-controller-ovs-hjqfl" Oct 12 05:56:18 crc kubenswrapper[4930]: I1012 05:56:18.770911 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5l4qf\" (UniqueName: \"kubernetes.io/projected/382036ff-8896-494d-9670-ec527019676f-kube-api-access-5l4qf\") pod \"ovn-controller-ovs-hjqfl\" (UID: \"382036ff-8896-494d-9670-ec527019676f\") " pod="openstack/ovn-controller-ovs-hjqfl" Oct 12 05:56:18 crc kubenswrapper[4930]: I1012 05:56:18.771259 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/382036ff-8896-494d-9670-ec527019676f-var-log\") pod \"ovn-controller-ovs-hjqfl\" (UID: \"382036ff-8896-494d-9670-ec527019676f\") " pod="openstack/ovn-controller-ovs-hjqfl" Oct 12 05:56:18 crc kubenswrapper[4930]: I1012 05:56:18.771357 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/382036ff-8896-494d-9670-ec527019676f-var-lib\") pod \"ovn-controller-ovs-hjqfl\" (UID: \"382036ff-8896-494d-9670-ec527019676f\") " pod="openstack/ovn-controller-ovs-hjqfl" Oct 12 05:56:18 crc kubenswrapper[4930]: I1012 05:56:18.771431 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/382036ff-8896-494d-9670-ec527019676f-etc-ovs\") pod \"ovn-controller-ovs-hjqfl\" (UID: \"382036ff-8896-494d-9670-ec527019676f\") " pod="openstack/ovn-controller-ovs-hjqfl" Oct 12 05:56:18 crc kubenswrapper[4930]: I1012 05:56:18.774148 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/382036ff-8896-494d-9670-ec527019676f-scripts\") pod \"ovn-controller-ovs-hjqfl\" (UID: \"382036ff-8896-494d-9670-ec527019676f\") " pod="openstack/ovn-controller-ovs-hjqfl" Oct 12 05:56:18 crc kubenswrapper[4930]: I1012 05:56:18.788081 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5l4qf\" (UniqueName: \"kubernetes.io/projected/382036ff-8896-494d-9670-ec527019676f-kube-api-access-5l4qf\") pod \"ovn-controller-ovs-hjqfl\" (UID: \"382036ff-8896-494d-9670-ec527019676f\") " pod="openstack/ovn-controller-ovs-hjqfl" Oct 12 05:56:18 crc kubenswrapper[4930]: I1012 05:56:18.800024 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nw5dm" Oct 12 05:56:18 crc kubenswrapper[4930]: I1012 05:56:18.827488 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-hjqfl" Oct 12 05:56:18 crc kubenswrapper[4930]: I1012 05:56:18.980969 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 12 05:56:18 crc kubenswrapper[4930]: I1012 05:56:18.985259 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 12 05:56:18 crc kubenswrapper[4930]: I1012 05:56:18.988406 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-vq74j" Oct 12 05:56:18 crc kubenswrapper[4930]: I1012 05:56:18.988845 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Oct 12 05:56:18 crc kubenswrapper[4930]: I1012 05:56:18.989170 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Oct 12 05:56:18 crc kubenswrapper[4930]: I1012 05:56:18.989540 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Oct 12 05:56:18 crc kubenswrapper[4930]: I1012 05:56:18.989652 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Oct 12 05:56:19 crc kubenswrapper[4930]: I1012 05:56:19.034374 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 12 05:56:19 crc kubenswrapper[4930]: I1012 05:56:19.082917 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/94bfaf3d-7abe-446f-b5ca-a359c65039b9-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"94bfaf3d-7abe-446f-b5ca-a359c65039b9\") " pod="openstack/ovsdbserver-nb-0" Oct 12 05:56:19 crc kubenswrapper[4930]: I1012 05:56:19.082971 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/94bfaf3d-7abe-446f-b5ca-a359c65039b9-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"94bfaf3d-7abe-446f-b5ca-a359c65039b9\") " pod="openstack/ovsdbserver-nb-0" Oct 12 05:56:19 crc kubenswrapper[4930]: I1012 05:56:19.083186 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/94bfaf3d-7abe-446f-b5ca-a359c65039b9-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"94bfaf3d-7abe-446f-b5ca-a359c65039b9\") " pod="openstack/ovsdbserver-nb-0" Oct 12 05:56:19 crc kubenswrapper[4930]: I1012 05:56:19.083209 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94bfaf3d-7abe-446f-b5ca-a359c65039b9-config\") pod \"ovsdbserver-nb-0\" (UID: \"94bfaf3d-7abe-446f-b5ca-a359c65039b9\") " pod="openstack/ovsdbserver-nb-0" Oct 12 05:56:19 crc kubenswrapper[4930]: I1012 05:56:19.083229 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hg25j\" (UniqueName: \"kubernetes.io/projected/94bfaf3d-7abe-446f-b5ca-a359c65039b9-kube-api-access-hg25j\") pod \"ovsdbserver-nb-0\" (UID: \"94bfaf3d-7abe-446f-b5ca-a359c65039b9\") " pod="openstack/ovsdbserver-nb-0" Oct 12 05:56:19 crc kubenswrapper[4930]: I1012 05:56:19.083252 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/94bfaf3d-7abe-446f-b5ca-a359c65039b9-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"94bfaf3d-7abe-446f-b5ca-a359c65039b9\") " pod="openstack/ovsdbserver-nb-0" Oct 12 05:56:19 crc kubenswrapper[4930]: I1012 05:56:19.083268 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94bfaf3d-7abe-446f-b5ca-a359c65039b9-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"94bfaf3d-7abe-446f-b5ca-a359c65039b9\") " pod="openstack/ovsdbserver-nb-0" Oct 12 05:56:19 crc kubenswrapper[4930]: I1012 05:56:19.083316 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"94bfaf3d-7abe-446f-b5ca-a359c65039b9\") " pod="openstack/ovsdbserver-nb-0" Oct 12 05:56:19 crc kubenswrapper[4930]: I1012 05:56:19.191433 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/94bfaf3d-7abe-446f-b5ca-a359c65039b9-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"94bfaf3d-7abe-446f-b5ca-a359c65039b9\") " pod="openstack/ovsdbserver-nb-0" Oct 12 05:56:19 crc kubenswrapper[4930]: I1012 05:56:19.191509 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/94bfaf3d-7abe-446f-b5ca-a359c65039b9-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"94bfaf3d-7abe-446f-b5ca-a359c65039b9\") " pod="openstack/ovsdbserver-nb-0" Oct 12 05:56:19 crc kubenswrapper[4930]: I1012 05:56:19.191539 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/94bfaf3d-7abe-446f-b5ca-a359c65039b9-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"94bfaf3d-7abe-446f-b5ca-a359c65039b9\") " pod="openstack/ovsdbserver-nb-0" Oct 12 05:56:19 crc kubenswrapper[4930]: I1012 05:56:19.191581 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94bfaf3d-7abe-446f-b5ca-a359c65039b9-config\") pod \"ovsdbserver-nb-0\" (UID: \"94bfaf3d-7abe-446f-b5ca-a359c65039b9\") " pod="openstack/ovsdbserver-nb-0" Oct 12 05:56:19 crc kubenswrapper[4930]: I1012 05:56:19.191601 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hg25j\" (UniqueName: \"kubernetes.io/projected/94bfaf3d-7abe-446f-b5ca-a359c65039b9-kube-api-access-hg25j\") pod \"ovsdbserver-nb-0\" (UID: \"94bfaf3d-7abe-446f-b5ca-a359c65039b9\") " pod="openstack/ovsdbserver-nb-0" Oct 12 05:56:19 crc kubenswrapper[4930]: I1012 05:56:19.191641 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/94bfaf3d-7abe-446f-b5ca-a359c65039b9-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"94bfaf3d-7abe-446f-b5ca-a359c65039b9\") " pod="openstack/ovsdbserver-nb-0" Oct 12 05:56:19 crc kubenswrapper[4930]: I1012 05:56:19.191662 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94bfaf3d-7abe-446f-b5ca-a359c65039b9-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"94bfaf3d-7abe-446f-b5ca-a359c65039b9\") " pod="openstack/ovsdbserver-nb-0" Oct 12 05:56:19 crc kubenswrapper[4930]: I1012 05:56:19.191745 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"94bfaf3d-7abe-446f-b5ca-a359c65039b9\") " pod="openstack/ovsdbserver-nb-0" Oct 12 05:56:19 crc kubenswrapper[4930]: I1012 05:56:19.192104 4930 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"94bfaf3d-7abe-446f-b5ca-a359c65039b9\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/ovsdbserver-nb-0" Oct 12 05:56:19 crc kubenswrapper[4930]: I1012 05:56:19.192172 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/94bfaf3d-7abe-446f-b5ca-a359c65039b9-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"94bfaf3d-7abe-446f-b5ca-a359c65039b9\") " pod="openstack/ovsdbserver-nb-0" Oct 12 05:56:19 crc kubenswrapper[4930]: I1012 05:56:19.193126 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94bfaf3d-7abe-446f-b5ca-a359c65039b9-config\") pod \"ovsdbserver-nb-0\" (UID: \"94bfaf3d-7abe-446f-b5ca-a359c65039b9\") " pod="openstack/ovsdbserver-nb-0" Oct 12 05:56:19 crc kubenswrapper[4930]: I1012 05:56:19.193246 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/94bfaf3d-7abe-446f-b5ca-a359c65039b9-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"94bfaf3d-7abe-446f-b5ca-a359c65039b9\") " pod="openstack/ovsdbserver-nb-0" Oct 12 05:56:19 crc kubenswrapper[4930]: I1012 05:56:19.200836 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94bfaf3d-7abe-446f-b5ca-a359c65039b9-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"94bfaf3d-7abe-446f-b5ca-a359c65039b9\") " pod="openstack/ovsdbserver-nb-0" Oct 12 05:56:19 crc kubenswrapper[4930]: I1012 05:56:19.201028 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/94bfaf3d-7abe-446f-b5ca-a359c65039b9-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"94bfaf3d-7abe-446f-b5ca-a359c65039b9\") " pod="openstack/ovsdbserver-nb-0" Oct 12 05:56:19 crc kubenswrapper[4930]: I1012 05:56:19.201028 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/94bfaf3d-7abe-446f-b5ca-a359c65039b9-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"94bfaf3d-7abe-446f-b5ca-a359c65039b9\") " pod="openstack/ovsdbserver-nb-0" Oct 12 05:56:19 crc kubenswrapper[4930]: I1012 05:56:19.209981 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hg25j\" (UniqueName: \"kubernetes.io/projected/94bfaf3d-7abe-446f-b5ca-a359c65039b9-kube-api-access-hg25j\") pod \"ovsdbserver-nb-0\" (UID: \"94bfaf3d-7abe-446f-b5ca-a359c65039b9\") " pod="openstack/ovsdbserver-nb-0" Oct 12 05:56:19 crc kubenswrapper[4930]: I1012 05:56:19.234777 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"94bfaf3d-7abe-446f-b5ca-a359c65039b9\") " pod="openstack/ovsdbserver-nb-0" Oct 12 05:56:19 crc kubenswrapper[4930]: I1012 05:56:19.346375 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 12 05:56:19 crc kubenswrapper[4930]: I1012 05:56:19.360975 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"80c83b5b-7221-4d68-b4e1-b8c622bfa7cf","Type":"ContainerStarted","Data":"9535f058afb1ea279023bc65ddbe2acf5bd26c9bdb7ca2ecbdc376000e22aaa1"} Oct 12 05:56:19 crc kubenswrapper[4930]: I1012 05:56:19.391520 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-nw5dm"] Oct 12 05:56:19 crc kubenswrapper[4930]: I1012 05:56:19.890919 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-hjqfl"] Oct 12 05:56:20 crc kubenswrapper[4930]: W1012 05:56:20.033478 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podddccae59_8916_4bd7_bffa_041cf574e89e.slice/crio-38fe1f351d2b92d41f48114abda9326992537fb96077224fae95277ced55c019 WatchSource:0}: Error finding container 38fe1f351d2b92d41f48114abda9326992537fb96077224fae95277ced55c019: Status 404 returned error can't find the container with id 38fe1f351d2b92d41f48114abda9326992537fb96077224fae95277ced55c019 Oct 12 05:56:20 crc kubenswrapper[4930]: W1012 05:56:20.222400 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod382036ff_8896_494d_9670_ec527019676f.slice/crio-d3ed8c026a27daea36c607b06c9dd947fea7f4993f44123ffc3918b4b7afc301 WatchSource:0}: Error finding container d3ed8c026a27daea36c607b06c9dd947fea7f4993f44123ffc3918b4b7afc301: Status 404 returned error can't find the container with id d3ed8c026a27daea36c607b06c9dd947fea7f4993f44123ffc3918b4b7afc301 Oct 12 05:56:20 crc kubenswrapper[4930]: I1012 05:56:20.399441 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nw5dm" event={"ID":"ddccae59-8916-4bd7-bffa-041cf574e89e","Type":"ContainerStarted","Data":"38fe1f351d2b92d41f48114abda9326992537fb96077224fae95277ced55c019"} Oct 12 05:56:20 crc kubenswrapper[4930]: I1012 05:56:20.405048 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-hjqfl" event={"ID":"382036ff-8896-494d-9670-ec527019676f","Type":"ContainerStarted","Data":"d3ed8c026a27daea36c607b06c9dd947fea7f4993f44123ffc3918b4b7afc301"} Oct 12 05:56:21 crc kubenswrapper[4930]: I1012 05:56:21.009558 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 12 05:56:21 crc kubenswrapper[4930]: I1012 05:56:21.030963 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-kpwjv"] Oct 12 05:56:21 crc kubenswrapper[4930]: I1012 05:56:21.039230 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-kpwjv" Oct 12 05:56:21 crc kubenswrapper[4930]: I1012 05:56:21.041852 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Oct 12 05:56:21 crc kubenswrapper[4930]: I1012 05:56:21.043788 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-kpwjv"] Oct 12 05:56:21 crc kubenswrapper[4930]: I1012 05:56:21.140268 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78fdc19a-5689-461a-89da-3054932b88c3-config\") pod \"ovn-controller-metrics-kpwjv\" (UID: \"78fdc19a-5689-461a-89da-3054932b88c3\") " pod="openstack/ovn-controller-metrics-kpwjv" Oct 12 05:56:21 crc kubenswrapper[4930]: I1012 05:56:21.140338 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/78fdc19a-5689-461a-89da-3054932b88c3-ovn-rundir\") pod \"ovn-controller-metrics-kpwjv\" (UID: \"78fdc19a-5689-461a-89da-3054932b88c3\") " pod="openstack/ovn-controller-metrics-kpwjv" Oct 12 05:56:21 crc kubenswrapper[4930]: I1012 05:56:21.140386 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tw968\" (UniqueName: \"kubernetes.io/projected/78fdc19a-5689-461a-89da-3054932b88c3-kube-api-access-tw968\") pod \"ovn-controller-metrics-kpwjv\" (UID: \"78fdc19a-5689-461a-89da-3054932b88c3\") " pod="openstack/ovn-controller-metrics-kpwjv" Oct 12 05:56:21 crc kubenswrapper[4930]: I1012 05:56:21.140416 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78fdc19a-5689-461a-89da-3054932b88c3-combined-ca-bundle\") pod \"ovn-controller-metrics-kpwjv\" (UID: \"78fdc19a-5689-461a-89da-3054932b88c3\") " pod="openstack/ovn-controller-metrics-kpwjv" Oct 12 05:56:21 crc kubenswrapper[4930]: I1012 05:56:21.140438 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/78fdc19a-5689-461a-89da-3054932b88c3-ovs-rundir\") pod \"ovn-controller-metrics-kpwjv\" (UID: \"78fdc19a-5689-461a-89da-3054932b88c3\") " pod="openstack/ovn-controller-metrics-kpwjv" Oct 12 05:56:21 crc kubenswrapper[4930]: I1012 05:56:21.140471 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/78fdc19a-5689-461a-89da-3054932b88c3-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-kpwjv\" (UID: \"78fdc19a-5689-461a-89da-3054932b88c3\") " pod="openstack/ovn-controller-metrics-kpwjv" Oct 12 05:56:21 crc kubenswrapper[4930]: I1012 05:56:21.226078 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6665b8cd9-rr4j9"] Oct 12 05:56:21 crc kubenswrapper[4930]: I1012 05:56:21.242448 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/78fdc19a-5689-461a-89da-3054932b88c3-ovs-rundir\") pod \"ovn-controller-metrics-kpwjv\" (UID: \"78fdc19a-5689-461a-89da-3054932b88c3\") " pod="openstack/ovn-controller-metrics-kpwjv" Oct 12 05:56:21 crc kubenswrapper[4930]: I1012 05:56:21.242523 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/78fdc19a-5689-461a-89da-3054932b88c3-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-kpwjv\" (UID: \"78fdc19a-5689-461a-89da-3054932b88c3\") " pod="openstack/ovn-controller-metrics-kpwjv" Oct 12 05:56:21 crc kubenswrapper[4930]: I1012 05:56:21.242591 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78fdc19a-5689-461a-89da-3054932b88c3-config\") pod \"ovn-controller-metrics-kpwjv\" (UID: \"78fdc19a-5689-461a-89da-3054932b88c3\") " pod="openstack/ovn-controller-metrics-kpwjv" Oct 12 05:56:21 crc kubenswrapper[4930]: I1012 05:56:21.242652 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/78fdc19a-5689-461a-89da-3054932b88c3-ovn-rundir\") pod \"ovn-controller-metrics-kpwjv\" (UID: \"78fdc19a-5689-461a-89da-3054932b88c3\") " pod="openstack/ovn-controller-metrics-kpwjv" Oct 12 05:56:21 crc kubenswrapper[4930]: I1012 05:56:21.242707 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tw968\" (UniqueName: \"kubernetes.io/projected/78fdc19a-5689-461a-89da-3054932b88c3-kube-api-access-tw968\") pod \"ovn-controller-metrics-kpwjv\" (UID: \"78fdc19a-5689-461a-89da-3054932b88c3\") " pod="openstack/ovn-controller-metrics-kpwjv" Oct 12 05:56:21 crc kubenswrapper[4930]: I1012 05:56:21.242729 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78fdc19a-5689-461a-89da-3054932b88c3-combined-ca-bundle\") pod \"ovn-controller-metrics-kpwjv\" (UID: \"78fdc19a-5689-461a-89da-3054932b88c3\") " pod="openstack/ovn-controller-metrics-kpwjv" Oct 12 05:56:21 crc kubenswrapper[4930]: I1012 05:56:21.243054 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/78fdc19a-5689-461a-89da-3054932b88c3-ovs-rundir\") pod \"ovn-controller-metrics-kpwjv\" (UID: \"78fdc19a-5689-461a-89da-3054932b88c3\") " pod="openstack/ovn-controller-metrics-kpwjv" Oct 12 05:56:21 crc kubenswrapper[4930]: I1012 05:56:21.243567 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/78fdc19a-5689-461a-89da-3054932b88c3-ovn-rundir\") pod \"ovn-controller-metrics-kpwjv\" (UID: \"78fdc19a-5689-461a-89da-3054932b88c3\") " pod="openstack/ovn-controller-metrics-kpwjv" Oct 12 05:56:21 crc kubenswrapper[4930]: I1012 05:56:21.259451 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78fdc19a-5689-461a-89da-3054932b88c3-config\") pod \"ovn-controller-metrics-kpwjv\" (UID: \"78fdc19a-5689-461a-89da-3054932b88c3\") " pod="openstack/ovn-controller-metrics-kpwjv" Oct 12 05:56:21 crc kubenswrapper[4930]: I1012 05:56:21.260220 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6fb75c485f-sk8d6"] Oct 12 05:56:21 crc kubenswrapper[4930]: I1012 05:56:21.263281 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fb75c485f-sk8d6" Oct 12 05:56:21 crc kubenswrapper[4930]: I1012 05:56:21.265004 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78fdc19a-5689-461a-89da-3054932b88c3-combined-ca-bundle\") pod \"ovn-controller-metrics-kpwjv\" (UID: \"78fdc19a-5689-461a-89da-3054932b88c3\") " pod="openstack/ovn-controller-metrics-kpwjv" Oct 12 05:56:21 crc kubenswrapper[4930]: I1012 05:56:21.265832 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/78fdc19a-5689-461a-89da-3054932b88c3-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-kpwjv\" (UID: \"78fdc19a-5689-461a-89da-3054932b88c3\") " pod="openstack/ovn-controller-metrics-kpwjv" Oct 12 05:56:21 crc kubenswrapper[4930]: I1012 05:56:21.269875 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 12 05:56:21 crc kubenswrapper[4930]: I1012 05:56:21.276664 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6fb75c485f-sk8d6"] Oct 12 05:56:21 crc kubenswrapper[4930]: I1012 05:56:21.318951 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tw968\" (UniqueName: \"kubernetes.io/projected/78fdc19a-5689-461a-89da-3054932b88c3-kube-api-access-tw968\") pod \"ovn-controller-metrics-kpwjv\" (UID: \"78fdc19a-5689-461a-89da-3054932b88c3\") " pod="openstack/ovn-controller-metrics-kpwjv" Oct 12 05:56:21 crc kubenswrapper[4930]: I1012 05:56:21.373013 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-kpwjv" Oct 12 05:56:21 crc kubenswrapper[4930]: I1012 05:56:21.450832 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tgjv\" (UniqueName: \"kubernetes.io/projected/f510e0b4-31e9-4050-a739-3c14f2b61603-kube-api-access-6tgjv\") pod \"dnsmasq-dns-6fb75c485f-sk8d6\" (UID: \"f510e0b4-31e9-4050-a739-3c14f2b61603\") " pod="openstack/dnsmasq-dns-6fb75c485f-sk8d6" Oct 12 05:56:21 crc kubenswrapper[4930]: I1012 05:56:21.450926 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f510e0b4-31e9-4050-a739-3c14f2b61603-config\") pod \"dnsmasq-dns-6fb75c485f-sk8d6\" (UID: \"f510e0b4-31e9-4050-a739-3c14f2b61603\") " pod="openstack/dnsmasq-dns-6fb75c485f-sk8d6" Oct 12 05:56:21 crc kubenswrapper[4930]: I1012 05:56:21.450995 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f510e0b4-31e9-4050-a739-3c14f2b61603-ovsdbserver-nb\") pod \"dnsmasq-dns-6fb75c485f-sk8d6\" (UID: \"f510e0b4-31e9-4050-a739-3c14f2b61603\") " pod="openstack/dnsmasq-dns-6fb75c485f-sk8d6" Oct 12 05:56:21 crc kubenswrapper[4930]: I1012 05:56:21.451061 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f510e0b4-31e9-4050-a739-3c14f2b61603-dns-svc\") pod \"dnsmasq-dns-6fb75c485f-sk8d6\" (UID: \"f510e0b4-31e9-4050-a739-3c14f2b61603\") " pod="openstack/dnsmasq-dns-6fb75c485f-sk8d6" Oct 12 05:56:21 crc kubenswrapper[4930]: I1012 05:56:21.554178 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f510e0b4-31e9-4050-a739-3c14f2b61603-dns-svc\") pod \"dnsmasq-dns-6fb75c485f-sk8d6\" (UID: \"f510e0b4-31e9-4050-a739-3c14f2b61603\") " pod="openstack/dnsmasq-dns-6fb75c485f-sk8d6" Oct 12 05:56:21 crc kubenswrapper[4930]: I1012 05:56:21.554274 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tgjv\" (UniqueName: \"kubernetes.io/projected/f510e0b4-31e9-4050-a739-3c14f2b61603-kube-api-access-6tgjv\") pod \"dnsmasq-dns-6fb75c485f-sk8d6\" (UID: \"f510e0b4-31e9-4050-a739-3c14f2b61603\") " pod="openstack/dnsmasq-dns-6fb75c485f-sk8d6" Oct 12 05:56:21 crc kubenswrapper[4930]: I1012 05:56:21.554316 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f510e0b4-31e9-4050-a739-3c14f2b61603-config\") pod \"dnsmasq-dns-6fb75c485f-sk8d6\" (UID: \"f510e0b4-31e9-4050-a739-3c14f2b61603\") " pod="openstack/dnsmasq-dns-6fb75c485f-sk8d6" Oct 12 05:56:21 crc kubenswrapper[4930]: I1012 05:56:21.554401 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f510e0b4-31e9-4050-a739-3c14f2b61603-ovsdbserver-nb\") pod \"dnsmasq-dns-6fb75c485f-sk8d6\" (UID: \"f510e0b4-31e9-4050-a739-3c14f2b61603\") " pod="openstack/dnsmasq-dns-6fb75c485f-sk8d6" Oct 12 05:56:21 crc kubenswrapper[4930]: I1012 05:56:21.556637 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f510e0b4-31e9-4050-a739-3c14f2b61603-config\") pod \"dnsmasq-dns-6fb75c485f-sk8d6\" (UID: \"f510e0b4-31e9-4050-a739-3c14f2b61603\") " pod="openstack/dnsmasq-dns-6fb75c485f-sk8d6" Oct 12 05:56:21 crc kubenswrapper[4930]: I1012 05:56:21.556980 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f510e0b4-31e9-4050-a739-3c14f2b61603-dns-svc\") pod \"dnsmasq-dns-6fb75c485f-sk8d6\" (UID: \"f510e0b4-31e9-4050-a739-3c14f2b61603\") " pod="openstack/dnsmasq-dns-6fb75c485f-sk8d6" Oct 12 05:56:21 crc kubenswrapper[4930]: I1012 05:56:21.561063 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f510e0b4-31e9-4050-a739-3c14f2b61603-ovsdbserver-nb\") pod \"dnsmasq-dns-6fb75c485f-sk8d6\" (UID: \"f510e0b4-31e9-4050-a739-3c14f2b61603\") " pod="openstack/dnsmasq-dns-6fb75c485f-sk8d6" Oct 12 05:56:21 crc kubenswrapper[4930]: I1012 05:56:21.584088 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tgjv\" (UniqueName: \"kubernetes.io/projected/f510e0b4-31e9-4050-a739-3c14f2b61603-kube-api-access-6tgjv\") pod \"dnsmasq-dns-6fb75c485f-sk8d6\" (UID: \"f510e0b4-31e9-4050-a739-3c14f2b61603\") " pod="openstack/dnsmasq-dns-6fb75c485f-sk8d6" Oct 12 05:56:21 crc kubenswrapper[4930]: I1012 05:56:21.666192 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fb75c485f-sk8d6" Oct 12 05:56:22 crc kubenswrapper[4930]: I1012 05:56:22.584361 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 12 05:56:22 crc kubenswrapper[4930]: I1012 05:56:22.586050 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 12 05:56:22 crc kubenswrapper[4930]: I1012 05:56:22.590245 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-9kb59" Oct 12 05:56:22 crc kubenswrapper[4930]: I1012 05:56:22.590483 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Oct 12 05:56:22 crc kubenswrapper[4930]: I1012 05:56:22.595244 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Oct 12 05:56:22 crc kubenswrapper[4930]: I1012 05:56:22.595574 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Oct 12 05:56:22 crc kubenswrapper[4930]: I1012 05:56:22.605575 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 12 05:56:22 crc kubenswrapper[4930]: I1012 05:56:22.694171 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"acd687f2-88b8-4750-9f1a-ba8fa345e290\") " pod="openstack/ovsdbserver-sb-0" Oct 12 05:56:22 crc kubenswrapper[4930]: I1012 05:56:22.694245 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/acd687f2-88b8-4750-9f1a-ba8fa345e290-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"acd687f2-88b8-4750-9f1a-ba8fa345e290\") " pod="openstack/ovsdbserver-sb-0" Oct 12 05:56:22 crc kubenswrapper[4930]: I1012 05:56:22.694276 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdtbn\" (UniqueName: \"kubernetes.io/projected/acd687f2-88b8-4750-9f1a-ba8fa345e290-kube-api-access-tdtbn\") pod \"ovsdbserver-sb-0\" (UID: \"acd687f2-88b8-4750-9f1a-ba8fa345e290\") " pod="openstack/ovsdbserver-sb-0" Oct 12 05:56:22 crc kubenswrapper[4930]: I1012 05:56:22.694296 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/acd687f2-88b8-4750-9f1a-ba8fa345e290-config\") pod \"ovsdbserver-sb-0\" (UID: \"acd687f2-88b8-4750-9f1a-ba8fa345e290\") " pod="openstack/ovsdbserver-sb-0" Oct 12 05:56:22 crc kubenswrapper[4930]: I1012 05:56:22.694376 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/acd687f2-88b8-4750-9f1a-ba8fa345e290-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"acd687f2-88b8-4750-9f1a-ba8fa345e290\") " pod="openstack/ovsdbserver-sb-0" Oct 12 05:56:22 crc kubenswrapper[4930]: I1012 05:56:22.694408 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acd687f2-88b8-4750-9f1a-ba8fa345e290-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"acd687f2-88b8-4750-9f1a-ba8fa345e290\") " pod="openstack/ovsdbserver-sb-0" Oct 12 05:56:22 crc kubenswrapper[4930]: I1012 05:56:22.694434 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/acd687f2-88b8-4750-9f1a-ba8fa345e290-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"acd687f2-88b8-4750-9f1a-ba8fa345e290\") " pod="openstack/ovsdbserver-sb-0" Oct 12 05:56:22 crc kubenswrapper[4930]: I1012 05:56:22.694470 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/acd687f2-88b8-4750-9f1a-ba8fa345e290-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"acd687f2-88b8-4750-9f1a-ba8fa345e290\") " pod="openstack/ovsdbserver-sb-0" Oct 12 05:56:22 crc kubenswrapper[4930]: I1012 05:56:22.798959 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/acd687f2-88b8-4750-9f1a-ba8fa345e290-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"acd687f2-88b8-4750-9f1a-ba8fa345e290\") " pod="openstack/ovsdbserver-sb-0" Oct 12 05:56:22 crc kubenswrapper[4930]: I1012 05:56:22.799050 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acd687f2-88b8-4750-9f1a-ba8fa345e290-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"acd687f2-88b8-4750-9f1a-ba8fa345e290\") " pod="openstack/ovsdbserver-sb-0" Oct 12 05:56:22 crc kubenswrapper[4930]: I1012 05:56:22.799082 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/acd687f2-88b8-4750-9f1a-ba8fa345e290-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"acd687f2-88b8-4750-9f1a-ba8fa345e290\") " pod="openstack/ovsdbserver-sb-0" Oct 12 05:56:22 crc kubenswrapper[4930]: I1012 05:56:22.799139 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/acd687f2-88b8-4750-9f1a-ba8fa345e290-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"acd687f2-88b8-4750-9f1a-ba8fa345e290\") " pod="openstack/ovsdbserver-sb-0" Oct 12 05:56:22 crc kubenswrapper[4930]: I1012 05:56:22.799179 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"acd687f2-88b8-4750-9f1a-ba8fa345e290\") " pod="openstack/ovsdbserver-sb-0" Oct 12 05:56:22 crc kubenswrapper[4930]: I1012 05:56:22.799235 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/acd687f2-88b8-4750-9f1a-ba8fa345e290-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"acd687f2-88b8-4750-9f1a-ba8fa345e290\") " pod="openstack/ovsdbserver-sb-0" Oct 12 05:56:22 crc kubenswrapper[4930]: I1012 05:56:22.799266 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdtbn\" (UniqueName: \"kubernetes.io/projected/acd687f2-88b8-4750-9f1a-ba8fa345e290-kube-api-access-tdtbn\") pod \"ovsdbserver-sb-0\" (UID: \"acd687f2-88b8-4750-9f1a-ba8fa345e290\") " pod="openstack/ovsdbserver-sb-0" Oct 12 05:56:22 crc kubenswrapper[4930]: I1012 05:56:22.799283 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/acd687f2-88b8-4750-9f1a-ba8fa345e290-config\") pod \"ovsdbserver-sb-0\" (UID: \"acd687f2-88b8-4750-9f1a-ba8fa345e290\") " pod="openstack/ovsdbserver-sb-0" Oct 12 05:56:22 crc kubenswrapper[4930]: I1012 05:56:22.802274 4930 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"acd687f2-88b8-4750-9f1a-ba8fa345e290\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/ovsdbserver-sb-0" Oct 12 05:56:22 crc kubenswrapper[4930]: I1012 05:56:22.803446 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/acd687f2-88b8-4750-9f1a-ba8fa345e290-config\") pod \"ovsdbserver-sb-0\" (UID: \"acd687f2-88b8-4750-9f1a-ba8fa345e290\") " pod="openstack/ovsdbserver-sb-0" Oct 12 05:56:22 crc kubenswrapper[4930]: I1012 05:56:22.807524 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acd687f2-88b8-4750-9f1a-ba8fa345e290-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"acd687f2-88b8-4750-9f1a-ba8fa345e290\") " pod="openstack/ovsdbserver-sb-0" Oct 12 05:56:22 crc kubenswrapper[4930]: I1012 05:56:22.807701 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/acd687f2-88b8-4750-9f1a-ba8fa345e290-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"acd687f2-88b8-4750-9f1a-ba8fa345e290\") " pod="openstack/ovsdbserver-sb-0" Oct 12 05:56:22 crc kubenswrapper[4930]: I1012 05:56:22.808439 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/acd687f2-88b8-4750-9f1a-ba8fa345e290-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"acd687f2-88b8-4750-9f1a-ba8fa345e290\") " pod="openstack/ovsdbserver-sb-0" Oct 12 05:56:22 crc kubenswrapper[4930]: I1012 05:56:22.812095 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/acd687f2-88b8-4750-9f1a-ba8fa345e290-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"acd687f2-88b8-4750-9f1a-ba8fa345e290\") " pod="openstack/ovsdbserver-sb-0" Oct 12 05:56:22 crc kubenswrapper[4930]: I1012 05:56:22.837062 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/acd687f2-88b8-4750-9f1a-ba8fa345e290-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"acd687f2-88b8-4750-9f1a-ba8fa345e290\") " pod="openstack/ovsdbserver-sb-0" Oct 12 05:56:22 crc kubenswrapper[4930]: I1012 05:56:22.837380 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdtbn\" (UniqueName: \"kubernetes.io/projected/acd687f2-88b8-4750-9f1a-ba8fa345e290-kube-api-access-tdtbn\") pod \"ovsdbserver-sb-0\" (UID: \"acd687f2-88b8-4750-9f1a-ba8fa345e290\") " pod="openstack/ovsdbserver-sb-0" Oct 12 05:56:22 crc kubenswrapper[4930]: I1012 05:56:22.899864 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"acd687f2-88b8-4750-9f1a-ba8fa345e290\") " pod="openstack/ovsdbserver-sb-0" Oct 12 05:56:22 crc kubenswrapper[4930]: I1012 05:56:22.914793 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 12 05:56:23 crc kubenswrapper[4930]: W1012 05:56:23.813305 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94bfaf3d_7abe_446f_b5ca_a359c65039b9.slice/crio-81a80db934937d75d81198bf66ac42efc65ccc904add674bde12c7e57507c2a7 WatchSource:0}: Error finding container 81a80db934937d75d81198bf66ac42efc65ccc904add674bde12c7e57507c2a7: Status 404 returned error can't find the container with id 81a80db934937d75d81198bf66ac42efc65ccc904add674bde12c7e57507c2a7 Oct 12 05:56:24 crc kubenswrapper[4930]: I1012 05:56:24.486269 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"86e4fb08-5a1f-4f1e-8cf0-7ef59386d66b","Type":"ContainerStarted","Data":"3fa6d184fb5c0aa4eade4a55080e02d334c5e40b8c240dea9baf23110864fd79"} Oct 12 05:56:24 crc kubenswrapper[4930]: I1012 05:56:24.486976 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 12 05:56:24 crc kubenswrapper[4930]: I1012 05:56:24.493538 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"94bfaf3d-7abe-446f-b5ca-a359c65039b9","Type":"ContainerStarted","Data":"81a80db934937d75d81198bf66ac42efc65ccc904add674bde12c7e57507c2a7"} Oct 12 05:56:24 crc kubenswrapper[4930]: I1012 05:56:24.507236 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.42617167 podStartE2EDuration="10.507221581s" podCreationTimestamp="2025-10-12 05:56:14 +0000 UTC" firstStartedPulling="2025-10-12 05:56:15.908773867 +0000 UTC m=+908.450875622" lastFinishedPulling="2025-10-12 05:56:23.989823768 +0000 UTC m=+916.531925533" observedRunningTime="2025-10-12 05:56:24.506259887 +0000 UTC m=+917.048361652" watchObservedRunningTime="2025-10-12 05:56:24.507221581 +0000 UTC m=+917.049323346" Oct 12 05:56:24 crc kubenswrapper[4930]: I1012 05:56:24.621345 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-kpwjv"] Oct 12 05:56:24 crc kubenswrapper[4930]: W1012 05:56:24.632320 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod78fdc19a_5689_461a_89da_3054932b88c3.slice/crio-d735a5f5653cc98ed91beff17ea1a8e15d326e45377a82d84ab035c54858249a WatchSource:0}: Error finding container d735a5f5653cc98ed91beff17ea1a8e15d326e45377a82d84ab035c54858249a: Status 404 returned error can't find the container with id d735a5f5653cc98ed91beff17ea1a8e15d326e45377a82d84ab035c54858249a Oct 12 05:56:24 crc kubenswrapper[4930]: I1012 05:56:24.735986 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6fb75c485f-sk8d6"] Oct 12 05:56:24 crc kubenswrapper[4930]: W1012 05:56:24.743427 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf510e0b4_31e9_4050_a739_3c14f2b61603.slice/crio-fd5f21eeb5962b65b4e776dc71aac372a98a2bccafc781c453a07231fbe8cf81 WatchSource:0}: Error finding container fd5f21eeb5962b65b4e776dc71aac372a98a2bccafc781c453a07231fbe8cf81: Status 404 returned error can't find the container with id fd5f21eeb5962b65b4e776dc71aac372a98a2bccafc781c453a07231fbe8cf81 Oct 12 05:56:24 crc kubenswrapper[4930]: I1012 05:56:24.798408 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 12 05:56:24 crc kubenswrapper[4930]: W1012 05:56:24.804140 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podacd687f2_88b8_4750_9f1a_ba8fa345e290.slice/crio-815f99e934eeaf2426ac8a3006b1d1801bcb55e2b8ca2841100d78c0f1a09d88 WatchSource:0}: Error finding container 815f99e934eeaf2426ac8a3006b1d1801bcb55e2b8ca2841100d78c0f1a09d88: Status 404 returned error can't find the container with id 815f99e934eeaf2426ac8a3006b1d1801bcb55e2b8ca2841100d78c0f1a09d88 Oct 12 05:56:25 crc kubenswrapper[4930]: I1012 05:56:25.503659 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-kpwjv" event={"ID":"78fdc19a-5689-461a-89da-3054932b88c3","Type":"ContainerStarted","Data":"d735a5f5653cc98ed91beff17ea1a8e15d326e45377a82d84ab035c54858249a"} Oct 12 05:56:25 crc kubenswrapper[4930]: I1012 05:56:25.504833 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"acd687f2-88b8-4750-9f1a-ba8fa345e290","Type":"ContainerStarted","Data":"815f99e934eeaf2426ac8a3006b1d1801bcb55e2b8ca2841100d78c0f1a09d88"} Oct 12 05:56:25 crc kubenswrapper[4930]: I1012 05:56:25.506837 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fb75c485f-sk8d6" event={"ID":"f510e0b4-31e9-4050-a739-3c14f2b61603","Type":"ContainerStarted","Data":"fd5f21eeb5962b65b4e776dc71aac372a98a2bccafc781c453a07231fbe8cf81"} Oct 12 05:56:26 crc kubenswrapper[4930]: I1012 05:56:26.432035 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5449989c59-cnw2z"] Oct 12 05:56:26 crc kubenswrapper[4930]: I1012 05:56:26.473400 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6dbf544cc9-5qcnc"] Oct 12 05:56:26 crc kubenswrapper[4930]: I1012 05:56:26.478634 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dbf544cc9-5qcnc" Oct 12 05:56:26 crc kubenswrapper[4930]: I1012 05:56:26.484560 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Oct 12 05:56:26 crc kubenswrapper[4930]: I1012 05:56:26.492853 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6dbf544cc9-5qcnc"] Oct 12 05:56:26 crc kubenswrapper[4930]: I1012 05:56:26.561579 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"80c83b5b-7221-4d68-b4e1-b8c622bfa7cf","Type":"ContainerStarted","Data":"1a7049180e067d43b47cdadecbf3e6ea6274e9fb255e153a5e74c52eeade26e4"} Oct 12 05:56:26 crc kubenswrapper[4930]: I1012 05:56:26.579243 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cb57aef8-53ef-4894-8cfc-1ef708aedd9b-ovsdbserver-nb\") pod \"dnsmasq-dns-6dbf544cc9-5qcnc\" (UID: \"cb57aef8-53ef-4894-8cfc-1ef708aedd9b\") " pod="openstack/dnsmasq-dns-6dbf544cc9-5qcnc" Oct 12 05:56:26 crc kubenswrapper[4930]: I1012 05:56:26.579343 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4g9pb\" (UniqueName: \"kubernetes.io/projected/cb57aef8-53ef-4894-8cfc-1ef708aedd9b-kube-api-access-4g9pb\") pod \"dnsmasq-dns-6dbf544cc9-5qcnc\" (UID: \"cb57aef8-53ef-4894-8cfc-1ef708aedd9b\") " pod="openstack/dnsmasq-dns-6dbf544cc9-5qcnc" Oct 12 05:56:26 crc kubenswrapper[4930]: I1012 05:56:26.579386 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb57aef8-53ef-4894-8cfc-1ef708aedd9b-dns-svc\") pod \"dnsmasq-dns-6dbf544cc9-5qcnc\" (UID: \"cb57aef8-53ef-4894-8cfc-1ef708aedd9b\") " pod="openstack/dnsmasq-dns-6dbf544cc9-5qcnc" Oct 12 05:56:26 crc kubenswrapper[4930]: I1012 05:56:26.579556 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cb57aef8-53ef-4894-8cfc-1ef708aedd9b-ovsdbserver-sb\") pod \"dnsmasq-dns-6dbf544cc9-5qcnc\" (UID: \"cb57aef8-53ef-4894-8cfc-1ef708aedd9b\") " pod="openstack/dnsmasq-dns-6dbf544cc9-5qcnc" Oct 12 05:56:26 crc kubenswrapper[4930]: I1012 05:56:26.579926 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb57aef8-53ef-4894-8cfc-1ef708aedd9b-config\") pod \"dnsmasq-dns-6dbf544cc9-5qcnc\" (UID: \"cb57aef8-53ef-4894-8cfc-1ef708aedd9b\") " pod="openstack/dnsmasq-dns-6dbf544cc9-5qcnc" Oct 12 05:56:26 crc kubenswrapper[4930]: I1012 05:56:26.681209 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb57aef8-53ef-4894-8cfc-1ef708aedd9b-config\") pod \"dnsmasq-dns-6dbf544cc9-5qcnc\" (UID: \"cb57aef8-53ef-4894-8cfc-1ef708aedd9b\") " pod="openstack/dnsmasq-dns-6dbf544cc9-5qcnc" Oct 12 05:56:26 crc kubenswrapper[4930]: I1012 05:56:26.681277 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cb57aef8-53ef-4894-8cfc-1ef708aedd9b-ovsdbserver-nb\") pod \"dnsmasq-dns-6dbf544cc9-5qcnc\" (UID: \"cb57aef8-53ef-4894-8cfc-1ef708aedd9b\") " pod="openstack/dnsmasq-dns-6dbf544cc9-5qcnc" Oct 12 05:56:26 crc kubenswrapper[4930]: I1012 05:56:26.681323 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4g9pb\" (UniqueName: \"kubernetes.io/projected/cb57aef8-53ef-4894-8cfc-1ef708aedd9b-kube-api-access-4g9pb\") pod \"dnsmasq-dns-6dbf544cc9-5qcnc\" (UID: \"cb57aef8-53ef-4894-8cfc-1ef708aedd9b\") " pod="openstack/dnsmasq-dns-6dbf544cc9-5qcnc" Oct 12 05:56:26 crc kubenswrapper[4930]: I1012 05:56:26.681350 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb57aef8-53ef-4894-8cfc-1ef708aedd9b-dns-svc\") pod \"dnsmasq-dns-6dbf544cc9-5qcnc\" (UID: \"cb57aef8-53ef-4894-8cfc-1ef708aedd9b\") " pod="openstack/dnsmasq-dns-6dbf544cc9-5qcnc" Oct 12 05:56:26 crc kubenswrapper[4930]: I1012 05:56:26.681388 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cb57aef8-53ef-4894-8cfc-1ef708aedd9b-ovsdbserver-sb\") pod \"dnsmasq-dns-6dbf544cc9-5qcnc\" (UID: \"cb57aef8-53ef-4894-8cfc-1ef708aedd9b\") " pod="openstack/dnsmasq-dns-6dbf544cc9-5qcnc" Oct 12 05:56:26 crc kubenswrapper[4930]: I1012 05:56:26.682605 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cb57aef8-53ef-4894-8cfc-1ef708aedd9b-ovsdbserver-sb\") pod \"dnsmasq-dns-6dbf544cc9-5qcnc\" (UID: \"cb57aef8-53ef-4894-8cfc-1ef708aedd9b\") " pod="openstack/dnsmasq-dns-6dbf544cc9-5qcnc" Oct 12 05:56:26 crc kubenswrapper[4930]: I1012 05:56:26.683127 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb57aef8-53ef-4894-8cfc-1ef708aedd9b-config\") pod \"dnsmasq-dns-6dbf544cc9-5qcnc\" (UID: \"cb57aef8-53ef-4894-8cfc-1ef708aedd9b\") " pod="openstack/dnsmasq-dns-6dbf544cc9-5qcnc" Oct 12 05:56:26 crc kubenswrapper[4930]: I1012 05:56:26.683663 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cb57aef8-53ef-4894-8cfc-1ef708aedd9b-ovsdbserver-nb\") pod \"dnsmasq-dns-6dbf544cc9-5qcnc\" (UID: \"cb57aef8-53ef-4894-8cfc-1ef708aedd9b\") " pod="openstack/dnsmasq-dns-6dbf544cc9-5qcnc" Oct 12 05:56:26 crc kubenswrapper[4930]: I1012 05:56:26.684002 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb57aef8-53ef-4894-8cfc-1ef708aedd9b-dns-svc\") pod \"dnsmasq-dns-6dbf544cc9-5qcnc\" (UID: \"cb57aef8-53ef-4894-8cfc-1ef708aedd9b\") " pod="openstack/dnsmasq-dns-6dbf544cc9-5qcnc" Oct 12 05:56:26 crc kubenswrapper[4930]: I1012 05:56:26.704014 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4g9pb\" (UniqueName: \"kubernetes.io/projected/cb57aef8-53ef-4894-8cfc-1ef708aedd9b-kube-api-access-4g9pb\") pod \"dnsmasq-dns-6dbf544cc9-5qcnc\" (UID: \"cb57aef8-53ef-4894-8cfc-1ef708aedd9b\") " pod="openstack/dnsmasq-dns-6dbf544cc9-5qcnc" Oct 12 05:56:26 crc kubenswrapper[4930]: I1012 05:56:26.808292 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dbf544cc9-5qcnc" Oct 12 05:56:29 crc kubenswrapper[4930]: I1012 05:56:29.352596 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6dbf544cc9-5qcnc"] Oct 12 05:56:29 crc kubenswrapper[4930]: W1012 05:56:29.357731 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcb57aef8_53ef_4894_8cfc_1ef708aedd9b.slice/crio-397bfe9ba29e11a01d9a2fb68299d0e39ac4c1c46b39490e2e99799f0c74e5aa WatchSource:0}: Error finding container 397bfe9ba29e11a01d9a2fb68299d0e39ac4c1c46b39490e2e99799f0c74e5aa: Status 404 returned error can't find the container with id 397bfe9ba29e11a01d9a2fb68299d0e39ac4c1c46b39490e2e99799f0c74e5aa Oct 12 05:56:29 crc kubenswrapper[4930]: I1012 05:56:29.595699 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dbf544cc9-5qcnc" event={"ID":"cb57aef8-53ef-4894-8cfc-1ef708aedd9b","Type":"ContainerStarted","Data":"397bfe9ba29e11a01d9a2fb68299d0e39ac4c1c46b39490e2e99799f0c74e5aa"} Oct 12 05:56:29 crc kubenswrapper[4930]: I1012 05:56:29.598524 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-kpwjv" event={"ID":"78fdc19a-5689-461a-89da-3054932b88c3","Type":"ContainerStarted","Data":"8b03cd18444c28a91e7afd739c432d9a72e23bad0fd51e115d2d3b5662bd599a"} Oct 12 05:56:29 crc kubenswrapper[4930]: I1012 05:56:29.628965 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-kpwjv" podStartSLOduration=4.33126059 podStartE2EDuration="8.628937203s" podCreationTimestamp="2025-10-12 05:56:21 +0000 UTC" firstStartedPulling="2025-10-12 05:56:24.63523549 +0000 UTC m=+917.177337245" lastFinishedPulling="2025-10-12 05:56:28.932912093 +0000 UTC m=+921.475013858" observedRunningTime="2025-10-12 05:56:29.616784489 +0000 UTC m=+922.158886254" watchObservedRunningTime="2025-10-12 05:56:29.628937203 +0000 UTC m=+922.171038968" Oct 12 05:56:33 crc kubenswrapper[4930]: I1012 05:56:33.667672 4930 generic.go:334] "Generic (PLEG): container finished" podID="80c83b5b-7221-4d68-b4e1-b8c622bfa7cf" containerID="1a7049180e067d43b47cdadecbf3e6ea6274e9fb255e153a5e74c52eeade26e4" exitCode=0 Oct 12 05:56:33 crc kubenswrapper[4930]: I1012 05:56:33.667846 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"80c83b5b-7221-4d68-b4e1-b8c622bfa7cf","Type":"ContainerDied","Data":"1a7049180e067d43b47cdadecbf3e6ea6274e9fb255e153a5e74c52eeade26e4"} Oct 12 05:56:33 crc kubenswrapper[4930]: I1012 05:56:33.669389 4930 patch_prober.go:28] interesting pod/machine-config-daemon-mk4tf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 05:56:33 crc kubenswrapper[4930]: I1012 05:56:33.669476 4930 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 05:56:35 crc kubenswrapper[4930]: I1012 05:56:35.230578 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 12 05:56:47 crc kubenswrapper[4930]: E1012 05:56:47.284607 4930 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-rabbitmq:current" Oct 12 05:56:47 crc kubenswrapper[4930]: E1012 05:56:47.285128 4930 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-rabbitmq:current" Oct 12 05:56:47 crc kubenswrapper[4930]: E1012 05:56:47.285246 4930 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.rdoproject.org/podified-master-centos10/openstack-rabbitmq:current,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m7v8w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-notifications-server-0_openstack(1594846a-5c2f-49f8-9bea-22661720c5a6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 12 05:56:47 crc kubenswrapper[4930]: E1012 05:56:47.286529 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-notifications-server-0" podUID="1594846a-5c2f-49f8-9bea-22661720c5a6" Oct 12 05:56:47 crc kubenswrapper[4930]: E1012 05:56:47.295074 4930 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-rabbitmq:current" Oct 12 05:56:47 crc kubenswrapper[4930]: E1012 05:56:47.295112 4930 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-rabbitmq:current" Oct 12 05:56:47 crc kubenswrapper[4930]: E1012 05:56:47.295235 4930 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.rdoproject.org/podified-master-centos10/openstack-rabbitmq:current,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jhrgr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(bad3587e-d515-4add-9edd-da341fe519b7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 12 05:56:47 crc kubenswrapper[4930]: E1012 05:56:47.296447 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="bad3587e-d515-4add-9edd-da341fe519b7" Oct 12 05:56:47 crc kubenswrapper[4930]: E1012 05:56:47.802315 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-rabbitmq:current\\\"\"" pod="openstack/rabbitmq-notifications-server-0" podUID="1594846a-5c2f-49f8-9bea-22661720c5a6" Oct 12 05:56:47 crc kubenswrapper[4930]: E1012 05:56:47.802391 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-rabbitmq:current\\\"\"" pod="openstack/rabbitmq-server-0" podUID="bad3587e-d515-4add-9edd-da341fe519b7" Oct 12 05:56:57 crc kubenswrapper[4930]: E1012 05:56:57.778678 4930 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-mariadb:current" Oct 12 05:56:57 crc kubenswrapper[4930]: E1012 05:56:57.779384 4930 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-mariadb:current" Oct 12 05:56:57 crc kubenswrapper[4930]: E1012 05:56:57.779589 4930 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.rdoproject.org/podified-master-centos10/openstack-mariadb:current,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:DB_ROOT_PASSWORD,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:osp-secret,},Key:DbRootPassword,Optional:nil,},},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:secrets,ReadOnly:true,MountPath:/var/lib/secrets,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-stj9p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-cell1-galera-0_openstack(46dfc5e5-80f4-49c8-bd9d-885e5dcb3fe5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 12 05:56:57 crc kubenswrapper[4930]: E1012 05:56:57.780939 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-cell1-galera-0" podUID="46dfc5e5-80f4-49c8-bd9d-885e5dcb3fe5" Oct 12 05:56:57 crc kubenswrapper[4930]: E1012 05:56:57.784605 4930 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-neutron-server:current" Oct 12 05:56:57 crc kubenswrapper[4930]: E1012 05:56:57.784653 4930 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-neutron-server:current" Oct 12 05:56:57 crc kubenswrapper[4930]: E1012 05:56:57.784848 4930 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.rdoproject.org/podified-master-centos10/openstack-neutron-server:current,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xd9d9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-8468885bfc-zhxs4_openstack(f9e22aa8-e04a-4743-a638-6835e95f5945): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 12 05:56:57 crc kubenswrapper[4930]: E1012 05:56:57.786379 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-8468885bfc-zhxs4" podUID="f9e22aa8-e04a-4743-a638-6835e95f5945" Oct 12 05:56:57 crc kubenswrapper[4930]: E1012 05:56:57.835926 4930 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-neutron-server:current" Oct 12 05:56:57 crc kubenswrapper[4930]: E1012 05:56:57.835997 4930 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-neutron-server:current" Oct 12 05:56:57 crc kubenswrapper[4930]: E1012 05:56:57.836097 4930 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.rdoproject.org/podified-master-centos10/openstack-neutron-server:current,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l8qtc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-545d49fd5c-78lqk_openstack(4b4dd6b0-8862-4c13-83df-abb6c3752260): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 12 05:56:57 crc kubenswrapper[4930]: E1012 05:56:57.837359 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-545d49fd5c-78lqk" podUID="4b4dd6b0-8862-4c13-83df-abb6c3752260" Oct 12 05:56:57 crc kubenswrapper[4930]: E1012 05:56:57.891370 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-mariadb:current\\\"\"" pod="openstack/openstack-cell1-galera-0" podUID="46dfc5e5-80f4-49c8-bd9d-885e5dcb3fe5" Oct 12 05:56:59 crc kubenswrapper[4930]: E1012 05:56:59.039846 4930 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-neutron-server:current" Oct 12 05:56:59 crc kubenswrapper[4930]: E1012 05:56:59.040188 4930 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-neutron-server:current" Oct 12 05:56:59 crc kubenswrapper[4930]: E1012 05:56:59.040391 4930 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.rdoproject.org/podified-master-centos10/openstack-neutron-server:current,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5c7h56dh5cfh8bh54fhbbhf4h5b9hdch67fhd7h55fh55fh6ch9h548h54ch665h647h6h8fhd6h5dfh5cdh58bh577h66fh695h5fbh55h77h5fcq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-z4pkc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5449989c59-cnw2z_openstack(c3424140-ca19-4ff4-b19a-c236e8868d38): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 12 05:56:59 crc kubenswrapper[4930]: E1012 05:56:59.041661 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5449989c59-cnw2z" podUID="c3424140-ca19-4ff4-b19a-c236e8868d38" Oct 12 05:56:59 crc kubenswrapper[4930]: E1012 05:56:59.729148 4930 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-memcached:current" Oct 12 05:56:59 crc kubenswrapper[4930]: E1012 05:56:59.729213 4930 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-memcached:current" Oct 12 05:56:59 crc kubenswrapper[4930]: E1012 05:56:59.729434 4930 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:memcached,Image:quay.rdoproject.org/podified-master-centos10/openstack-memcached:current,Command:[/usr/bin/dumb-init -- /usr/local/bin/kolla_start],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:memcached,HostPort:0,ContainerPort:11211,Protocol:TCP,HostIP:,},ContainerPort{Name:memcached-tls,HostPort:0,ContainerPort:11212,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:POD_IPS,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIPs,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:CONFIG_HASH,Value:n594hb4hf7h678hf4hfh5c6hbdh5ddh5cdh544h576h576h5d8hd9h655h557h8ch5c6h594h87h74h584h74h5d6hdbh556h579h644h5dfh574h5c5q,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/src,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/certs/memcached.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/private/memcached.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jqffh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42457,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42457,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod memcached-0_openstack(76348a63-90b8-46b5-8856-da5c983b6d72): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 12 05:56:59 crc kubenswrapper[4930]: E1012 05:56:59.730543 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/memcached-0" podUID="76348a63-90b8-46b5-8856-da5c983b6d72" Oct 12 05:56:59 crc kubenswrapper[4930]: E1012 05:56:59.774901 4930 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-neutron-server:current" Oct 12 05:56:59 crc kubenswrapper[4930]: E1012 05:56:59.774945 4930 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-neutron-server:current" Oct 12 05:56:59 crc kubenswrapper[4930]: E1012 05:56:59.775039 4930 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.rdoproject.org/podified-master-centos10/openstack-neutron-server:current,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5c6h54h5b5h8fh59chb8h657h5c6hfbhfh4h68fh5f7h9fhc7h594h8hc7h5bfh56ch5fbh688h5bch699h5f6h55fh564h64h5dfh5dch586h75q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6wwqb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-6665b8cd9-rr4j9_openstack(e314b90f-0ecb-4ab1-b7da-86f2f6d1c36b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 12 05:56:59 crc kubenswrapper[4930]: E1012 05:56:59.776352 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-6665b8cd9-rr4j9" podUID="e314b90f-0ecb-4ab1-b7da-86f2f6d1c36b" Oct 12 05:56:59 crc kubenswrapper[4930]: E1012 05:56:59.776945 4930 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-ovn-base:current" Oct 12 05:56:59 crc kubenswrapper[4930]: E1012 05:56:59.776962 4930 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-ovn-base:current" Oct 12 05:56:59 crc kubenswrapper[4930]: E1012 05:56:59.777026 4930 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:ovsdb-server-init,Image:quay.rdoproject.org/podified-master-centos10/openstack-ovn-base:current,Command:[/usr/local/bin/container-scripts/init-ovsdb-server.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n98h5d5hcdh87hf5h5h664h94h585h77h588h648h578h559h6h56dhffh655h5f5h665hcdh557h565hb7h679h5b6h689h5fh594hbh8fh5f8q,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-ovs,ReadOnly:false,MountPath:/etc/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-run,ReadOnly:false,MountPath:/var/run/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-log,ReadOnly:false,MountPath:/var/log/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-lib,ReadOnly:false,MountPath:/var/lib/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5l4qf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[NET_ADMIN SYS_ADMIN SYS_NICE],Drop:[],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-controller-ovs-hjqfl_openstack(382036ff-8896-494d-9670-ec527019676f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 12 05:56:59 crc kubenswrapper[4930]: E1012 05:56:59.778151 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdb-server-init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ovn-controller-ovs-hjqfl" podUID="382036ff-8896-494d-9670-ec527019676f" Oct 12 05:56:59 crc kubenswrapper[4930]: E1012 05:56:59.810969 4930 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-neutron-server:current" Oct 12 05:56:59 crc kubenswrapper[4930]: E1012 05:56:59.811013 4930 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-neutron-server:current" Oct 12 05:56:59 crc kubenswrapper[4930]: E1012 05:56:59.811219 4930 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.rdoproject.org/podified-master-centos10/openstack-neutron-server:current,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nf8h67chbdh56bh684h64h69h64fh5ddh67fh5f6h56bhch5c9h595h55ch95h99hcdh584h9dhb7h656h645h649h574h569hb9h5b4h5cdh666hc5q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mrrs7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-589dfdf7-fq2v4_openstack(cbf6abbf-41bf-4616-84ec-9bc01291d120): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 12 05:56:59 crc kubenswrapper[4930]: E1012 05:56:59.812445 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-589dfdf7-fq2v4" podUID="cbf6abbf-41bf-4616-84ec-9bc01291d120" Oct 12 05:56:59 crc kubenswrapper[4930]: E1012 05:56:59.907447 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-memcached:current\\\"\"" pod="openstack/memcached-0" podUID="76348a63-90b8-46b5-8856-da5c983b6d72" Oct 12 05:56:59 crc kubenswrapper[4930]: E1012 05:56:59.907968 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdb-server-init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-ovn-base:current\\\"\"" pod="openstack/ovn-controller-ovs-hjqfl" podUID="382036ff-8896-494d-9670-ec527019676f" Oct 12 05:57:01 crc kubenswrapper[4930]: E1012 05:57:01.855239 4930 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/prometheus-rhel9@sha256:a0a1d0e39de54c5b2786c2b82d0104f358b479135c069075ddd4f7cd76826c00" Oct 12 05:57:01 crc kubenswrapper[4930]: E1012 05:57:01.855451 4930 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:prometheus,Image:registry.redhat.io/cluster-observability-operator/prometheus-rhel9@sha256:a0a1d0e39de54c5b2786c2b82d0104f358b479135c069075ddd4f7cd76826c00,Command:[],Args:[--web.console.templates=/etc/prometheus/consoles --web.console.libraries=/etc/prometheus/console_libraries --config.file=/etc/prometheus/config_out/prometheus.env.yaml --web.enable-lifecycle --web.enable-remote-write-receiver --web.route-prefix=/ --storage.tsdb.retention.time=24h --storage.tsdb.path=/prometheus --web.config.file=/etc/prometheus/web_config/web-config.yaml],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:web,HostPort:0,ContainerPort:9090,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-out,ReadOnly:true,MountPath:/etc/prometheus/config_out,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tls-assets,ReadOnly:true,MountPath:/etc/prometheus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:prometheus-metric-storage-db,ReadOnly:false,MountPath:/prometheus,SubPath:prometheus-db,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:prometheus-metric-storage-rulefiles-0,ReadOnly:false,MountPath:/etc/prometheus/rules/prometheus-metric-storage-rulefiles-0,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:web-config,ReadOnly:true,MountPath:/etc/prometheus/web_config/web-config.yaml,SubPath:web-config.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qg2n4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/-/healthy,Port:{1 0 web},Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:3,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/-/ready,Port:{1 0 web},Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:3,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/-/ready,Port:{1 0 web},Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:3,PeriodSeconds:15,SuccessThreshold:1,FailureThreshold:60,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod prometheus-metric-storage-0_openstack(80c83b5b-7221-4d68-b4e1-b8c622bfa7cf): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 12 05:57:01 crc kubenswrapper[4930]: I1012 05:57:01.921015 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-545d49fd5c-78lqk" event={"ID":"4b4dd6b0-8862-4c13-83df-abb6c3752260","Type":"ContainerDied","Data":"7d4d0185a16ee99c19787cfa2a7ecd71ed9c2b3e01b98e42fc1511db3d2cad24"} Oct 12 05:57:01 crc kubenswrapper[4930]: I1012 05:57:01.921388 4930 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d4d0185a16ee99c19787cfa2a7ecd71ed9c2b3e01b98e42fc1511db3d2cad24" Oct 12 05:57:01 crc kubenswrapper[4930]: I1012 05:57:01.923119 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8468885bfc-zhxs4" event={"ID":"f9e22aa8-e04a-4743-a638-6835e95f5945","Type":"ContainerDied","Data":"8dc6f450223300e68b84e073f702d9f1e6b8dead0ac5d0ce1d810c2a15bdb277"} Oct 12 05:57:01 crc kubenswrapper[4930]: I1012 05:57:01.923144 4930 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8dc6f450223300e68b84e073f702d9f1e6b8dead0ac5d0ce1d810c2a15bdb277" Oct 12 05:57:01 crc kubenswrapper[4930]: I1012 05:57:01.930177 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8468885bfc-zhxs4" Oct 12 05:57:01 crc kubenswrapper[4930]: I1012 05:57:01.940580 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-545d49fd5c-78lqk" Oct 12 05:57:02 crc kubenswrapper[4930]: I1012 05:57:02.053292 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9e22aa8-e04a-4743-a638-6835e95f5945-config\") pod \"f9e22aa8-e04a-4743-a638-6835e95f5945\" (UID: \"f9e22aa8-e04a-4743-a638-6835e95f5945\") " Oct 12 05:57:02 crc kubenswrapper[4930]: I1012 05:57:02.053431 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8qtc\" (UniqueName: \"kubernetes.io/projected/4b4dd6b0-8862-4c13-83df-abb6c3752260-kube-api-access-l8qtc\") pod \"4b4dd6b0-8862-4c13-83df-abb6c3752260\" (UID: \"4b4dd6b0-8862-4c13-83df-abb6c3752260\") " Oct 12 05:57:02 crc kubenswrapper[4930]: I1012 05:57:02.053517 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xd9d9\" (UniqueName: \"kubernetes.io/projected/f9e22aa8-e04a-4743-a638-6835e95f5945-kube-api-access-xd9d9\") pod \"f9e22aa8-e04a-4743-a638-6835e95f5945\" (UID: \"f9e22aa8-e04a-4743-a638-6835e95f5945\") " Oct 12 05:57:02 crc kubenswrapper[4930]: I1012 05:57:02.053618 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b4dd6b0-8862-4c13-83df-abb6c3752260-config\") pod \"4b4dd6b0-8862-4c13-83df-abb6c3752260\" (UID: \"4b4dd6b0-8862-4c13-83df-abb6c3752260\") " Oct 12 05:57:02 crc kubenswrapper[4930]: I1012 05:57:02.053894 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4b4dd6b0-8862-4c13-83df-abb6c3752260-dns-svc\") pod \"4b4dd6b0-8862-4c13-83df-abb6c3752260\" (UID: \"4b4dd6b0-8862-4c13-83df-abb6c3752260\") " Oct 12 05:57:02 crc kubenswrapper[4930]: I1012 05:57:02.054251 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b4dd6b0-8862-4c13-83df-abb6c3752260-config" (OuterVolumeSpecName: "config") pod "4b4dd6b0-8862-4c13-83df-abb6c3752260" (UID: "4b4dd6b0-8862-4c13-83df-abb6c3752260"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:57:02 crc kubenswrapper[4930]: I1012 05:57:02.054495 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b4dd6b0-8862-4c13-83df-abb6c3752260-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4b4dd6b0-8862-4c13-83df-abb6c3752260" (UID: "4b4dd6b0-8862-4c13-83df-abb6c3752260"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:57:02 crc kubenswrapper[4930]: I1012 05:57:02.054518 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9e22aa8-e04a-4743-a638-6835e95f5945-config" (OuterVolumeSpecName: "config") pod "f9e22aa8-e04a-4743-a638-6835e95f5945" (UID: "f9e22aa8-e04a-4743-a638-6835e95f5945"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:57:02 crc kubenswrapper[4930]: I1012 05:57:02.054867 4930 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4b4dd6b0-8862-4c13-83df-abb6c3752260-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 12 05:57:02 crc kubenswrapper[4930]: I1012 05:57:02.054882 4930 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9e22aa8-e04a-4743-a638-6835e95f5945-config\") on node \"crc\" DevicePath \"\"" Oct 12 05:57:02 crc kubenswrapper[4930]: I1012 05:57:02.054890 4930 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b4dd6b0-8862-4c13-83df-abb6c3752260-config\") on node \"crc\" DevicePath \"\"" Oct 12 05:57:02 crc kubenswrapper[4930]: I1012 05:57:02.058360 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9e22aa8-e04a-4743-a638-6835e95f5945-kube-api-access-xd9d9" (OuterVolumeSpecName: "kube-api-access-xd9d9") pod "f9e22aa8-e04a-4743-a638-6835e95f5945" (UID: "f9e22aa8-e04a-4743-a638-6835e95f5945"). InnerVolumeSpecName "kube-api-access-xd9d9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:57:02 crc kubenswrapper[4930]: I1012 05:57:02.059412 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b4dd6b0-8862-4c13-83df-abb6c3752260-kube-api-access-l8qtc" (OuterVolumeSpecName: "kube-api-access-l8qtc") pod "4b4dd6b0-8862-4c13-83df-abb6c3752260" (UID: "4b4dd6b0-8862-4c13-83df-abb6c3752260"). InnerVolumeSpecName "kube-api-access-l8qtc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:57:02 crc kubenswrapper[4930]: E1012 05:57:02.150354 4930 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-ovn-sb-db-server:current" Oct 12 05:57:02 crc kubenswrapper[4930]: E1012 05:57:02.150394 4930 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-ovn-sb-db-server:current" Oct 12 05:57:02 crc kubenswrapper[4930]: E1012 05:57:02.150565 4930 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ovsdbserver-sb,Image:quay.rdoproject.org/podified-master-centos10/openstack-ovn-sb-db-server:current,Command:[/usr/bin/dumb-init],Args:[/usr/local/bin/container-scripts/setup.sh],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5c9hc7h84h7h684h5c4h99hfbh547h565h6fh7h66dhb6h67ch658h7bh555h5b8h8dh554h675h5bbh65fh696hdh589h65fh664h54fh68fh64fq,ValueFrom:nil,},EnvVar{Name:OVN_LOGDIR,Value:/tmp,ValueFrom:nil,},EnvVar{Name:OVN_RUNDIR,Value:/tmp,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovndbcluster-sb-etc-ovn,ReadOnly:false,MountPath:/etc/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdb-rundir,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndb.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovndb.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tdtbn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:&Lifecycle{PostStart:nil,PreStop:&LifecycleHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/cleanup.sh],},HTTPGet:nil,TCPSocket:nil,Sleep:nil,},},TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:20,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovsdbserver-sb-0_openstack(acd687f2-88b8-4750-9f1a-ba8fa345e290): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 12 05:57:02 crc kubenswrapper[4930]: I1012 05:57:02.158936 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8qtc\" (UniqueName: \"kubernetes.io/projected/4b4dd6b0-8862-4c13-83df-abb6c3752260-kube-api-access-l8qtc\") on node \"crc\" DevicePath \"\"" Oct 12 05:57:02 crc kubenswrapper[4930]: I1012 05:57:02.158974 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xd9d9\" (UniqueName: \"kubernetes.io/projected/f9e22aa8-e04a-4743-a638-6835e95f5945-kube-api-access-xd9d9\") on node \"crc\" DevicePath \"\"" Oct 12 05:57:02 crc kubenswrapper[4930]: E1012 05:57:02.329853 4930 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current" Oct 12 05:57:02 crc kubenswrapper[4930]: E1012 05:57:02.329903 4930 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current" Oct 12 05:57:02 crc kubenswrapper[4930]: E1012 05:57:02.330034 4930 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ovn-controller,Image:quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current,Command:[ovn-controller --pidfile unix:/run/openvswitch/db.sock --certificate=/etc/pki/tls/certs/ovndb.crt --private-key=/etc/pki/tls/private/ovndb.key --ca-cert=/etc/pki/tls/certs/ovndbca.crt],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n98h5d5hcdh87hf5h5h664h94h585h77h588h648h578h559h6h56dhffh655h5f5h665hcdh557h565hb7h679h5b6h689h5fh594hbh8fh5f8q,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:var-run,ReadOnly:false,MountPath:/var/run/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-run-ovn,ReadOnly:false,MountPath:/var/run/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-log-ovn,ReadOnly:false,MountPath:/var/log/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-controller-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndb.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-controller-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovndb.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-controller-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-j2khz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/ovn_controller_liveness.sh],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:30,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/ovn_controller_readiness.sh],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:30,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:&Lifecycle{PostStart:nil,PreStop:&LifecycleHandler{Exec:&ExecAction{Command:[/usr/share/ovn/scripts/ovn-ctl stop_controller],},HTTPGet:nil,TCPSocket:nil,Sleep:nil,},},TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[NET_ADMIN SYS_ADMIN SYS_NICE],Drop:[],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-controller-nw5dm_openstack(ddccae59-8916-4bd7-bffa-041cf574e89e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 12 05:57:02 crc kubenswrapper[4930]: E1012 05:57:02.331235 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovn-controller\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ovn-controller-nw5dm" podUID="ddccae59-8916-4bd7-bffa-041cf574e89e" Oct 12 05:57:02 crc kubenswrapper[4930]: I1012 05:57:02.363441 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5449989c59-cnw2z" Oct 12 05:57:02 crc kubenswrapper[4930]: I1012 05:57:02.369323 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6665b8cd9-rr4j9" Oct 12 05:57:02 crc kubenswrapper[4930]: I1012 05:57:02.379974 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589dfdf7-fq2v4" Oct 12 05:57:02 crc kubenswrapper[4930]: I1012 05:57:02.469423 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4pkc\" (UniqueName: \"kubernetes.io/projected/c3424140-ca19-4ff4-b19a-c236e8868d38-kube-api-access-z4pkc\") pod \"c3424140-ca19-4ff4-b19a-c236e8868d38\" (UID: \"c3424140-ca19-4ff4-b19a-c236e8868d38\") " Oct 12 05:57:02 crc kubenswrapper[4930]: I1012 05:57:02.469583 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e314b90f-0ecb-4ab1-b7da-86f2f6d1c36b-config\") pod \"e314b90f-0ecb-4ab1-b7da-86f2f6d1c36b\" (UID: \"e314b90f-0ecb-4ab1-b7da-86f2f6d1c36b\") " Oct 12 05:57:02 crc kubenswrapper[4930]: I1012 05:57:02.469783 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6wwqb\" (UniqueName: \"kubernetes.io/projected/e314b90f-0ecb-4ab1-b7da-86f2f6d1c36b-kube-api-access-6wwqb\") pod \"e314b90f-0ecb-4ab1-b7da-86f2f6d1c36b\" (UID: \"e314b90f-0ecb-4ab1-b7da-86f2f6d1c36b\") " Oct 12 05:57:02 crc kubenswrapper[4930]: I1012 05:57:02.469807 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3424140-ca19-4ff4-b19a-c236e8868d38-config\") pod \"c3424140-ca19-4ff4-b19a-c236e8868d38\" (UID: \"c3424140-ca19-4ff4-b19a-c236e8868d38\") " Oct 12 05:57:02 crc kubenswrapper[4930]: I1012 05:57:02.469835 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e314b90f-0ecb-4ab1-b7da-86f2f6d1c36b-dns-svc\") pod \"e314b90f-0ecb-4ab1-b7da-86f2f6d1c36b\" (UID: \"e314b90f-0ecb-4ab1-b7da-86f2f6d1c36b\") " Oct 12 05:57:02 crc kubenswrapper[4930]: I1012 05:57:02.470183 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e314b90f-0ecb-4ab1-b7da-86f2f6d1c36b-config" (OuterVolumeSpecName: "config") pod "e314b90f-0ecb-4ab1-b7da-86f2f6d1c36b" (UID: "e314b90f-0ecb-4ab1-b7da-86f2f6d1c36b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:57:02 crc kubenswrapper[4930]: I1012 05:57:02.470195 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3424140-ca19-4ff4-b19a-c236e8868d38-config" (OuterVolumeSpecName: "config") pod "c3424140-ca19-4ff4-b19a-c236e8868d38" (UID: "c3424140-ca19-4ff4-b19a-c236e8868d38"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:57:02 crc kubenswrapper[4930]: I1012 05:57:02.470301 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c3424140-ca19-4ff4-b19a-c236e8868d38-dns-svc\") pod \"c3424140-ca19-4ff4-b19a-c236e8868d38\" (UID: \"c3424140-ca19-4ff4-b19a-c236e8868d38\") " Oct 12 05:57:02 crc kubenswrapper[4930]: I1012 05:57:02.470624 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e314b90f-0ecb-4ab1-b7da-86f2f6d1c36b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e314b90f-0ecb-4ab1-b7da-86f2f6d1c36b" (UID: "e314b90f-0ecb-4ab1-b7da-86f2f6d1c36b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:57:02 crc kubenswrapper[4930]: I1012 05:57:02.470915 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3424140-ca19-4ff4-b19a-c236e8868d38-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c3424140-ca19-4ff4-b19a-c236e8868d38" (UID: "c3424140-ca19-4ff4-b19a-c236e8868d38"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:57:02 crc kubenswrapper[4930]: I1012 05:57:02.471077 4930 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3424140-ca19-4ff4-b19a-c236e8868d38-config\") on node \"crc\" DevicePath \"\"" Oct 12 05:57:02 crc kubenswrapper[4930]: I1012 05:57:02.471097 4930 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e314b90f-0ecb-4ab1-b7da-86f2f6d1c36b-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 12 05:57:02 crc kubenswrapper[4930]: I1012 05:57:02.471106 4930 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c3424140-ca19-4ff4-b19a-c236e8868d38-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 12 05:57:02 crc kubenswrapper[4930]: I1012 05:57:02.471115 4930 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e314b90f-0ecb-4ab1-b7da-86f2f6d1c36b-config\") on node \"crc\" DevicePath \"\"" Oct 12 05:57:02 crc kubenswrapper[4930]: I1012 05:57:02.472110 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3424140-ca19-4ff4-b19a-c236e8868d38-kube-api-access-z4pkc" (OuterVolumeSpecName: "kube-api-access-z4pkc") pod "c3424140-ca19-4ff4-b19a-c236e8868d38" (UID: "c3424140-ca19-4ff4-b19a-c236e8868d38"). InnerVolumeSpecName "kube-api-access-z4pkc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:57:02 crc kubenswrapper[4930]: I1012 05:57:02.472530 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e314b90f-0ecb-4ab1-b7da-86f2f6d1c36b-kube-api-access-6wwqb" (OuterVolumeSpecName: "kube-api-access-6wwqb") pod "e314b90f-0ecb-4ab1-b7da-86f2f6d1c36b" (UID: "e314b90f-0ecb-4ab1-b7da-86f2f6d1c36b"). InnerVolumeSpecName "kube-api-access-6wwqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:57:02 crc kubenswrapper[4930]: I1012 05:57:02.572791 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrrs7\" (UniqueName: \"kubernetes.io/projected/cbf6abbf-41bf-4616-84ec-9bc01291d120-kube-api-access-mrrs7\") pod \"cbf6abbf-41bf-4616-84ec-9bc01291d120\" (UID: \"cbf6abbf-41bf-4616-84ec-9bc01291d120\") " Oct 12 05:57:02 crc kubenswrapper[4930]: I1012 05:57:02.572998 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cbf6abbf-41bf-4616-84ec-9bc01291d120-dns-svc\") pod \"cbf6abbf-41bf-4616-84ec-9bc01291d120\" (UID: \"cbf6abbf-41bf-4616-84ec-9bc01291d120\") " Oct 12 05:57:02 crc kubenswrapper[4930]: I1012 05:57:02.573054 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbf6abbf-41bf-4616-84ec-9bc01291d120-config\") pod \"cbf6abbf-41bf-4616-84ec-9bc01291d120\" (UID: \"cbf6abbf-41bf-4616-84ec-9bc01291d120\") " Oct 12 05:57:02 crc kubenswrapper[4930]: I1012 05:57:02.573531 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4pkc\" (UniqueName: \"kubernetes.io/projected/c3424140-ca19-4ff4-b19a-c236e8868d38-kube-api-access-z4pkc\") on node \"crc\" DevicePath \"\"" Oct 12 05:57:02 crc kubenswrapper[4930]: I1012 05:57:02.573564 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6wwqb\" (UniqueName: \"kubernetes.io/projected/e314b90f-0ecb-4ab1-b7da-86f2f6d1c36b-kube-api-access-6wwqb\") on node \"crc\" DevicePath \"\"" Oct 12 05:57:02 crc kubenswrapper[4930]: I1012 05:57:02.574192 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cbf6abbf-41bf-4616-84ec-9bc01291d120-config" (OuterVolumeSpecName: "config") pod "cbf6abbf-41bf-4616-84ec-9bc01291d120" (UID: "cbf6abbf-41bf-4616-84ec-9bc01291d120"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:57:02 crc kubenswrapper[4930]: I1012 05:57:02.574225 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cbf6abbf-41bf-4616-84ec-9bc01291d120-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cbf6abbf-41bf-4616-84ec-9bc01291d120" (UID: "cbf6abbf-41bf-4616-84ec-9bc01291d120"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:57:02 crc kubenswrapper[4930]: I1012 05:57:02.577883 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbf6abbf-41bf-4616-84ec-9bc01291d120-kube-api-access-mrrs7" (OuterVolumeSpecName: "kube-api-access-mrrs7") pod "cbf6abbf-41bf-4616-84ec-9bc01291d120" (UID: "cbf6abbf-41bf-4616-84ec-9bc01291d120"). InnerVolumeSpecName "kube-api-access-mrrs7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:57:02 crc kubenswrapper[4930]: E1012 05:57:02.588663 4930 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-ovn-nb-db-server:current" Oct 12 05:57:02 crc kubenswrapper[4930]: E1012 05:57:02.588696 4930 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-ovn-nb-db-server:current" Oct 12 05:57:02 crc kubenswrapper[4930]: E1012 05:57:02.588824 4930 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ovsdbserver-nb,Image:quay.rdoproject.org/podified-master-centos10/openstack-ovn-nb-db-server:current,Command:[/usr/bin/dumb-init],Args:[/usr/local/bin/container-scripts/setup.sh],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5h5dfhffh587h568hb4h648h5c5h667hb4h95h6chc4h5bbh68bh8bh658h697hd8h688h654h567h67hb5h54dh5c8h6ch75h5fdh655hd8h588q,ValueFrom:nil,},EnvVar{Name:OVN_LOGDIR,Value:/tmp,ValueFrom:nil,},EnvVar{Name:OVN_RUNDIR,Value:/tmp,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovndbcluster-nb-etc-ovn,ReadOnly:false,MountPath:/etc/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdb-rundir,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndb.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovndb.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hg25j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:&Lifecycle{PostStart:nil,PreStop:&LifecycleHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/cleanup.sh],},HTTPGet:nil,TCPSocket:nil,Sleep:nil,},},TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:20,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovsdbserver-nb-0_openstack(94bfaf3d-7abe-446f-b5ca-a359c65039b9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 12 05:57:02 crc kubenswrapper[4930]: I1012 05:57:02.675664 4930 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cbf6abbf-41bf-4616-84ec-9bc01291d120-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 12 05:57:02 crc kubenswrapper[4930]: I1012 05:57:02.675689 4930 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbf6abbf-41bf-4616-84ec-9bc01291d120-config\") on node \"crc\" DevicePath \"\"" Oct 12 05:57:02 crc kubenswrapper[4930]: I1012 05:57:02.675699 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mrrs7\" (UniqueName: \"kubernetes.io/projected/cbf6abbf-41bf-4616-84ec-9bc01291d120-kube-api-access-mrrs7\") on node \"crc\" DevicePath \"\"" Oct 12 05:57:02 crc kubenswrapper[4930]: E1012 05:57:02.902055 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-sb\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ovsdbserver-sb-0" podUID="acd687f2-88b8-4750-9f1a-ba8fa345e290" Oct 12 05:57:02 crc kubenswrapper[4930]: E1012 05:57:02.923440 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-nb\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ovsdbserver-nb-0" podUID="94bfaf3d-7abe-446f-b5ca-a359c65039b9" Oct 12 05:57:02 crc kubenswrapper[4930]: I1012 05:57:02.948408 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e52fde5b-22df-4fea-ae39-2bb2ef6fa033","Type":"ContainerStarted","Data":"31167ee21fb29ddf06f80988f5e68bd8384f8e8e88abc1c66f1e041fc53e1ea2"} Oct 12 05:57:02 crc kubenswrapper[4930]: I1012 05:57:02.950294 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589dfdf7-fq2v4" event={"ID":"cbf6abbf-41bf-4616-84ec-9bc01291d120","Type":"ContainerDied","Data":"a9708018d46ef09d1279bb4e404f629c61461aca504053e877fec7dc5531f946"} Oct 12 05:57:02 crc kubenswrapper[4930]: I1012 05:57:02.950333 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589dfdf7-fq2v4" Oct 12 05:57:02 crc kubenswrapper[4930]: I1012 05:57:02.965563 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"acd687f2-88b8-4750-9f1a-ba8fa345e290","Type":"ContainerStarted","Data":"ee2c63fb9206def47030c74b782757934880650bdc888669a1e7cf55ae4f7ba6"} Oct 12 05:57:02 crc kubenswrapper[4930]: E1012 05:57:02.966959 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-sb\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-ovn-sb-db-server:current\\\"\"" pod="openstack/ovsdbserver-sb-0" podUID="acd687f2-88b8-4750-9f1a-ba8fa345e290" Oct 12 05:57:02 crc kubenswrapper[4930]: I1012 05:57:02.967322 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5449989c59-cnw2z" Oct 12 05:57:02 crc kubenswrapper[4930]: I1012 05:57:02.966718 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5449989c59-cnw2z" event={"ID":"c3424140-ca19-4ff4-b19a-c236e8868d38","Type":"ContainerDied","Data":"78b3feacd68cbc1fabafebaaee6f788b3292e824999b36e5f86bdadce04fd873"} Oct 12 05:57:02 crc kubenswrapper[4930]: I1012 05:57:02.970424 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"94bfaf3d-7abe-446f-b5ca-a359c65039b9","Type":"ContainerStarted","Data":"d633286ae35e0509b8694aa3a66843d8786bddf8b0cbaf5366a04ad575a5533b"} Oct 12 05:57:02 crc kubenswrapper[4930]: E1012 05:57:02.970592 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-nb\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-ovn-nb-db-server:current\\\"\"" pod="openstack/ovsdbserver-nb-0" podUID="94bfaf3d-7abe-446f-b5ca-a359c65039b9" Oct 12 05:57:02 crc kubenswrapper[4930]: I1012 05:57:02.971678 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-545d49fd5c-78lqk" Oct 12 05:57:02 crc kubenswrapper[4930]: I1012 05:57:02.973278 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6665b8cd9-rr4j9" event={"ID":"e314b90f-0ecb-4ab1-b7da-86f2f6d1c36b","Type":"ContainerDied","Data":"0e2a981eb4f13758764cf8208237a9ab6ec705314bc5fe9aa0254a66bcb636b4"} Oct 12 05:57:02 crc kubenswrapper[4930]: I1012 05:57:02.973435 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8468885bfc-zhxs4" Oct 12 05:57:02 crc kubenswrapper[4930]: I1012 05:57:02.973581 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6665b8cd9-rr4j9" Oct 12 05:57:02 crc kubenswrapper[4930]: E1012 05:57:02.974376 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovn-controller\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current\\\"\"" pod="openstack/ovn-controller-nw5dm" podUID="ddccae59-8916-4bd7-bffa-041cf574e89e" Oct 12 05:57:03 crc kubenswrapper[4930]: I1012 05:57:03.146894 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-589dfdf7-fq2v4"] Oct 12 05:57:03 crc kubenswrapper[4930]: I1012 05:57:03.149048 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-589dfdf7-fq2v4"] Oct 12 05:57:03 crc kubenswrapper[4930]: I1012 05:57:03.164075 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5449989c59-cnw2z"] Oct 12 05:57:03 crc kubenswrapper[4930]: I1012 05:57:03.170036 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5449989c59-cnw2z"] Oct 12 05:57:03 crc kubenswrapper[4930]: I1012 05:57:03.197126 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8468885bfc-zhxs4"] Oct 12 05:57:03 crc kubenswrapper[4930]: I1012 05:57:03.201960 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8468885bfc-zhxs4"] Oct 12 05:57:03 crc kubenswrapper[4930]: I1012 05:57:03.227974 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6665b8cd9-rr4j9"] Oct 12 05:57:03 crc kubenswrapper[4930]: I1012 05:57:03.234692 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6665b8cd9-rr4j9"] Oct 12 05:57:03 crc kubenswrapper[4930]: I1012 05:57:03.254514 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-545d49fd5c-78lqk"] Oct 12 05:57:03 crc kubenswrapper[4930]: I1012 05:57:03.259237 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-545d49fd5c-78lqk"] Oct 12 05:57:03 crc kubenswrapper[4930]: I1012 05:57:03.669856 4930 patch_prober.go:28] interesting pod/machine-config-daemon-mk4tf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 05:57:03 crc kubenswrapper[4930]: I1012 05:57:03.669927 4930 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 05:57:03 crc kubenswrapper[4930]: I1012 05:57:03.979141 4930 generic.go:334] "Generic (PLEG): container finished" podID="f510e0b4-31e9-4050-a739-3c14f2b61603" containerID="e856cb52bfb532f468433b2ccabb65a1caeeeb1b16747c67e6cb9aafd15b9d50" exitCode=0 Oct 12 05:57:03 crc kubenswrapper[4930]: I1012 05:57:03.979204 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fb75c485f-sk8d6" event={"ID":"f510e0b4-31e9-4050-a739-3c14f2b61603","Type":"ContainerDied","Data":"e856cb52bfb532f468433b2ccabb65a1caeeeb1b16747c67e6cb9aafd15b9d50"} Oct 12 05:57:03 crc kubenswrapper[4930]: I1012 05:57:03.981425 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"fe886795-2501-4474-bfbc-9febcc5113f3","Type":"ContainerStarted","Data":"65c472105bc514ff6ba9f1922a44a2d105ac01e044a675fd9c167f35ca166f27"} Oct 12 05:57:03 crc kubenswrapper[4930]: I1012 05:57:03.982906 4930 generic.go:334] "Generic (PLEG): container finished" podID="cb57aef8-53ef-4894-8cfc-1ef708aedd9b" containerID="e0e1140a5cba334180f40ebc8f97714083b477e5eed97b3ed477c0ddf80a4da3" exitCode=0 Oct 12 05:57:03 crc kubenswrapper[4930]: I1012 05:57:03.983017 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dbf544cc9-5qcnc" event={"ID":"cb57aef8-53ef-4894-8cfc-1ef708aedd9b","Type":"ContainerDied","Data":"e0e1140a5cba334180f40ebc8f97714083b477e5eed97b3ed477c0ddf80a4da3"} Oct 12 05:57:03 crc kubenswrapper[4930]: E1012 05:57:03.983767 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-sb\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-ovn-sb-db-server:current\\\"\"" pod="openstack/ovsdbserver-sb-0" podUID="acd687f2-88b8-4750-9f1a-ba8fa345e290" Oct 12 05:57:03 crc kubenswrapper[4930]: E1012 05:57:03.988239 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-nb\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-ovn-nb-db-server:current\\\"\"" pod="openstack/ovsdbserver-nb-0" podUID="94bfaf3d-7abe-446f-b5ca-a359c65039b9" Oct 12 05:57:04 crc kubenswrapper[4930]: I1012 05:57:04.143369 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b4dd6b0-8862-4c13-83df-abb6c3752260" path="/var/lib/kubelet/pods/4b4dd6b0-8862-4c13-83df-abb6c3752260/volumes" Oct 12 05:57:04 crc kubenswrapper[4930]: I1012 05:57:04.144018 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3424140-ca19-4ff4-b19a-c236e8868d38" path="/var/lib/kubelet/pods/c3424140-ca19-4ff4-b19a-c236e8868d38/volumes" Oct 12 05:57:04 crc kubenswrapper[4930]: I1012 05:57:04.144846 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbf6abbf-41bf-4616-84ec-9bc01291d120" path="/var/lib/kubelet/pods/cbf6abbf-41bf-4616-84ec-9bc01291d120/volumes" Oct 12 05:57:04 crc kubenswrapper[4930]: I1012 05:57:04.145256 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e314b90f-0ecb-4ab1-b7da-86f2f6d1c36b" path="/var/lib/kubelet/pods/e314b90f-0ecb-4ab1-b7da-86f2f6d1c36b/volumes" Oct 12 05:57:04 crc kubenswrapper[4930]: I1012 05:57:04.145648 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9e22aa8-e04a-4743-a638-6835e95f5945" path="/var/lib/kubelet/pods/f9e22aa8-e04a-4743-a638-6835e95f5945/volumes" Oct 12 05:57:04 crc kubenswrapper[4930]: I1012 05:57:04.995225 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fb75c485f-sk8d6" event={"ID":"f510e0b4-31e9-4050-a739-3c14f2b61603","Type":"ContainerStarted","Data":"8d2786f7eb8bfaa38e48e68e67442c01381f24e4b10b0244bd990b2877d6213b"} Oct 12 05:57:04 crc kubenswrapper[4930]: I1012 05:57:04.995822 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6fb75c485f-sk8d6" Oct 12 05:57:05 crc kubenswrapper[4930]: I1012 05:57:05.001652 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dbf544cc9-5qcnc" event={"ID":"cb57aef8-53ef-4894-8cfc-1ef708aedd9b","Type":"ContainerStarted","Data":"3b9bbb5e4f685880b9051a8d13dd6db431ac050309db653872f1d0aa3e037de9"} Oct 12 05:57:05 crc kubenswrapper[4930]: I1012 05:57:05.001887 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6dbf544cc9-5qcnc" Oct 12 05:57:05 crc kubenswrapper[4930]: I1012 05:57:05.004770 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-notifications-server-0" event={"ID":"1594846a-5c2f-49f8-9bea-22661720c5a6","Type":"ContainerStarted","Data":"2c4c847aa56622d93cb41315862b13e7089f36b84899c67e01fbcca66b8a6997"} Oct 12 05:57:05 crc kubenswrapper[4930]: I1012 05:57:05.008950 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"bad3587e-d515-4add-9edd-da341fe519b7","Type":"ContainerStarted","Data":"600c87b339b6336ad0bdb7f69f659fd2b7a9dc3ae9650c061969bc4d027d0a09"} Oct 12 05:57:05 crc kubenswrapper[4930]: I1012 05:57:05.032891 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6fb75c485f-sk8d6" podStartSLOduration=6.226167231 podStartE2EDuration="44.032863054s" podCreationTimestamp="2025-10-12 05:56:21 +0000 UTC" firstStartedPulling="2025-10-12 05:56:24.750466019 +0000 UTC m=+917.292567784" lastFinishedPulling="2025-10-12 05:57:02.557161842 +0000 UTC m=+955.099263607" observedRunningTime="2025-10-12 05:57:05.022935805 +0000 UTC m=+957.565037630" watchObservedRunningTime="2025-10-12 05:57:05.032863054 +0000 UTC m=+957.574964859" Oct 12 05:57:05 crc kubenswrapper[4930]: I1012 05:57:05.098204 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6dbf544cc9-5qcnc" podStartSLOduration=5.832252466 podStartE2EDuration="39.098175691s" podCreationTimestamp="2025-10-12 05:56:26 +0000 UTC" firstStartedPulling="2025-10-12 05:56:29.360797431 +0000 UTC m=+921.902899196" lastFinishedPulling="2025-10-12 05:57:02.626720656 +0000 UTC m=+955.168822421" observedRunningTime="2025-10-12 05:57:05.093299379 +0000 UTC m=+957.635401184" watchObservedRunningTime="2025-10-12 05:57:05.098175691 +0000 UTC m=+957.640277466" Oct 12 05:57:06 crc kubenswrapper[4930]: I1012 05:57:06.019641 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"80c83b5b-7221-4d68-b4e1-b8c622bfa7cf","Type":"ContainerStarted","Data":"d411ed8b138f62ed0e0dca34c8730b2c8aff7bd99556f08a347d77ea720596dd"} Oct 12 05:57:08 crc kubenswrapper[4930]: I1012 05:57:08.040211 4930 generic.go:334] "Generic (PLEG): container finished" podID="e52fde5b-22df-4fea-ae39-2bb2ef6fa033" containerID="31167ee21fb29ddf06f80988f5e68bd8384f8e8e88abc1c66f1e041fc53e1ea2" exitCode=0 Oct 12 05:57:08 crc kubenswrapper[4930]: I1012 05:57:08.040349 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e52fde5b-22df-4fea-ae39-2bb2ef6fa033","Type":"ContainerDied","Data":"31167ee21fb29ddf06f80988f5e68bd8384f8e8e88abc1c66f1e041fc53e1ea2"} Oct 12 05:57:08 crc kubenswrapper[4930]: E1012 05:57:08.385896 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/prometheus-metric-storage-0" podUID="80c83b5b-7221-4d68-b4e1-b8c622bfa7cf" Oct 12 05:57:09 crc kubenswrapper[4930]: I1012 05:57:09.057407 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"80c83b5b-7221-4d68-b4e1-b8c622bfa7cf","Type":"ContainerStarted","Data":"7211811e0ae74c9852f66da818742cd8b77f9649efedc77809964b20d1e9d3c1"} Oct 12 05:57:09 crc kubenswrapper[4930]: E1012 05:57:09.060141 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/prometheus-rhel9@sha256:a0a1d0e39de54c5b2786c2b82d0104f358b479135c069075ddd4f7cd76826c00\\\"\"" pod="openstack/prometheus-metric-storage-0" podUID="80c83b5b-7221-4d68-b4e1-b8c622bfa7cf" Oct 12 05:57:09 crc kubenswrapper[4930]: I1012 05:57:09.062509 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e52fde5b-22df-4fea-ae39-2bb2ef6fa033","Type":"ContainerStarted","Data":"d4a62d9e2adbf63f3ad024e7c4b1f8dad0b2238b730f7558459b0bac87e59236"} Oct 12 05:57:09 crc kubenswrapper[4930]: I1012 05:57:09.148324 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=9.278855602 podStartE2EDuration="59.148291986s" podCreationTimestamp="2025-10-12 05:56:10 +0000 UTC" firstStartedPulling="2025-10-12 05:56:12.430822487 +0000 UTC m=+904.972924252" lastFinishedPulling="2025-10-12 05:57:02.300258871 +0000 UTC m=+954.842360636" observedRunningTime="2025-10-12 05:57:09.136313626 +0000 UTC m=+961.678415461" watchObservedRunningTime="2025-10-12 05:57:09.148291986 +0000 UTC m=+961.690393801" Oct 12 05:57:10 crc kubenswrapper[4930]: E1012 05:57:10.075138 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/prometheus-rhel9@sha256:a0a1d0e39de54c5b2786c2b82d0104f358b479135c069075ddd4f7cd76826c00\\\"\"" pod="openstack/prometheus-metric-storage-0" podUID="80c83b5b-7221-4d68-b4e1-b8c622bfa7cf" Oct 12 05:57:11 crc kubenswrapper[4930]: I1012 05:57:11.082937 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"76348a63-90b8-46b5-8856-da5c983b6d72","Type":"ContainerStarted","Data":"62b4669ce9f08cc1ea9fe18d7fe990716698c820f82a80bb700e4095208aa48b"} Oct 12 05:57:11 crc kubenswrapper[4930]: I1012 05:57:11.083542 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Oct 12 05:57:11 crc kubenswrapper[4930]: I1012 05:57:11.106163 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=2.601753418 podStartE2EDuration="59.106144195s" podCreationTimestamp="2025-10-12 05:56:12 +0000 UTC" firstStartedPulling="2025-10-12 05:56:13.85519593 +0000 UTC m=+906.397297685" lastFinishedPulling="2025-10-12 05:57:10.359586677 +0000 UTC m=+962.901688462" observedRunningTime="2025-10-12 05:57:11.10315827 +0000 UTC m=+963.645260065" watchObservedRunningTime="2025-10-12 05:57:11.106144195 +0000 UTC m=+963.648245970" Oct 12 05:57:11 crc kubenswrapper[4930]: I1012 05:57:11.522363 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Oct 12 05:57:11 crc kubenswrapper[4930]: I1012 05:57:11.522791 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Oct 12 05:57:11 crc kubenswrapper[4930]: I1012 05:57:11.667987 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6fb75c485f-sk8d6" Oct 12 05:57:11 crc kubenswrapper[4930]: I1012 05:57:11.809938 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6dbf544cc9-5qcnc" Oct 12 05:57:11 crc kubenswrapper[4930]: I1012 05:57:11.861905 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6fb75c485f-sk8d6"] Oct 12 05:57:12 crc kubenswrapper[4930]: I1012 05:57:12.092358 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"46dfc5e5-80f4-49c8-bd9d-885e5dcb3fe5","Type":"ContainerStarted","Data":"610aeeaefb6791b87e9080dda8ed5cd1d77fe3b9e05c2a4da5822f1bc2c3cda8"} Oct 12 05:57:12 crc kubenswrapper[4930]: I1012 05:57:12.092484 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6fb75c485f-sk8d6" podUID="f510e0b4-31e9-4050-a739-3c14f2b61603" containerName="dnsmasq-dns" containerID="cri-o://8d2786f7eb8bfaa38e48e68e67442c01381f24e4b10b0244bd990b2877d6213b" gracePeriod=10 Oct 12 05:57:12 crc kubenswrapper[4930]: I1012 05:57:12.564269 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fb75c485f-sk8d6" Oct 12 05:57:12 crc kubenswrapper[4930]: I1012 05:57:12.676629 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f510e0b4-31e9-4050-a739-3c14f2b61603-dns-svc\") pod \"f510e0b4-31e9-4050-a739-3c14f2b61603\" (UID: \"f510e0b4-31e9-4050-a739-3c14f2b61603\") " Oct 12 05:57:12 crc kubenswrapper[4930]: I1012 05:57:12.676689 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f510e0b4-31e9-4050-a739-3c14f2b61603-config\") pod \"f510e0b4-31e9-4050-a739-3c14f2b61603\" (UID: \"f510e0b4-31e9-4050-a739-3c14f2b61603\") " Oct 12 05:57:12 crc kubenswrapper[4930]: I1012 05:57:12.676801 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f510e0b4-31e9-4050-a739-3c14f2b61603-ovsdbserver-nb\") pod \"f510e0b4-31e9-4050-a739-3c14f2b61603\" (UID: \"f510e0b4-31e9-4050-a739-3c14f2b61603\") " Oct 12 05:57:12 crc kubenswrapper[4930]: I1012 05:57:12.676865 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tgjv\" (UniqueName: \"kubernetes.io/projected/f510e0b4-31e9-4050-a739-3c14f2b61603-kube-api-access-6tgjv\") pod \"f510e0b4-31e9-4050-a739-3c14f2b61603\" (UID: \"f510e0b4-31e9-4050-a739-3c14f2b61603\") " Oct 12 05:57:12 crc kubenswrapper[4930]: I1012 05:57:12.686023 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f510e0b4-31e9-4050-a739-3c14f2b61603-kube-api-access-6tgjv" (OuterVolumeSpecName: "kube-api-access-6tgjv") pod "f510e0b4-31e9-4050-a739-3c14f2b61603" (UID: "f510e0b4-31e9-4050-a739-3c14f2b61603"). InnerVolumeSpecName "kube-api-access-6tgjv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:57:12 crc kubenswrapper[4930]: I1012 05:57:12.718297 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f510e0b4-31e9-4050-a739-3c14f2b61603-config" (OuterVolumeSpecName: "config") pod "f510e0b4-31e9-4050-a739-3c14f2b61603" (UID: "f510e0b4-31e9-4050-a739-3c14f2b61603"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:57:12 crc kubenswrapper[4930]: I1012 05:57:12.724384 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f510e0b4-31e9-4050-a739-3c14f2b61603-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f510e0b4-31e9-4050-a739-3c14f2b61603" (UID: "f510e0b4-31e9-4050-a739-3c14f2b61603"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:57:12 crc kubenswrapper[4930]: I1012 05:57:12.732703 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f510e0b4-31e9-4050-a739-3c14f2b61603-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f510e0b4-31e9-4050-a739-3c14f2b61603" (UID: "f510e0b4-31e9-4050-a739-3c14f2b61603"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:57:12 crc kubenswrapper[4930]: I1012 05:57:12.779259 4930 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f510e0b4-31e9-4050-a739-3c14f2b61603-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 12 05:57:12 crc kubenswrapper[4930]: I1012 05:57:12.779290 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6tgjv\" (UniqueName: \"kubernetes.io/projected/f510e0b4-31e9-4050-a739-3c14f2b61603-kube-api-access-6tgjv\") on node \"crc\" DevicePath \"\"" Oct 12 05:57:12 crc kubenswrapper[4930]: I1012 05:57:12.779305 4930 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f510e0b4-31e9-4050-a739-3c14f2b61603-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 12 05:57:12 crc kubenswrapper[4930]: I1012 05:57:12.779314 4930 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f510e0b4-31e9-4050-a739-3c14f2b61603-config\") on node \"crc\" DevicePath \"\"" Oct 12 05:57:13 crc kubenswrapper[4930]: I1012 05:57:13.104303 4930 generic.go:334] "Generic (PLEG): container finished" podID="f510e0b4-31e9-4050-a739-3c14f2b61603" containerID="8d2786f7eb8bfaa38e48e68e67442c01381f24e4b10b0244bd990b2877d6213b" exitCode=0 Oct 12 05:57:13 crc kubenswrapper[4930]: I1012 05:57:13.104351 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fb75c485f-sk8d6" event={"ID":"f510e0b4-31e9-4050-a739-3c14f2b61603","Type":"ContainerDied","Data":"8d2786f7eb8bfaa38e48e68e67442c01381f24e4b10b0244bd990b2877d6213b"} Oct 12 05:57:13 crc kubenswrapper[4930]: I1012 05:57:13.104377 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fb75c485f-sk8d6" event={"ID":"f510e0b4-31e9-4050-a739-3c14f2b61603","Type":"ContainerDied","Data":"fd5f21eeb5962b65b4e776dc71aac372a98a2bccafc781c453a07231fbe8cf81"} Oct 12 05:57:13 crc kubenswrapper[4930]: I1012 05:57:13.104393 4930 scope.go:117] "RemoveContainer" containerID="8d2786f7eb8bfaa38e48e68e67442c01381f24e4b10b0244bd990b2877d6213b" Oct 12 05:57:13 crc kubenswrapper[4930]: I1012 05:57:13.104389 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fb75c485f-sk8d6" Oct 12 05:57:13 crc kubenswrapper[4930]: I1012 05:57:13.126036 4930 scope.go:117] "RemoveContainer" containerID="e856cb52bfb532f468433b2ccabb65a1caeeeb1b16747c67e6cb9aafd15b9d50" Oct 12 05:57:13 crc kubenswrapper[4930]: I1012 05:57:13.160772 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6fb75c485f-sk8d6"] Oct 12 05:57:13 crc kubenswrapper[4930]: I1012 05:57:13.165730 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6fb75c485f-sk8d6"] Oct 12 05:57:13 crc kubenswrapper[4930]: I1012 05:57:13.170772 4930 scope.go:117] "RemoveContainer" containerID="8d2786f7eb8bfaa38e48e68e67442c01381f24e4b10b0244bd990b2877d6213b" Oct 12 05:57:13 crc kubenswrapper[4930]: E1012 05:57:13.171223 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d2786f7eb8bfaa38e48e68e67442c01381f24e4b10b0244bd990b2877d6213b\": container with ID starting with 8d2786f7eb8bfaa38e48e68e67442c01381f24e4b10b0244bd990b2877d6213b not found: ID does not exist" containerID="8d2786f7eb8bfaa38e48e68e67442c01381f24e4b10b0244bd990b2877d6213b" Oct 12 05:57:13 crc kubenswrapper[4930]: I1012 05:57:13.171256 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d2786f7eb8bfaa38e48e68e67442c01381f24e4b10b0244bd990b2877d6213b"} err="failed to get container status \"8d2786f7eb8bfaa38e48e68e67442c01381f24e4b10b0244bd990b2877d6213b\": rpc error: code = NotFound desc = could not find container \"8d2786f7eb8bfaa38e48e68e67442c01381f24e4b10b0244bd990b2877d6213b\": container with ID starting with 8d2786f7eb8bfaa38e48e68e67442c01381f24e4b10b0244bd990b2877d6213b not found: ID does not exist" Oct 12 05:57:13 crc kubenswrapper[4930]: I1012 05:57:13.171279 4930 scope.go:117] "RemoveContainer" containerID="e856cb52bfb532f468433b2ccabb65a1caeeeb1b16747c67e6cb9aafd15b9d50" Oct 12 05:57:13 crc kubenswrapper[4930]: E1012 05:57:13.171541 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e856cb52bfb532f468433b2ccabb65a1caeeeb1b16747c67e6cb9aafd15b9d50\": container with ID starting with e856cb52bfb532f468433b2ccabb65a1caeeeb1b16747c67e6cb9aafd15b9d50 not found: ID does not exist" containerID="e856cb52bfb532f468433b2ccabb65a1caeeeb1b16747c67e6cb9aafd15b9d50" Oct 12 05:57:13 crc kubenswrapper[4930]: I1012 05:57:13.171578 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e856cb52bfb532f468433b2ccabb65a1caeeeb1b16747c67e6cb9aafd15b9d50"} err="failed to get container status \"e856cb52bfb532f468433b2ccabb65a1caeeeb1b16747c67e6cb9aafd15b9d50\": rpc error: code = NotFound desc = could not find container \"e856cb52bfb532f468433b2ccabb65a1caeeeb1b16747c67e6cb9aafd15b9d50\": container with ID starting with e856cb52bfb532f468433b2ccabb65a1caeeeb1b16747c67e6cb9aafd15b9d50 not found: ID does not exist" Oct 12 05:57:14 crc kubenswrapper[4930]: I1012 05:57:14.116809 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-hjqfl" event={"ID":"382036ff-8896-494d-9670-ec527019676f","Type":"ContainerStarted","Data":"d168ce03f3f2d28d298179bb813022964f497594964a01fd97466d30d8994ec2"} Oct 12 05:57:14 crc kubenswrapper[4930]: I1012 05:57:14.155440 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f510e0b4-31e9-4050-a739-3c14f2b61603" path="/var/lib/kubelet/pods/f510e0b4-31e9-4050-a739-3c14f2b61603/volumes" Oct 12 05:57:15 crc kubenswrapper[4930]: I1012 05:57:15.130041 4930 generic.go:334] "Generic (PLEG): container finished" podID="382036ff-8896-494d-9670-ec527019676f" containerID="d168ce03f3f2d28d298179bb813022964f497594964a01fd97466d30d8994ec2" exitCode=0 Oct 12 05:57:15 crc kubenswrapper[4930]: I1012 05:57:15.130099 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-hjqfl" event={"ID":"382036ff-8896-494d-9670-ec527019676f","Type":"ContainerDied","Data":"d168ce03f3f2d28d298179bb813022964f497594964a01fd97466d30d8994ec2"} Oct 12 05:57:15 crc kubenswrapper[4930]: I1012 05:57:15.617164 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Oct 12 05:57:15 crc kubenswrapper[4930]: I1012 05:57:15.720544 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Oct 12 05:57:16 crc kubenswrapper[4930]: I1012 05:57:16.143551 4930 generic.go:334] "Generic (PLEG): container finished" podID="46dfc5e5-80f4-49c8-bd9d-885e5dcb3fe5" containerID="610aeeaefb6791b87e9080dda8ed5cd1d77fe3b9e05c2a4da5822f1bc2c3cda8" exitCode=0 Oct 12 05:57:16 crc kubenswrapper[4930]: I1012 05:57:16.145193 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"acd687f2-88b8-4750-9f1a-ba8fa345e290","Type":"ContainerStarted","Data":"f6d4ef812cfb70eb419e91af6fc0e7ca1ca31acbafae5e7d5400c63586d69604"} Oct 12 05:57:16 crc kubenswrapper[4930]: I1012 05:57:16.145247 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"46dfc5e5-80f4-49c8-bd9d-885e5dcb3fe5","Type":"ContainerDied","Data":"610aeeaefb6791b87e9080dda8ed5cd1d77fe3b9e05c2a4da5822f1bc2c3cda8"} Oct 12 05:57:16 crc kubenswrapper[4930]: I1012 05:57:16.146023 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-hjqfl" event={"ID":"382036ff-8896-494d-9670-ec527019676f","Type":"ContainerStarted","Data":"97ab08dbbb7e91311c4dc83727b9d5391b5e4e6f925d81e4e200213769554afa"} Oct 12 05:57:16 crc kubenswrapper[4930]: I1012 05:57:16.146059 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-hjqfl" event={"ID":"382036ff-8896-494d-9670-ec527019676f","Type":"ContainerStarted","Data":"bfba993579ca37f15349bd69cea38d690ac4bf02b382e585dcef96f93929a649"} Oct 12 05:57:16 crc kubenswrapper[4930]: I1012 05:57:16.146273 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-hjqfl" Oct 12 05:57:16 crc kubenswrapper[4930]: I1012 05:57:16.146365 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-hjqfl" Oct 12 05:57:16 crc kubenswrapper[4930]: I1012 05:57:16.147681 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nw5dm" event={"ID":"ddccae59-8916-4bd7-bffa-041cf574e89e","Type":"ContainerStarted","Data":"30e0e05d42475e249ee463f3c4fdc6269238e50a2d090f9e44f4c5a0a6055b45"} Oct 12 05:57:16 crc kubenswrapper[4930]: I1012 05:57:16.148014 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-nw5dm" Oct 12 05:57:16 crc kubenswrapper[4930]: I1012 05:57:16.211249 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-hjqfl" podStartSLOduration=5.071166504 podStartE2EDuration="58.21123264s" podCreationTimestamp="2025-10-12 05:56:18 +0000 UTC" firstStartedPulling="2025-10-12 05:56:20.23158151 +0000 UTC m=+912.773683275" lastFinishedPulling="2025-10-12 05:57:13.371647646 +0000 UTC m=+965.913749411" observedRunningTime="2025-10-12 05:57:16.188978433 +0000 UTC m=+968.731080198" watchObservedRunningTime="2025-10-12 05:57:16.21123264 +0000 UTC m=+968.753334405" Oct 12 05:57:16 crc kubenswrapper[4930]: I1012 05:57:16.211669 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-nw5dm" podStartSLOduration=2.881780002 podStartE2EDuration="58.211664681s" podCreationTimestamp="2025-10-12 05:56:18 +0000 UTC" firstStartedPulling="2025-10-12 05:56:20.036079429 +0000 UTC m=+912.578181194" lastFinishedPulling="2025-10-12 05:57:15.365964048 +0000 UTC m=+967.908065873" observedRunningTime="2025-10-12 05:57:16.210603495 +0000 UTC m=+968.752705270" watchObservedRunningTime="2025-10-12 05:57:16.211664681 +0000 UTC m=+968.753766446" Oct 12 05:57:16 crc kubenswrapper[4930]: I1012 05:57:16.238288 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=4.693853711 podStartE2EDuration="55.238265248s" podCreationTimestamp="2025-10-12 05:56:21 +0000 UTC" firstStartedPulling="2025-10-12 05:56:24.808038283 +0000 UTC m=+917.350140058" lastFinishedPulling="2025-10-12 05:57:15.35244982 +0000 UTC m=+967.894551595" observedRunningTime="2025-10-12 05:57:16.233083598 +0000 UTC m=+968.775185363" watchObservedRunningTime="2025-10-12 05:57:16.238265248 +0000 UTC m=+968.780367013" Oct 12 05:57:16 crc kubenswrapper[4930]: I1012 05:57:16.916293 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Oct 12 05:57:17 crc kubenswrapper[4930]: I1012 05:57:17.161456 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"46dfc5e5-80f4-49c8-bd9d-885e5dcb3fe5","Type":"ContainerStarted","Data":"e5ad361163a65f5c249f087c7810648e2ff2f400b7734ecd857234992fb49889"} Oct 12 05:57:17 crc kubenswrapper[4930]: I1012 05:57:17.198184 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=-9223371970.65661 podStartE2EDuration="1m6.198165855s" podCreationTimestamp="2025-10-12 05:56:11 +0000 UTC" firstStartedPulling="2025-10-12 05:56:13.673231868 +0000 UTC m=+906.215333633" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:57:17.191145979 +0000 UTC m=+969.733247794" watchObservedRunningTime="2025-10-12 05:57:17.198165855 +0000 UTC m=+969.740267630" Oct 12 05:57:17 crc kubenswrapper[4930]: I1012 05:57:17.916160 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Oct 12 05:57:18 crc kubenswrapper[4930]: I1012 05:57:18.330873 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Oct 12 05:57:19 crc kubenswrapper[4930]: I1012 05:57:19.184508 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"94bfaf3d-7abe-446f-b5ca-a359c65039b9","Type":"ContainerStarted","Data":"990383cee843be9f8b7dcad37714264d5c2933f894e634cf5db495c8553f400a"} Oct 12 05:57:19 crc kubenswrapper[4930]: I1012 05:57:19.348074 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Oct 12 05:57:19 crc kubenswrapper[4930]: I1012 05:57:19.348157 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Oct 12 05:57:19 crc kubenswrapper[4930]: I1012 05:57:19.965539 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Oct 12 05:57:20 crc kubenswrapper[4930]: I1012 05:57:20.000512 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=8.530578358 podStartE2EDuration="1m3.000489596s" podCreationTimestamp="2025-10-12 05:56:17 +0000 UTC" firstStartedPulling="2025-10-12 05:56:23.884047006 +0000 UTC m=+916.426148771" lastFinishedPulling="2025-10-12 05:57:18.353958204 +0000 UTC m=+970.896060009" observedRunningTime="2025-10-12 05:57:19.210649824 +0000 UTC m=+971.752751629" watchObservedRunningTime="2025-10-12 05:57:20.000489596 +0000 UTC m=+972.542591361" Oct 12 05:57:22 crc kubenswrapper[4930]: I1012 05:57:22.221210 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"80c83b5b-7221-4d68-b4e1-b8c622bfa7cf","Type":"ContainerStarted","Data":"1a5c179b00bc7ad1bf2c025dadbf9a3b667bb42a416c99d5cad006fe189debea"} Oct 12 05:57:22 crc kubenswrapper[4930]: I1012 05:57:22.274334 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=4.616218349 podStartE2EDuration="1m7.274301223s" podCreationTimestamp="2025-10-12 05:56:15 +0000 UTC" firstStartedPulling="2025-10-12 05:56:18.394318836 +0000 UTC m=+910.936420591" lastFinishedPulling="2025-10-12 05:57:21.05240169 +0000 UTC m=+973.594503465" observedRunningTime="2025-10-12 05:57:22.262631341 +0000 UTC m=+974.804733146" watchObservedRunningTime="2025-10-12 05:57:22.274301223 +0000 UTC m=+974.816403018" Oct 12 05:57:22 crc kubenswrapper[4930]: I1012 05:57:22.410636 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Oct 12 05:57:22 crc kubenswrapper[4930]: I1012 05:57:22.975246 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Oct 12 05:57:23 crc kubenswrapper[4930]: I1012 05:57:23.066599 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-s7sp9"] Oct 12 05:57:23 crc kubenswrapper[4930]: E1012 05:57:23.067644 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f510e0b4-31e9-4050-a739-3c14f2b61603" containerName="init" Oct 12 05:57:23 crc kubenswrapper[4930]: I1012 05:57:23.067675 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="f510e0b4-31e9-4050-a739-3c14f2b61603" containerName="init" Oct 12 05:57:23 crc kubenswrapper[4930]: E1012 05:57:23.067689 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f510e0b4-31e9-4050-a739-3c14f2b61603" containerName="dnsmasq-dns" Oct 12 05:57:23 crc kubenswrapper[4930]: I1012 05:57:23.067696 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="f510e0b4-31e9-4050-a739-3c14f2b61603" containerName="dnsmasq-dns" Oct 12 05:57:23 crc kubenswrapper[4930]: I1012 05:57:23.067932 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="f510e0b4-31e9-4050-a739-3c14f2b61603" containerName="dnsmasq-dns" Oct 12 05:57:23 crc kubenswrapper[4930]: I1012 05:57:23.069037 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-s7sp9" Oct 12 05:57:23 crc kubenswrapper[4930]: I1012 05:57:23.072193 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-s7sp9"] Oct 12 05:57:23 crc kubenswrapper[4930]: I1012 05:57:23.183244 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nf4bn\" (UniqueName: \"kubernetes.io/projected/a8cd3366-214d-44fe-bef6-99a5522924c0-kube-api-access-nf4bn\") pod \"keystone-db-create-s7sp9\" (UID: \"a8cd3366-214d-44fe-bef6-99a5522924c0\") " pod="openstack/keystone-db-create-s7sp9" Oct 12 05:57:23 crc kubenswrapper[4930]: I1012 05:57:23.187543 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Oct 12 05:57:23 crc kubenswrapper[4930]: I1012 05:57:23.187581 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Oct 12 05:57:23 crc kubenswrapper[4930]: I1012 05:57:23.281070 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-n2tb8"] Oct 12 05:57:23 crc kubenswrapper[4930]: I1012 05:57:23.282072 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-n2tb8" Oct 12 05:57:23 crc kubenswrapper[4930]: I1012 05:57:23.284531 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nf4bn\" (UniqueName: \"kubernetes.io/projected/a8cd3366-214d-44fe-bef6-99a5522924c0-kube-api-access-nf4bn\") pod \"keystone-db-create-s7sp9\" (UID: \"a8cd3366-214d-44fe-bef6-99a5522924c0\") " pod="openstack/keystone-db-create-s7sp9" Oct 12 05:57:23 crc kubenswrapper[4930]: I1012 05:57:23.293145 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Oct 12 05:57:23 crc kubenswrapper[4930]: I1012 05:57:23.299181 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-n2tb8"] Oct 12 05:57:23 crc kubenswrapper[4930]: I1012 05:57:23.311404 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nf4bn\" (UniqueName: \"kubernetes.io/projected/a8cd3366-214d-44fe-bef6-99a5522924c0-kube-api-access-nf4bn\") pod \"keystone-db-create-s7sp9\" (UID: \"a8cd3366-214d-44fe-bef6-99a5522924c0\") " pod="openstack/keystone-db-create-s7sp9" Oct 12 05:57:23 crc kubenswrapper[4930]: I1012 05:57:23.386648 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7c4c5\" (UniqueName: \"kubernetes.io/projected/065099a3-1832-4ac1-9654-080eee86c974-kube-api-access-7c4c5\") pod \"placement-db-create-n2tb8\" (UID: \"065099a3-1832-4ac1-9654-080eee86c974\") " pod="openstack/placement-db-create-n2tb8" Oct 12 05:57:23 crc kubenswrapper[4930]: I1012 05:57:23.395987 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-s7sp9" Oct 12 05:57:23 crc kubenswrapper[4930]: I1012 05:57:23.403232 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Oct 12 05:57:23 crc kubenswrapper[4930]: I1012 05:57:23.489040 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7c4c5\" (UniqueName: \"kubernetes.io/projected/065099a3-1832-4ac1-9654-080eee86c974-kube-api-access-7c4c5\") pod \"placement-db-create-n2tb8\" (UID: \"065099a3-1832-4ac1-9654-080eee86c974\") " pod="openstack/placement-db-create-n2tb8" Oct 12 05:57:23 crc kubenswrapper[4930]: I1012 05:57:23.511717 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7c4c5\" (UniqueName: \"kubernetes.io/projected/065099a3-1832-4ac1-9654-080eee86c974-kube-api-access-7c4c5\") pod \"placement-db-create-n2tb8\" (UID: \"065099a3-1832-4ac1-9654-080eee86c974\") " pod="openstack/placement-db-create-n2tb8" Oct 12 05:57:23 crc kubenswrapper[4930]: I1012 05:57:23.605588 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-n2tb8" Oct 12 05:57:23 crc kubenswrapper[4930]: I1012 05:57:23.905259 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-s7sp9"] Oct 12 05:57:24 crc kubenswrapper[4930]: I1012 05:57:24.102396 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-n2tb8"] Oct 12 05:57:24 crc kubenswrapper[4930]: I1012 05:57:24.239505 4930 generic.go:334] "Generic (PLEG): container finished" podID="a8cd3366-214d-44fe-bef6-99a5522924c0" containerID="45d0f645d2ed4c0fe05ec4a344139c07459e51cc4421e6696bd2bfb516a7ab36" exitCode=0 Oct 12 05:57:24 crc kubenswrapper[4930]: I1012 05:57:24.239565 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-s7sp9" event={"ID":"a8cd3366-214d-44fe-bef6-99a5522924c0","Type":"ContainerDied","Data":"45d0f645d2ed4c0fe05ec4a344139c07459e51cc4421e6696bd2bfb516a7ab36"} Oct 12 05:57:24 crc kubenswrapper[4930]: I1012 05:57:24.239591 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-s7sp9" event={"ID":"a8cd3366-214d-44fe-bef6-99a5522924c0","Type":"ContainerStarted","Data":"d316516a64c0340417125f2651d6770ec805fce0039bfdc1537b5af8180a7acd"} Oct 12 05:57:24 crc kubenswrapper[4930]: I1012 05:57:24.241060 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-n2tb8" event={"ID":"065099a3-1832-4ac1-9654-080eee86c974","Type":"ContainerStarted","Data":"f44a37c1c6d1e3b9d76f167a2e93c1980a89703851d4138de20c4bd6d9cf22e1"} Oct 12 05:57:24 crc kubenswrapper[4930]: I1012 05:57:24.426931 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Oct 12 05:57:24 crc kubenswrapper[4930]: I1012 05:57:24.585575 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Oct 12 05:57:24 crc kubenswrapper[4930]: I1012 05:57:24.588321 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 12 05:57:24 crc kubenswrapper[4930]: I1012 05:57:24.618260 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-btg5h" Oct 12 05:57:24 crc kubenswrapper[4930]: I1012 05:57:24.618506 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Oct 12 05:57:24 crc kubenswrapper[4930]: I1012 05:57:24.618652 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Oct 12 05:57:24 crc kubenswrapper[4930]: I1012 05:57:24.618794 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Oct 12 05:57:24 crc kubenswrapper[4930]: I1012 05:57:24.630948 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 12 05:57:24 crc kubenswrapper[4930]: I1012 05:57:24.714065 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4580a3d-faab-45c2-a7a6-ef2802549ef9-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"b4580a3d-faab-45c2-a7a6-ef2802549ef9\") " pod="openstack/ovn-northd-0" Oct 12 05:57:24 crc kubenswrapper[4930]: I1012 05:57:24.714269 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4580a3d-faab-45c2-a7a6-ef2802549ef9-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"b4580a3d-faab-45c2-a7a6-ef2802549ef9\") " pod="openstack/ovn-northd-0" Oct 12 05:57:24 crc kubenswrapper[4930]: I1012 05:57:24.714451 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4580a3d-faab-45c2-a7a6-ef2802549ef9-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"b4580a3d-faab-45c2-a7a6-ef2802549ef9\") " pod="openstack/ovn-northd-0" Oct 12 05:57:24 crc kubenswrapper[4930]: I1012 05:57:24.714495 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdjfh\" (UniqueName: \"kubernetes.io/projected/b4580a3d-faab-45c2-a7a6-ef2802549ef9-kube-api-access-tdjfh\") pod \"ovn-northd-0\" (UID: \"b4580a3d-faab-45c2-a7a6-ef2802549ef9\") " pod="openstack/ovn-northd-0" Oct 12 05:57:24 crc kubenswrapper[4930]: I1012 05:57:24.714650 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b4580a3d-faab-45c2-a7a6-ef2802549ef9-scripts\") pod \"ovn-northd-0\" (UID: \"b4580a3d-faab-45c2-a7a6-ef2802549ef9\") " pod="openstack/ovn-northd-0" Oct 12 05:57:24 crc kubenswrapper[4930]: I1012 05:57:24.714708 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b4580a3d-faab-45c2-a7a6-ef2802549ef9-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"b4580a3d-faab-45c2-a7a6-ef2802549ef9\") " pod="openstack/ovn-northd-0" Oct 12 05:57:24 crc kubenswrapper[4930]: I1012 05:57:24.714858 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4580a3d-faab-45c2-a7a6-ef2802549ef9-config\") pod \"ovn-northd-0\" (UID: \"b4580a3d-faab-45c2-a7a6-ef2802549ef9\") " pod="openstack/ovn-northd-0" Oct 12 05:57:24 crc kubenswrapper[4930]: I1012 05:57:24.817191 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b4580a3d-faab-45c2-a7a6-ef2802549ef9-scripts\") pod \"ovn-northd-0\" (UID: \"b4580a3d-faab-45c2-a7a6-ef2802549ef9\") " pod="openstack/ovn-northd-0" Oct 12 05:57:24 crc kubenswrapper[4930]: I1012 05:57:24.817281 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b4580a3d-faab-45c2-a7a6-ef2802549ef9-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"b4580a3d-faab-45c2-a7a6-ef2802549ef9\") " pod="openstack/ovn-northd-0" Oct 12 05:57:24 crc kubenswrapper[4930]: I1012 05:57:24.817367 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4580a3d-faab-45c2-a7a6-ef2802549ef9-config\") pod \"ovn-northd-0\" (UID: \"b4580a3d-faab-45c2-a7a6-ef2802549ef9\") " pod="openstack/ovn-northd-0" Oct 12 05:57:24 crc kubenswrapper[4930]: I1012 05:57:24.817461 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4580a3d-faab-45c2-a7a6-ef2802549ef9-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"b4580a3d-faab-45c2-a7a6-ef2802549ef9\") " pod="openstack/ovn-northd-0" Oct 12 05:57:24 crc kubenswrapper[4930]: I1012 05:57:24.817505 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4580a3d-faab-45c2-a7a6-ef2802549ef9-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"b4580a3d-faab-45c2-a7a6-ef2802549ef9\") " pod="openstack/ovn-northd-0" Oct 12 05:57:24 crc kubenswrapper[4930]: I1012 05:57:24.817607 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4580a3d-faab-45c2-a7a6-ef2802549ef9-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"b4580a3d-faab-45c2-a7a6-ef2802549ef9\") " pod="openstack/ovn-northd-0" Oct 12 05:57:24 crc kubenswrapper[4930]: I1012 05:57:24.817640 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdjfh\" (UniqueName: \"kubernetes.io/projected/b4580a3d-faab-45c2-a7a6-ef2802549ef9-kube-api-access-tdjfh\") pod \"ovn-northd-0\" (UID: \"b4580a3d-faab-45c2-a7a6-ef2802549ef9\") " pod="openstack/ovn-northd-0" Oct 12 05:57:24 crc kubenswrapper[4930]: I1012 05:57:24.818107 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b4580a3d-faab-45c2-a7a6-ef2802549ef9-scripts\") pod \"ovn-northd-0\" (UID: \"b4580a3d-faab-45c2-a7a6-ef2802549ef9\") " pod="openstack/ovn-northd-0" Oct 12 05:57:24 crc kubenswrapper[4930]: I1012 05:57:24.818148 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4580a3d-faab-45c2-a7a6-ef2802549ef9-config\") pod \"ovn-northd-0\" (UID: \"b4580a3d-faab-45c2-a7a6-ef2802549ef9\") " pod="openstack/ovn-northd-0" Oct 12 05:57:24 crc kubenswrapper[4930]: I1012 05:57:24.818501 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b4580a3d-faab-45c2-a7a6-ef2802549ef9-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"b4580a3d-faab-45c2-a7a6-ef2802549ef9\") " pod="openstack/ovn-northd-0" Oct 12 05:57:24 crc kubenswrapper[4930]: I1012 05:57:24.828463 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4580a3d-faab-45c2-a7a6-ef2802549ef9-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"b4580a3d-faab-45c2-a7a6-ef2802549ef9\") " pod="openstack/ovn-northd-0" Oct 12 05:57:24 crc kubenswrapper[4930]: I1012 05:57:24.833035 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4580a3d-faab-45c2-a7a6-ef2802549ef9-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"b4580a3d-faab-45c2-a7a6-ef2802549ef9\") " pod="openstack/ovn-northd-0" Oct 12 05:57:24 crc kubenswrapper[4930]: I1012 05:57:24.835289 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdjfh\" (UniqueName: \"kubernetes.io/projected/b4580a3d-faab-45c2-a7a6-ef2802549ef9-kube-api-access-tdjfh\") pod \"ovn-northd-0\" (UID: \"b4580a3d-faab-45c2-a7a6-ef2802549ef9\") " pod="openstack/ovn-northd-0" Oct 12 05:57:24 crc kubenswrapper[4930]: I1012 05:57:24.835462 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4580a3d-faab-45c2-a7a6-ef2802549ef9-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"b4580a3d-faab-45c2-a7a6-ef2802549ef9\") " pod="openstack/ovn-northd-0" Oct 12 05:57:24 crc kubenswrapper[4930]: I1012 05:57:24.941426 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 12 05:57:25 crc kubenswrapper[4930]: I1012 05:57:25.268405 4930 generic.go:334] "Generic (PLEG): container finished" podID="065099a3-1832-4ac1-9654-080eee86c974" containerID="8fa8efc30466226e8621280addac73a3c2576c77641c3e9c6f2db92911dd05a6" exitCode=0 Oct 12 05:57:25 crc kubenswrapper[4930]: I1012 05:57:25.269001 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-n2tb8" event={"ID":"065099a3-1832-4ac1-9654-080eee86c974","Type":"ContainerDied","Data":"8fa8efc30466226e8621280addac73a3c2576c77641c3e9c6f2db92911dd05a6"} Oct 12 05:57:25 crc kubenswrapper[4930]: I1012 05:57:25.301362 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 12 05:57:25 crc kubenswrapper[4930]: I1012 05:57:25.391805 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-db-create-gzvqc"] Oct 12 05:57:25 crc kubenswrapper[4930]: I1012 05:57:25.392991 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-gzvqc" Oct 12 05:57:25 crc kubenswrapper[4930]: I1012 05:57:25.426829 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-76f9c4c8bc-6hjqx"] Oct 12 05:57:25 crc kubenswrapper[4930]: I1012 05:57:25.428273 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76f9c4c8bc-6hjqx" Oct 12 05:57:25 crc kubenswrapper[4930]: I1012 05:57:25.438842 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6z9l\" (UniqueName: \"kubernetes.io/projected/51978201-ae64-4c05-9e62-17e1eb5111d1-kube-api-access-q6z9l\") pod \"watcher-db-create-gzvqc\" (UID: \"51978201-ae64-4c05-9e62-17e1eb5111d1\") " pod="openstack/watcher-db-create-gzvqc" Oct 12 05:57:25 crc kubenswrapper[4930]: I1012 05:57:25.445858 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-create-gzvqc"] Oct 12 05:57:25 crc kubenswrapper[4930]: I1012 05:57:25.462895 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76f9c4c8bc-6hjqx"] Oct 12 05:57:25 crc kubenswrapper[4930]: I1012 05:57:25.544860 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkpp8\" (UniqueName: \"kubernetes.io/projected/c538bd3d-6ead-4b75-a12e-327b70390f9c-kube-api-access-wkpp8\") pod \"dnsmasq-dns-76f9c4c8bc-6hjqx\" (UID: \"c538bd3d-6ead-4b75-a12e-327b70390f9c\") " pod="openstack/dnsmasq-dns-76f9c4c8bc-6hjqx" Oct 12 05:57:25 crc kubenswrapper[4930]: I1012 05:57:25.544899 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c538bd3d-6ead-4b75-a12e-327b70390f9c-dns-svc\") pod \"dnsmasq-dns-76f9c4c8bc-6hjqx\" (UID: \"c538bd3d-6ead-4b75-a12e-327b70390f9c\") " pod="openstack/dnsmasq-dns-76f9c4c8bc-6hjqx" Oct 12 05:57:25 crc kubenswrapper[4930]: I1012 05:57:25.544928 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c538bd3d-6ead-4b75-a12e-327b70390f9c-config\") pod \"dnsmasq-dns-76f9c4c8bc-6hjqx\" (UID: \"c538bd3d-6ead-4b75-a12e-327b70390f9c\") " pod="openstack/dnsmasq-dns-76f9c4c8bc-6hjqx" Oct 12 05:57:25 crc kubenswrapper[4930]: I1012 05:57:25.544972 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6z9l\" (UniqueName: \"kubernetes.io/projected/51978201-ae64-4c05-9e62-17e1eb5111d1-kube-api-access-q6z9l\") pod \"watcher-db-create-gzvqc\" (UID: \"51978201-ae64-4c05-9e62-17e1eb5111d1\") " pod="openstack/watcher-db-create-gzvqc" Oct 12 05:57:25 crc kubenswrapper[4930]: I1012 05:57:25.545054 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c538bd3d-6ead-4b75-a12e-327b70390f9c-ovsdbserver-nb\") pod \"dnsmasq-dns-76f9c4c8bc-6hjqx\" (UID: \"c538bd3d-6ead-4b75-a12e-327b70390f9c\") " pod="openstack/dnsmasq-dns-76f9c4c8bc-6hjqx" Oct 12 05:57:25 crc kubenswrapper[4930]: I1012 05:57:25.545079 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c538bd3d-6ead-4b75-a12e-327b70390f9c-ovsdbserver-sb\") pod \"dnsmasq-dns-76f9c4c8bc-6hjqx\" (UID: \"c538bd3d-6ead-4b75-a12e-327b70390f9c\") " pod="openstack/dnsmasq-dns-76f9c4c8bc-6hjqx" Oct 12 05:57:25 crc kubenswrapper[4930]: I1012 05:57:25.592760 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6z9l\" (UniqueName: \"kubernetes.io/projected/51978201-ae64-4c05-9e62-17e1eb5111d1-kube-api-access-q6z9l\") pod \"watcher-db-create-gzvqc\" (UID: \"51978201-ae64-4c05-9e62-17e1eb5111d1\") " pod="openstack/watcher-db-create-gzvqc" Oct 12 05:57:25 crc kubenswrapper[4930]: I1012 05:57:25.646689 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c538bd3d-6ead-4b75-a12e-327b70390f9c-ovsdbserver-sb\") pod \"dnsmasq-dns-76f9c4c8bc-6hjqx\" (UID: \"c538bd3d-6ead-4b75-a12e-327b70390f9c\") " pod="openstack/dnsmasq-dns-76f9c4c8bc-6hjqx" Oct 12 05:57:25 crc kubenswrapper[4930]: I1012 05:57:25.646801 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkpp8\" (UniqueName: \"kubernetes.io/projected/c538bd3d-6ead-4b75-a12e-327b70390f9c-kube-api-access-wkpp8\") pod \"dnsmasq-dns-76f9c4c8bc-6hjqx\" (UID: \"c538bd3d-6ead-4b75-a12e-327b70390f9c\") " pod="openstack/dnsmasq-dns-76f9c4c8bc-6hjqx" Oct 12 05:57:25 crc kubenswrapper[4930]: I1012 05:57:25.646821 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c538bd3d-6ead-4b75-a12e-327b70390f9c-dns-svc\") pod \"dnsmasq-dns-76f9c4c8bc-6hjqx\" (UID: \"c538bd3d-6ead-4b75-a12e-327b70390f9c\") " pod="openstack/dnsmasq-dns-76f9c4c8bc-6hjqx" Oct 12 05:57:25 crc kubenswrapper[4930]: I1012 05:57:25.646850 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c538bd3d-6ead-4b75-a12e-327b70390f9c-config\") pod \"dnsmasq-dns-76f9c4c8bc-6hjqx\" (UID: \"c538bd3d-6ead-4b75-a12e-327b70390f9c\") " pod="openstack/dnsmasq-dns-76f9c4c8bc-6hjqx" Oct 12 05:57:25 crc kubenswrapper[4930]: I1012 05:57:25.646949 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c538bd3d-6ead-4b75-a12e-327b70390f9c-ovsdbserver-nb\") pod \"dnsmasq-dns-76f9c4c8bc-6hjqx\" (UID: \"c538bd3d-6ead-4b75-a12e-327b70390f9c\") " pod="openstack/dnsmasq-dns-76f9c4c8bc-6hjqx" Oct 12 05:57:25 crc kubenswrapper[4930]: I1012 05:57:25.647863 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c538bd3d-6ead-4b75-a12e-327b70390f9c-ovsdbserver-nb\") pod \"dnsmasq-dns-76f9c4c8bc-6hjqx\" (UID: \"c538bd3d-6ead-4b75-a12e-327b70390f9c\") " pod="openstack/dnsmasq-dns-76f9c4c8bc-6hjqx" Oct 12 05:57:25 crc kubenswrapper[4930]: I1012 05:57:25.648380 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c538bd3d-6ead-4b75-a12e-327b70390f9c-ovsdbserver-sb\") pod \"dnsmasq-dns-76f9c4c8bc-6hjqx\" (UID: \"c538bd3d-6ead-4b75-a12e-327b70390f9c\") " pod="openstack/dnsmasq-dns-76f9c4c8bc-6hjqx" Oct 12 05:57:25 crc kubenswrapper[4930]: I1012 05:57:25.649168 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c538bd3d-6ead-4b75-a12e-327b70390f9c-dns-svc\") pod \"dnsmasq-dns-76f9c4c8bc-6hjqx\" (UID: \"c538bd3d-6ead-4b75-a12e-327b70390f9c\") " pod="openstack/dnsmasq-dns-76f9c4c8bc-6hjqx" Oct 12 05:57:25 crc kubenswrapper[4930]: I1012 05:57:25.649653 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c538bd3d-6ead-4b75-a12e-327b70390f9c-config\") pod \"dnsmasq-dns-76f9c4c8bc-6hjqx\" (UID: \"c538bd3d-6ead-4b75-a12e-327b70390f9c\") " pod="openstack/dnsmasq-dns-76f9c4c8bc-6hjqx" Oct 12 05:57:25 crc kubenswrapper[4930]: I1012 05:57:25.666159 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkpp8\" (UniqueName: \"kubernetes.io/projected/c538bd3d-6ead-4b75-a12e-327b70390f9c-kube-api-access-wkpp8\") pod \"dnsmasq-dns-76f9c4c8bc-6hjqx\" (UID: \"c538bd3d-6ead-4b75-a12e-327b70390f9c\") " pod="openstack/dnsmasq-dns-76f9c4c8bc-6hjqx" Oct 12 05:57:25 crc kubenswrapper[4930]: I1012 05:57:25.770009 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-gzvqc" Oct 12 05:57:25 crc kubenswrapper[4930]: I1012 05:57:25.789968 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76f9c4c8bc-6hjqx" Oct 12 05:57:25 crc kubenswrapper[4930]: I1012 05:57:25.869328 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-s7sp9" Oct 12 05:57:25 crc kubenswrapper[4930]: I1012 05:57:25.952665 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nf4bn\" (UniqueName: \"kubernetes.io/projected/a8cd3366-214d-44fe-bef6-99a5522924c0-kube-api-access-nf4bn\") pod \"a8cd3366-214d-44fe-bef6-99a5522924c0\" (UID: \"a8cd3366-214d-44fe-bef6-99a5522924c0\") " Oct 12 05:57:25 crc kubenswrapper[4930]: I1012 05:57:25.959201 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8cd3366-214d-44fe-bef6-99a5522924c0-kube-api-access-nf4bn" (OuterVolumeSpecName: "kube-api-access-nf4bn") pod "a8cd3366-214d-44fe-bef6-99a5522924c0" (UID: "a8cd3366-214d-44fe-bef6-99a5522924c0"). InnerVolumeSpecName "kube-api-access-nf4bn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:57:26 crc kubenswrapper[4930]: I1012 05:57:26.054708 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nf4bn\" (UniqueName: \"kubernetes.io/projected/a8cd3366-214d-44fe-bef6-99a5522924c0-kube-api-access-nf4bn\") on node \"crc\" DevicePath \"\"" Oct 12 05:57:26 crc kubenswrapper[4930]: I1012 05:57:26.261590 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-create-gzvqc"] Oct 12 05:57:26 crc kubenswrapper[4930]: I1012 05:57:26.276907 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"b4580a3d-faab-45c2-a7a6-ef2802549ef9","Type":"ContainerStarted","Data":"47a7995d6e2fe5147ce5ed4064606858c10af28b1143d3c4fc0e6309fbed2680"} Oct 12 05:57:26 crc kubenswrapper[4930]: I1012 05:57:26.283279 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-s7sp9" Oct 12 05:57:26 crc kubenswrapper[4930]: I1012 05:57:26.283369 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-s7sp9" event={"ID":"a8cd3366-214d-44fe-bef6-99a5522924c0","Type":"ContainerDied","Data":"d316516a64c0340417125f2651d6770ec805fce0039bfdc1537b5af8180a7acd"} Oct 12 05:57:26 crc kubenswrapper[4930]: I1012 05:57:26.283432 4930 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d316516a64c0340417125f2651d6770ec805fce0039bfdc1537b5af8180a7acd" Oct 12 05:57:26 crc kubenswrapper[4930]: I1012 05:57:26.359691 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76f9c4c8bc-6hjqx"] Oct 12 05:57:26 crc kubenswrapper[4930]: I1012 05:57:26.523791 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Oct 12 05:57:26 crc kubenswrapper[4930]: I1012 05:57:26.596890 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Oct 12 05:57:26 crc kubenswrapper[4930]: E1012 05:57:26.600797 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8cd3366-214d-44fe-bef6-99a5522924c0" containerName="mariadb-database-create" Oct 12 05:57:26 crc kubenswrapper[4930]: I1012 05:57:26.600829 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8cd3366-214d-44fe-bef6-99a5522924c0" containerName="mariadb-database-create" Oct 12 05:57:26 crc kubenswrapper[4930]: I1012 05:57:26.601121 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8cd3366-214d-44fe-bef6-99a5522924c0" containerName="mariadb-database-create" Oct 12 05:57:26 crc kubenswrapper[4930]: I1012 05:57:26.606425 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 12 05:57:26 crc kubenswrapper[4930]: I1012 05:57:26.608788 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Oct 12 05:57:26 crc kubenswrapper[4930]: I1012 05:57:26.610245 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-btddf" Oct 12 05:57:26 crc kubenswrapper[4930]: I1012 05:57:26.610436 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Oct 12 05:57:26 crc kubenswrapper[4930]: I1012 05:57:26.612038 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Oct 12 05:57:26 crc kubenswrapper[4930]: I1012 05:57:26.615697 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 12 05:57:26 crc kubenswrapper[4930]: I1012 05:57:26.664313 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3-cache\") pod \"swift-storage-0\" (UID: \"073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3\") " pod="openstack/swift-storage-0" Oct 12 05:57:26 crc kubenswrapper[4930]: I1012 05:57:26.664435 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3\") " pod="openstack/swift-storage-0" Oct 12 05:57:26 crc kubenswrapper[4930]: I1012 05:57:26.664462 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3-etc-swift\") pod \"swift-storage-0\" (UID: \"073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3\") " pod="openstack/swift-storage-0" Oct 12 05:57:26 crc kubenswrapper[4930]: I1012 05:57:26.664569 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3-lock\") pod \"swift-storage-0\" (UID: \"073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3\") " pod="openstack/swift-storage-0" Oct 12 05:57:26 crc kubenswrapper[4930]: I1012 05:57:26.664620 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2fcc\" (UniqueName: \"kubernetes.io/projected/073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3-kube-api-access-n2fcc\") pod \"swift-storage-0\" (UID: \"073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3\") " pod="openstack/swift-storage-0" Oct 12 05:57:26 crc kubenswrapper[4930]: I1012 05:57:26.749115 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-n2tb8" Oct 12 05:57:26 crc kubenswrapper[4930]: I1012 05:57:26.766944 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4c5\" (UniqueName: \"kubernetes.io/projected/065099a3-1832-4ac1-9654-080eee86c974-kube-api-access-7c4c5\") pod \"065099a3-1832-4ac1-9654-080eee86c974\" (UID: \"065099a3-1832-4ac1-9654-080eee86c974\") " Oct 12 05:57:26 crc kubenswrapper[4930]: I1012 05:57:26.767350 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3-lock\") pod \"swift-storage-0\" (UID: \"073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3\") " pod="openstack/swift-storage-0" Oct 12 05:57:26 crc kubenswrapper[4930]: I1012 05:57:26.767407 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2fcc\" (UniqueName: \"kubernetes.io/projected/073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3-kube-api-access-n2fcc\") pod \"swift-storage-0\" (UID: \"073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3\") " pod="openstack/swift-storage-0" Oct 12 05:57:26 crc kubenswrapper[4930]: I1012 05:57:26.767441 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3-cache\") pod \"swift-storage-0\" (UID: \"073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3\") " pod="openstack/swift-storage-0" Oct 12 05:57:26 crc kubenswrapper[4930]: I1012 05:57:26.767484 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3\") " pod="openstack/swift-storage-0" Oct 12 05:57:26 crc kubenswrapper[4930]: I1012 05:57:26.767505 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3-etc-swift\") pod \"swift-storage-0\" (UID: \"073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3\") " pod="openstack/swift-storage-0" Oct 12 05:57:26 crc kubenswrapper[4930]: E1012 05:57:26.768499 4930 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 12 05:57:26 crc kubenswrapper[4930]: E1012 05:57:26.768525 4930 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 12 05:57:26 crc kubenswrapper[4930]: I1012 05:57:26.768546 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3-lock\") pod \"swift-storage-0\" (UID: \"073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3\") " pod="openstack/swift-storage-0" Oct 12 05:57:26 crc kubenswrapper[4930]: E1012 05:57:26.768603 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3-etc-swift podName:073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3 nodeName:}" failed. No retries permitted until 2025-10-12 05:57:27.268565486 +0000 UTC m=+979.810667251 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3-etc-swift") pod "swift-storage-0" (UID: "073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3") : configmap "swift-ring-files" not found Oct 12 05:57:26 crc kubenswrapper[4930]: I1012 05:57:26.768605 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3-cache\") pod \"swift-storage-0\" (UID: \"073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3\") " pod="openstack/swift-storage-0" Oct 12 05:57:26 crc kubenswrapper[4930]: I1012 05:57:26.768781 4930 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/swift-storage-0" Oct 12 05:57:26 crc kubenswrapper[4930]: I1012 05:57:26.771966 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/065099a3-1832-4ac1-9654-080eee86c974-kube-api-access-7c4c5" (OuterVolumeSpecName: "kube-api-access-7c4c5") pod "065099a3-1832-4ac1-9654-080eee86c974" (UID: "065099a3-1832-4ac1-9654-080eee86c974"). InnerVolumeSpecName "kube-api-access-7c4c5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:57:26 crc kubenswrapper[4930]: I1012 05:57:26.799699 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2fcc\" (UniqueName: \"kubernetes.io/projected/073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3-kube-api-access-n2fcc\") pod \"swift-storage-0\" (UID: \"073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3\") " pod="openstack/swift-storage-0" Oct 12 05:57:26 crc kubenswrapper[4930]: I1012 05:57:26.823500 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3\") " pod="openstack/swift-storage-0" Oct 12 05:57:26 crc kubenswrapper[4930]: I1012 05:57:26.869588 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4c5\" (UniqueName: \"kubernetes.io/projected/065099a3-1832-4ac1-9654-080eee86c974-kube-api-access-7c4c5\") on node \"crc\" DevicePath \"\"" Oct 12 05:57:27 crc kubenswrapper[4930]: I1012 05:57:27.274984 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3-etc-swift\") pod \"swift-storage-0\" (UID: \"073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3\") " pod="openstack/swift-storage-0" Oct 12 05:57:27 crc kubenswrapper[4930]: E1012 05:57:27.275191 4930 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 12 05:57:27 crc kubenswrapper[4930]: E1012 05:57:27.275426 4930 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 12 05:57:27 crc kubenswrapper[4930]: E1012 05:57:27.275515 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3-etc-swift podName:073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3 nodeName:}" failed. No retries permitted until 2025-10-12 05:57:28.275481365 +0000 UTC m=+980.817583140 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3-etc-swift") pod "swift-storage-0" (UID: "073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3") : configmap "swift-ring-files" not found Oct 12 05:57:27 crc kubenswrapper[4930]: I1012 05:57:27.296490 4930 generic.go:334] "Generic (PLEG): container finished" podID="51978201-ae64-4c05-9e62-17e1eb5111d1" containerID="762b85a5b5828a13bbc556e082c56c99ad851f798985561a4615c24df133c99a" exitCode=0 Oct 12 05:57:27 crc kubenswrapper[4930]: I1012 05:57:27.296583 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-gzvqc" event={"ID":"51978201-ae64-4c05-9e62-17e1eb5111d1","Type":"ContainerDied","Data":"762b85a5b5828a13bbc556e082c56c99ad851f798985561a4615c24df133c99a"} Oct 12 05:57:27 crc kubenswrapper[4930]: I1012 05:57:27.296615 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-gzvqc" event={"ID":"51978201-ae64-4c05-9e62-17e1eb5111d1","Type":"ContainerStarted","Data":"0eecbbdd27b3a90cfd20f6214610b0b68b4c7f2a07807fa8fdf407812373618e"} Oct 12 05:57:27 crc kubenswrapper[4930]: I1012 05:57:27.300175 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"b4580a3d-faab-45c2-a7a6-ef2802549ef9","Type":"ContainerStarted","Data":"d4d16d5c9ddb20e5d507829a3d7652dc79944afde9a6d3697149414028ff90ca"} Oct 12 05:57:27 crc kubenswrapper[4930]: I1012 05:57:27.300224 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"b4580a3d-faab-45c2-a7a6-ef2802549ef9","Type":"ContainerStarted","Data":"f63a70b255b6b8dba7e37eb22c111e67da27f52d72c7cfb4a7861af0b759b98b"} Oct 12 05:57:27 crc kubenswrapper[4930]: I1012 05:57:27.302887 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-n2tb8" Oct 12 05:57:27 crc kubenswrapper[4930]: I1012 05:57:27.303094 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-n2tb8" event={"ID":"065099a3-1832-4ac1-9654-080eee86c974","Type":"ContainerDied","Data":"f44a37c1c6d1e3b9d76f167a2e93c1980a89703851d4138de20c4bd6d9cf22e1"} Oct 12 05:57:27 crc kubenswrapper[4930]: I1012 05:57:27.303142 4930 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f44a37c1c6d1e3b9d76f167a2e93c1980a89703851d4138de20c4bd6d9cf22e1" Oct 12 05:57:27 crc kubenswrapper[4930]: I1012 05:57:27.305499 4930 generic.go:334] "Generic (PLEG): container finished" podID="c538bd3d-6ead-4b75-a12e-327b70390f9c" containerID="36b664bc1818cc9c2606d64e8a4ce29fc00199ab10c440540f3682d59a9d0153" exitCode=0 Oct 12 05:57:27 crc kubenswrapper[4930]: I1012 05:57:27.305557 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76f9c4c8bc-6hjqx" event={"ID":"c538bd3d-6ead-4b75-a12e-327b70390f9c","Type":"ContainerDied","Data":"36b664bc1818cc9c2606d64e8a4ce29fc00199ab10c440540f3682d59a9d0153"} Oct 12 05:57:27 crc kubenswrapper[4930]: I1012 05:57:27.305588 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76f9c4c8bc-6hjqx" event={"ID":"c538bd3d-6ead-4b75-a12e-327b70390f9c","Type":"ContainerStarted","Data":"5796279102b5b07dfe44ffd6cc3f9770acebf998bfd8aae791f412e477c72acc"} Oct 12 05:57:27 crc kubenswrapper[4930]: I1012 05:57:27.355535 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.162886393 podStartE2EDuration="3.355503792s" podCreationTimestamp="2025-10-12 05:57:24 +0000 UTC" firstStartedPulling="2025-10-12 05:57:25.373571388 +0000 UTC m=+977.915673153" lastFinishedPulling="2025-10-12 05:57:26.566188777 +0000 UTC m=+979.108290552" observedRunningTime="2025-10-12 05:57:27.345827371 +0000 UTC m=+979.887929156" watchObservedRunningTime="2025-10-12 05:57:27.355503792 +0000 UTC m=+979.897605557" Oct 12 05:57:28 crc kubenswrapper[4930]: I1012 05:57:28.302783 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3-etc-swift\") pod \"swift-storage-0\" (UID: \"073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3\") " pod="openstack/swift-storage-0" Oct 12 05:57:28 crc kubenswrapper[4930]: E1012 05:57:28.302977 4930 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 12 05:57:28 crc kubenswrapper[4930]: E1012 05:57:28.303088 4930 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 12 05:57:28 crc kubenswrapper[4930]: E1012 05:57:28.303146 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3-etc-swift podName:073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3 nodeName:}" failed. No retries permitted until 2025-10-12 05:57:30.303130538 +0000 UTC m=+982.845232303 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3-etc-swift") pod "swift-storage-0" (UID: "073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3") : configmap "swift-ring-files" not found Oct 12 05:57:28 crc kubenswrapper[4930]: I1012 05:57:28.350666 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76f9c4c8bc-6hjqx" event={"ID":"c538bd3d-6ead-4b75-a12e-327b70390f9c","Type":"ContainerStarted","Data":"f74502473d6f97afe0249a1d38c4e959958a619305d15a4309737b260c87bd64"} Oct 12 05:57:28 crc kubenswrapper[4930]: I1012 05:57:28.350707 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Oct 12 05:57:28 crc kubenswrapper[4930]: I1012 05:57:28.350838 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-76f9c4c8bc-6hjqx" Oct 12 05:57:28 crc kubenswrapper[4930]: I1012 05:57:28.386333 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-76f9c4c8bc-6hjqx" podStartSLOduration=3.3863158540000002 podStartE2EDuration="3.386315854s" podCreationTimestamp="2025-10-12 05:57:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:57:28.381437832 +0000 UTC m=+980.923539597" watchObservedRunningTime="2025-10-12 05:57:28.386315854 +0000 UTC m=+980.928417619" Oct 12 05:57:28 crc kubenswrapper[4930]: I1012 05:57:28.728256 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-gzvqc" Oct 12 05:57:28 crc kubenswrapper[4930]: I1012 05:57:28.910838 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6z9l\" (UniqueName: \"kubernetes.io/projected/51978201-ae64-4c05-9e62-17e1eb5111d1-kube-api-access-q6z9l\") pod \"51978201-ae64-4c05-9e62-17e1eb5111d1\" (UID: \"51978201-ae64-4c05-9e62-17e1eb5111d1\") " Oct 12 05:57:28 crc kubenswrapper[4930]: I1012 05:57:28.917857 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51978201-ae64-4c05-9e62-17e1eb5111d1-kube-api-access-q6z9l" (OuterVolumeSpecName: "kube-api-access-q6z9l") pod "51978201-ae64-4c05-9e62-17e1eb5111d1" (UID: "51978201-ae64-4c05-9e62-17e1eb5111d1"). InnerVolumeSpecName "kube-api-access-q6z9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:57:29 crc kubenswrapper[4930]: I1012 05:57:29.013208 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6z9l\" (UniqueName: \"kubernetes.io/projected/51978201-ae64-4c05-9e62-17e1eb5111d1-kube-api-access-q6z9l\") on node \"crc\" DevicePath \"\"" Oct 12 05:57:29 crc kubenswrapper[4930]: I1012 05:57:29.374613 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-gzvqc" event={"ID":"51978201-ae64-4c05-9e62-17e1eb5111d1","Type":"ContainerDied","Data":"0eecbbdd27b3a90cfd20f6214610b0b68b4c7f2a07807fa8fdf407812373618e"} Oct 12 05:57:29 crc kubenswrapper[4930]: I1012 05:57:29.374683 4930 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0eecbbdd27b3a90cfd20f6214610b0b68b4c7f2a07807fa8fdf407812373618e" Oct 12 05:57:29 crc kubenswrapper[4930]: I1012 05:57:29.374930 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-gzvqc" Oct 12 05:57:30 crc kubenswrapper[4930]: I1012 05:57:30.340081 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3-etc-swift\") pod \"swift-storage-0\" (UID: \"073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3\") " pod="openstack/swift-storage-0" Oct 12 05:57:30 crc kubenswrapper[4930]: E1012 05:57:30.340680 4930 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 12 05:57:30 crc kubenswrapper[4930]: E1012 05:57:30.340708 4930 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 12 05:57:30 crc kubenswrapper[4930]: E1012 05:57:30.340802 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3-etc-swift podName:073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3 nodeName:}" failed. No retries permitted until 2025-10-12 05:57:34.340778993 +0000 UTC m=+986.882880798 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3-etc-swift") pod "swift-storage-0" (UID: "073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3") : configmap "swift-ring-files" not found Oct 12 05:57:30 crc kubenswrapper[4930]: I1012 05:57:30.444482 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-xldxb"] Oct 12 05:57:30 crc kubenswrapper[4930]: E1012 05:57:30.444925 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51978201-ae64-4c05-9e62-17e1eb5111d1" containerName="mariadb-database-create" Oct 12 05:57:30 crc kubenswrapper[4930]: I1012 05:57:30.444940 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="51978201-ae64-4c05-9e62-17e1eb5111d1" containerName="mariadb-database-create" Oct 12 05:57:30 crc kubenswrapper[4930]: E1012 05:57:30.444981 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="065099a3-1832-4ac1-9654-080eee86c974" containerName="mariadb-database-create" Oct 12 05:57:30 crc kubenswrapper[4930]: I1012 05:57:30.444989 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="065099a3-1832-4ac1-9654-080eee86c974" containerName="mariadb-database-create" Oct 12 05:57:30 crc kubenswrapper[4930]: I1012 05:57:30.445203 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="065099a3-1832-4ac1-9654-080eee86c974" containerName="mariadb-database-create" Oct 12 05:57:30 crc kubenswrapper[4930]: I1012 05:57:30.445251 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="51978201-ae64-4c05-9e62-17e1eb5111d1" containerName="mariadb-database-create" Oct 12 05:57:30 crc kubenswrapper[4930]: I1012 05:57:30.446048 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-xldxb" Oct 12 05:57:30 crc kubenswrapper[4930]: I1012 05:57:30.454499 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 12 05:57:30 crc kubenswrapper[4930]: I1012 05:57:30.454533 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Oct 12 05:57:30 crc kubenswrapper[4930]: I1012 05:57:30.455459 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Oct 12 05:57:30 crc kubenswrapper[4930]: I1012 05:57:30.455613 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-xldxb"] Oct 12 05:57:30 crc kubenswrapper[4930]: I1012 05:57:30.646864 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/df839ac4-27ff-436b-b328-55b948887fce-dispersionconf\") pod \"swift-ring-rebalance-xldxb\" (UID: \"df839ac4-27ff-436b-b328-55b948887fce\") " pod="openstack/swift-ring-rebalance-xldxb" Oct 12 05:57:30 crc kubenswrapper[4930]: I1012 05:57:30.647499 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/df839ac4-27ff-436b-b328-55b948887fce-ring-data-devices\") pod \"swift-ring-rebalance-xldxb\" (UID: \"df839ac4-27ff-436b-b328-55b948887fce\") " pod="openstack/swift-ring-rebalance-xldxb" Oct 12 05:57:30 crc kubenswrapper[4930]: I1012 05:57:30.647593 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/df839ac4-27ff-436b-b328-55b948887fce-scripts\") pod \"swift-ring-rebalance-xldxb\" (UID: \"df839ac4-27ff-436b-b328-55b948887fce\") " pod="openstack/swift-ring-rebalance-xldxb" Oct 12 05:57:30 crc kubenswrapper[4930]: I1012 05:57:30.647679 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df839ac4-27ff-436b-b328-55b948887fce-combined-ca-bundle\") pod \"swift-ring-rebalance-xldxb\" (UID: \"df839ac4-27ff-436b-b328-55b948887fce\") " pod="openstack/swift-ring-rebalance-xldxb" Oct 12 05:57:30 crc kubenswrapper[4930]: I1012 05:57:30.647807 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/df839ac4-27ff-436b-b328-55b948887fce-swiftconf\") pod \"swift-ring-rebalance-xldxb\" (UID: \"df839ac4-27ff-436b-b328-55b948887fce\") " pod="openstack/swift-ring-rebalance-xldxb" Oct 12 05:57:30 crc kubenswrapper[4930]: I1012 05:57:30.647848 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/df839ac4-27ff-436b-b328-55b948887fce-etc-swift\") pod \"swift-ring-rebalance-xldxb\" (UID: \"df839ac4-27ff-436b-b328-55b948887fce\") " pod="openstack/swift-ring-rebalance-xldxb" Oct 12 05:57:30 crc kubenswrapper[4930]: I1012 05:57:30.647921 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkwcc\" (UniqueName: \"kubernetes.io/projected/df839ac4-27ff-436b-b328-55b948887fce-kube-api-access-nkwcc\") pod \"swift-ring-rebalance-xldxb\" (UID: \"df839ac4-27ff-436b-b328-55b948887fce\") " pod="openstack/swift-ring-rebalance-xldxb" Oct 12 05:57:30 crc kubenswrapper[4930]: I1012 05:57:30.749411 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/df839ac4-27ff-436b-b328-55b948887fce-swiftconf\") pod \"swift-ring-rebalance-xldxb\" (UID: \"df839ac4-27ff-436b-b328-55b948887fce\") " pod="openstack/swift-ring-rebalance-xldxb" Oct 12 05:57:30 crc kubenswrapper[4930]: I1012 05:57:30.749468 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/df839ac4-27ff-436b-b328-55b948887fce-etc-swift\") pod \"swift-ring-rebalance-xldxb\" (UID: \"df839ac4-27ff-436b-b328-55b948887fce\") " pod="openstack/swift-ring-rebalance-xldxb" Oct 12 05:57:30 crc kubenswrapper[4930]: I1012 05:57:30.749511 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkwcc\" (UniqueName: \"kubernetes.io/projected/df839ac4-27ff-436b-b328-55b948887fce-kube-api-access-nkwcc\") pod \"swift-ring-rebalance-xldxb\" (UID: \"df839ac4-27ff-436b-b328-55b948887fce\") " pod="openstack/swift-ring-rebalance-xldxb" Oct 12 05:57:30 crc kubenswrapper[4930]: I1012 05:57:30.749607 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/df839ac4-27ff-436b-b328-55b948887fce-dispersionconf\") pod \"swift-ring-rebalance-xldxb\" (UID: \"df839ac4-27ff-436b-b328-55b948887fce\") " pod="openstack/swift-ring-rebalance-xldxb" Oct 12 05:57:30 crc kubenswrapper[4930]: I1012 05:57:30.749666 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/df839ac4-27ff-436b-b328-55b948887fce-ring-data-devices\") pod \"swift-ring-rebalance-xldxb\" (UID: \"df839ac4-27ff-436b-b328-55b948887fce\") " pod="openstack/swift-ring-rebalance-xldxb" Oct 12 05:57:30 crc kubenswrapper[4930]: I1012 05:57:30.749696 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/df839ac4-27ff-436b-b328-55b948887fce-scripts\") pod \"swift-ring-rebalance-xldxb\" (UID: \"df839ac4-27ff-436b-b328-55b948887fce\") " pod="openstack/swift-ring-rebalance-xldxb" Oct 12 05:57:30 crc kubenswrapper[4930]: I1012 05:57:30.749771 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df839ac4-27ff-436b-b328-55b948887fce-combined-ca-bundle\") pod \"swift-ring-rebalance-xldxb\" (UID: \"df839ac4-27ff-436b-b328-55b948887fce\") " pod="openstack/swift-ring-rebalance-xldxb" Oct 12 05:57:30 crc kubenswrapper[4930]: I1012 05:57:30.750785 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/df839ac4-27ff-436b-b328-55b948887fce-etc-swift\") pod \"swift-ring-rebalance-xldxb\" (UID: \"df839ac4-27ff-436b-b328-55b948887fce\") " pod="openstack/swift-ring-rebalance-xldxb" Oct 12 05:57:30 crc kubenswrapper[4930]: I1012 05:57:30.751154 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/df839ac4-27ff-436b-b328-55b948887fce-ring-data-devices\") pod \"swift-ring-rebalance-xldxb\" (UID: \"df839ac4-27ff-436b-b328-55b948887fce\") " pod="openstack/swift-ring-rebalance-xldxb" Oct 12 05:57:30 crc kubenswrapper[4930]: I1012 05:57:30.751612 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/df839ac4-27ff-436b-b328-55b948887fce-scripts\") pod \"swift-ring-rebalance-xldxb\" (UID: \"df839ac4-27ff-436b-b328-55b948887fce\") " pod="openstack/swift-ring-rebalance-xldxb" Oct 12 05:57:30 crc kubenswrapper[4930]: I1012 05:57:30.755368 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/df839ac4-27ff-436b-b328-55b948887fce-swiftconf\") pod \"swift-ring-rebalance-xldxb\" (UID: \"df839ac4-27ff-436b-b328-55b948887fce\") " pod="openstack/swift-ring-rebalance-xldxb" Oct 12 05:57:30 crc kubenswrapper[4930]: I1012 05:57:30.756575 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/df839ac4-27ff-436b-b328-55b948887fce-dispersionconf\") pod \"swift-ring-rebalance-xldxb\" (UID: \"df839ac4-27ff-436b-b328-55b948887fce\") " pod="openstack/swift-ring-rebalance-xldxb" Oct 12 05:57:30 crc kubenswrapper[4930]: I1012 05:57:30.756633 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df839ac4-27ff-436b-b328-55b948887fce-combined-ca-bundle\") pod \"swift-ring-rebalance-xldxb\" (UID: \"df839ac4-27ff-436b-b328-55b948887fce\") " pod="openstack/swift-ring-rebalance-xldxb" Oct 12 05:57:30 crc kubenswrapper[4930]: I1012 05:57:30.782159 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkwcc\" (UniqueName: \"kubernetes.io/projected/df839ac4-27ff-436b-b328-55b948887fce-kube-api-access-nkwcc\") pod \"swift-ring-rebalance-xldxb\" (UID: \"df839ac4-27ff-436b-b328-55b948887fce\") " pod="openstack/swift-ring-rebalance-xldxb" Oct 12 05:57:31 crc kubenswrapper[4930]: I1012 05:57:31.066329 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-xldxb" Oct 12 05:57:31 crc kubenswrapper[4930]: I1012 05:57:31.525423 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Oct 12 05:57:31 crc kubenswrapper[4930]: I1012 05:57:31.529297 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-xldxb"] Oct 12 05:57:31 crc kubenswrapper[4930]: I1012 05:57:31.532161 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Oct 12 05:57:32 crc kubenswrapper[4930]: I1012 05:57:32.433362 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-xldxb" event={"ID":"df839ac4-27ff-436b-b328-55b948887fce","Type":"ContainerStarted","Data":"36e54e155194a03e21d2e57acce27019e3980e2b546c616cb951fa702c63f9f6"} Oct 12 05:57:32 crc kubenswrapper[4930]: I1012 05:57:32.435691 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Oct 12 05:57:32 crc kubenswrapper[4930]: I1012 05:57:32.969289 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-b403-account-create-lzlnf"] Oct 12 05:57:32 crc kubenswrapper[4930]: I1012 05:57:32.970674 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b403-account-create-lzlnf" Oct 12 05:57:32 crc kubenswrapper[4930]: I1012 05:57:32.973529 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Oct 12 05:57:32 crc kubenswrapper[4930]: I1012 05:57:32.986667 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-b403-account-create-lzlnf"] Oct 12 05:57:33 crc kubenswrapper[4930]: I1012 05:57:33.116352 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fnx9\" (UniqueName: \"kubernetes.io/projected/bd2f9a6f-cb9e-4db3-8165-e2cc18f1daff-kube-api-access-9fnx9\") pod \"keystone-b403-account-create-lzlnf\" (UID: \"bd2f9a6f-cb9e-4db3-8165-e2cc18f1daff\") " pod="openstack/keystone-b403-account-create-lzlnf" Oct 12 05:57:33 crc kubenswrapper[4930]: I1012 05:57:33.223298 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fnx9\" (UniqueName: \"kubernetes.io/projected/bd2f9a6f-cb9e-4db3-8165-e2cc18f1daff-kube-api-access-9fnx9\") pod \"keystone-b403-account-create-lzlnf\" (UID: \"bd2f9a6f-cb9e-4db3-8165-e2cc18f1daff\") " pod="openstack/keystone-b403-account-create-lzlnf" Oct 12 05:57:33 crc kubenswrapper[4930]: I1012 05:57:33.251028 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fnx9\" (UniqueName: \"kubernetes.io/projected/bd2f9a6f-cb9e-4db3-8165-e2cc18f1daff-kube-api-access-9fnx9\") pod \"keystone-b403-account-create-lzlnf\" (UID: \"bd2f9a6f-cb9e-4db3-8165-e2cc18f1daff\") " pod="openstack/keystone-b403-account-create-lzlnf" Oct 12 05:57:33 crc kubenswrapper[4930]: I1012 05:57:33.289133 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b403-account-create-lzlnf" Oct 12 05:57:33 crc kubenswrapper[4930]: I1012 05:57:33.352745 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-2133-account-create-h59cp"] Oct 12 05:57:33 crc kubenswrapper[4930]: I1012 05:57:33.354288 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2133-account-create-h59cp" Oct 12 05:57:33 crc kubenswrapper[4930]: I1012 05:57:33.359317 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Oct 12 05:57:33 crc kubenswrapper[4930]: I1012 05:57:33.360875 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-2133-account-create-h59cp"] Oct 12 05:57:33 crc kubenswrapper[4930]: I1012 05:57:33.427563 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9sxx\" (UniqueName: \"kubernetes.io/projected/f603981a-3581-49ba-8726-f21e981d4988-kube-api-access-d9sxx\") pod \"placement-2133-account-create-h59cp\" (UID: \"f603981a-3581-49ba-8726-f21e981d4988\") " pod="openstack/placement-2133-account-create-h59cp" Oct 12 05:57:33 crc kubenswrapper[4930]: I1012 05:57:33.530008 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9sxx\" (UniqueName: \"kubernetes.io/projected/f603981a-3581-49ba-8726-f21e981d4988-kube-api-access-d9sxx\") pod \"placement-2133-account-create-h59cp\" (UID: \"f603981a-3581-49ba-8726-f21e981d4988\") " pod="openstack/placement-2133-account-create-h59cp" Oct 12 05:57:33 crc kubenswrapper[4930]: I1012 05:57:33.547056 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9sxx\" (UniqueName: \"kubernetes.io/projected/f603981a-3581-49ba-8726-f21e981d4988-kube-api-access-d9sxx\") pod \"placement-2133-account-create-h59cp\" (UID: \"f603981a-3581-49ba-8726-f21e981d4988\") " pod="openstack/placement-2133-account-create-h59cp" Oct 12 05:57:33 crc kubenswrapper[4930]: I1012 05:57:33.669688 4930 patch_prober.go:28] interesting pod/machine-config-daemon-mk4tf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 05:57:33 crc kubenswrapper[4930]: I1012 05:57:33.669833 4930 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 05:57:33 crc kubenswrapper[4930]: I1012 05:57:33.669900 4930 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" Oct 12 05:57:33 crc kubenswrapper[4930]: I1012 05:57:33.670860 4930 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"86436bfc7a3b225084b0677c9406a12b80ec7ace76e26bb6e4b679b1e7256578"} pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 12 05:57:33 crc kubenswrapper[4930]: I1012 05:57:33.670958 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerName="machine-config-daemon" containerID="cri-o://86436bfc7a3b225084b0677c9406a12b80ec7ace76e26bb6e4b679b1e7256578" gracePeriod=600 Oct 12 05:57:33 crc kubenswrapper[4930]: I1012 05:57:33.692325 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2133-account-create-h59cp" Oct 12 05:57:34 crc kubenswrapper[4930]: I1012 05:57:34.347381 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3-etc-swift\") pod \"swift-storage-0\" (UID: \"073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3\") " pod="openstack/swift-storage-0" Oct 12 05:57:34 crc kubenswrapper[4930]: E1012 05:57:34.347711 4930 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 12 05:57:34 crc kubenswrapper[4930]: E1012 05:57:34.347987 4930 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 12 05:57:34 crc kubenswrapper[4930]: E1012 05:57:34.348070 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3-etc-swift podName:073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3 nodeName:}" failed. No retries permitted until 2025-10-12 05:57:42.348045066 +0000 UTC m=+994.890146871 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3-etc-swift") pod "swift-storage-0" (UID: "073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3") : configmap "swift-ring-files" not found Oct 12 05:57:34 crc kubenswrapper[4930]: I1012 05:57:34.461780 4930 generic.go:334] "Generic (PLEG): container finished" podID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerID="86436bfc7a3b225084b0677c9406a12b80ec7ace76e26bb6e4b679b1e7256578" exitCode=0 Oct 12 05:57:34 crc kubenswrapper[4930]: I1012 05:57:34.461847 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" event={"ID":"02f8684c-a3e4-44e8-9741-9f54488d8d8d","Type":"ContainerDied","Data":"86436bfc7a3b225084b0677c9406a12b80ec7ace76e26bb6e4b679b1e7256578"} Oct 12 05:57:34 crc kubenswrapper[4930]: I1012 05:57:34.461904 4930 scope.go:117] "RemoveContainer" containerID="8c4b30d4b3900fd24d77d876af5c72fe4eab21e88f5e73b0f78ab790ead4cf26" Oct 12 05:57:35 crc kubenswrapper[4930]: I1012 05:57:35.433801 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 12 05:57:35 crc kubenswrapper[4930]: I1012 05:57:35.434920 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="80c83b5b-7221-4d68-b4e1-b8c622bfa7cf" containerName="config-reloader" containerID="cri-o://d411ed8b138f62ed0e0dca34c8730b2c8aff7bd99556f08a347d77ea720596dd" gracePeriod=600 Oct 12 05:57:35 crc kubenswrapper[4930]: I1012 05:57:35.435036 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="80c83b5b-7221-4d68-b4e1-b8c622bfa7cf" containerName="prometheus" containerID="cri-o://1a5c179b00bc7ad1bf2c025dadbf9a3b667bb42a416c99d5cad006fe189debea" gracePeriod=600 Oct 12 05:57:35 crc kubenswrapper[4930]: I1012 05:57:35.435077 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="80c83b5b-7221-4d68-b4e1-b8c622bfa7cf" containerName="thanos-sidecar" containerID="cri-o://7211811e0ae74c9852f66da818742cd8b77f9649efedc77809964b20d1e9d3c1" gracePeriod=600 Oct 12 05:57:35 crc kubenswrapper[4930]: I1012 05:57:35.486460 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-8e21-account-create-645jw"] Oct 12 05:57:35 crc kubenswrapper[4930]: I1012 05:57:35.492552 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-8e21-account-create-645jw" Oct 12 05:57:35 crc kubenswrapper[4930]: I1012 05:57:35.495497 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-db-secret" Oct 12 05:57:35 crc kubenswrapper[4930]: I1012 05:57:35.508053 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-8e21-account-create-645jw"] Oct 12 05:57:35 crc kubenswrapper[4930]: I1012 05:57:35.677681 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlkzx\" (UniqueName: \"kubernetes.io/projected/31682f1f-3cd9-4ad2-a3ed-46a6c686c0f3-kube-api-access-nlkzx\") pod \"watcher-8e21-account-create-645jw\" (UID: \"31682f1f-3cd9-4ad2-a3ed-46a6c686c0f3\") " pod="openstack/watcher-8e21-account-create-645jw" Oct 12 05:57:35 crc kubenswrapper[4930]: I1012 05:57:35.779199 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlkzx\" (UniqueName: \"kubernetes.io/projected/31682f1f-3cd9-4ad2-a3ed-46a6c686c0f3-kube-api-access-nlkzx\") pod \"watcher-8e21-account-create-645jw\" (UID: \"31682f1f-3cd9-4ad2-a3ed-46a6c686c0f3\") " pod="openstack/watcher-8e21-account-create-645jw" Oct 12 05:57:35 crc kubenswrapper[4930]: I1012 05:57:35.791630 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-76f9c4c8bc-6hjqx" Oct 12 05:57:35 crc kubenswrapper[4930]: I1012 05:57:35.803510 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlkzx\" (UniqueName: \"kubernetes.io/projected/31682f1f-3cd9-4ad2-a3ed-46a6c686c0f3-kube-api-access-nlkzx\") pod \"watcher-8e21-account-create-645jw\" (UID: \"31682f1f-3cd9-4ad2-a3ed-46a6c686c0f3\") " pod="openstack/watcher-8e21-account-create-645jw" Oct 12 05:57:35 crc kubenswrapper[4930]: I1012 05:57:35.852376 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-8e21-account-create-645jw" Oct 12 05:57:35 crc kubenswrapper[4930]: I1012 05:57:35.876887 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6dbf544cc9-5qcnc"] Oct 12 05:57:35 crc kubenswrapper[4930]: I1012 05:57:35.877166 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6dbf544cc9-5qcnc" podUID="cb57aef8-53ef-4894-8cfc-1ef708aedd9b" containerName="dnsmasq-dns" containerID="cri-o://3b9bbb5e4f685880b9051a8d13dd6db431ac050309db653872f1d0aa3e037de9" gracePeriod=10 Oct 12 05:57:36 crc kubenswrapper[4930]: I1012 05:57:36.482582 4930 generic.go:334] "Generic (PLEG): container finished" podID="80c83b5b-7221-4d68-b4e1-b8c622bfa7cf" containerID="1a5c179b00bc7ad1bf2c025dadbf9a3b667bb42a416c99d5cad006fe189debea" exitCode=0 Oct 12 05:57:36 crc kubenswrapper[4930]: I1012 05:57:36.482615 4930 generic.go:334] "Generic (PLEG): container finished" podID="80c83b5b-7221-4d68-b4e1-b8c622bfa7cf" containerID="7211811e0ae74c9852f66da818742cd8b77f9649efedc77809964b20d1e9d3c1" exitCode=0 Oct 12 05:57:36 crc kubenswrapper[4930]: I1012 05:57:36.482623 4930 generic.go:334] "Generic (PLEG): container finished" podID="80c83b5b-7221-4d68-b4e1-b8c622bfa7cf" containerID="d411ed8b138f62ed0e0dca34c8730b2c8aff7bd99556f08a347d77ea720596dd" exitCode=0 Oct 12 05:57:36 crc kubenswrapper[4930]: I1012 05:57:36.482663 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"80c83b5b-7221-4d68-b4e1-b8c622bfa7cf","Type":"ContainerDied","Data":"1a5c179b00bc7ad1bf2c025dadbf9a3b667bb42a416c99d5cad006fe189debea"} Oct 12 05:57:36 crc kubenswrapper[4930]: I1012 05:57:36.482688 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"80c83b5b-7221-4d68-b4e1-b8c622bfa7cf","Type":"ContainerDied","Data":"7211811e0ae74c9852f66da818742cd8b77f9649efedc77809964b20d1e9d3c1"} Oct 12 05:57:36 crc kubenswrapper[4930]: I1012 05:57:36.482697 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"80c83b5b-7221-4d68-b4e1-b8c622bfa7cf","Type":"ContainerDied","Data":"d411ed8b138f62ed0e0dca34c8730b2c8aff7bd99556f08a347d77ea720596dd"} Oct 12 05:57:36 crc kubenswrapper[4930]: I1012 05:57:36.484309 4930 generic.go:334] "Generic (PLEG): container finished" podID="fe886795-2501-4474-bfbc-9febcc5113f3" containerID="65c472105bc514ff6ba9f1922a44a2d105ac01e044a675fd9c167f35ca166f27" exitCode=0 Oct 12 05:57:36 crc kubenswrapper[4930]: I1012 05:57:36.484351 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"fe886795-2501-4474-bfbc-9febcc5113f3","Type":"ContainerDied","Data":"65c472105bc514ff6ba9f1922a44a2d105ac01e044a675fd9c167f35ca166f27"} Oct 12 05:57:36 crc kubenswrapper[4930]: I1012 05:57:36.489212 4930 generic.go:334] "Generic (PLEG): container finished" podID="cb57aef8-53ef-4894-8cfc-1ef708aedd9b" containerID="3b9bbb5e4f685880b9051a8d13dd6db431ac050309db653872f1d0aa3e037de9" exitCode=0 Oct 12 05:57:36 crc kubenswrapper[4930]: I1012 05:57:36.489259 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dbf544cc9-5qcnc" event={"ID":"cb57aef8-53ef-4894-8cfc-1ef708aedd9b","Type":"ContainerDied","Data":"3b9bbb5e4f685880b9051a8d13dd6db431ac050309db653872f1d0aa3e037de9"} Oct 12 05:57:36 crc kubenswrapper[4930]: I1012 05:57:36.525507 4930 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="80c83b5b-7221-4d68-b4e1-b8c622bfa7cf" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.0.115:9090/-/ready\": dial tcp 10.217.0.115:9090: connect: connection refused" Oct 12 05:57:36 crc kubenswrapper[4930]: I1012 05:57:36.809182 4930 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6dbf544cc9-5qcnc" podUID="cb57aef8-53ef-4894-8cfc-1ef708aedd9b" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.122:5353: connect: connection refused" Oct 12 05:57:37 crc kubenswrapper[4930]: I1012 05:57:37.476849 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dbf544cc9-5qcnc" Oct 12 05:57:37 crc kubenswrapper[4930]: I1012 05:57:37.531755 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dbf544cc9-5qcnc" event={"ID":"cb57aef8-53ef-4894-8cfc-1ef708aedd9b","Type":"ContainerDied","Data":"397bfe9ba29e11a01d9a2fb68299d0e39ac4c1c46b39490e2e99799f0c74e5aa"} Oct 12 05:57:37 crc kubenswrapper[4930]: I1012 05:57:37.531805 4930 scope.go:117] "RemoveContainer" containerID="3b9bbb5e4f685880b9051a8d13dd6db431ac050309db653872f1d0aa3e037de9" Oct 12 05:57:37 crc kubenswrapper[4930]: I1012 05:57:37.531825 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dbf544cc9-5qcnc" Oct 12 05:57:37 crc kubenswrapper[4930]: I1012 05:57:37.544683 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" event={"ID":"02f8684c-a3e4-44e8-9741-9f54488d8d8d","Type":"ContainerStarted","Data":"c01e0f78e06c76804a67ffb0c83af238f0fea06c1b96b28458581d18668d1cf0"} Oct 12 05:57:37 crc kubenswrapper[4930]: I1012 05:57:37.547001 4930 generic.go:334] "Generic (PLEG): container finished" podID="1594846a-5c2f-49f8-9bea-22661720c5a6" containerID="2c4c847aa56622d93cb41315862b13e7089f36b84899c67e01fbcca66b8a6997" exitCode=0 Oct 12 05:57:37 crc kubenswrapper[4930]: I1012 05:57:37.547042 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-notifications-server-0" event={"ID":"1594846a-5c2f-49f8-9bea-22661720c5a6","Type":"ContainerDied","Data":"2c4c847aa56622d93cb41315862b13e7089f36b84899c67e01fbcca66b8a6997"} Oct 12 05:57:37 crc kubenswrapper[4930]: I1012 05:57:37.548650 4930 generic.go:334] "Generic (PLEG): container finished" podID="bad3587e-d515-4add-9edd-da341fe519b7" containerID="600c87b339b6336ad0bdb7f69f659fd2b7a9dc3ae9650c061969bc4d027d0a09" exitCode=0 Oct 12 05:57:37 crc kubenswrapper[4930]: I1012 05:57:37.548701 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"bad3587e-d515-4add-9edd-da341fe519b7","Type":"ContainerDied","Data":"600c87b339b6336ad0bdb7f69f659fd2b7a9dc3ae9650c061969bc4d027d0a09"} Oct 12 05:57:37 crc kubenswrapper[4930]: I1012 05:57:37.573289 4930 scope.go:117] "RemoveContainer" containerID="e0e1140a5cba334180f40ebc8f97714083b477e5eed97b3ed477c0ddf80a4da3" Oct 12 05:57:37 crc kubenswrapper[4930]: I1012 05:57:37.630193 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cb57aef8-53ef-4894-8cfc-1ef708aedd9b-ovsdbserver-nb\") pod \"cb57aef8-53ef-4894-8cfc-1ef708aedd9b\" (UID: \"cb57aef8-53ef-4894-8cfc-1ef708aedd9b\") " Oct 12 05:57:37 crc kubenswrapper[4930]: I1012 05:57:37.630494 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb57aef8-53ef-4894-8cfc-1ef708aedd9b-config\") pod \"cb57aef8-53ef-4894-8cfc-1ef708aedd9b\" (UID: \"cb57aef8-53ef-4894-8cfc-1ef708aedd9b\") " Oct 12 05:57:37 crc kubenswrapper[4930]: I1012 05:57:37.630548 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4g9pb\" (UniqueName: \"kubernetes.io/projected/cb57aef8-53ef-4894-8cfc-1ef708aedd9b-kube-api-access-4g9pb\") pod \"cb57aef8-53ef-4894-8cfc-1ef708aedd9b\" (UID: \"cb57aef8-53ef-4894-8cfc-1ef708aedd9b\") " Oct 12 05:57:37 crc kubenswrapper[4930]: I1012 05:57:37.630616 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cb57aef8-53ef-4894-8cfc-1ef708aedd9b-ovsdbserver-sb\") pod \"cb57aef8-53ef-4894-8cfc-1ef708aedd9b\" (UID: \"cb57aef8-53ef-4894-8cfc-1ef708aedd9b\") " Oct 12 05:57:37 crc kubenswrapper[4930]: I1012 05:57:37.630637 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb57aef8-53ef-4894-8cfc-1ef708aedd9b-dns-svc\") pod \"cb57aef8-53ef-4894-8cfc-1ef708aedd9b\" (UID: \"cb57aef8-53ef-4894-8cfc-1ef708aedd9b\") " Oct 12 05:57:37 crc kubenswrapper[4930]: I1012 05:57:37.645842 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb57aef8-53ef-4894-8cfc-1ef708aedd9b-kube-api-access-4g9pb" (OuterVolumeSpecName: "kube-api-access-4g9pb") pod "cb57aef8-53ef-4894-8cfc-1ef708aedd9b" (UID: "cb57aef8-53ef-4894-8cfc-1ef708aedd9b"). InnerVolumeSpecName "kube-api-access-4g9pb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:57:37 crc kubenswrapper[4930]: I1012 05:57:37.677779 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb57aef8-53ef-4894-8cfc-1ef708aedd9b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cb57aef8-53ef-4894-8cfc-1ef708aedd9b" (UID: "cb57aef8-53ef-4894-8cfc-1ef708aedd9b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:57:37 crc kubenswrapper[4930]: I1012 05:57:37.680753 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 12 05:57:37 crc kubenswrapper[4930]: I1012 05:57:37.682220 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb57aef8-53ef-4894-8cfc-1ef708aedd9b-config" (OuterVolumeSpecName: "config") pod "cb57aef8-53ef-4894-8cfc-1ef708aedd9b" (UID: "cb57aef8-53ef-4894-8cfc-1ef708aedd9b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:57:37 crc kubenswrapper[4930]: I1012 05:57:37.693603 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb57aef8-53ef-4894-8cfc-1ef708aedd9b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cb57aef8-53ef-4894-8cfc-1ef708aedd9b" (UID: "cb57aef8-53ef-4894-8cfc-1ef708aedd9b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:57:37 crc kubenswrapper[4930]: I1012 05:57:37.713018 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb57aef8-53ef-4894-8cfc-1ef708aedd9b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cb57aef8-53ef-4894-8cfc-1ef708aedd9b" (UID: "cb57aef8-53ef-4894-8cfc-1ef708aedd9b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:57:37 crc kubenswrapper[4930]: I1012 05:57:37.732690 4930 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cb57aef8-53ef-4894-8cfc-1ef708aedd9b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 12 05:57:37 crc kubenswrapper[4930]: I1012 05:57:37.732725 4930 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb57aef8-53ef-4894-8cfc-1ef708aedd9b-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 12 05:57:37 crc kubenswrapper[4930]: I1012 05:57:37.732747 4930 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cb57aef8-53ef-4894-8cfc-1ef708aedd9b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 12 05:57:37 crc kubenswrapper[4930]: I1012 05:57:37.732756 4930 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb57aef8-53ef-4894-8cfc-1ef708aedd9b-config\") on node \"crc\" DevicePath \"\"" Oct 12 05:57:37 crc kubenswrapper[4930]: I1012 05:57:37.732766 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4g9pb\" (UniqueName: \"kubernetes.io/projected/cb57aef8-53ef-4894-8cfc-1ef708aedd9b-kube-api-access-4g9pb\") on node \"crc\" DevicePath \"\"" Oct 12 05:57:37 crc kubenswrapper[4930]: I1012 05:57:37.836351 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/80c83b5b-7221-4d68-b4e1-b8c622bfa7cf-prometheus-metric-storage-rulefiles-0\") pod \"80c83b5b-7221-4d68-b4e1-b8c622bfa7cf\" (UID: \"80c83b5b-7221-4d68-b4e1-b8c622bfa7cf\") " Oct 12 05:57:37 crc kubenswrapper[4930]: I1012 05:57:37.836468 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/80c83b5b-7221-4d68-b4e1-b8c622bfa7cf-tls-assets\") pod \"80c83b5b-7221-4d68-b4e1-b8c622bfa7cf\" (UID: \"80c83b5b-7221-4d68-b4e1-b8c622bfa7cf\") " Oct 12 05:57:37 crc kubenswrapper[4930]: I1012 05:57:37.836492 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/80c83b5b-7221-4d68-b4e1-b8c622bfa7cf-config-out\") pod \"80c83b5b-7221-4d68-b4e1-b8c622bfa7cf\" (UID: \"80c83b5b-7221-4d68-b4e1-b8c622bfa7cf\") " Oct 12 05:57:37 crc kubenswrapper[4930]: I1012 05:57:37.836548 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg2n4\" (UniqueName: \"kubernetes.io/projected/80c83b5b-7221-4d68-b4e1-b8c622bfa7cf-kube-api-access-qg2n4\") pod \"80c83b5b-7221-4d68-b4e1-b8c622bfa7cf\" (UID: \"80c83b5b-7221-4d68-b4e1-b8c622bfa7cf\") " Oct 12 05:57:37 crc kubenswrapper[4930]: I1012 05:57:37.836675 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-22a7639f-0965-4497-b4ff-8d976229c443\") pod \"80c83b5b-7221-4d68-b4e1-b8c622bfa7cf\" (UID: \"80c83b5b-7221-4d68-b4e1-b8c622bfa7cf\") " Oct 12 05:57:37 crc kubenswrapper[4930]: I1012 05:57:37.836721 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/80c83b5b-7221-4d68-b4e1-b8c622bfa7cf-thanos-prometheus-http-client-file\") pod \"80c83b5b-7221-4d68-b4e1-b8c622bfa7cf\" (UID: \"80c83b5b-7221-4d68-b4e1-b8c622bfa7cf\") " Oct 12 05:57:37 crc kubenswrapper[4930]: I1012 05:57:37.836783 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/80c83b5b-7221-4d68-b4e1-b8c622bfa7cf-web-config\") pod \"80c83b5b-7221-4d68-b4e1-b8c622bfa7cf\" (UID: \"80c83b5b-7221-4d68-b4e1-b8c622bfa7cf\") " Oct 12 05:57:37 crc kubenswrapper[4930]: I1012 05:57:37.836812 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/80c83b5b-7221-4d68-b4e1-b8c622bfa7cf-config\") pod \"80c83b5b-7221-4d68-b4e1-b8c622bfa7cf\" (UID: \"80c83b5b-7221-4d68-b4e1-b8c622bfa7cf\") " Oct 12 05:57:37 crc kubenswrapper[4930]: I1012 05:57:37.843598 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80c83b5b-7221-4d68-b4e1-b8c622bfa7cf-config" (OuterVolumeSpecName: "config") pod "80c83b5b-7221-4d68-b4e1-b8c622bfa7cf" (UID: "80c83b5b-7221-4d68-b4e1-b8c622bfa7cf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:57:37 crc kubenswrapper[4930]: I1012 05:57:37.844879 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80c83b5b-7221-4d68-b4e1-b8c622bfa7cf-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "80c83b5b-7221-4d68-b4e1-b8c622bfa7cf" (UID: "80c83b5b-7221-4d68-b4e1-b8c622bfa7cf"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:57:37 crc kubenswrapper[4930]: I1012 05:57:37.847851 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80c83b5b-7221-4d68-b4e1-b8c622bfa7cf-kube-api-access-qg2n4" (OuterVolumeSpecName: "kube-api-access-qg2n4") pod "80c83b5b-7221-4d68-b4e1-b8c622bfa7cf" (UID: "80c83b5b-7221-4d68-b4e1-b8c622bfa7cf"). InnerVolumeSpecName "kube-api-access-qg2n4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:57:37 crc kubenswrapper[4930]: I1012 05:57:37.852461 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80c83b5b-7221-4d68-b4e1-b8c622bfa7cf-config-out" (OuterVolumeSpecName: "config-out") pod "80c83b5b-7221-4d68-b4e1-b8c622bfa7cf" (UID: "80c83b5b-7221-4d68-b4e1-b8c622bfa7cf"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 05:57:37 crc kubenswrapper[4930]: I1012 05:57:37.856815 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80c83b5b-7221-4d68-b4e1-b8c622bfa7cf-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "80c83b5b-7221-4d68-b4e1-b8c622bfa7cf" (UID: "80c83b5b-7221-4d68-b4e1-b8c622bfa7cf"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:57:37 crc kubenswrapper[4930]: I1012 05:57:37.859876 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80c83b5b-7221-4d68-b4e1-b8c622bfa7cf-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "80c83b5b-7221-4d68-b4e1-b8c622bfa7cf" (UID: "80c83b5b-7221-4d68-b4e1-b8c622bfa7cf"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:57:37 crc kubenswrapper[4930]: I1012 05:57:37.896794 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80c83b5b-7221-4d68-b4e1-b8c622bfa7cf-web-config" (OuterVolumeSpecName: "web-config") pod "80c83b5b-7221-4d68-b4e1-b8c622bfa7cf" (UID: "80c83b5b-7221-4d68-b4e1-b8c622bfa7cf"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:57:37 crc kubenswrapper[4930]: I1012 05:57:37.932806 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-8e21-account-create-645jw"] Oct 12 05:57:37 crc kubenswrapper[4930]: I1012 05:57:37.935864 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-22a7639f-0965-4497-b4ff-8d976229c443" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "80c83b5b-7221-4d68-b4e1-b8c622bfa7cf" (UID: "80c83b5b-7221-4d68-b4e1-b8c622bfa7cf"). InnerVolumeSpecName "pvc-22a7639f-0965-4497-b4ff-8d976229c443". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 12 05:57:37 crc kubenswrapper[4930]: I1012 05:57:37.938186 4930 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-22a7639f-0965-4497-b4ff-8d976229c443\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-22a7639f-0965-4497-b4ff-8d976229c443\") on node \"crc\" " Oct 12 05:57:37 crc kubenswrapper[4930]: I1012 05:57:37.938216 4930 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/80c83b5b-7221-4d68-b4e1-b8c622bfa7cf-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Oct 12 05:57:37 crc kubenswrapper[4930]: I1012 05:57:37.938226 4930 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/80c83b5b-7221-4d68-b4e1-b8c622bfa7cf-web-config\") on node \"crc\" DevicePath \"\"" Oct 12 05:57:37 crc kubenswrapper[4930]: I1012 05:57:37.938235 4930 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/80c83b5b-7221-4d68-b4e1-b8c622bfa7cf-config\") on node \"crc\" DevicePath \"\"" Oct 12 05:57:37 crc kubenswrapper[4930]: I1012 05:57:37.938245 4930 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/80c83b5b-7221-4d68-b4e1-b8c622bfa7cf-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Oct 12 05:57:37 crc kubenswrapper[4930]: I1012 05:57:37.938254 4930 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/80c83b5b-7221-4d68-b4e1-b8c622bfa7cf-tls-assets\") on node \"crc\" DevicePath \"\"" Oct 12 05:57:37 crc kubenswrapper[4930]: I1012 05:57:37.938261 4930 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/80c83b5b-7221-4d68-b4e1-b8c622bfa7cf-config-out\") on node \"crc\" DevicePath \"\"" Oct 12 05:57:37 crc kubenswrapper[4930]: I1012 05:57:37.938270 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg2n4\" (UniqueName: \"kubernetes.io/projected/80c83b5b-7221-4d68-b4e1-b8c622bfa7cf-kube-api-access-qg2n4\") on node \"crc\" DevicePath \"\"" Oct 12 05:57:37 crc kubenswrapper[4930]: I1012 05:57:37.943321 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6dbf544cc9-5qcnc"] Oct 12 05:57:37 crc kubenswrapper[4930]: I1012 05:57:37.950916 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6dbf544cc9-5qcnc"] Oct 12 05:57:37 crc kubenswrapper[4930]: I1012 05:57:37.969273 4930 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 12 05:57:37 crc kubenswrapper[4930]: I1012 05:57:37.969425 4930 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-22a7639f-0965-4497-b4ff-8d976229c443" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-22a7639f-0965-4497-b4ff-8d976229c443") on node "crc" Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.039727 4930 reconciler_common.go:293] "Volume detached for volume \"pvc-22a7639f-0965-4497-b4ff-8d976229c443\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-22a7639f-0965-4497-b4ff-8d976229c443\") on node \"crc\" DevicePath \"\"" Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.096796 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-b403-account-create-lzlnf"] Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.133652 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-2133-account-create-h59cp"] Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.156007 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb57aef8-53ef-4894-8cfc-1ef708aedd9b" path="/var/lib/kubelet/pods/cb57aef8-53ef-4894-8cfc-1ef708aedd9b/volumes" Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.578668 4930 generic.go:334] "Generic (PLEG): container finished" podID="31682f1f-3cd9-4ad2-a3ed-46a6c686c0f3" containerID="d6460b3debccdb12a4bb1f2ef621d96eb151d8b96d3a917641eedb0887a754da" exitCode=0 Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.578724 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-8e21-account-create-645jw" event={"ID":"31682f1f-3cd9-4ad2-a3ed-46a6c686c0f3","Type":"ContainerDied","Data":"d6460b3debccdb12a4bb1f2ef621d96eb151d8b96d3a917641eedb0887a754da"} Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.578772 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-8e21-account-create-645jw" event={"ID":"31682f1f-3cd9-4ad2-a3ed-46a6c686c0f3","Type":"ContainerStarted","Data":"87c5c41776df4d01c173104930bf83396a62b1f51420897f0d00f9f4a86aa33b"} Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.582319 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"bad3587e-d515-4add-9edd-da341fe519b7","Type":"ContainerStarted","Data":"62f020028b6b17a42630094b79d284a4e9e7103dc1dc7e65f9313baef3502d4e"} Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.582879 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.584833 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"fe886795-2501-4474-bfbc-9febcc5113f3","Type":"ContainerStarted","Data":"16242491c9076429f59d7ec96b240e6798e421997f2bcf00ecca026f566f6d44"} Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.585035 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.588168 4930 generic.go:334] "Generic (PLEG): container finished" podID="bd2f9a6f-cb9e-4db3-8165-e2cc18f1daff" containerID="600c90f643ed8c7dc8a88e357c9f9c68e185e9a62a849ede53cd466b7b86d776" exitCode=0 Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.588230 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-b403-account-create-lzlnf" event={"ID":"bd2f9a6f-cb9e-4db3-8165-e2cc18f1daff","Type":"ContainerDied","Data":"600c90f643ed8c7dc8a88e357c9f9c68e185e9a62a849ede53cd466b7b86d776"} Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.588245 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-b403-account-create-lzlnf" event={"ID":"bd2f9a6f-cb9e-4db3-8165-e2cc18f1daff","Type":"ContainerStarted","Data":"b55e16316b1def7216a42ee29e9884253945d8aeef2fff2354c30b29976b1254"} Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.597128 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-xldxb" event={"ID":"df839ac4-27ff-436b-b328-55b948887fce","Type":"ContainerStarted","Data":"0ffac965cf81d1a0b2c3ca4aaecb28935c5fafdc28b4816f7e48804a22d68a46"} Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.600271 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-notifications-server-0" event={"ID":"1594846a-5c2f-49f8-9bea-22661720c5a6","Type":"ContainerStarted","Data":"e6a3588d25f330a2f6d85dd1fd6f65b46d2dadea223de1832a5c6cbfeb3eec10"} Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.600474 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-notifications-server-0" Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.603688 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"80c83b5b-7221-4d68-b4e1-b8c622bfa7cf","Type":"ContainerDied","Data":"9535f058afb1ea279023bc65ddbe2acf5bd26c9bdb7ca2ecbdc376000e22aaa1"} Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.603723 4930 scope.go:117] "RemoveContainer" containerID="1a5c179b00bc7ad1bf2c025dadbf9a3b667bb42a416c99d5cad006fe189debea" Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.603771 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.607040 4930 generic.go:334] "Generic (PLEG): container finished" podID="f603981a-3581-49ba-8726-f21e981d4988" containerID="f6a746baccc0baea788f924a3f35120bccfd6e07f7331bc73e35b86127f35086" exitCode=0 Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.607120 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-2133-account-create-h59cp" event={"ID":"f603981a-3581-49ba-8726-f21e981d4988","Type":"ContainerDied","Data":"f6a746baccc0baea788f924a3f35120bccfd6e07f7331bc73e35b86127f35086"} Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.607224 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-2133-account-create-h59cp" event={"ID":"f603981a-3581-49ba-8726-f21e981d4988","Type":"ContainerStarted","Data":"bfca76718babc70d19bad9eb976798ead0e2a4a9ea971181fe62f9fc5fdb396c"} Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.624522 4930 scope.go:117] "RemoveContainer" containerID="7211811e0ae74c9852f66da818742cd8b77f9649efedc77809964b20d1e9d3c1" Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.625033 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=-9223371946.229761 podStartE2EDuration="1m30.625014698s" podCreationTimestamp="2025-10-12 05:56:08 +0000 UTC" firstStartedPulling="2025-10-12 05:56:10.056522138 +0000 UTC m=+902.598623903" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:57:38.625012538 +0000 UTC m=+991.167114303" watchObservedRunningTime="2025-10-12 05:57:38.625014698 +0000 UTC m=+991.167116463" Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.646096 4930 scope.go:117] "RemoveContainer" containerID="d411ed8b138f62ed0e0dca34c8730b2c8aff7bd99556f08a347d77ea720596dd" Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.675393 4930 scope.go:117] "RemoveContainer" containerID="1a7049180e067d43b47cdadecbf3e6ea6274e9fb255e153a5e74c52eeade26e4" Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.708824 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=38.907215073 podStartE2EDuration="1m30.708802139s" podCreationTimestamp="2025-10-12 05:56:08 +0000 UTC" firstStartedPulling="2025-10-12 05:56:10.499449004 +0000 UTC m=+903.041550769" lastFinishedPulling="2025-10-12 05:57:02.30103607 +0000 UTC m=+954.843137835" observedRunningTime="2025-10-12 05:57:38.693832166 +0000 UTC m=+991.235933931" watchObservedRunningTime="2025-10-12 05:57:38.708802139 +0000 UTC m=+991.250903904" Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.726700 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-notifications-server-0" podStartSLOduration=38.864774514 podStartE2EDuration="1m31.726683695s" podCreationTimestamp="2025-10-12 05:56:07 +0000 UTC" firstStartedPulling="2025-10-12 05:56:09.699812905 +0000 UTC m=+902.241914660" lastFinishedPulling="2025-10-12 05:57:02.561722076 +0000 UTC m=+955.103823841" observedRunningTime="2025-10-12 05:57:38.717847915 +0000 UTC m=+991.259949680" watchObservedRunningTime="2025-10-12 05:57:38.726683695 +0000 UTC m=+991.268785460" Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.759214 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.770093 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.773421 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-xldxb" podStartSLOduration=3.06580767 podStartE2EDuration="8.773404031s" podCreationTimestamp="2025-10-12 05:57:30 +0000 UTC" firstStartedPulling="2025-10-12 05:57:31.535267579 +0000 UTC m=+984.077369354" lastFinishedPulling="2025-10-12 05:57:37.24286395 +0000 UTC m=+989.784965715" observedRunningTime="2025-10-12 05:57:38.76855146 +0000 UTC m=+991.310653235" watchObservedRunningTime="2025-10-12 05:57:38.773404031 +0000 UTC m=+991.315505796" Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.782847 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 12 05:57:38 crc kubenswrapper[4930]: E1012 05:57:38.783172 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80c83b5b-7221-4d68-b4e1-b8c622bfa7cf" containerName="config-reloader" Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.783192 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="80c83b5b-7221-4d68-b4e1-b8c622bfa7cf" containerName="config-reloader" Oct 12 05:57:38 crc kubenswrapper[4930]: E1012 05:57:38.783209 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80c83b5b-7221-4d68-b4e1-b8c622bfa7cf" containerName="init-config-reloader" Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.783217 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="80c83b5b-7221-4d68-b4e1-b8c622bfa7cf" containerName="init-config-reloader" Oct 12 05:57:38 crc kubenswrapper[4930]: E1012 05:57:38.783227 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb57aef8-53ef-4894-8cfc-1ef708aedd9b" containerName="dnsmasq-dns" Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.783233 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb57aef8-53ef-4894-8cfc-1ef708aedd9b" containerName="dnsmasq-dns" Oct 12 05:57:38 crc kubenswrapper[4930]: E1012 05:57:38.783245 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80c83b5b-7221-4d68-b4e1-b8c622bfa7cf" containerName="thanos-sidecar" Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.783251 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="80c83b5b-7221-4d68-b4e1-b8c622bfa7cf" containerName="thanos-sidecar" Oct 12 05:57:38 crc kubenswrapper[4930]: E1012 05:57:38.783265 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb57aef8-53ef-4894-8cfc-1ef708aedd9b" containerName="init" Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.783271 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb57aef8-53ef-4894-8cfc-1ef708aedd9b" containerName="init" Oct 12 05:57:38 crc kubenswrapper[4930]: E1012 05:57:38.783285 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80c83b5b-7221-4d68-b4e1-b8c622bfa7cf" containerName="prometheus" Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.783290 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="80c83b5b-7221-4d68-b4e1-b8c622bfa7cf" containerName="prometheus" Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.783446 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="80c83b5b-7221-4d68-b4e1-b8c622bfa7cf" containerName="prometheus" Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.783459 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="80c83b5b-7221-4d68-b4e1-b8c622bfa7cf" containerName="thanos-sidecar" Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.783482 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="80c83b5b-7221-4d68-b4e1-b8c622bfa7cf" containerName="config-reloader" Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.783493 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb57aef8-53ef-4894-8cfc-1ef708aedd9b" containerName="dnsmasq-dns" Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.784942 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.786634 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.791288 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.791804 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.792064 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.792190 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.792442 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-jt8z4" Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.810068 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.813146 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.854107 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/e61b566b-95e6-4b55-801e-5db824bd5814-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"e61b566b-95e6-4b55-801e-5db824bd5814\") " pod="openstack/prometheus-metric-storage-0" Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.854179 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e61b566b-95e6-4b55-801e-5db824bd5814-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"e61b566b-95e6-4b55-801e-5db824bd5814\") " pod="openstack/prometheus-metric-storage-0" Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.854310 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e61b566b-95e6-4b55-801e-5db824bd5814-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"e61b566b-95e6-4b55-801e-5db824bd5814\") " pod="openstack/prometheus-metric-storage-0" Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.854337 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e61b566b-95e6-4b55-801e-5db824bd5814-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"e61b566b-95e6-4b55-801e-5db824bd5814\") " pod="openstack/prometheus-metric-storage-0" Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.854358 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e61b566b-95e6-4b55-801e-5db824bd5814-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"e61b566b-95e6-4b55-801e-5db824bd5814\") " pod="openstack/prometheus-metric-storage-0" Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.854498 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e61b566b-95e6-4b55-801e-5db824bd5814-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"e61b566b-95e6-4b55-801e-5db824bd5814\") " pod="openstack/prometheus-metric-storage-0" Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.854555 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95tbx\" (UniqueName: \"kubernetes.io/projected/e61b566b-95e6-4b55-801e-5db824bd5814-kube-api-access-95tbx\") pod \"prometheus-metric-storage-0\" (UID: \"e61b566b-95e6-4b55-801e-5db824bd5814\") " pod="openstack/prometheus-metric-storage-0" Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.854597 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-22a7639f-0965-4497-b4ff-8d976229c443\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-22a7639f-0965-4497-b4ff-8d976229c443\") pod \"prometheus-metric-storage-0\" (UID: \"e61b566b-95e6-4b55-801e-5db824bd5814\") " pod="openstack/prometheus-metric-storage-0" Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.854647 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e61b566b-95e6-4b55-801e-5db824bd5814-config\") pod \"prometheus-metric-storage-0\" (UID: \"e61b566b-95e6-4b55-801e-5db824bd5814\") " pod="openstack/prometheus-metric-storage-0" Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.854674 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/e61b566b-95e6-4b55-801e-5db824bd5814-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"e61b566b-95e6-4b55-801e-5db824bd5814\") " pod="openstack/prometheus-metric-storage-0" Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.854710 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e61b566b-95e6-4b55-801e-5db824bd5814-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"e61b566b-95e6-4b55-801e-5db824bd5814\") " pod="openstack/prometheus-metric-storage-0" Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.957306 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/e61b566b-95e6-4b55-801e-5db824bd5814-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"e61b566b-95e6-4b55-801e-5db824bd5814\") " pod="openstack/prometheus-metric-storage-0" Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.957384 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e61b566b-95e6-4b55-801e-5db824bd5814-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"e61b566b-95e6-4b55-801e-5db824bd5814\") " pod="openstack/prometheus-metric-storage-0" Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.957431 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/e61b566b-95e6-4b55-801e-5db824bd5814-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"e61b566b-95e6-4b55-801e-5db824bd5814\") " pod="openstack/prometheus-metric-storage-0" Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.957486 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e61b566b-95e6-4b55-801e-5db824bd5814-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"e61b566b-95e6-4b55-801e-5db824bd5814\") " pod="openstack/prometheus-metric-storage-0" Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.957595 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e61b566b-95e6-4b55-801e-5db824bd5814-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"e61b566b-95e6-4b55-801e-5db824bd5814\") " pod="openstack/prometheus-metric-storage-0" Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.957618 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e61b566b-95e6-4b55-801e-5db824bd5814-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"e61b566b-95e6-4b55-801e-5db824bd5814\") " pod="openstack/prometheus-metric-storage-0" Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.957639 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e61b566b-95e6-4b55-801e-5db824bd5814-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"e61b566b-95e6-4b55-801e-5db824bd5814\") " pod="openstack/prometheus-metric-storage-0" Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.957685 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e61b566b-95e6-4b55-801e-5db824bd5814-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"e61b566b-95e6-4b55-801e-5db824bd5814\") " pod="openstack/prometheus-metric-storage-0" Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.957712 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95tbx\" (UniqueName: \"kubernetes.io/projected/e61b566b-95e6-4b55-801e-5db824bd5814-kube-api-access-95tbx\") pod \"prometheus-metric-storage-0\" (UID: \"e61b566b-95e6-4b55-801e-5db824bd5814\") " pod="openstack/prometheus-metric-storage-0" Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.957765 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-22a7639f-0965-4497-b4ff-8d976229c443\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-22a7639f-0965-4497-b4ff-8d976229c443\") pod \"prometheus-metric-storage-0\" (UID: \"e61b566b-95e6-4b55-801e-5db824bd5814\") " pod="openstack/prometheus-metric-storage-0" Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.957797 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e61b566b-95e6-4b55-801e-5db824bd5814-config\") pod \"prometheus-metric-storage-0\" (UID: \"e61b566b-95e6-4b55-801e-5db824bd5814\") " pod="openstack/prometheus-metric-storage-0" Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.962355 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e61b566b-95e6-4b55-801e-5db824bd5814-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"e61b566b-95e6-4b55-801e-5db824bd5814\") " pod="openstack/prometheus-metric-storage-0" Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.967353 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/e61b566b-95e6-4b55-801e-5db824bd5814-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"e61b566b-95e6-4b55-801e-5db824bd5814\") " pod="openstack/prometheus-metric-storage-0" Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.967364 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e61b566b-95e6-4b55-801e-5db824bd5814-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"e61b566b-95e6-4b55-801e-5db824bd5814\") " pod="openstack/prometheus-metric-storage-0" Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.970995 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/e61b566b-95e6-4b55-801e-5db824bd5814-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"e61b566b-95e6-4b55-801e-5db824bd5814\") " pod="openstack/prometheus-metric-storage-0" Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.971471 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e61b566b-95e6-4b55-801e-5db824bd5814-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"e61b566b-95e6-4b55-801e-5db824bd5814\") " pod="openstack/prometheus-metric-storage-0" Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.972084 4930 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.972198 4930 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-22a7639f-0965-4497-b4ff-8d976229c443\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-22a7639f-0965-4497-b4ff-8d976229c443\") pod \"prometheus-metric-storage-0\" (UID: \"e61b566b-95e6-4b55-801e-5db824bd5814\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/6e15b61105eae1f7086da32ef53b808da7b93145612971cfb218d121d2d8a399/globalmount\"" pod="openstack/prometheus-metric-storage-0" Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.973211 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e61b566b-95e6-4b55-801e-5db824bd5814-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"e61b566b-95e6-4b55-801e-5db824bd5814\") " pod="openstack/prometheus-metric-storage-0" Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.977800 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e61b566b-95e6-4b55-801e-5db824bd5814-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"e61b566b-95e6-4b55-801e-5db824bd5814\") " pod="openstack/prometheus-metric-storage-0" Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.978781 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e61b566b-95e6-4b55-801e-5db824bd5814-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"e61b566b-95e6-4b55-801e-5db824bd5814\") " pod="openstack/prometheus-metric-storage-0" Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.981218 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e61b566b-95e6-4b55-801e-5db824bd5814-config\") pod \"prometheus-metric-storage-0\" (UID: \"e61b566b-95e6-4b55-801e-5db824bd5814\") " pod="openstack/prometheus-metric-storage-0" Oct 12 05:57:38 crc kubenswrapper[4930]: I1012 05:57:38.985773 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95tbx\" (UniqueName: \"kubernetes.io/projected/e61b566b-95e6-4b55-801e-5db824bd5814-kube-api-access-95tbx\") pod \"prometheus-metric-storage-0\" (UID: \"e61b566b-95e6-4b55-801e-5db824bd5814\") " pod="openstack/prometheus-metric-storage-0" Oct 12 05:57:39 crc kubenswrapper[4930]: I1012 05:57:39.022901 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-22a7639f-0965-4497-b4ff-8d976229c443\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-22a7639f-0965-4497-b4ff-8d976229c443\") pod \"prometheus-metric-storage-0\" (UID: \"e61b566b-95e6-4b55-801e-5db824bd5814\") " pod="openstack/prometheus-metric-storage-0" Oct 12 05:57:39 crc kubenswrapper[4930]: I1012 05:57:39.103644 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 12 05:57:39 crc kubenswrapper[4930]: I1012 05:57:39.640314 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 12 05:57:39 crc kubenswrapper[4930]: W1012 05:57:39.647364 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode61b566b_95e6_4b55_801e_5db824bd5814.slice/crio-df157038ed553b6eebc5062e237ab86c75f87a280181a46fadc4869d751f3e7e WatchSource:0}: Error finding container df157038ed553b6eebc5062e237ab86c75f87a280181a46fadc4869d751f3e7e: Status 404 returned error can't find the container with id df157038ed553b6eebc5062e237ab86c75f87a280181a46fadc4869d751f3e7e Oct 12 05:57:40 crc kubenswrapper[4930]: I1012 05:57:40.055280 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Oct 12 05:57:40 crc kubenswrapper[4930]: I1012 05:57:40.145359 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80c83b5b-7221-4d68-b4e1-b8c622bfa7cf" path="/var/lib/kubelet/pods/80c83b5b-7221-4d68-b4e1-b8c622bfa7cf/volumes" Oct 12 05:57:40 crc kubenswrapper[4930]: I1012 05:57:40.155210 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2133-account-create-h59cp" Oct 12 05:57:40 crc kubenswrapper[4930]: I1012 05:57:40.162151 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b403-account-create-lzlnf" Oct 12 05:57:40 crc kubenswrapper[4930]: I1012 05:57:40.170167 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-8e21-account-create-645jw" Oct 12 05:57:40 crc kubenswrapper[4930]: I1012 05:57:40.283693 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlkzx\" (UniqueName: \"kubernetes.io/projected/31682f1f-3cd9-4ad2-a3ed-46a6c686c0f3-kube-api-access-nlkzx\") pod \"31682f1f-3cd9-4ad2-a3ed-46a6c686c0f3\" (UID: \"31682f1f-3cd9-4ad2-a3ed-46a6c686c0f3\") " Oct 12 05:57:40 crc kubenswrapper[4930]: I1012 05:57:40.283760 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fnx9\" (UniqueName: \"kubernetes.io/projected/bd2f9a6f-cb9e-4db3-8165-e2cc18f1daff-kube-api-access-9fnx9\") pod \"bd2f9a6f-cb9e-4db3-8165-e2cc18f1daff\" (UID: \"bd2f9a6f-cb9e-4db3-8165-e2cc18f1daff\") " Oct 12 05:57:40 crc kubenswrapper[4930]: I1012 05:57:40.283784 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9sxx\" (UniqueName: \"kubernetes.io/projected/f603981a-3581-49ba-8726-f21e981d4988-kube-api-access-d9sxx\") pod \"f603981a-3581-49ba-8726-f21e981d4988\" (UID: \"f603981a-3581-49ba-8726-f21e981d4988\") " Oct 12 05:57:40 crc kubenswrapper[4930]: I1012 05:57:40.289946 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f603981a-3581-49ba-8726-f21e981d4988-kube-api-access-d9sxx" (OuterVolumeSpecName: "kube-api-access-d9sxx") pod "f603981a-3581-49ba-8726-f21e981d4988" (UID: "f603981a-3581-49ba-8726-f21e981d4988"). InnerVolumeSpecName "kube-api-access-d9sxx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:57:40 crc kubenswrapper[4930]: I1012 05:57:40.293564 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd2f9a6f-cb9e-4db3-8165-e2cc18f1daff-kube-api-access-9fnx9" (OuterVolumeSpecName: "kube-api-access-9fnx9") pod "bd2f9a6f-cb9e-4db3-8165-e2cc18f1daff" (UID: "bd2f9a6f-cb9e-4db3-8165-e2cc18f1daff"). InnerVolumeSpecName "kube-api-access-9fnx9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:57:40 crc kubenswrapper[4930]: I1012 05:57:40.303072 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31682f1f-3cd9-4ad2-a3ed-46a6c686c0f3-kube-api-access-nlkzx" (OuterVolumeSpecName: "kube-api-access-nlkzx") pod "31682f1f-3cd9-4ad2-a3ed-46a6c686c0f3" (UID: "31682f1f-3cd9-4ad2-a3ed-46a6c686c0f3"). InnerVolumeSpecName "kube-api-access-nlkzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:57:40 crc kubenswrapper[4930]: I1012 05:57:40.385907 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nlkzx\" (UniqueName: \"kubernetes.io/projected/31682f1f-3cd9-4ad2-a3ed-46a6c686c0f3-kube-api-access-nlkzx\") on node \"crc\" DevicePath \"\"" Oct 12 05:57:40 crc kubenswrapper[4930]: I1012 05:57:40.385944 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fnx9\" (UniqueName: \"kubernetes.io/projected/bd2f9a6f-cb9e-4db3-8165-e2cc18f1daff-kube-api-access-9fnx9\") on node \"crc\" DevicePath \"\"" Oct 12 05:57:40 crc kubenswrapper[4930]: I1012 05:57:40.385954 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9sxx\" (UniqueName: \"kubernetes.io/projected/f603981a-3581-49ba-8726-f21e981d4988-kube-api-access-d9sxx\") on node \"crc\" DevicePath \"\"" Oct 12 05:57:40 crc kubenswrapper[4930]: I1012 05:57:40.625941 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-2133-account-create-h59cp" event={"ID":"f603981a-3581-49ba-8726-f21e981d4988","Type":"ContainerDied","Data":"bfca76718babc70d19bad9eb976798ead0e2a4a9ea971181fe62f9fc5fdb396c"} Oct 12 05:57:40 crc kubenswrapper[4930]: I1012 05:57:40.625972 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2133-account-create-h59cp" Oct 12 05:57:40 crc kubenswrapper[4930]: I1012 05:57:40.625990 4930 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bfca76718babc70d19bad9eb976798ead0e2a4a9ea971181fe62f9fc5fdb396c" Oct 12 05:57:40 crc kubenswrapper[4930]: I1012 05:57:40.626847 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e61b566b-95e6-4b55-801e-5db824bd5814","Type":"ContainerStarted","Data":"df157038ed553b6eebc5062e237ab86c75f87a280181a46fadc4869d751f3e7e"} Oct 12 05:57:40 crc kubenswrapper[4930]: I1012 05:57:40.628311 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b403-account-create-lzlnf" Oct 12 05:57:40 crc kubenswrapper[4930]: I1012 05:57:40.628320 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-b403-account-create-lzlnf" event={"ID":"bd2f9a6f-cb9e-4db3-8165-e2cc18f1daff","Type":"ContainerDied","Data":"b55e16316b1def7216a42ee29e9884253945d8aeef2fff2354c30b29976b1254"} Oct 12 05:57:40 crc kubenswrapper[4930]: I1012 05:57:40.628356 4930 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b55e16316b1def7216a42ee29e9884253945d8aeef2fff2354c30b29976b1254" Oct 12 05:57:40 crc kubenswrapper[4930]: I1012 05:57:40.629299 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-8e21-account-create-645jw" event={"ID":"31682f1f-3cd9-4ad2-a3ed-46a6c686c0f3","Type":"ContainerDied","Data":"87c5c41776df4d01c173104930bf83396a62b1f51420897f0d00f9f4a86aa33b"} Oct 12 05:57:40 crc kubenswrapper[4930]: I1012 05:57:40.629320 4930 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87c5c41776df4d01c173104930bf83396a62b1f51420897f0d00f9f4a86aa33b" Oct 12 05:57:40 crc kubenswrapper[4930]: I1012 05:57:40.629352 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-8e21-account-create-645jw" Oct 12 05:57:42 crc kubenswrapper[4930]: I1012 05:57:42.424938 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3-etc-swift\") pod \"swift-storage-0\" (UID: \"073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3\") " pod="openstack/swift-storage-0" Oct 12 05:57:42 crc kubenswrapper[4930]: E1012 05:57:42.425129 4930 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 12 05:57:42 crc kubenswrapper[4930]: E1012 05:57:42.425546 4930 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 12 05:57:42 crc kubenswrapper[4930]: E1012 05:57:42.425620 4930 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3-etc-swift podName:073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3 nodeName:}" failed. No retries permitted until 2025-10-12 05:57:58.425598624 +0000 UTC m=+1010.967700409 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3-etc-swift") pod "swift-storage-0" (UID: "073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3") : configmap "swift-ring-files" not found Oct 12 05:57:43 crc kubenswrapper[4930]: I1012 05:57:43.657116 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e61b566b-95e6-4b55-801e-5db824bd5814","Type":"ContainerStarted","Data":"e5f53a09911ea7eb99e996b3e8fa5996ff29ee3e390ac9dd75dc650aed71ad43"} Oct 12 05:57:45 crc kubenswrapper[4930]: I1012 05:57:45.678767 4930 generic.go:334] "Generic (PLEG): container finished" podID="df839ac4-27ff-436b-b328-55b948887fce" containerID="0ffac965cf81d1a0b2c3ca4aaecb28935c5fafdc28b4816f7e48804a22d68a46" exitCode=0 Oct 12 05:57:45 crc kubenswrapper[4930]: I1012 05:57:45.678865 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-xldxb" event={"ID":"df839ac4-27ff-436b-b328-55b948887fce","Type":"ContainerDied","Data":"0ffac965cf81d1a0b2c3ca4aaecb28935c5fafdc28b4816f7e48804a22d68a46"} Oct 12 05:57:47 crc kubenswrapper[4930]: I1012 05:57:47.115362 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-xldxb" Oct 12 05:57:47 crc kubenswrapper[4930]: I1012 05:57:47.216262 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/df839ac4-27ff-436b-b328-55b948887fce-ring-data-devices\") pod \"df839ac4-27ff-436b-b328-55b948887fce\" (UID: \"df839ac4-27ff-436b-b328-55b948887fce\") " Oct 12 05:57:47 crc kubenswrapper[4930]: I1012 05:57:47.216392 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/df839ac4-27ff-436b-b328-55b948887fce-scripts\") pod \"df839ac4-27ff-436b-b328-55b948887fce\" (UID: \"df839ac4-27ff-436b-b328-55b948887fce\") " Oct 12 05:57:47 crc kubenswrapper[4930]: I1012 05:57:47.216429 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/df839ac4-27ff-436b-b328-55b948887fce-dispersionconf\") pod \"df839ac4-27ff-436b-b328-55b948887fce\" (UID: \"df839ac4-27ff-436b-b328-55b948887fce\") " Oct 12 05:57:47 crc kubenswrapper[4930]: I1012 05:57:47.216459 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/df839ac4-27ff-436b-b328-55b948887fce-swiftconf\") pod \"df839ac4-27ff-436b-b328-55b948887fce\" (UID: \"df839ac4-27ff-436b-b328-55b948887fce\") " Oct 12 05:57:47 crc kubenswrapper[4930]: I1012 05:57:47.216525 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nkwcc\" (UniqueName: \"kubernetes.io/projected/df839ac4-27ff-436b-b328-55b948887fce-kube-api-access-nkwcc\") pod \"df839ac4-27ff-436b-b328-55b948887fce\" (UID: \"df839ac4-27ff-436b-b328-55b948887fce\") " Oct 12 05:57:47 crc kubenswrapper[4930]: I1012 05:57:47.216631 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/df839ac4-27ff-436b-b328-55b948887fce-etc-swift\") pod \"df839ac4-27ff-436b-b328-55b948887fce\" (UID: \"df839ac4-27ff-436b-b328-55b948887fce\") " Oct 12 05:57:47 crc kubenswrapper[4930]: I1012 05:57:47.216704 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df839ac4-27ff-436b-b328-55b948887fce-combined-ca-bundle\") pod \"df839ac4-27ff-436b-b328-55b948887fce\" (UID: \"df839ac4-27ff-436b-b328-55b948887fce\") " Oct 12 05:57:47 crc kubenswrapper[4930]: I1012 05:57:47.217208 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df839ac4-27ff-436b-b328-55b948887fce-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "df839ac4-27ff-436b-b328-55b948887fce" (UID: "df839ac4-27ff-436b-b328-55b948887fce"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:57:47 crc kubenswrapper[4930]: I1012 05:57:47.217578 4930 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/df839ac4-27ff-436b-b328-55b948887fce-ring-data-devices\") on node \"crc\" DevicePath \"\"" Oct 12 05:57:47 crc kubenswrapper[4930]: I1012 05:57:47.217845 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df839ac4-27ff-436b-b328-55b948887fce-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "df839ac4-27ff-436b-b328-55b948887fce" (UID: "df839ac4-27ff-436b-b328-55b948887fce"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 05:57:47 crc kubenswrapper[4930]: I1012 05:57:47.223653 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df839ac4-27ff-436b-b328-55b948887fce-kube-api-access-nkwcc" (OuterVolumeSpecName: "kube-api-access-nkwcc") pod "df839ac4-27ff-436b-b328-55b948887fce" (UID: "df839ac4-27ff-436b-b328-55b948887fce"). InnerVolumeSpecName "kube-api-access-nkwcc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:57:47 crc kubenswrapper[4930]: I1012 05:57:47.231025 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df839ac4-27ff-436b-b328-55b948887fce-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "df839ac4-27ff-436b-b328-55b948887fce" (UID: "df839ac4-27ff-436b-b328-55b948887fce"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:57:47 crc kubenswrapper[4930]: I1012 05:57:47.244911 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df839ac4-27ff-436b-b328-55b948887fce-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "df839ac4-27ff-436b-b328-55b948887fce" (UID: "df839ac4-27ff-436b-b328-55b948887fce"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:57:47 crc kubenswrapper[4930]: I1012 05:57:47.248510 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df839ac4-27ff-436b-b328-55b948887fce-scripts" (OuterVolumeSpecName: "scripts") pod "df839ac4-27ff-436b-b328-55b948887fce" (UID: "df839ac4-27ff-436b-b328-55b948887fce"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:57:47 crc kubenswrapper[4930]: I1012 05:57:47.260469 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df839ac4-27ff-436b-b328-55b948887fce-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "df839ac4-27ff-436b-b328-55b948887fce" (UID: "df839ac4-27ff-436b-b328-55b948887fce"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:57:47 crc kubenswrapper[4930]: I1012 05:57:47.319888 4930 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/df839ac4-27ff-436b-b328-55b948887fce-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 12 05:57:47 crc kubenswrapper[4930]: I1012 05:57:47.319921 4930 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df839ac4-27ff-436b-b328-55b948887fce-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 05:57:47 crc kubenswrapper[4930]: I1012 05:57:47.319935 4930 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/df839ac4-27ff-436b-b328-55b948887fce-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 05:57:47 crc kubenswrapper[4930]: I1012 05:57:47.319946 4930 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/df839ac4-27ff-436b-b328-55b948887fce-dispersionconf\") on node \"crc\" DevicePath \"\"" Oct 12 05:57:47 crc kubenswrapper[4930]: I1012 05:57:47.319957 4930 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/df839ac4-27ff-436b-b328-55b948887fce-swiftconf\") on node \"crc\" DevicePath \"\"" Oct 12 05:57:47 crc kubenswrapper[4930]: I1012 05:57:47.319967 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nkwcc\" (UniqueName: \"kubernetes.io/projected/df839ac4-27ff-436b-b328-55b948887fce-kube-api-access-nkwcc\") on node \"crc\" DevicePath \"\"" Oct 12 05:57:47 crc kubenswrapper[4930]: I1012 05:57:47.699897 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-xldxb" event={"ID":"df839ac4-27ff-436b-b328-55b948887fce","Type":"ContainerDied","Data":"36e54e155194a03e21d2e57acce27019e3980e2b546c616cb951fa702c63f9f6"} Oct 12 05:57:47 crc kubenswrapper[4930]: I1012 05:57:47.700203 4930 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="36e54e155194a03e21d2e57acce27019e3980e2b546c616cb951fa702c63f9f6" Oct 12 05:57:47 crc kubenswrapper[4930]: I1012 05:57:47.700011 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-xldxb" Oct 12 05:57:48 crc kubenswrapper[4930]: I1012 05:57:48.851380 4930 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-nw5dm" podUID="ddccae59-8916-4bd7-bffa-041cf574e89e" containerName="ovn-controller" probeResult="failure" output=< Oct 12 05:57:48 crc kubenswrapper[4930]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 12 05:57:48 crc kubenswrapper[4930]: > Oct 12 05:57:48 crc kubenswrapper[4930]: I1012 05:57:48.879082 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-hjqfl" Oct 12 05:57:48 crc kubenswrapper[4930]: I1012 05:57:48.883438 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-hjqfl" Oct 12 05:57:49 crc kubenswrapper[4930]: I1012 05:57:49.096941 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-nw5dm-config-xngzj"] Oct 12 05:57:49 crc kubenswrapper[4930]: E1012 05:57:49.097564 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31682f1f-3cd9-4ad2-a3ed-46a6c686c0f3" containerName="mariadb-account-create" Oct 12 05:57:49 crc kubenswrapper[4930]: I1012 05:57:49.097596 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="31682f1f-3cd9-4ad2-a3ed-46a6c686c0f3" containerName="mariadb-account-create" Oct 12 05:57:49 crc kubenswrapper[4930]: E1012 05:57:49.097621 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd2f9a6f-cb9e-4db3-8165-e2cc18f1daff" containerName="mariadb-account-create" Oct 12 05:57:49 crc kubenswrapper[4930]: I1012 05:57:49.097634 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd2f9a6f-cb9e-4db3-8165-e2cc18f1daff" containerName="mariadb-account-create" Oct 12 05:57:49 crc kubenswrapper[4930]: E1012 05:57:49.097672 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df839ac4-27ff-436b-b328-55b948887fce" containerName="swift-ring-rebalance" Oct 12 05:57:49 crc kubenswrapper[4930]: I1012 05:57:49.097688 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="df839ac4-27ff-436b-b328-55b948887fce" containerName="swift-ring-rebalance" Oct 12 05:57:49 crc kubenswrapper[4930]: E1012 05:57:49.097720 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f603981a-3581-49ba-8726-f21e981d4988" containerName="mariadb-account-create" Oct 12 05:57:49 crc kubenswrapper[4930]: I1012 05:57:49.097732 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="f603981a-3581-49ba-8726-f21e981d4988" containerName="mariadb-account-create" Oct 12 05:57:49 crc kubenswrapper[4930]: I1012 05:57:49.098050 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd2f9a6f-cb9e-4db3-8165-e2cc18f1daff" containerName="mariadb-account-create" Oct 12 05:57:49 crc kubenswrapper[4930]: I1012 05:57:49.098085 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="f603981a-3581-49ba-8726-f21e981d4988" containerName="mariadb-account-create" Oct 12 05:57:49 crc kubenswrapper[4930]: I1012 05:57:49.098122 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="df839ac4-27ff-436b-b328-55b948887fce" containerName="swift-ring-rebalance" Oct 12 05:57:49 crc kubenswrapper[4930]: I1012 05:57:49.098139 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="31682f1f-3cd9-4ad2-a3ed-46a6c686c0f3" containerName="mariadb-account-create" Oct 12 05:57:49 crc kubenswrapper[4930]: I1012 05:57:49.099074 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nw5dm-config-xngzj" Oct 12 05:57:49 crc kubenswrapper[4930]: I1012 05:57:49.103635 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 12 05:57:49 crc kubenswrapper[4930]: I1012 05:57:49.104868 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-nw5dm-config-xngzj"] Oct 12 05:57:49 crc kubenswrapper[4930]: I1012 05:57:49.235001 4930 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-notifications-server-0" podUID="1594846a-5c2f-49f8-9bea-22661720c5a6" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.108:5671: connect: connection refused" Oct 12 05:57:49 crc kubenswrapper[4930]: I1012 05:57:49.258395 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/4ffe8a0c-572d-4d13-b7da-9877408f9fbe-additional-scripts\") pod \"ovn-controller-nw5dm-config-xngzj\" (UID: \"4ffe8a0c-572d-4d13-b7da-9877408f9fbe\") " pod="openstack/ovn-controller-nw5dm-config-xngzj" Oct 12 05:57:49 crc kubenswrapper[4930]: I1012 05:57:49.258712 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4ffe8a0c-572d-4d13-b7da-9877408f9fbe-var-run\") pod \"ovn-controller-nw5dm-config-xngzj\" (UID: \"4ffe8a0c-572d-4d13-b7da-9877408f9fbe\") " pod="openstack/ovn-controller-nw5dm-config-xngzj" Oct 12 05:57:49 crc kubenswrapper[4930]: I1012 05:57:49.258862 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4ffe8a0c-572d-4d13-b7da-9877408f9fbe-var-run-ovn\") pod \"ovn-controller-nw5dm-config-xngzj\" (UID: \"4ffe8a0c-572d-4d13-b7da-9877408f9fbe\") " pod="openstack/ovn-controller-nw5dm-config-xngzj" Oct 12 05:57:49 crc kubenswrapper[4930]: I1012 05:57:49.258981 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cc6m5\" (UniqueName: \"kubernetes.io/projected/4ffe8a0c-572d-4d13-b7da-9877408f9fbe-kube-api-access-cc6m5\") pod \"ovn-controller-nw5dm-config-xngzj\" (UID: \"4ffe8a0c-572d-4d13-b7da-9877408f9fbe\") " pod="openstack/ovn-controller-nw5dm-config-xngzj" Oct 12 05:57:49 crc kubenswrapper[4930]: I1012 05:57:49.259089 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4ffe8a0c-572d-4d13-b7da-9877408f9fbe-var-log-ovn\") pod \"ovn-controller-nw5dm-config-xngzj\" (UID: \"4ffe8a0c-572d-4d13-b7da-9877408f9fbe\") " pod="openstack/ovn-controller-nw5dm-config-xngzj" Oct 12 05:57:49 crc kubenswrapper[4930]: I1012 05:57:49.259195 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4ffe8a0c-572d-4d13-b7da-9877408f9fbe-scripts\") pod \"ovn-controller-nw5dm-config-xngzj\" (UID: \"4ffe8a0c-572d-4d13-b7da-9877408f9fbe\") " pod="openstack/ovn-controller-nw5dm-config-xngzj" Oct 12 05:57:49 crc kubenswrapper[4930]: I1012 05:57:49.361185 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/4ffe8a0c-572d-4d13-b7da-9877408f9fbe-additional-scripts\") pod \"ovn-controller-nw5dm-config-xngzj\" (UID: \"4ffe8a0c-572d-4d13-b7da-9877408f9fbe\") " pod="openstack/ovn-controller-nw5dm-config-xngzj" Oct 12 05:57:49 crc kubenswrapper[4930]: I1012 05:57:49.361267 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4ffe8a0c-572d-4d13-b7da-9877408f9fbe-var-run\") pod \"ovn-controller-nw5dm-config-xngzj\" (UID: \"4ffe8a0c-572d-4d13-b7da-9877408f9fbe\") " pod="openstack/ovn-controller-nw5dm-config-xngzj" Oct 12 05:57:49 crc kubenswrapper[4930]: I1012 05:57:49.361314 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4ffe8a0c-572d-4d13-b7da-9877408f9fbe-var-run-ovn\") pod \"ovn-controller-nw5dm-config-xngzj\" (UID: \"4ffe8a0c-572d-4d13-b7da-9877408f9fbe\") " pod="openstack/ovn-controller-nw5dm-config-xngzj" Oct 12 05:57:49 crc kubenswrapper[4930]: I1012 05:57:49.361361 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cc6m5\" (UniqueName: \"kubernetes.io/projected/4ffe8a0c-572d-4d13-b7da-9877408f9fbe-kube-api-access-cc6m5\") pod \"ovn-controller-nw5dm-config-xngzj\" (UID: \"4ffe8a0c-572d-4d13-b7da-9877408f9fbe\") " pod="openstack/ovn-controller-nw5dm-config-xngzj" Oct 12 05:57:49 crc kubenswrapper[4930]: I1012 05:57:49.361417 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4ffe8a0c-572d-4d13-b7da-9877408f9fbe-var-log-ovn\") pod \"ovn-controller-nw5dm-config-xngzj\" (UID: \"4ffe8a0c-572d-4d13-b7da-9877408f9fbe\") " pod="openstack/ovn-controller-nw5dm-config-xngzj" Oct 12 05:57:49 crc kubenswrapper[4930]: I1012 05:57:49.361470 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4ffe8a0c-572d-4d13-b7da-9877408f9fbe-scripts\") pod \"ovn-controller-nw5dm-config-xngzj\" (UID: \"4ffe8a0c-572d-4d13-b7da-9877408f9fbe\") " pod="openstack/ovn-controller-nw5dm-config-xngzj" Oct 12 05:57:49 crc kubenswrapper[4930]: I1012 05:57:49.361887 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4ffe8a0c-572d-4d13-b7da-9877408f9fbe-var-log-ovn\") pod \"ovn-controller-nw5dm-config-xngzj\" (UID: \"4ffe8a0c-572d-4d13-b7da-9877408f9fbe\") " pod="openstack/ovn-controller-nw5dm-config-xngzj" Oct 12 05:57:49 crc kubenswrapper[4930]: I1012 05:57:49.362002 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4ffe8a0c-572d-4d13-b7da-9877408f9fbe-var-run\") pod \"ovn-controller-nw5dm-config-xngzj\" (UID: \"4ffe8a0c-572d-4d13-b7da-9877408f9fbe\") " pod="openstack/ovn-controller-nw5dm-config-xngzj" Oct 12 05:57:49 crc kubenswrapper[4930]: I1012 05:57:49.362098 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4ffe8a0c-572d-4d13-b7da-9877408f9fbe-var-run-ovn\") pod \"ovn-controller-nw5dm-config-xngzj\" (UID: \"4ffe8a0c-572d-4d13-b7da-9877408f9fbe\") " pod="openstack/ovn-controller-nw5dm-config-xngzj" Oct 12 05:57:49 crc kubenswrapper[4930]: I1012 05:57:49.362589 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/4ffe8a0c-572d-4d13-b7da-9877408f9fbe-additional-scripts\") pod \"ovn-controller-nw5dm-config-xngzj\" (UID: \"4ffe8a0c-572d-4d13-b7da-9877408f9fbe\") " pod="openstack/ovn-controller-nw5dm-config-xngzj" Oct 12 05:57:49 crc kubenswrapper[4930]: I1012 05:57:49.365110 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4ffe8a0c-572d-4d13-b7da-9877408f9fbe-scripts\") pod \"ovn-controller-nw5dm-config-xngzj\" (UID: \"4ffe8a0c-572d-4d13-b7da-9877408f9fbe\") " pod="openstack/ovn-controller-nw5dm-config-xngzj" Oct 12 05:57:49 crc kubenswrapper[4930]: I1012 05:57:49.394972 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cc6m5\" (UniqueName: \"kubernetes.io/projected/4ffe8a0c-572d-4d13-b7da-9877408f9fbe-kube-api-access-cc6m5\") pod \"ovn-controller-nw5dm-config-xngzj\" (UID: \"4ffe8a0c-572d-4d13-b7da-9877408f9fbe\") " pod="openstack/ovn-controller-nw5dm-config-xngzj" Oct 12 05:57:49 crc kubenswrapper[4930]: I1012 05:57:49.459989 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nw5dm-config-xngzj" Oct 12 05:57:49 crc kubenswrapper[4930]: I1012 05:57:49.546027 4930 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="bad3587e-d515-4add-9edd-da341fe519b7" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.109:5671: connect: connection refused" Oct 12 05:57:50 crc kubenswrapper[4930]: I1012 05:57:49.937324 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-nw5dm-config-xngzj"] Oct 12 05:57:50 crc kubenswrapper[4930]: I1012 05:57:49.938193 4930 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="fe886795-2501-4474-bfbc-9febcc5113f3" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.110:5671: connect: connection refused" Oct 12 05:57:50 crc kubenswrapper[4930]: W1012 05:57:49.945383 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4ffe8a0c_572d_4d13_b7da_9877408f9fbe.slice/crio-9e787dc27150554758ea6bb5fe4adae50546e62cac593e4f77d471e4b4c99362 WatchSource:0}: Error finding container 9e787dc27150554758ea6bb5fe4adae50546e62cac593e4f77d471e4b4c99362: Status 404 returned error can't find the container with id 9e787dc27150554758ea6bb5fe4adae50546e62cac593e4f77d471e4b4c99362 Oct 12 05:57:50 crc kubenswrapper[4930]: I1012 05:57:50.729753 4930 generic.go:334] "Generic (PLEG): container finished" podID="4ffe8a0c-572d-4d13-b7da-9877408f9fbe" containerID="20d540d2eb93bee693da7417e88176ccc63ce67b45c9423965b7eccef1b42fd9" exitCode=0 Oct 12 05:57:50 crc kubenswrapper[4930]: I1012 05:57:50.729989 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nw5dm-config-xngzj" event={"ID":"4ffe8a0c-572d-4d13-b7da-9877408f9fbe","Type":"ContainerDied","Data":"20d540d2eb93bee693da7417e88176ccc63ce67b45c9423965b7eccef1b42fd9"} Oct 12 05:57:50 crc kubenswrapper[4930]: I1012 05:57:50.730013 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nw5dm-config-xngzj" event={"ID":"4ffe8a0c-572d-4d13-b7da-9877408f9fbe","Type":"ContainerStarted","Data":"9e787dc27150554758ea6bb5fe4adae50546e62cac593e4f77d471e4b4c99362"} Oct 12 05:57:52 crc kubenswrapper[4930]: I1012 05:57:52.289009 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nw5dm-config-xngzj" Oct 12 05:57:52 crc kubenswrapper[4930]: I1012 05:57:52.416304 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cc6m5\" (UniqueName: \"kubernetes.io/projected/4ffe8a0c-572d-4d13-b7da-9877408f9fbe-kube-api-access-cc6m5\") pod \"4ffe8a0c-572d-4d13-b7da-9877408f9fbe\" (UID: \"4ffe8a0c-572d-4d13-b7da-9877408f9fbe\") " Oct 12 05:57:52 crc kubenswrapper[4930]: I1012 05:57:52.417261 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4ffe8a0c-572d-4d13-b7da-9877408f9fbe-scripts\") pod \"4ffe8a0c-572d-4d13-b7da-9877408f9fbe\" (UID: \"4ffe8a0c-572d-4d13-b7da-9877408f9fbe\") " Oct 12 05:57:52 crc kubenswrapper[4930]: I1012 05:57:52.417513 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4ffe8a0c-572d-4d13-b7da-9877408f9fbe-var-run-ovn\") pod \"4ffe8a0c-572d-4d13-b7da-9877408f9fbe\" (UID: \"4ffe8a0c-572d-4d13-b7da-9877408f9fbe\") " Oct 12 05:57:52 crc kubenswrapper[4930]: I1012 05:57:52.417604 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4ffe8a0c-572d-4d13-b7da-9877408f9fbe-var-run\") pod \"4ffe8a0c-572d-4d13-b7da-9877408f9fbe\" (UID: \"4ffe8a0c-572d-4d13-b7da-9877408f9fbe\") " Oct 12 05:57:52 crc kubenswrapper[4930]: I1012 05:57:52.417714 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/4ffe8a0c-572d-4d13-b7da-9877408f9fbe-additional-scripts\") pod \"4ffe8a0c-572d-4d13-b7da-9877408f9fbe\" (UID: \"4ffe8a0c-572d-4d13-b7da-9877408f9fbe\") " Oct 12 05:57:52 crc kubenswrapper[4930]: I1012 05:57:52.417879 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4ffe8a0c-572d-4d13-b7da-9877408f9fbe-var-log-ovn\") pod \"4ffe8a0c-572d-4d13-b7da-9877408f9fbe\" (UID: \"4ffe8a0c-572d-4d13-b7da-9877408f9fbe\") " Oct 12 05:57:52 crc kubenswrapper[4930]: I1012 05:57:52.417531 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4ffe8a0c-572d-4d13-b7da-9877408f9fbe-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "4ffe8a0c-572d-4d13-b7da-9877408f9fbe" (UID: "4ffe8a0c-572d-4d13-b7da-9877408f9fbe"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 05:57:52 crc kubenswrapper[4930]: I1012 05:57:52.417800 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4ffe8a0c-572d-4d13-b7da-9877408f9fbe-var-run" (OuterVolumeSpecName: "var-run") pod "4ffe8a0c-572d-4d13-b7da-9877408f9fbe" (UID: "4ffe8a0c-572d-4d13-b7da-9877408f9fbe"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 05:57:52 crc kubenswrapper[4930]: I1012 05:57:52.418423 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ffe8a0c-572d-4d13-b7da-9877408f9fbe-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "4ffe8a0c-572d-4d13-b7da-9877408f9fbe" (UID: "4ffe8a0c-572d-4d13-b7da-9877408f9fbe"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:57:52 crc kubenswrapper[4930]: I1012 05:57:52.419168 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4ffe8a0c-572d-4d13-b7da-9877408f9fbe-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "4ffe8a0c-572d-4d13-b7da-9877408f9fbe" (UID: "4ffe8a0c-572d-4d13-b7da-9877408f9fbe"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 05:57:52 crc kubenswrapper[4930]: I1012 05:57:52.419350 4930 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4ffe8a0c-572d-4d13-b7da-9877408f9fbe-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 12 05:57:52 crc kubenswrapper[4930]: I1012 05:57:52.419405 4930 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4ffe8a0c-572d-4d13-b7da-9877408f9fbe-var-run\") on node \"crc\" DevicePath \"\"" Oct 12 05:57:52 crc kubenswrapper[4930]: I1012 05:57:52.419430 4930 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/4ffe8a0c-572d-4d13-b7da-9877408f9fbe-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 05:57:52 crc kubenswrapper[4930]: I1012 05:57:52.419459 4930 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4ffe8a0c-572d-4d13-b7da-9877408f9fbe-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 12 05:57:52 crc kubenswrapper[4930]: I1012 05:57:52.421450 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ffe8a0c-572d-4d13-b7da-9877408f9fbe-scripts" (OuterVolumeSpecName: "scripts") pod "4ffe8a0c-572d-4d13-b7da-9877408f9fbe" (UID: "4ffe8a0c-572d-4d13-b7da-9877408f9fbe"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:57:52 crc kubenswrapper[4930]: I1012 05:57:52.429011 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ffe8a0c-572d-4d13-b7da-9877408f9fbe-kube-api-access-cc6m5" (OuterVolumeSpecName: "kube-api-access-cc6m5") pod "4ffe8a0c-572d-4d13-b7da-9877408f9fbe" (UID: "4ffe8a0c-572d-4d13-b7da-9877408f9fbe"). InnerVolumeSpecName "kube-api-access-cc6m5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:57:52 crc kubenswrapper[4930]: I1012 05:57:52.523531 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cc6m5\" (UniqueName: \"kubernetes.io/projected/4ffe8a0c-572d-4d13-b7da-9877408f9fbe-kube-api-access-cc6m5\") on node \"crc\" DevicePath \"\"" Oct 12 05:57:52 crc kubenswrapper[4930]: I1012 05:57:52.523587 4930 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4ffe8a0c-572d-4d13-b7da-9877408f9fbe-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 05:57:52 crc kubenswrapper[4930]: I1012 05:57:52.754449 4930 generic.go:334] "Generic (PLEG): container finished" podID="e61b566b-95e6-4b55-801e-5db824bd5814" containerID="e5f53a09911ea7eb99e996b3e8fa5996ff29ee3e390ac9dd75dc650aed71ad43" exitCode=0 Oct 12 05:57:52 crc kubenswrapper[4930]: I1012 05:57:52.754526 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e61b566b-95e6-4b55-801e-5db824bd5814","Type":"ContainerDied","Data":"e5f53a09911ea7eb99e996b3e8fa5996ff29ee3e390ac9dd75dc650aed71ad43"} Oct 12 05:57:52 crc kubenswrapper[4930]: I1012 05:57:52.757192 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nw5dm-config-xngzj" event={"ID":"4ffe8a0c-572d-4d13-b7da-9877408f9fbe","Type":"ContainerDied","Data":"9e787dc27150554758ea6bb5fe4adae50546e62cac593e4f77d471e4b4c99362"} Oct 12 05:57:52 crc kubenswrapper[4930]: I1012 05:57:52.757229 4930 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e787dc27150554758ea6bb5fe4adae50546e62cac593e4f77d471e4b4c99362" Oct 12 05:57:52 crc kubenswrapper[4930]: I1012 05:57:52.757270 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nw5dm-config-xngzj" Oct 12 05:57:53 crc kubenswrapper[4930]: I1012 05:57:53.389116 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-nw5dm-config-xngzj"] Oct 12 05:57:53 crc kubenswrapper[4930]: I1012 05:57:53.395415 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-nw5dm-config-xngzj"] Oct 12 05:57:53 crc kubenswrapper[4930]: I1012 05:57:53.766899 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e61b566b-95e6-4b55-801e-5db824bd5814","Type":"ContainerStarted","Data":"f82954f05556e368bc6165bff1860b8a1559662c84bf6880efb75d0c9b9212ff"} Oct 12 05:57:53 crc kubenswrapper[4930]: I1012 05:57:53.872752 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-nw5dm" Oct 12 05:57:54 crc kubenswrapper[4930]: I1012 05:57:54.148232 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ffe8a0c-572d-4d13-b7da-9877408f9fbe" path="/var/lib/kubelet/pods/4ffe8a0c-572d-4d13-b7da-9877408f9fbe/volumes" Oct 12 05:57:56 crc kubenswrapper[4930]: I1012 05:57:56.804606 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e61b566b-95e6-4b55-801e-5db824bd5814","Type":"ContainerStarted","Data":"674b9b89d3de6274121c17ead64e7b7d2689ff475296e079738918f34652dc44"} Oct 12 05:57:56 crc kubenswrapper[4930]: I1012 05:57:56.805084 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e61b566b-95e6-4b55-801e-5db824bd5814","Type":"ContainerStarted","Data":"c167b25d47d309eb148d4becbc53110fc7657fa84771075ed535e97196965593"} Oct 12 05:57:56 crc kubenswrapper[4930]: I1012 05:57:56.843314 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=18.843288866 podStartE2EDuration="18.843288866s" podCreationTimestamp="2025-10-12 05:57:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:57:56.835975873 +0000 UTC m=+1009.378077668" watchObservedRunningTime="2025-10-12 05:57:56.843288866 +0000 UTC m=+1009.385390671" Oct 12 05:57:58 crc kubenswrapper[4930]: I1012 05:57:58.428824 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3-etc-swift\") pod \"swift-storage-0\" (UID: \"073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3\") " pod="openstack/swift-storage-0" Oct 12 05:57:58 crc kubenswrapper[4930]: I1012 05:57:58.441243 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3-etc-swift\") pod \"swift-storage-0\" (UID: \"073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3\") " pod="openstack/swift-storage-0" Oct 12 05:57:58 crc kubenswrapper[4930]: I1012 05:57:58.588149 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 12 05:57:59 crc kubenswrapper[4930]: I1012 05:57:59.104003 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Oct 12 05:57:59 crc kubenswrapper[4930]: I1012 05:57:59.236274 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-notifications-server-0" Oct 12 05:57:59 crc kubenswrapper[4930]: I1012 05:57:59.474829 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-sxb5z"] Oct 12 05:57:59 crc kubenswrapper[4930]: E1012 05:57:59.475445 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ffe8a0c-572d-4d13-b7da-9877408f9fbe" containerName="ovn-config" Oct 12 05:57:59 crc kubenswrapper[4930]: I1012 05:57:59.475461 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ffe8a0c-572d-4d13-b7da-9877408f9fbe" containerName="ovn-config" Oct 12 05:57:59 crc kubenswrapper[4930]: I1012 05:57:59.475674 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ffe8a0c-572d-4d13-b7da-9877408f9fbe" containerName="ovn-config" Oct 12 05:57:59 crc kubenswrapper[4930]: I1012 05:57:59.476429 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-sxb5z" Oct 12 05:57:59 crc kubenswrapper[4930]: I1012 05:57:59.489543 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-sxb5z"] Oct 12 05:57:59 crc kubenswrapper[4930]: I1012 05:57:59.544958 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 12 05:57:59 crc kubenswrapper[4930]: I1012 05:57:59.551702 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdjpk\" (UniqueName: \"kubernetes.io/projected/53492810-e3e8-42c4-b6ec-df0913d9c969-kube-api-access-pdjpk\") pod \"glance-db-create-sxb5z\" (UID: \"53492810-e3e8-42c4-b6ec-df0913d9c969\") " pod="openstack/glance-db-create-sxb5z" Oct 12 05:57:59 crc kubenswrapper[4930]: I1012 05:57:59.653664 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdjpk\" (UniqueName: \"kubernetes.io/projected/53492810-e3e8-42c4-b6ec-df0913d9c969-kube-api-access-pdjpk\") pod \"glance-db-create-sxb5z\" (UID: \"53492810-e3e8-42c4-b6ec-df0913d9c969\") " pod="openstack/glance-db-create-sxb5z" Oct 12 05:57:59 crc kubenswrapper[4930]: I1012 05:57:59.673556 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdjpk\" (UniqueName: \"kubernetes.io/projected/53492810-e3e8-42c4-b6ec-df0913d9c969-kube-api-access-pdjpk\") pod \"glance-db-create-sxb5z\" (UID: \"53492810-e3e8-42c4-b6ec-df0913d9c969\") " pod="openstack/glance-db-create-sxb5z" Oct 12 05:57:59 crc kubenswrapper[4930]: I1012 05:57:59.681519 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 12 05:57:59 crc kubenswrapper[4930]: I1012 05:57:59.806558 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-sxb5z" Oct 12 05:57:59 crc kubenswrapper[4930]: I1012 05:57:59.864596 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3","Type":"ContainerStarted","Data":"5d574749a0547b8ba53183239dc24eea32dafe986aae03b332ad07d2044aed95"} Oct 12 05:57:59 crc kubenswrapper[4930]: I1012 05:57:59.937981 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 12 05:58:00 crc kubenswrapper[4930]: I1012 05:58:00.299097 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-sxb5z"] Oct 12 05:58:00 crc kubenswrapper[4930]: I1012 05:58:00.874575 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-x67ml"] Oct 12 05:58:00 crc kubenswrapper[4930]: I1012 05:58:00.875870 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-x67ml" Oct 12 05:58:00 crc kubenswrapper[4930]: I1012 05:58:00.886257 4930 generic.go:334] "Generic (PLEG): container finished" podID="53492810-e3e8-42c4-b6ec-df0913d9c969" containerID="c9309d12f84eac3d78b59c6275628f9c0d95432ed1ede6bafcae3d969bfe6dd9" exitCode=0 Oct 12 05:58:00 crc kubenswrapper[4930]: I1012 05:58:00.886298 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-sxb5z" event={"ID":"53492810-e3e8-42c4-b6ec-df0913d9c969","Type":"ContainerDied","Data":"c9309d12f84eac3d78b59c6275628f9c0d95432ed1ede6bafcae3d969bfe6dd9"} Oct 12 05:58:00 crc kubenswrapper[4930]: I1012 05:58:00.886324 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-sxb5z" event={"ID":"53492810-e3e8-42c4-b6ec-df0913d9c969","Type":"ContainerStarted","Data":"6383d2f1a78059954b3e720c055ce93ab78ab6412f3a109ea1507a18120a1e51"} Oct 12 05:58:00 crc kubenswrapper[4930]: I1012 05:58:00.893671 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-x67ml"] Oct 12 05:58:00 crc kubenswrapper[4930]: I1012 05:58:00.974780 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ckxl\" (UniqueName: \"kubernetes.io/projected/a81240c3-2ae0-494f-af8e-8e9d60aca47b-kube-api-access-8ckxl\") pod \"barbican-db-create-x67ml\" (UID: \"a81240c3-2ae0-494f-af8e-8e9d60aca47b\") " pod="openstack/barbican-db-create-x67ml" Oct 12 05:58:00 crc kubenswrapper[4930]: I1012 05:58:00.987503 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-xqv8n"] Oct 12 05:58:00 crc kubenswrapper[4930]: I1012 05:58:00.989059 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-xqv8n" Oct 12 05:58:01 crc kubenswrapper[4930]: I1012 05:58:01.000092 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-xqv8n"] Oct 12 05:58:01 crc kubenswrapper[4930]: I1012 05:58:01.029717 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-db-sync-wqxq9"] Oct 12 05:58:01 crc kubenswrapper[4930]: I1012 05:58:01.030763 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-wqxq9" Oct 12 05:58:01 crc kubenswrapper[4930]: I1012 05:58:01.033653 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-watcher-dockercfg-bcmxb" Oct 12 05:58:01 crc kubenswrapper[4930]: I1012 05:58:01.034224 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-config-data" Oct 12 05:58:01 crc kubenswrapper[4930]: I1012 05:58:01.076924 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n449c\" (UniqueName: \"kubernetes.io/projected/acfc6300-e2c5-40ba-885c-f3f62a3f6e29-kube-api-access-n449c\") pod \"cinder-db-create-xqv8n\" (UID: \"acfc6300-e2c5-40ba-885c-f3f62a3f6e29\") " pod="openstack/cinder-db-create-xqv8n" Oct 12 05:58:01 crc kubenswrapper[4930]: I1012 05:58:01.076991 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ckxl\" (UniqueName: \"kubernetes.io/projected/a81240c3-2ae0-494f-af8e-8e9d60aca47b-kube-api-access-8ckxl\") pod \"barbican-db-create-x67ml\" (UID: \"a81240c3-2ae0-494f-af8e-8e9d60aca47b\") " pod="openstack/barbican-db-create-x67ml" Oct 12 05:58:01 crc kubenswrapper[4930]: I1012 05:58:01.086552 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-sync-wqxq9"] Oct 12 05:58:01 crc kubenswrapper[4930]: I1012 05:58:01.123954 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ckxl\" (UniqueName: \"kubernetes.io/projected/a81240c3-2ae0-494f-af8e-8e9d60aca47b-kube-api-access-8ckxl\") pod \"barbican-db-create-x67ml\" (UID: \"a81240c3-2ae0-494f-af8e-8e9d60aca47b\") " pod="openstack/barbican-db-create-x67ml" Oct 12 05:58:01 crc kubenswrapper[4930]: I1012 05:58:01.179005 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n449c\" (UniqueName: \"kubernetes.io/projected/acfc6300-e2c5-40ba-885c-f3f62a3f6e29-kube-api-access-n449c\") pod \"cinder-db-create-xqv8n\" (UID: \"acfc6300-e2c5-40ba-885c-f3f62a3f6e29\") " pod="openstack/cinder-db-create-xqv8n" Oct 12 05:58:01 crc kubenswrapper[4930]: I1012 05:58:01.179071 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96917cd8-6c91-463b-b8de-9d854e0ee581-combined-ca-bundle\") pod \"watcher-db-sync-wqxq9\" (UID: \"96917cd8-6c91-463b-b8de-9d854e0ee581\") " pod="openstack/watcher-db-sync-wqxq9" Oct 12 05:58:01 crc kubenswrapper[4930]: I1012 05:58:01.179100 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d89gg\" (UniqueName: \"kubernetes.io/projected/96917cd8-6c91-463b-b8de-9d854e0ee581-kube-api-access-d89gg\") pod \"watcher-db-sync-wqxq9\" (UID: \"96917cd8-6c91-463b-b8de-9d854e0ee581\") " pod="openstack/watcher-db-sync-wqxq9" Oct 12 05:58:01 crc kubenswrapper[4930]: I1012 05:58:01.179136 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96917cd8-6c91-463b-b8de-9d854e0ee581-config-data\") pod \"watcher-db-sync-wqxq9\" (UID: \"96917cd8-6c91-463b-b8de-9d854e0ee581\") " pod="openstack/watcher-db-sync-wqxq9" Oct 12 05:58:01 crc kubenswrapper[4930]: I1012 05:58:01.179161 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/96917cd8-6c91-463b-b8de-9d854e0ee581-db-sync-config-data\") pod \"watcher-db-sync-wqxq9\" (UID: \"96917cd8-6c91-463b-b8de-9d854e0ee581\") " pod="openstack/watcher-db-sync-wqxq9" Oct 12 05:58:01 crc kubenswrapper[4930]: I1012 05:58:01.179692 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-j8k6q"] Oct 12 05:58:01 crc kubenswrapper[4930]: I1012 05:58:01.180716 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-j8k6q" Oct 12 05:58:01 crc kubenswrapper[4930]: I1012 05:58:01.203417 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-j8k6q"] Oct 12 05:58:01 crc kubenswrapper[4930]: I1012 05:58:01.203863 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-x67ml" Oct 12 05:58:01 crc kubenswrapper[4930]: I1012 05:58:01.204653 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n449c\" (UniqueName: \"kubernetes.io/projected/acfc6300-e2c5-40ba-885c-f3f62a3f6e29-kube-api-access-n449c\") pod \"cinder-db-create-xqv8n\" (UID: \"acfc6300-e2c5-40ba-885c-f3f62a3f6e29\") " pod="openstack/cinder-db-create-xqv8n" Oct 12 05:58:01 crc kubenswrapper[4930]: I1012 05:58:01.281669 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96917cd8-6c91-463b-b8de-9d854e0ee581-combined-ca-bundle\") pod \"watcher-db-sync-wqxq9\" (UID: \"96917cd8-6c91-463b-b8de-9d854e0ee581\") " pod="openstack/watcher-db-sync-wqxq9" Oct 12 05:58:01 crc kubenswrapper[4930]: I1012 05:58:01.281751 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d89gg\" (UniqueName: \"kubernetes.io/projected/96917cd8-6c91-463b-b8de-9d854e0ee581-kube-api-access-d89gg\") pod \"watcher-db-sync-wqxq9\" (UID: \"96917cd8-6c91-463b-b8de-9d854e0ee581\") " pod="openstack/watcher-db-sync-wqxq9" Oct 12 05:58:01 crc kubenswrapper[4930]: I1012 05:58:01.281814 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkgpw\" (UniqueName: \"kubernetes.io/projected/53923311-a7cd-46ee-b287-26dd3cc96916-kube-api-access-wkgpw\") pod \"neutron-db-create-j8k6q\" (UID: \"53923311-a7cd-46ee-b287-26dd3cc96916\") " pod="openstack/neutron-db-create-j8k6q" Oct 12 05:58:01 crc kubenswrapper[4930]: I1012 05:58:01.281862 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96917cd8-6c91-463b-b8de-9d854e0ee581-config-data\") pod \"watcher-db-sync-wqxq9\" (UID: \"96917cd8-6c91-463b-b8de-9d854e0ee581\") " pod="openstack/watcher-db-sync-wqxq9" Oct 12 05:58:01 crc kubenswrapper[4930]: I1012 05:58:01.281894 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/96917cd8-6c91-463b-b8de-9d854e0ee581-db-sync-config-data\") pod \"watcher-db-sync-wqxq9\" (UID: \"96917cd8-6c91-463b-b8de-9d854e0ee581\") " pod="openstack/watcher-db-sync-wqxq9" Oct 12 05:58:01 crc kubenswrapper[4930]: I1012 05:58:01.289399 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96917cd8-6c91-463b-b8de-9d854e0ee581-config-data\") pod \"watcher-db-sync-wqxq9\" (UID: \"96917cd8-6c91-463b-b8de-9d854e0ee581\") " pod="openstack/watcher-db-sync-wqxq9" Oct 12 05:58:01 crc kubenswrapper[4930]: I1012 05:58:01.290844 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/96917cd8-6c91-463b-b8de-9d854e0ee581-db-sync-config-data\") pod \"watcher-db-sync-wqxq9\" (UID: \"96917cd8-6c91-463b-b8de-9d854e0ee581\") " pod="openstack/watcher-db-sync-wqxq9" Oct 12 05:58:01 crc kubenswrapper[4930]: I1012 05:58:01.291887 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-wndmm"] Oct 12 05:58:01 crc kubenswrapper[4930]: I1012 05:58:01.293223 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-wndmm" Oct 12 05:58:01 crc kubenswrapper[4930]: I1012 05:58:01.295389 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96917cd8-6c91-463b-b8de-9d854e0ee581-combined-ca-bundle\") pod \"watcher-db-sync-wqxq9\" (UID: \"96917cd8-6c91-463b-b8de-9d854e0ee581\") " pod="openstack/watcher-db-sync-wqxq9" Oct 12 05:58:01 crc kubenswrapper[4930]: I1012 05:58:01.301113 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 12 05:58:01 crc kubenswrapper[4930]: I1012 05:58:01.301395 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 12 05:58:01 crc kubenswrapper[4930]: I1012 05:58:01.301576 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 12 05:58:01 crc kubenswrapper[4930]: I1012 05:58:01.309117 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-hkmzw" Oct 12 05:58:01 crc kubenswrapper[4930]: I1012 05:58:01.309458 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-xqv8n" Oct 12 05:58:01 crc kubenswrapper[4930]: I1012 05:58:01.332420 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d89gg\" (UniqueName: \"kubernetes.io/projected/96917cd8-6c91-463b-b8de-9d854e0ee581-kube-api-access-d89gg\") pod \"watcher-db-sync-wqxq9\" (UID: \"96917cd8-6c91-463b-b8de-9d854e0ee581\") " pod="openstack/watcher-db-sync-wqxq9" Oct 12 05:58:01 crc kubenswrapper[4930]: I1012 05:58:01.336448 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-wndmm"] Oct 12 05:58:01 crc kubenswrapper[4930]: I1012 05:58:01.365314 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-wqxq9" Oct 12 05:58:01 crc kubenswrapper[4930]: I1012 05:58:01.408761 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkgpw\" (UniqueName: \"kubernetes.io/projected/53923311-a7cd-46ee-b287-26dd3cc96916-kube-api-access-wkgpw\") pod \"neutron-db-create-j8k6q\" (UID: \"53923311-a7cd-46ee-b287-26dd3cc96916\") " pod="openstack/neutron-db-create-j8k6q" Oct 12 05:58:01 crc kubenswrapper[4930]: I1012 05:58:01.409099 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9bb388c-0196-4f0c-9567-2ef5b30889dd-combined-ca-bundle\") pod \"keystone-db-sync-wndmm\" (UID: \"b9bb388c-0196-4f0c-9567-2ef5b30889dd\") " pod="openstack/keystone-db-sync-wndmm" Oct 12 05:58:01 crc kubenswrapper[4930]: I1012 05:58:01.409131 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9bb388c-0196-4f0c-9567-2ef5b30889dd-config-data\") pod \"keystone-db-sync-wndmm\" (UID: \"b9bb388c-0196-4f0c-9567-2ef5b30889dd\") " pod="openstack/keystone-db-sync-wndmm" Oct 12 05:58:01 crc kubenswrapper[4930]: I1012 05:58:01.409148 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4clt2\" (UniqueName: \"kubernetes.io/projected/b9bb388c-0196-4f0c-9567-2ef5b30889dd-kube-api-access-4clt2\") pod \"keystone-db-sync-wndmm\" (UID: \"b9bb388c-0196-4f0c-9567-2ef5b30889dd\") " pod="openstack/keystone-db-sync-wndmm" Oct 12 05:58:01 crc kubenswrapper[4930]: I1012 05:58:01.448380 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkgpw\" (UniqueName: \"kubernetes.io/projected/53923311-a7cd-46ee-b287-26dd3cc96916-kube-api-access-wkgpw\") pod \"neutron-db-create-j8k6q\" (UID: \"53923311-a7cd-46ee-b287-26dd3cc96916\") " pod="openstack/neutron-db-create-j8k6q" Oct 12 05:58:01 crc kubenswrapper[4930]: I1012 05:58:01.519647 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9bb388c-0196-4f0c-9567-2ef5b30889dd-combined-ca-bundle\") pod \"keystone-db-sync-wndmm\" (UID: \"b9bb388c-0196-4f0c-9567-2ef5b30889dd\") " pod="openstack/keystone-db-sync-wndmm" Oct 12 05:58:01 crc kubenswrapper[4930]: I1012 05:58:01.519712 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9bb388c-0196-4f0c-9567-2ef5b30889dd-config-data\") pod \"keystone-db-sync-wndmm\" (UID: \"b9bb388c-0196-4f0c-9567-2ef5b30889dd\") " pod="openstack/keystone-db-sync-wndmm" Oct 12 05:58:01 crc kubenswrapper[4930]: I1012 05:58:01.519728 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4clt2\" (UniqueName: \"kubernetes.io/projected/b9bb388c-0196-4f0c-9567-2ef5b30889dd-kube-api-access-4clt2\") pod \"keystone-db-sync-wndmm\" (UID: \"b9bb388c-0196-4f0c-9567-2ef5b30889dd\") " pod="openstack/keystone-db-sync-wndmm" Oct 12 05:58:01 crc kubenswrapper[4930]: I1012 05:58:01.534507 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9bb388c-0196-4f0c-9567-2ef5b30889dd-combined-ca-bundle\") pod \"keystone-db-sync-wndmm\" (UID: \"b9bb388c-0196-4f0c-9567-2ef5b30889dd\") " pod="openstack/keystone-db-sync-wndmm" Oct 12 05:58:01 crc kubenswrapper[4930]: I1012 05:58:01.542446 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9bb388c-0196-4f0c-9567-2ef5b30889dd-config-data\") pod \"keystone-db-sync-wndmm\" (UID: \"b9bb388c-0196-4f0c-9567-2ef5b30889dd\") " pod="openstack/keystone-db-sync-wndmm" Oct 12 05:58:01 crc kubenswrapper[4930]: I1012 05:58:01.545318 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4clt2\" (UniqueName: \"kubernetes.io/projected/b9bb388c-0196-4f0c-9567-2ef5b30889dd-kube-api-access-4clt2\") pod \"keystone-db-sync-wndmm\" (UID: \"b9bb388c-0196-4f0c-9567-2ef5b30889dd\") " pod="openstack/keystone-db-sync-wndmm" Oct 12 05:58:01 crc kubenswrapper[4930]: I1012 05:58:01.593103 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-j8k6q" Oct 12 05:58:01 crc kubenswrapper[4930]: I1012 05:58:01.657304 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-wndmm" Oct 12 05:58:01 crc kubenswrapper[4930]: I1012 05:58:01.899046 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3","Type":"ContainerStarted","Data":"b2b185c9ed6f016b1895a0ee15f463bf2713f3ce2e3590bed52e16a2fc14d1d2"} Oct 12 05:58:01 crc kubenswrapper[4930]: I1012 05:58:01.899517 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3","Type":"ContainerStarted","Data":"7c9a63efd72b517dbcb4e1d3db63845964c7f6b7596e4908762e11c861f83ac2"} Oct 12 05:58:02 crc kubenswrapper[4930]: I1012 05:58:02.082129 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-x67ml"] Oct 12 05:58:02 crc kubenswrapper[4930]: I1012 05:58:02.088646 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-xqv8n"] Oct 12 05:58:02 crc kubenswrapper[4930]: I1012 05:58:02.301458 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-sync-wqxq9"] Oct 12 05:58:02 crc kubenswrapper[4930]: I1012 05:58:02.403187 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-j8k6q"] Oct 12 05:58:02 crc kubenswrapper[4930]: I1012 05:58:02.422630 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-sxb5z" Oct 12 05:58:02 crc kubenswrapper[4930]: I1012 05:58:02.443118 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-wndmm"] Oct 12 05:58:02 crc kubenswrapper[4930]: W1012 05:58:02.459942 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb9bb388c_0196_4f0c_9567_2ef5b30889dd.slice/crio-013acc0458f454f4abcfd0e868c4d08fd13fef15d0e56b40389180dc871c6bac WatchSource:0}: Error finding container 013acc0458f454f4abcfd0e868c4d08fd13fef15d0e56b40389180dc871c6bac: Status 404 returned error can't find the container with id 013acc0458f454f4abcfd0e868c4d08fd13fef15d0e56b40389180dc871c6bac Oct 12 05:58:02 crc kubenswrapper[4930]: E1012 05:58:02.502559 4930 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.111:50598->38.102.83.111:46517: write tcp 38.102.83.111:50598->38.102.83.111:46517: write: broken pipe Oct 12 05:58:02 crc kubenswrapper[4930]: I1012 05:58:02.545464 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdjpk\" (UniqueName: \"kubernetes.io/projected/53492810-e3e8-42c4-b6ec-df0913d9c969-kube-api-access-pdjpk\") pod \"53492810-e3e8-42c4-b6ec-df0913d9c969\" (UID: \"53492810-e3e8-42c4-b6ec-df0913d9c969\") " Oct 12 05:58:02 crc kubenswrapper[4930]: I1012 05:58:02.558981 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53492810-e3e8-42c4-b6ec-df0913d9c969-kube-api-access-pdjpk" (OuterVolumeSpecName: "kube-api-access-pdjpk") pod "53492810-e3e8-42c4-b6ec-df0913d9c969" (UID: "53492810-e3e8-42c4-b6ec-df0913d9c969"). InnerVolumeSpecName "kube-api-access-pdjpk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:58:02 crc kubenswrapper[4930]: I1012 05:58:02.647423 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdjpk\" (UniqueName: \"kubernetes.io/projected/53492810-e3e8-42c4-b6ec-df0913d9c969-kube-api-access-pdjpk\") on node \"crc\" DevicePath \"\"" Oct 12 05:58:02 crc kubenswrapper[4930]: I1012 05:58:02.924447 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3","Type":"ContainerStarted","Data":"3b83dc06441b00d2233069ac39e718c749584f9899d1e20ddaaade15efdb6909"} Oct 12 05:58:02 crc kubenswrapper[4930]: I1012 05:58:02.924694 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3","Type":"ContainerStarted","Data":"ab1434685ea4563d2704472c59facc2c8353b9e875053853bb44f0c4c4265175"} Oct 12 05:58:02 crc kubenswrapper[4930]: I1012 05:58:02.926572 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-sxb5z" Oct 12 05:58:02 crc kubenswrapper[4930]: I1012 05:58:02.926642 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-sxb5z" event={"ID":"53492810-e3e8-42c4-b6ec-df0913d9c969","Type":"ContainerDied","Data":"6383d2f1a78059954b3e720c055ce93ab78ab6412f3a109ea1507a18120a1e51"} Oct 12 05:58:02 crc kubenswrapper[4930]: I1012 05:58:02.926678 4930 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6383d2f1a78059954b3e720c055ce93ab78ab6412f3a109ea1507a18120a1e51" Oct 12 05:58:02 crc kubenswrapper[4930]: I1012 05:58:02.927870 4930 generic.go:334] "Generic (PLEG): container finished" podID="acfc6300-e2c5-40ba-885c-f3f62a3f6e29" containerID="2ff4ac790498d9d4bc5cbc349cebba80d54288bb0a0ca16deb94f3e6a28c0a8b" exitCode=0 Oct 12 05:58:02 crc kubenswrapper[4930]: I1012 05:58:02.927953 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-xqv8n" event={"ID":"acfc6300-e2c5-40ba-885c-f3f62a3f6e29","Type":"ContainerDied","Data":"2ff4ac790498d9d4bc5cbc349cebba80d54288bb0a0ca16deb94f3e6a28c0a8b"} Oct 12 05:58:02 crc kubenswrapper[4930]: I1012 05:58:02.928018 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-xqv8n" event={"ID":"acfc6300-e2c5-40ba-885c-f3f62a3f6e29","Type":"ContainerStarted","Data":"dadcfecec4292c6a09e5f46f757343af3d80462d11cd4577dcd8176bd641ecb3"} Oct 12 05:58:02 crc kubenswrapper[4930]: I1012 05:58:02.929493 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-wqxq9" event={"ID":"96917cd8-6c91-463b-b8de-9d854e0ee581","Type":"ContainerStarted","Data":"9c4c94360cafa338e010f1d05de7087d0a6b99f1b7ebe313f90b4fb117d5ce62"} Oct 12 05:58:02 crc kubenswrapper[4930]: I1012 05:58:02.930924 4930 generic.go:334] "Generic (PLEG): container finished" podID="53923311-a7cd-46ee-b287-26dd3cc96916" containerID="4f2c0af4131f43daff6c7e90cceb9c7eae521130630264d618da1cb0f8d63b1d" exitCode=0 Oct 12 05:58:02 crc kubenswrapper[4930]: I1012 05:58:02.930985 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-j8k6q" event={"ID":"53923311-a7cd-46ee-b287-26dd3cc96916","Type":"ContainerDied","Data":"4f2c0af4131f43daff6c7e90cceb9c7eae521130630264d618da1cb0f8d63b1d"} Oct 12 05:58:02 crc kubenswrapper[4930]: I1012 05:58:02.931007 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-j8k6q" event={"ID":"53923311-a7cd-46ee-b287-26dd3cc96916","Type":"ContainerStarted","Data":"97af1e92f8dd670d37c2e9ea87aa1ec4684963dbdcbdfdd27d4fa3b6ae6d6869"} Oct 12 05:58:02 crc kubenswrapper[4930]: I1012 05:58:02.932219 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-wndmm" event={"ID":"b9bb388c-0196-4f0c-9567-2ef5b30889dd","Type":"ContainerStarted","Data":"013acc0458f454f4abcfd0e868c4d08fd13fef15d0e56b40389180dc871c6bac"} Oct 12 05:58:02 crc kubenswrapper[4930]: I1012 05:58:02.938855 4930 generic.go:334] "Generic (PLEG): container finished" podID="a81240c3-2ae0-494f-af8e-8e9d60aca47b" containerID="8dcda72439b2b02f5cabb35d03b2d4beff4f959d2cb7a59112130545921e1872" exitCode=0 Oct 12 05:58:02 crc kubenswrapper[4930]: I1012 05:58:02.938954 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-x67ml" event={"ID":"a81240c3-2ae0-494f-af8e-8e9d60aca47b","Type":"ContainerDied","Data":"8dcda72439b2b02f5cabb35d03b2d4beff4f959d2cb7a59112130545921e1872"} Oct 12 05:58:02 crc kubenswrapper[4930]: I1012 05:58:02.939016 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-x67ml" event={"ID":"a81240c3-2ae0-494f-af8e-8e9d60aca47b","Type":"ContainerStarted","Data":"8955478edaf5edd1eeeed8c9e589a476da8b5bcc8a4accda21dc5fbaabc323e9"} Oct 12 05:58:04 crc kubenswrapper[4930]: I1012 05:58:04.004972 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3","Type":"ContainerStarted","Data":"14aff2d153cab9644a8dacd92a1086423c07bf5bbb34a224aa3ecdfe996921ef"} Oct 12 05:58:04 crc kubenswrapper[4930]: I1012 05:58:04.496396 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-xqv8n" Oct 12 05:58:04 crc kubenswrapper[4930]: I1012 05:58:04.589920 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n449c\" (UniqueName: \"kubernetes.io/projected/acfc6300-e2c5-40ba-885c-f3f62a3f6e29-kube-api-access-n449c\") pod \"acfc6300-e2c5-40ba-885c-f3f62a3f6e29\" (UID: \"acfc6300-e2c5-40ba-885c-f3f62a3f6e29\") " Oct 12 05:58:04 crc kubenswrapper[4930]: I1012 05:58:04.600220 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acfc6300-e2c5-40ba-885c-f3f62a3f6e29-kube-api-access-n449c" (OuterVolumeSpecName: "kube-api-access-n449c") pod "acfc6300-e2c5-40ba-885c-f3f62a3f6e29" (UID: "acfc6300-e2c5-40ba-885c-f3f62a3f6e29"). InnerVolumeSpecName "kube-api-access-n449c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:58:04 crc kubenswrapper[4930]: I1012 05:58:04.683647 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-x67ml" Oct 12 05:58:04 crc kubenswrapper[4930]: I1012 05:58:04.691786 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n449c\" (UniqueName: \"kubernetes.io/projected/acfc6300-e2c5-40ba-885c-f3f62a3f6e29-kube-api-access-n449c\") on node \"crc\" DevicePath \"\"" Oct 12 05:58:04 crc kubenswrapper[4930]: I1012 05:58:04.726200 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-j8k6q" Oct 12 05:58:04 crc kubenswrapper[4930]: I1012 05:58:04.799381 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ckxl\" (UniqueName: \"kubernetes.io/projected/a81240c3-2ae0-494f-af8e-8e9d60aca47b-kube-api-access-8ckxl\") pod \"a81240c3-2ae0-494f-af8e-8e9d60aca47b\" (UID: \"a81240c3-2ae0-494f-af8e-8e9d60aca47b\") " Oct 12 05:58:04 crc kubenswrapper[4930]: I1012 05:58:04.799838 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkgpw\" (UniqueName: \"kubernetes.io/projected/53923311-a7cd-46ee-b287-26dd3cc96916-kube-api-access-wkgpw\") pod \"53923311-a7cd-46ee-b287-26dd3cc96916\" (UID: \"53923311-a7cd-46ee-b287-26dd3cc96916\") " Oct 12 05:58:04 crc kubenswrapper[4930]: I1012 05:58:04.804505 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a81240c3-2ae0-494f-af8e-8e9d60aca47b-kube-api-access-8ckxl" (OuterVolumeSpecName: "kube-api-access-8ckxl") pod "a81240c3-2ae0-494f-af8e-8e9d60aca47b" (UID: "a81240c3-2ae0-494f-af8e-8e9d60aca47b"). InnerVolumeSpecName "kube-api-access-8ckxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:58:04 crc kubenswrapper[4930]: I1012 05:58:04.805408 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53923311-a7cd-46ee-b287-26dd3cc96916-kube-api-access-wkgpw" (OuterVolumeSpecName: "kube-api-access-wkgpw") pod "53923311-a7cd-46ee-b287-26dd3cc96916" (UID: "53923311-a7cd-46ee-b287-26dd3cc96916"). InnerVolumeSpecName "kube-api-access-wkgpw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:58:04 crc kubenswrapper[4930]: I1012 05:58:04.901800 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8ckxl\" (UniqueName: \"kubernetes.io/projected/a81240c3-2ae0-494f-af8e-8e9d60aca47b-kube-api-access-8ckxl\") on node \"crc\" DevicePath \"\"" Oct 12 05:58:04 crc kubenswrapper[4930]: I1012 05:58:04.901835 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wkgpw\" (UniqueName: \"kubernetes.io/projected/53923311-a7cd-46ee-b287-26dd3cc96916-kube-api-access-wkgpw\") on node \"crc\" DevicePath \"\"" Oct 12 05:58:05 crc kubenswrapper[4930]: I1012 05:58:05.018315 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-x67ml" Oct 12 05:58:05 crc kubenswrapper[4930]: I1012 05:58:05.018327 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-x67ml" event={"ID":"a81240c3-2ae0-494f-af8e-8e9d60aca47b","Type":"ContainerDied","Data":"8955478edaf5edd1eeeed8c9e589a476da8b5bcc8a4accda21dc5fbaabc323e9"} Oct 12 05:58:05 crc kubenswrapper[4930]: I1012 05:58:05.018370 4930 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8955478edaf5edd1eeeed8c9e589a476da8b5bcc8a4accda21dc5fbaabc323e9" Oct 12 05:58:05 crc kubenswrapper[4930]: I1012 05:58:05.022756 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3","Type":"ContainerStarted","Data":"095c9d0e4f143183dce8c371aa9f02a555ea4db0353384c063adcaaec1214542"} Oct 12 05:58:05 crc kubenswrapper[4930]: I1012 05:58:05.022786 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3","Type":"ContainerStarted","Data":"7a75739162472096672e86df415790b65838d956eb8ad4d8d3b6048763332312"} Oct 12 05:58:05 crc kubenswrapper[4930]: I1012 05:58:05.022797 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3","Type":"ContainerStarted","Data":"6aa74e78eabb32705fa2c25e4a5f14f53dcfc8c73d8505da2e629a3a11693c95"} Oct 12 05:58:05 crc kubenswrapper[4930]: I1012 05:58:05.024291 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-xqv8n" event={"ID":"acfc6300-e2c5-40ba-885c-f3f62a3f6e29","Type":"ContainerDied","Data":"dadcfecec4292c6a09e5f46f757343af3d80462d11cd4577dcd8176bd641ecb3"} Oct 12 05:58:05 crc kubenswrapper[4930]: I1012 05:58:05.024313 4930 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dadcfecec4292c6a09e5f46f757343af3d80462d11cd4577dcd8176bd641ecb3" Oct 12 05:58:05 crc kubenswrapper[4930]: I1012 05:58:05.024361 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-xqv8n" Oct 12 05:58:05 crc kubenswrapper[4930]: I1012 05:58:05.029922 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-j8k6q" event={"ID":"53923311-a7cd-46ee-b287-26dd3cc96916","Type":"ContainerDied","Data":"97af1e92f8dd670d37c2e9ea87aa1ec4684963dbdcbdfdd27d4fa3b6ae6d6869"} Oct 12 05:58:05 crc kubenswrapper[4930]: I1012 05:58:05.029962 4930 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97af1e92f8dd670d37c2e9ea87aa1ec4684963dbdcbdfdd27d4fa3b6ae6d6869" Oct 12 05:58:05 crc kubenswrapper[4930]: I1012 05:58:05.029971 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-j8k6q" Oct 12 05:58:06 crc kubenswrapper[4930]: I1012 05:58:06.046612 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3","Type":"ContainerStarted","Data":"0f51cfab588b04c7e66eac4139cf3ffa8504954d5d41d85659c4696a3154d223"} Oct 12 05:58:09 crc kubenswrapper[4930]: I1012 05:58:09.104551 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Oct 12 05:58:09 crc kubenswrapper[4930]: I1012 05:58:09.111146 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Oct 12 05:58:09 crc kubenswrapper[4930]: I1012 05:58:09.625128 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-b9d1-account-create-np4q6"] Oct 12 05:58:09 crc kubenswrapper[4930]: E1012 05:58:09.625863 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53492810-e3e8-42c4-b6ec-df0913d9c969" containerName="mariadb-database-create" Oct 12 05:58:09 crc kubenswrapper[4930]: I1012 05:58:09.625884 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="53492810-e3e8-42c4-b6ec-df0913d9c969" containerName="mariadb-database-create" Oct 12 05:58:09 crc kubenswrapper[4930]: E1012 05:58:09.625905 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acfc6300-e2c5-40ba-885c-f3f62a3f6e29" containerName="mariadb-database-create" Oct 12 05:58:09 crc kubenswrapper[4930]: I1012 05:58:09.625914 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="acfc6300-e2c5-40ba-885c-f3f62a3f6e29" containerName="mariadb-database-create" Oct 12 05:58:09 crc kubenswrapper[4930]: E1012 05:58:09.625959 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a81240c3-2ae0-494f-af8e-8e9d60aca47b" containerName="mariadb-database-create" Oct 12 05:58:09 crc kubenswrapper[4930]: I1012 05:58:09.625971 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="a81240c3-2ae0-494f-af8e-8e9d60aca47b" containerName="mariadb-database-create" Oct 12 05:58:09 crc kubenswrapper[4930]: E1012 05:58:09.625984 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53923311-a7cd-46ee-b287-26dd3cc96916" containerName="mariadb-database-create" Oct 12 05:58:09 crc kubenswrapper[4930]: I1012 05:58:09.625993 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="53923311-a7cd-46ee-b287-26dd3cc96916" containerName="mariadb-database-create" Oct 12 05:58:09 crc kubenswrapper[4930]: I1012 05:58:09.626217 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="a81240c3-2ae0-494f-af8e-8e9d60aca47b" containerName="mariadb-database-create" Oct 12 05:58:09 crc kubenswrapper[4930]: I1012 05:58:09.626237 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="acfc6300-e2c5-40ba-885c-f3f62a3f6e29" containerName="mariadb-database-create" Oct 12 05:58:09 crc kubenswrapper[4930]: I1012 05:58:09.626251 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="53492810-e3e8-42c4-b6ec-df0913d9c969" containerName="mariadb-database-create" Oct 12 05:58:09 crc kubenswrapper[4930]: I1012 05:58:09.626279 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="53923311-a7cd-46ee-b287-26dd3cc96916" containerName="mariadb-database-create" Oct 12 05:58:09 crc kubenswrapper[4930]: I1012 05:58:09.627018 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b9d1-account-create-np4q6" Oct 12 05:58:09 crc kubenswrapper[4930]: I1012 05:58:09.629963 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Oct 12 05:58:09 crc kubenswrapper[4930]: I1012 05:58:09.635452 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-b9d1-account-create-np4q6"] Oct 12 05:58:09 crc kubenswrapper[4930]: I1012 05:58:09.724564 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvnnr\" (UniqueName: \"kubernetes.io/projected/38bf3863-9d2b-4158-afb6-e1feb4963f15-kube-api-access-qvnnr\") pod \"glance-b9d1-account-create-np4q6\" (UID: \"38bf3863-9d2b-4158-afb6-e1feb4963f15\") " pod="openstack/glance-b9d1-account-create-np4q6" Oct 12 05:58:09 crc kubenswrapper[4930]: I1012 05:58:09.827003 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvnnr\" (UniqueName: \"kubernetes.io/projected/38bf3863-9d2b-4158-afb6-e1feb4963f15-kube-api-access-qvnnr\") pod \"glance-b9d1-account-create-np4q6\" (UID: \"38bf3863-9d2b-4158-afb6-e1feb4963f15\") " pod="openstack/glance-b9d1-account-create-np4q6" Oct 12 05:58:09 crc kubenswrapper[4930]: I1012 05:58:09.856418 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvnnr\" (UniqueName: \"kubernetes.io/projected/38bf3863-9d2b-4158-afb6-e1feb4963f15-kube-api-access-qvnnr\") pod \"glance-b9d1-account-create-np4q6\" (UID: \"38bf3863-9d2b-4158-afb6-e1feb4963f15\") " pod="openstack/glance-b9d1-account-create-np4q6" Oct 12 05:58:09 crc kubenswrapper[4930]: I1012 05:58:09.955583 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b9d1-account-create-np4q6" Oct 12 05:58:10 crc kubenswrapper[4930]: I1012 05:58:10.091262 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Oct 12 05:58:11 crc kubenswrapper[4930]: I1012 05:58:11.033433 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-8362-account-create-r4fxc"] Oct 12 05:58:11 crc kubenswrapper[4930]: I1012 05:58:11.034779 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8362-account-create-r4fxc" Oct 12 05:58:11 crc kubenswrapper[4930]: I1012 05:58:11.037441 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Oct 12 05:58:11 crc kubenswrapper[4930]: I1012 05:58:11.043039 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-8362-account-create-r4fxc"] Oct 12 05:58:11 crc kubenswrapper[4930]: I1012 05:58:11.084646 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkntf\" (UniqueName: \"kubernetes.io/projected/f5a8fa49-dfc5-4c99-b1b1-fc2795013a1c-kube-api-access-wkntf\") pod \"cinder-8362-account-create-r4fxc\" (UID: \"f5a8fa49-dfc5-4c99-b1b1-fc2795013a1c\") " pod="openstack/cinder-8362-account-create-r4fxc" Oct 12 05:58:11 crc kubenswrapper[4930]: I1012 05:58:11.186848 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkntf\" (UniqueName: \"kubernetes.io/projected/f5a8fa49-dfc5-4c99-b1b1-fc2795013a1c-kube-api-access-wkntf\") pod \"cinder-8362-account-create-r4fxc\" (UID: \"f5a8fa49-dfc5-4c99-b1b1-fc2795013a1c\") " pod="openstack/cinder-8362-account-create-r4fxc" Oct 12 05:58:11 crc kubenswrapper[4930]: I1012 05:58:11.204660 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkntf\" (UniqueName: \"kubernetes.io/projected/f5a8fa49-dfc5-4c99-b1b1-fc2795013a1c-kube-api-access-wkntf\") pod \"cinder-8362-account-create-r4fxc\" (UID: \"f5a8fa49-dfc5-4c99-b1b1-fc2795013a1c\") " pod="openstack/cinder-8362-account-create-r4fxc" Oct 12 05:58:11 crc kubenswrapper[4930]: I1012 05:58:11.250533 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-5ea2-account-create-5srb4"] Oct 12 05:58:11 crc kubenswrapper[4930]: I1012 05:58:11.253717 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-5ea2-account-create-5srb4" Oct 12 05:58:11 crc kubenswrapper[4930]: I1012 05:58:11.264809 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Oct 12 05:58:11 crc kubenswrapper[4930]: I1012 05:58:11.284099 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-5ea2-account-create-5srb4"] Oct 12 05:58:11 crc kubenswrapper[4930]: I1012 05:58:11.391937 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59hbj\" (UniqueName: \"kubernetes.io/projected/678827f4-28a3-4cb5-9886-96e11e89c172-kube-api-access-59hbj\") pod \"barbican-5ea2-account-create-5srb4\" (UID: \"678827f4-28a3-4cb5-9886-96e11e89c172\") " pod="openstack/barbican-5ea2-account-create-5srb4" Oct 12 05:58:11 crc kubenswrapper[4930]: I1012 05:58:11.394247 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8362-account-create-r4fxc" Oct 12 05:58:11 crc kubenswrapper[4930]: I1012 05:58:11.439828 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-a944-account-create-nrsz7"] Oct 12 05:58:11 crc kubenswrapper[4930]: I1012 05:58:11.441175 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-a944-account-create-nrsz7" Oct 12 05:58:11 crc kubenswrapper[4930]: I1012 05:58:11.443344 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Oct 12 05:58:11 crc kubenswrapper[4930]: I1012 05:58:11.447430 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-a944-account-create-nrsz7"] Oct 12 05:58:11 crc kubenswrapper[4930]: I1012 05:58:11.493626 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59hbj\" (UniqueName: \"kubernetes.io/projected/678827f4-28a3-4cb5-9886-96e11e89c172-kube-api-access-59hbj\") pod \"barbican-5ea2-account-create-5srb4\" (UID: \"678827f4-28a3-4cb5-9886-96e11e89c172\") " pod="openstack/barbican-5ea2-account-create-5srb4" Oct 12 05:58:11 crc kubenswrapper[4930]: I1012 05:58:11.516139 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59hbj\" (UniqueName: \"kubernetes.io/projected/678827f4-28a3-4cb5-9886-96e11e89c172-kube-api-access-59hbj\") pod \"barbican-5ea2-account-create-5srb4\" (UID: \"678827f4-28a3-4cb5-9886-96e11e89c172\") " pod="openstack/barbican-5ea2-account-create-5srb4" Oct 12 05:58:11 crc kubenswrapper[4930]: I1012 05:58:11.592973 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-5ea2-account-create-5srb4" Oct 12 05:58:11 crc kubenswrapper[4930]: I1012 05:58:11.594307 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqmg9\" (UniqueName: \"kubernetes.io/projected/f9377945-1809-42da-b3c2-f38d3d91a1a6-kube-api-access-cqmg9\") pod \"neutron-a944-account-create-nrsz7\" (UID: \"f9377945-1809-42da-b3c2-f38d3d91a1a6\") " pod="openstack/neutron-a944-account-create-nrsz7" Oct 12 05:58:11 crc kubenswrapper[4930]: I1012 05:58:11.696687 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqmg9\" (UniqueName: \"kubernetes.io/projected/f9377945-1809-42da-b3c2-f38d3d91a1a6-kube-api-access-cqmg9\") pod \"neutron-a944-account-create-nrsz7\" (UID: \"f9377945-1809-42da-b3c2-f38d3d91a1a6\") " pod="openstack/neutron-a944-account-create-nrsz7" Oct 12 05:58:11 crc kubenswrapper[4930]: I1012 05:58:11.714910 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqmg9\" (UniqueName: \"kubernetes.io/projected/f9377945-1809-42da-b3c2-f38d3d91a1a6-kube-api-access-cqmg9\") pod \"neutron-a944-account-create-nrsz7\" (UID: \"f9377945-1809-42da-b3c2-f38d3d91a1a6\") " pod="openstack/neutron-a944-account-create-nrsz7" Oct 12 05:58:11 crc kubenswrapper[4930]: I1012 05:58:11.769781 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-a944-account-create-nrsz7" Oct 12 05:58:14 crc kubenswrapper[4930]: I1012 05:58:14.458659 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-5ea2-account-create-5srb4"] Oct 12 05:58:14 crc kubenswrapper[4930]: I1012 05:58:14.484274 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-a944-account-create-nrsz7"] Oct 12 05:58:14 crc kubenswrapper[4930]: I1012 05:58:14.590761 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-8362-account-create-r4fxc"] Oct 12 05:58:14 crc kubenswrapper[4930]: I1012 05:58:14.771228 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-b9d1-account-create-np4q6"] Oct 12 05:58:15 crc kubenswrapper[4930]: I1012 05:58:15.175396 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3","Type":"ContainerStarted","Data":"d6d3a15ab6dacbce10359b9fe38614ec5e9a32cc3baf79e70003c0e8e6d01702"} Oct 12 05:58:15 crc kubenswrapper[4930]: I1012 05:58:15.175445 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3","Type":"ContainerStarted","Data":"9da119e5a27f21ffd94de8e0a61c928221053c02c3cd189846c5f53a928723fd"} Oct 12 05:58:15 crc kubenswrapper[4930]: I1012 05:58:15.175456 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3","Type":"ContainerStarted","Data":"97eb17d1c6de157eb774c97ac31b137f37c7bbad1dd39eae5d7006e6a22153fc"} Oct 12 05:58:15 crc kubenswrapper[4930]: I1012 05:58:15.175468 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3","Type":"ContainerStarted","Data":"cc90f005de2bb492ef7a35d064316a16e737cf167bc19f6726f52008867fbd84"} Oct 12 05:58:15 crc kubenswrapper[4930]: I1012 05:58:15.177355 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b9d1-account-create-np4q6" event={"ID":"38bf3863-9d2b-4158-afb6-e1feb4963f15","Type":"ContainerStarted","Data":"5edc2ca24f7555be5ae806d6a71af41b96a8ecc0bc56d104bbb7b63241138cd8"} Oct 12 05:58:15 crc kubenswrapper[4930]: I1012 05:58:15.179674 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-wqxq9" event={"ID":"96917cd8-6c91-463b-b8de-9d854e0ee581","Type":"ContainerStarted","Data":"3052de8a3e627f0cfd9983c59ff403b54f3820761a61c595ef8caee6a1314cef"} Oct 12 05:58:15 crc kubenswrapper[4930]: I1012 05:58:15.180675 4930 generic.go:334] "Generic (PLEG): container finished" podID="678827f4-28a3-4cb5-9886-96e11e89c172" containerID="4bdf5fa1f489f1ddcb7a4da154479d9a13a53608a029a1324e4ba6df881293bd" exitCode=0 Oct 12 05:58:15 crc kubenswrapper[4930]: I1012 05:58:15.180749 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-5ea2-account-create-5srb4" event={"ID":"678827f4-28a3-4cb5-9886-96e11e89c172","Type":"ContainerDied","Data":"4bdf5fa1f489f1ddcb7a4da154479d9a13a53608a029a1324e4ba6df881293bd"} Oct 12 05:58:15 crc kubenswrapper[4930]: I1012 05:58:15.180812 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-5ea2-account-create-5srb4" event={"ID":"678827f4-28a3-4cb5-9886-96e11e89c172","Type":"ContainerStarted","Data":"5e4c9b826d64d669ad077316f58b190ed122fbe403be092defa0228dd6d880eb"} Oct 12 05:58:15 crc kubenswrapper[4930]: I1012 05:58:15.181979 4930 generic.go:334] "Generic (PLEG): container finished" podID="f5a8fa49-dfc5-4c99-b1b1-fc2795013a1c" containerID="650dcdcd858d5b7beb90c3731640c5ccf71b4ef197dde82c17b6cc833219a1c8" exitCode=0 Oct 12 05:58:15 crc kubenswrapper[4930]: I1012 05:58:15.182075 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8362-account-create-r4fxc" event={"ID":"f5a8fa49-dfc5-4c99-b1b1-fc2795013a1c","Type":"ContainerDied","Data":"650dcdcd858d5b7beb90c3731640c5ccf71b4ef197dde82c17b6cc833219a1c8"} Oct 12 05:58:15 crc kubenswrapper[4930]: I1012 05:58:15.182153 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8362-account-create-r4fxc" event={"ID":"f5a8fa49-dfc5-4c99-b1b1-fc2795013a1c","Type":"ContainerStarted","Data":"14b5d5eaa1737e1d8c54e6398a58bc55784866581fcbac6f7229a6b62fb9767f"} Oct 12 05:58:15 crc kubenswrapper[4930]: I1012 05:58:15.184591 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-wndmm" event={"ID":"b9bb388c-0196-4f0c-9567-2ef5b30889dd","Type":"ContainerStarted","Data":"08ce7ca7283a64b89f5cd5b7d08d7e47145a7c445f078df713a61e6404c9cb09"} Oct 12 05:58:15 crc kubenswrapper[4930]: I1012 05:58:15.188516 4930 generic.go:334] "Generic (PLEG): container finished" podID="f9377945-1809-42da-b3c2-f38d3d91a1a6" containerID="8e9b3463ba8a90c44ce9a067f41e90b9110e616c4f1008d907e7b4af331a38b1" exitCode=0 Oct 12 05:58:15 crc kubenswrapper[4930]: I1012 05:58:15.188558 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-a944-account-create-nrsz7" event={"ID":"f9377945-1809-42da-b3c2-f38d3d91a1a6","Type":"ContainerDied","Data":"8e9b3463ba8a90c44ce9a067f41e90b9110e616c4f1008d907e7b4af331a38b1"} Oct 12 05:58:15 crc kubenswrapper[4930]: I1012 05:58:15.188579 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-a944-account-create-nrsz7" event={"ID":"f9377945-1809-42da-b3c2-f38d3d91a1a6","Type":"ContainerStarted","Data":"272d4a2921404cf21f7d8fd2ca6c05a82e3aac02dc711ad0b97d7c107fc14097"} Oct 12 05:58:15 crc kubenswrapper[4930]: I1012 05:58:15.205330 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-db-sync-wqxq9" podStartSLOduration=2.55891573 podStartE2EDuration="14.205307741s" podCreationTimestamp="2025-10-12 05:58:01 +0000 UTC" firstStartedPulling="2025-10-12 05:58:02.316198491 +0000 UTC m=+1014.858300256" lastFinishedPulling="2025-10-12 05:58:13.962590502 +0000 UTC m=+1026.504692267" observedRunningTime="2025-10-12 05:58:15.19844846 +0000 UTC m=+1027.740550225" watchObservedRunningTime="2025-10-12 05:58:15.205307741 +0000 UTC m=+1027.747409526" Oct 12 05:58:15 crc kubenswrapper[4930]: I1012 05:58:15.221463 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-wndmm" podStartSLOduration=2.7551915989999998 podStartE2EDuration="14.221440304s" podCreationTimestamp="2025-10-12 05:58:01 +0000 UTC" firstStartedPulling="2025-10-12 05:58:02.462534653 +0000 UTC m=+1015.004636418" lastFinishedPulling="2025-10-12 05:58:13.928783368 +0000 UTC m=+1026.470885123" observedRunningTime="2025-10-12 05:58:15.210406248 +0000 UTC m=+1027.752508013" watchObservedRunningTime="2025-10-12 05:58:15.221440304 +0000 UTC m=+1027.763542089" Oct 12 05:58:16 crc kubenswrapper[4930]: I1012 05:58:16.197749 4930 generic.go:334] "Generic (PLEG): container finished" podID="38bf3863-9d2b-4158-afb6-e1feb4963f15" containerID="2db4e0d07355e79407e76c1c971d84326154f48ef1080a39a3c069a8606f40b5" exitCode=0 Oct 12 05:58:16 crc kubenswrapper[4930]: I1012 05:58:16.197873 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b9d1-account-create-np4q6" event={"ID":"38bf3863-9d2b-4158-afb6-e1feb4963f15","Type":"ContainerDied","Data":"2db4e0d07355e79407e76c1c971d84326154f48ef1080a39a3c069a8606f40b5"} Oct 12 05:58:16 crc kubenswrapper[4930]: I1012 05:58:16.206874 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3","Type":"ContainerStarted","Data":"4cd62e117f83cbb27da963d06a1aee015f891772c3fa94ffef971d3401b78807"} Oct 12 05:58:16 crc kubenswrapper[4930]: I1012 05:58:16.206971 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3","Type":"ContainerStarted","Data":"aeb379f8b0f23af92d6c1f2e52aed3b3b3e7985946a184ef3fe3eafd78b7e3b7"} Oct 12 05:58:16 crc kubenswrapper[4930]: I1012 05:58:16.257654 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=45.377261357 podStartE2EDuration="51.25761401s" podCreationTimestamp="2025-10-12 05:57:25 +0000 UTC" firstStartedPulling="2025-10-12 05:57:59.691070786 +0000 UTC m=+1012.233172551" lastFinishedPulling="2025-10-12 05:58:05.571423439 +0000 UTC m=+1018.113525204" observedRunningTime="2025-10-12 05:58:16.255448565 +0000 UTC m=+1028.797550340" watchObservedRunningTime="2025-10-12 05:58:16.25761401 +0000 UTC m=+1028.799715765" Oct 12 05:58:16 crc kubenswrapper[4930]: I1012 05:58:16.578160 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55b99bf79c-xbwtf"] Oct 12 05:58:16 crc kubenswrapper[4930]: I1012 05:58:16.588265 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55b99bf79c-xbwtf" Oct 12 05:58:16 crc kubenswrapper[4930]: I1012 05:58:16.607584 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Oct 12 05:58:16 crc kubenswrapper[4930]: I1012 05:58:16.610159 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55b99bf79c-xbwtf"] Oct 12 05:58:16 crc kubenswrapper[4930]: I1012 05:58:16.740693 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a43020f9-02f4-4d52-b00f-74fd6da4e46b-config\") pod \"dnsmasq-dns-55b99bf79c-xbwtf\" (UID: \"a43020f9-02f4-4d52-b00f-74fd6da4e46b\") " pod="openstack/dnsmasq-dns-55b99bf79c-xbwtf" Oct 12 05:58:16 crc kubenswrapper[4930]: I1012 05:58:16.740751 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a43020f9-02f4-4d52-b00f-74fd6da4e46b-dns-svc\") pod \"dnsmasq-dns-55b99bf79c-xbwtf\" (UID: \"a43020f9-02f4-4d52-b00f-74fd6da4e46b\") " pod="openstack/dnsmasq-dns-55b99bf79c-xbwtf" Oct 12 05:58:16 crc kubenswrapper[4930]: I1012 05:58:16.740779 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a43020f9-02f4-4d52-b00f-74fd6da4e46b-ovsdbserver-nb\") pod \"dnsmasq-dns-55b99bf79c-xbwtf\" (UID: \"a43020f9-02f4-4d52-b00f-74fd6da4e46b\") " pod="openstack/dnsmasq-dns-55b99bf79c-xbwtf" Oct 12 05:58:16 crc kubenswrapper[4930]: I1012 05:58:16.740804 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a43020f9-02f4-4d52-b00f-74fd6da4e46b-dns-swift-storage-0\") pod \"dnsmasq-dns-55b99bf79c-xbwtf\" (UID: \"a43020f9-02f4-4d52-b00f-74fd6da4e46b\") " pod="openstack/dnsmasq-dns-55b99bf79c-xbwtf" Oct 12 05:58:16 crc kubenswrapper[4930]: I1012 05:58:16.740833 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thc66\" (UniqueName: \"kubernetes.io/projected/a43020f9-02f4-4d52-b00f-74fd6da4e46b-kube-api-access-thc66\") pod \"dnsmasq-dns-55b99bf79c-xbwtf\" (UID: \"a43020f9-02f4-4d52-b00f-74fd6da4e46b\") " pod="openstack/dnsmasq-dns-55b99bf79c-xbwtf" Oct 12 05:58:16 crc kubenswrapper[4930]: I1012 05:58:16.740884 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a43020f9-02f4-4d52-b00f-74fd6da4e46b-ovsdbserver-sb\") pod \"dnsmasq-dns-55b99bf79c-xbwtf\" (UID: \"a43020f9-02f4-4d52-b00f-74fd6da4e46b\") " pod="openstack/dnsmasq-dns-55b99bf79c-xbwtf" Oct 12 05:58:16 crc kubenswrapper[4930]: I1012 05:58:16.780156 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-5ea2-account-create-5srb4" Oct 12 05:58:16 crc kubenswrapper[4930]: I1012 05:58:16.786483 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8362-account-create-r4fxc" Oct 12 05:58:16 crc kubenswrapper[4930]: I1012 05:58:16.793152 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-a944-account-create-nrsz7" Oct 12 05:58:16 crc kubenswrapper[4930]: I1012 05:58:16.842997 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a43020f9-02f4-4d52-b00f-74fd6da4e46b-config\") pod \"dnsmasq-dns-55b99bf79c-xbwtf\" (UID: \"a43020f9-02f4-4d52-b00f-74fd6da4e46b\") " pod="openstack/dnsmasq-dns-55b99bf79c-xbwtf" Oct 12 05:58:16 crc kubenswrapper[4930]: I1012 05:58:16.843041 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a43020f9-02f4-4d52-b00f-74fd6da4e46b-dns-svc\") pod \"dnsmasq-dns-55b99bf79c-xbwtf\" (UID: \"a43020f9-02f4-4d52-b00f-74fd6da4e46b\") " pod="openstack/dnsmasq-dns-55b99bf79c-xbwtf" Oct 12 05:58:16 crc kubenswrapper[4930]: I1012 05:58:16.843062 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a43020f9-02f4-4d52-b00f-74fd6da4e46b-ovsdbserver-nb\") pod \"dnsmasq-dns-55b99bf79c-xbwtf\" (UID: \"a43020f9-02f4-4d52-b00f-74fd6da4e46b\") " pod="openstack/dnsmasq-dns-55b99bf79c-xbwtf" Oct 12 05:58:16 crc kubenswrapper[4930]: I1012 05:58:16.843090 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a43020f9-02f4-4d52-b00f-74fd6da4e46b-dns-swift-storage-0\") pod \"dnsmasq-dns-55b99bf79c-xbwtf\" (UID: \"a43020f9-02f4-4d52-b00f-74fd6da4e46b\") " pod="openstack/dnsmasq-dns-55b99bf79c-xbwtf" Oct 12 05:58:16 crc kubenswrapper[4930]: I1012 05:58:16.843117 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thc66\" (UniqueName: \"kubernetes.io/projected/a43020f9-02f4-4d52-b00f-74fd6da4e46b-kube-api-access-thc66\") pod \"dnsmasq-dns-55b99bf79c-xbwtf\" (UID: \"a43020f9-02f4-4d52-b00f-74fd6da4e46b\") " pod="openstack/dnsmasq-dns-55b99bf79c-xbwtf" Oct 12 05:58:16 crc kubenswrapper[4930]: I1012 05:58:16.843174 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a43020f9-02f4-4d52-b00f-74fd6da4e46b-ovsdbserver-sb\") pod \"dnsmasq-dns-55b99bf79c-xbwtf\" (UID: \"a43020f9-02f4-4d52-b00f-74fd6da4e46b\") " pod="openstack/dnsmasq-dns-55b99bf79c-xbwtf" Oct 12 05:58:16 crc kubenswrapper[4930]: I1012 05:58:16.844048 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a43020f9-02f4-4d52-b00f-74fd6da4e46b-ovsdbserver-sb\") pod \"dnsmasq-dns-55b99bf79c-xbwtf\" (UID: \"a43020f9-02f4-4d52-b00f-74fd6da4e46b\") " pod="openstack/dnsmasq-dns-55b99bf79c-xbwtf" Oct 12 05:58:16 crc kubenswrapper[4930]: I1012 05:58:16.844555 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a43020f9-02f4-4d52-b00f-74fd6da4e46b-config\") pod \"dnsmasq-dns-55b99bf79c-xbwtf\" (UID: \"a43020f9-02f4-4d52-b00f-74fd6da4e46b\") " pod="openstack/dnsmasq-dns-55b99bf79c-xbwtf" Oct 12 05:58:16 crc kubenswrapper[4930]: I1012 05:58:16.845066 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a43020f9-02f4-4d52-b00f-74fd6da4e46b-dns-svc\") pod \"dnsmasq-dns-55b99bf79c-xbwtf\" (UID: \"a43020f9-02f4-4d52-b00f-74fd6da4e46b\") " pod="openstack/dnsmasq-dns-55b99bf79c-xbwtf" Oct 12 05:58:16 crc kubenswrapper[4930]: I1012 05:58:16.845611 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a43020f9-02f4-4d52-b00f-74fd6da4e46b-ovsdbserver-nb\") pod \"dnsmasq-dns-55b99bf79c-xbwtf\" (UID: \"a43020f9-02f4-4d52-b00f-74fd6da4e46b\") " pod="openstack/dnsmasq-dns-55b99bf79c-xbwtf" Oct 12 05:58:16 crc kubenswrapper[4930]: I1012 05:58:16.846186 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a43020f9-02f4-4d52-b00f-74fd6da4e46b-dns-swift-storage-0\") pod \"dnsmasq-dns-55b99bf79c-xbwtf\" (UID: \"a43020f9-02f4-4d52-b00f-74fd6da4e46b\") " pod="openstack/dnsmasq-dns-55b99bf79c-xbwtf" Oct 12 05:58:16 crc kubenswrapper[4930]: I1012 05:58:16.872618 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thc66\" (UniqueName: \"kubernetes.io/projected/a43020f9-02f4-4d52-b00f-74fd6da4e46b-kube-api-access-thc66\") pod \"dnsmasq-dns-55b99bf79c-xbwtf\" (UID: \"a43020f9-02f4-4d52-b00f-74fd6da4e46b\") " pod="openstack/dnsmasq-dns-55b99bf79c-xbwtf" Oct 12 05:58:16 crc kubenswrapper[4930]: I1012 05:58:16.921781 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55b99bf79c-xbwtf" Oct 12 05:58:16 crc kubenswrapper[4930]: I1012 05:58:16.943967 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59hbj\" (UniqueName: \"kubernetes.io/projected/678827f4-28a3-4cb5-9886-96e11e89c172-kube-api-access-59hbj\") pod \"678827f4-28a3-4cb5-9886-96e11e89c172\" (UID: \"678827f4-28a3-4cb5-9886-96e11e89c172\") " Oct 12 05:58:16 crc kubenswrapper[4930]: I1012 05:58:16.944076 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqmg9\" (UniqueName: \"kubernetes.io/projected/f9377945-1809-42da-b3c2-f38d3d91a1a6-kube-api-access-cqmg9\") pod \"f9377945-1809-42da-b3c2-f38d3d91a1a6\" (UID: \"f9377945-1809-42da-b3c2-f38d3d91a1a6\") " Oct 12 05:58:16 crc kubenswrapper[4930]: I1012 05:58:16.944209 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkntf\" (UniqueName: \"kubernetes.io/projected/f5a8fa49-dfc5-4c99-b1b1-fc2795013a1c-kube-api-access-wkntf\") pod \"f5a8fa49-dfc5-4c99-b1b1-fc2795013a1c\" (UID: \"f5a8fa49-dfc5-4c99-b1b1-fc2795013a1c\") " Oct 12 05:58:16 crc kubenswrapper[4930]: I1012 05:58:16.948006 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9377945-1809-42da-b3c2-f38d3d91a1a6-kube-api-access-cqmg9" (OuterVolumeSpecName: "kube-api-access-cqmg9") pod "f9377945-1809-42da-b3c2-f38d3d91a1a6" (UID: "f9377945-1809-42da-b3c2-f38d3d91a1a6"). InnerVolumeSpecName "kube-api-access-cqmg9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:58:16 crc kubenswrapper[4930]: I1012 05:58:16.948048 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/678827f4-28a3-4cb5-9886-96e11e89c172-kube-api-access-59hbj" (OuterVolumeSpecName: "kube-api-access-59hbj") pod "678827f4-28a3-4cb5-9886-96e11e89c172" (UID: "678827f4-28a3-4cb5-9886-96e11e89c172"). InnerVolumeSpecName "kube-api-access-59hbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:58:16 crc kubenswrapper[4930]: I1012 05:58:16.948775 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5a8fa49-dfc5-4c99-b1b1-fc2795013a1c-kube-api-access-wkntf" (OuterVolumeSpecName: "kube-api-access-wkntf") pod "f5a8fa49-dfc5-4c99-b1b1-fc2795013a1c" (UID: "f5a8fa49-dfc5-4c99-b1b1-fc2795013a1c"). InnerVolumeSpecName "kube-api-access-wkntf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:58:17 crc kubenswrapper[4930]: I1012 05:58:17.045971 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59hbj\" (UniqueName: \"kubernetes.io/projected/678827f4-28a3-4cb5-9886-96e11e89c172-kube-api-access-59hbj\") on node \"crc\" DevicePath \"\"" Oct 12 05:58:17 crc kubenswrapper[4930]: I1012 05:58:17.046360 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqmg9\" (UniqueName: \"kubernetes.io/projected/f9377945-1809-42da-b3c2-f38d3d91a1a6-kube-api-access-cqmg9\") on node \"crc\" DevicePath \"\"" Oct 12 05:58:17 crc kubenswrapper[4930]: I1012 05:58:17.046374 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wkntf\" (UniqueName: \"kubernetes.io/projected/f5a8fa49-dfc5-4c99-b1b1-fc2795013a1c-kube-api-access-wkntf\") on node \"crc\" DevicePath \"\"" Oct 12 05:58:17 crc kubenswrapper[4930]: I1012 05:58:17.217143 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-a944-account-create-nrsz7" event={"ID":"f9377945-1809-42da-b3c2-f38d3d91a1a6","Type":"ContainerDied","Data":"272d4a2921404cf21f7d8fd2ca6c05a82e3aac02dc711ad0b97d7c107fc14097"} Oct 12 05:58:17 crc kubenswrapper[4930]: I1012 05:58:17.217866 4930 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="272d4a2921404cf21f7d8fd2ca6c05a82e3aac02dc711ad0b97d7c107fc14097" Oct 12 05:58:17 crc kubenswrapper[4930]: I1012 05:58:17.217394 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-a944-account-create-nrsz7" Oct 12 05:58:17 crc kubenswrapper[4930]: I1012 05:58:17.218870 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-5ea2-account-create-5srb4" event={"ID":"678827f4-28a3-4cb5-9886-96e11e89c172","Type":"ContainerDied","Data":"5e4c9b826d64d669ad077316f58b190ed122fbe403be092defa0228dd6d880eb"} Oct 12 05:58:17 crc kubenswrapper[4930]: I1012 05:58:17.218914 4930 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e4c9b826d64d669ad077316f58b190ed122fbe403be092defa0228dd6d880eb" Oct 12 05:58:17 crc kubenswrapper[4930]: I1012 05:58:17.218991 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-5ea2-account-create-5srb4" Oct 12 05:58:17 crc kubenswrapper[4930]: I1012 05:58:17.220486 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8362-account-create-r4fxc" event={"ID":"f5a8fa49-dfc5-4c99-b1b1-fc2795013a1c","Type":"ContainerDied","Data":"14b5d5eaa1737e1d8c54e6398a58bc55784866581fcbac6f7229a6b62fb9767f"} Oct 12 05:58:17 crc kubenswrapper[4930]: I1012 05:58:17.220525 4930 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14b5d5eaa1737e1d8c54e6398a58bc55784866581fcbac6f7229a6b62fb9767f" Oct 12 05:58:17 crc kubenswrapper[4930]: I1012 05:58:17.220948 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8362-account-create-r4fxc" Oct 12 05:58:17 crc kubenswrapper[4930]: I1012 05:58:17.336207 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55b99bf79c-xbwtf"] Oct 12 05:58:17 crc kubenswrapper[4930]: W1012 05:58:17.347791 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda43020f9_02f4_4d52_b00f_74fd6da4e46b.slice/crio-282c4e6ca435fa75d2920a7b597dbd7819bb076f08f5fcf008acb02b9c845af9 WatchSource:0}: Error finding container 282c4e6ca435fa75d2920a7b597dbd7819bb076f08f5fcf008acb02b9c845af9: Status 404 returned error can't find the container with id 282c4e6ca435fa75d2920a7b597dbd7819bb076f08f5fcf008acb02b9c845af9 Oct 12 05:58:17 crc kubenswrapper[4930]: I1012 05:58:17.490034 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b9d1-account-create-np4q6" Oct 12 05:58:17 crc kubenswrapper[4930]: I1012 05:58:17.661909 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvnnr\" (UniqueName: \"kubernetes.io/projected/38bf3863-9d2b-4158-afb6-e1feb4963f15-kube-api-access-qvnnr\") pod \"38bf3863-9d2b-4158-afb6-e1feb4963f15\" (UID: \"38bf3863-9d2b-4158-afb6-e1feb4963f15\") " Oct 12 05:58:17 crc kubenswrapper[4930]: I1012 05:58:17.670754 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38bf3863-9d2b-4158-afb6-e1feb4963f15-kube-api-access-qvnnr" (OuterVolumeSpecName: "kube-api-access-qvnnr") pod "38bf3863-9d2b-4158-afb6-e1feb4963f15" (UID: "38bf3863-9d2b-4158-afb6-e1feb4963f15"). InnerVolumeSpecName "kube-api-access-qvnnr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:58:17 crc kubenswrapper[4930]: I1012 05:58:17.765386 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qvnnr\" (UniqueName: \"kubernetes.io/projected/38bf3863-9d2b-4158-afb6-e1feb4963f15-kube-api-access-qvnnr\") on node \"crc\" DevicePath \"\"" Oct 12 05:58:18 crc kubenswrapper[4930]: I1012 05:58:18.233840 4930 generic.go:334] "Generic (PLEG): container finished" podID="96917cd8-6c91-463b-b8de-9d854e0ee581" containerID="3052de8a3e627f0cfd9983c59ff403b54f3820761a61c595ef8caee6a1314cef" exitCode=0 Oct 12 05:58:18 crc kubenswrapper[4930]: I1012 05:58:18.233940 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-wqxq9" event={"ID":"96917cd8-6c91-463b-b8de-9d854e0ee581","Type":"ContainerDied","Data":"3052de8a3e627f0cfd9983c59ff403b54f3820761a61c595ef8caee6a1314cef"} Oct 12 05:58:18 crc kubenswrapper[4930]: I1012 05:58:18.237520 4930 generic.go:334] "Generic (PLEG): container finished" podID="a43020f9-02f4-4d52-b00f-74fd6da4e46b" containerID="d01e5938047a270c554579961f7a5bf6b9042ef286e598a049e951cd5dfc957c" exitCode=0 Oct 12 05:58:18 crc kubenswrapper[4930]: I1012 05:58:18.237651 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55b99bf79c-xbwtf" event={"ID":"a43020f9-02f4-4d52-b00f-74fd6da4e46b","Type":"ContainerDied","Data":"d01e5938047a270c554579961f7a5bf6b9042ef286e598a049e951cd5dfc957c"} Oct 12 05:58:18 crc kubenswrapper[4930]: I1012 05:58:18.237714 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55b99bf79c-xbwtf" event={"ID":"a43020f9-02f4-4d52-b00f-74fd6da4e46b","Type":"ContainerStarted","Data":"282c4e6ca435fa75d2920a7b597dbd7819bb076f08f5fcf008acb02b9c845af9"} Oct 12 05:58:18 crc kubenswrapper[4930]: I1012 05:58:18.240192 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b9d1-account-create-np4q6" event={"ID":"38bf3863-9d2b-4158-afb6-e1feb4963f15","Type":"ContainerDied","Data":"5edc2ca24f7555be5ae806d6a71af41b96a8ecc0bc56d104bbb7b63241138cd8"} Oct 12 05:58:18 crc kubenswrapper[4930]: I1012 05:58:18.240251 4930 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5edc2ca24f7555be5ae806d6a71af41b96a8ecc0bc56d104bbb7b63241138cd8" Oct 12 05:58:18 crc kubenswrapper[4930]: I1012 05:58:18.240333 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b9d1-account-create-np4q6" Oct 12 05:58:19 crc kubenswrapper[4930]: I1012 05:58:19.256827 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55b99bf79c-xbwtf" event={"ID":"a43020f9-02f4-4d52-b00f-74fd6da4e46b","Type":"ContainerStarted","Data":"e44647d19ab751b3e2adbe990f4423e7c2122dd92fb056ee1ca8360d8a58997d"} Oct 12 05:58:19 crc kubenswrapper[4930]: I1012 05:58:19.257392 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55b99bf79c-xbwtf" Oct 12 05:58:19 crc kubenswrapper[4930]: I1012 05:58:19.259684 4930 generic.go:334] "Generic (PLEG): container finished" podID="b9bb388c-0196-4f0c-9567-2ef5b30889dd" containerID="08ce7ca7283a64b89f5cd5b7d08d7e47145a7c445f078df713a61e6404c9cb09" exitCode=0 Oct 12 05:58:19 crc kubenswrapper[4930]: I1012 05:58:19.259790 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-wndmm" event={"ID":"b9bb388c-0196-4f0c-9567-2ef5b30889dd","Type":"ContainerDied","Data":"08ce7ca7283a64b89f5cd5b7d08d7e47145a7c445f078df713a61e6404c9cb09"} Oct 12 05:58:19 crc kubenswrapper[4930]: I1012 05:58:19.299566 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55b99bf79c-xbwtf" podStartSLOduration=3.299544214 podStartE2EDuration="3.299544214s" podCreationTimestamp="2025-10-12 05:58:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:58:19.294140409 +0000 UTC m=+1031.836242244" watchObservedRunningTime="2025-10-12 05:58:19.299544214 +0000 UTC m=+1031.841646019" Oct 12 05:58:19 crc kubenswrapper[4930]: I1012 05:58:19.820384 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-wqxq9" Oct 12 05:58:19 crc kubenswrapper[4930]: I1012 05:58:19.896007 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-jj2m5"] Oct 12 05:58:19 crc kubenswrapper[4930]: E1012 05:58:19.896996 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5a8fa49-dfc5-4c99-b1b1-fc2795013a1c" containerName="mariadb-account-create" Oct 12 05:58:19 crc kubenswrapper[4930]: I1012 05:58:19.897024 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5a8fa49-dfc5-4c99-b1b1-fc2795013a1c" containerName="mariadb-account-create" Oct 12 05:58:19 crc kubenswrapper[4930]: E1012 05:58:19.897085 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="678827f4-28a3-4cb5-9886-96e11e89c172" containerName="mariadb-account-create" Oct 12 05:58:19 crc kubenswrapper[4930]: I1012 05:58:19.897096 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="678827f4-28a3-4cb5-9886-96e11e89c172" containerName="mariadb-account-create" Oct 12 05:58:19 crc kubenswrapper[4930]: E1012 05:58:19.897127 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9377945-1809-42da-b3c2-f38d3d91a1a6" containerName="mariadb-account-create" Oct 12 05:58:19 crc kubenswrapper[4930]: I1012 05:58:19.897137 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9377945-1809-42da-b3c2-f38d3d91a1a6" containerName="mariadb-account-create" Oct 12 05:58:19 crc kubenswrapper[4930]: E1012 05:58:19.897159 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96917cd8-6c91-463b-b8de-9d854e0ee581" containerName="watcher-db-sync" Oct 12 05:58:19 crc kubenswrapper[4930]: I1012 05:58:19.897168 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="96917cd8-6c91-463b-b8de-9d854e0ee581" containerName="watcher-db-sync" Oct 12 05:58:19 crc kubenswrapper[4930]: E1012 05:58:19.897182 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38bf3863-9d2b-4158-afb6-e1feb4963f15" containerName="mariadb-account-create" Oct 12 05:58:19 crc kubenswrapper[4930]: I1012 05:58:19.897190 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="38bf3863-9d2b-4158-afb6-e1feb4963f15" containerName="mariadb-account-create" Oct 12 05:58:19 crc kubenswrapper[4930]: I1012 05:58:19.897594 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="678827f4-28a3-4cb5-9886-96e11e89c172" containerName="mariadb-account-create" Oct 12 05:58:19 crc kubenswrapper[4930]: I1012 05:58:19.897623 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="38bf3863-9d2b-4158-afb6-e1feb4963f15" containerName="mariadb-account-create" Oct 12 05:58:19 crc kubenswrapper[4930]: I1012 05:58:19.897645 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="96917cd8-6c91-463b-b8de-9d854e0ee581" containerName="watcher-db-sync" Oct 12 05:58:19 crc kubenswrapper[4930]: I1012 05:58:19.897660 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9377945-1809-42da-b3c2-f38d3d91a1a6" containerName="mariadb-account-create" Oct 12 05:58:19 crc kubenswrapper[4930]: I1012 05:58:19.897691 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5a8fa49-dfc5-4c99-b1b1-fc2795013a1c" containerName="mariadb-account-create" Oct 12 05:58:19 crc kubenswrapper[4930]: I1012 05:58:19.898718 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-jj2m5" Oct 12 05:58:19 crc kubenswrapper[4930]: I1012 05:58:19.903037 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-6jg95" Oct 12 05:58:19 crc kubenswrapper[4930]: I1012 05:58:19.903449 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Oct 12 05:58:19 crc kubenswrapper[4930]: I1012 05:58:19.916682 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96917cd8-6c91-463b-b8de-9d854e0ee581-combined-ca-bundle\") pod \"96917cd8-6c91-463b-b8de-9d854e0ee581\" (UID: \"96917cd8-6c91-463b-b8de-9d854e0ee581\") " Oct 12 05:58:19 crc kubenswrapper[4930]: I1012 05:58:19.916820 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d89gg\" (UniqueName: \"kubernetes.io/projected/96917cd8-6c91-463b-b8de-9d854e0ee581-kube-api-access-d89gg\") pod \"96917cd8-6c91-463b-b8de-9d854e0ee581\" (UID: \"96917cd8-6c91-463b-b8de-9d854e0ee581\") " Oct 12 05:58:19 crc kubenswrapper[4930]: I1012 05:58:19.916956 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/96917cd8-6c91-463b-b8de-9d854e0ee581-db-sync-config-data\") pod \"96917cd8-6c91-463b-b8de-9d854e0ee581\" (UID: \"96917cd8-6c91-463b-b8de-9d854e0ee581\") " Oct 12 05:58:19 crc kubenswrapper[4930]: I1012 05:58:19.916980 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96917cd8-6c91-463b-b8de-9d854e0ee581-config-data\") pod \"96917cd8-6c91-463b-b8de-9d854e0ee581\" (UID: \"96917cd8-6c91-463b-b8de-9d854e0ee581\") " Oct 12 05:58:19 crc kubenswrapper[4930]: I1012 05:58:19.917856 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/448db83a-c0af-4680-890f-24b8d8da1088-config-data\") pod \"glance-db-sync-jj2m5\" (UID: \"448db83a-c0af-4680-890f-24b8d8da1088\") " pod="openstack/glance-db-sync-jj2m5" Oct 12 05:58:19 crc kubenswrapper[4930]: I1012 05:58:19.917886 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lr9gg\" (UniqueName: \"kubernetes.io/projected/448db83a-c0af-4680-890f-24b8d8da1088-kube-api-access-lr9gg\") pod \"glance-db-sync-jj2m5\" (UID: \"448db83a-c0af-4680-890f-24b8d8da1088\") " pod="openstack/glance-db-sync-jj2m5" Oct 12 05:58:19 crc kubenswrapper[4930]: I1012 05:58:19.918351 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/448db83a-c0af-4680-890f-24b8d8da1088-db-sync-config-data\") pod \"glance-db-sync-jj2m5\" (UID: \"448db83a-c0af-4680-890f-24b8d8da1088\") " pod="openstack/glance-db-sync-jj2m5" Oct 12 05:58:19 crc kubenswrapper[4930]: I1012 05:58:19.918550 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/448db83a-c0af-4680-890f-24b8d8da1088-combined-ca-bundle\") pod \"glance-db-sync-jj2m5\" (UID: \"448db83a-c0af-4680-890f-24b8d8da1088\") " pod="openstack/glance-db-sync-jj2m5" Oct 12 05:58:19 crc kubenswrapper[4930]: I1012 05:58:19.922005 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-jj2m5"] Oct 12 05:58:19 crc kubenswrapper[4930]: I1012 05:58:19.934504 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96917cd8-6c91-463b-b8de-9d854e0ee581-kube-api-access-d89gg" (OuterVolumeSpecName: "kube-api-access-d89gg") pod "96917cd8-6c91-463b-b8de-9d854e0ee581" (UID: "96917cd8-6c91-463b-b8de-9d854e0ee581"). InnerVolumeSpecName "kube-api-access-d89gg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:58:19 crc kubenswrapper[4930]: I1012 05:58:19.936931 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96917cd8-6c91-463b-b8de-9d854e0ee581-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "96917cd8-6c91-463b-b8de-9d854e0ee581" (UID: "96917cd8-6c91-463b-b8de-9d854e0ee581"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:58:19 crc kubenswrapper[4930]: I1012 05:58:19.958568 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96917cd8-6c91-463b-b8de-9d854e0ee581-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "96917cd8-6c91-463b-b8de-9d854e0ee581" (UID: "96917cd8-6c91-463b-b8de-9d854e0ee581"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:58:19 crc kubenswrapper[4930]: I1012 05:58:19.996298 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96917cd8-6c91-463b-b8de-9d854e0ee581-config-data" (OuterVolumeSpecName: "config-data") pod "96917cd8-6c91-463b-b8de-9d854e0ee581" (UID: "96917cd8-6c91-463b-b8de-9d854e0ee581"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:58:20 crc kubenswrapper[4930]: I1012 05:58:20.020571 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/448db83a-c0af-4680-890f-24b8d8da1088-db-sync-config-data\") pod \"glance-db-sync-jj2m5\" (UID: \"448db83a-c0af-4680-890f-24b8d8da1088\") " pod="openstack/glance-db-sync-jj2m5" Oct 12 05:58:20 crc kubenswrapper[4930]: I1012 05:58:20.020612 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/448db83a-c0af-4680-890f-24b8d8da1088-config-data\") pod \"glance-db-sync-jj2m5\" (UID: \"448db83a-c0af-4680-890f-24b8d8da1088\") " pod="openstack/glance-db-sync-jj2m5" Oct 12 05:58:20 crc kubenswrapper[4930]: I1012 05:58:20.020630 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lr9gg\" (UniqueName: \"kubernetes.io/projected/448db83a-c0af-4680-890f-24b8d8da1088-kube-api-access-lr9gg\") pod \"glance-db-sync-jj2m5\" (UID: \"448db83a-c0af-4680-890f-24b8d8da1088\") " pod="openstack/glance-db-sync-jj2m5" Oct 12 05:58:20 crc kubenswrapper[4930]: I1012 05:58:20.020689 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/448db83a-c0af-4680-890f-24b8d8da1088-combined-ca-bundle\") pod \"glance-db-sync-jj2m5\" (UID: \"448db83a-c0af-4680-890f-24b8d8da1088\") " pod="openstack/glance-db-sync-jj2m5" Oct 12 05:58:20 crc kubenswrapper[4930]: I1012 05:58:20.020835 4930 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/96917cd8-6c91-463b-b8de-9d854e0ee581-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 05:58:20 crc kubenswrapper[4930]: I1012 05:58:20.020847 4930 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96917cd8-6c91-463b-b8de-9d854e0ee581-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 05:58:20 crc kubenswrapper[4930]: I1012 05:58:20.020856 4930 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96917cd8-6c91-463b-b8de-9d854e0ee581-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 05:58:20 crc kubenswrapper[4930]: I1012 05:58:20.020865 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d89gg\" (UniqueName: \"kubernetes.io/projected/96917cd8-6c91-463b-b8de-9d854e0ee581-kube-api-access-d89gg\") on node \"crc\" DevicePath \"\"" Oct 12 05:58:20 crc kubenswrapper[4930]: I1012 05:58:20.024654 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/448db83a-c0af-4680-890f-24b8d8da1088-combined-ca-bundle\") pod \"glance-db-sync-jj2m5\" (UID: \"448db83a-c0af-4680-890f-24b8d8da1088\") " pod="openstack/glance-db-sync-jj2m5" Oct 12 05:58:20 crc kubenswrapper[4930]: I1012 05:58:20.024985 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/448db83a-c0af-4680-890f-24b8d8da1088-db-sync-config-data\") pod \"glance-db-sync-jj2m5\" (UID: \"448db83a-c0af-4680-890f-24b8d8da1088\") " pod="openstack/glance-db-sync-jj2m5" Oct 12 05:58:20 crc kubenswrapper[4930]: I1012 05:58:20.027713 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/448db83a-c0af-4680-890f-24b8d8da1088-config-data\") pod \"glance-db-sync-jj2m5\" (UID: \"448db83a-c0af-4680-890f-24b8d8da1088\") " pod="openstack/glance-db-sync-jj2m5" Oct 12 05:58:20 crc kubenswrapper[4930]: I1012 05:58:20.042597 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lr9gg\" (UniqueName: \"kubernetes.io/projected/448db83a-c0af-4680-890f-24b8d8da1088-kube-api-access-lr9gg\") pod \"glance-db-sync-jj2m5\" (UID: \"448db83a-c0af-4680-890f-24b8d8da1088\") " pod="openstack/glance-db-sync-jj2m5" Oct 12 05:58:20 crc kubenswrapper[4930]: I1012 05:58:20.227520 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-jj2m5" Oct 12 05:58:20 crc kubenswrapper[4930]: I1012 05:58:20.297533 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-wqxq9" Oct 12 05:58:20 crc kubenswrapper[4930]: I1012 05:58:20.298160 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-wqxq9" event={"ID":"96917cd8-6c91-463b-b8de-9d854e0ee581","Type":"ContainerDied","Data":"9c4c94360cafa338e010f1d05de7087d0a6b99f1b7ebe313f90b4fb117d5ce62"} Oct 12 05:58:20 crc kubenswrapper[4930]: I1012 05:58:20.298181 4930 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c4c94360cafa338e010f1d05de7087d0a6b99f1b7ebe313f90b4fb117d5ce62" Oct 12 05:58:20 crc kubenswrapper[4930]: I1012 05:58:20.770206 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-wndmm" Oct 12 05:58:20 crc kubenswrapper[4930]: I1012 05:58:20.836261 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9bb388c-0196-4f0c-9567-2ef5b30889dd-config-data\") pod \"b9bb388c-0196-4f0c-9567-2ef5b30889dd\" (UID: \"b9bb388c-0196-4f0c-9567-2ef5b30889dd\") " Oct 12 05:58:20 crc kubenswrapper[4930]: I1012 05:58:20.836581 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4clt2\" (UniqueName: \"kubernetes.io/projected/b9bb388c-0196-4f0c-9567-2ef5b30889dd-kube-api-access-4clt2\") pod \"b9bb388c-0196-4f0c-9567-2ef5b30889dd\" (UID: \"b9bb388c-0196-4f0c-9567-2ef5b30889dd\") " Oct 12 05:58:20 crc kubenswrapper[4930]: I1012 05:58:20.836822 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9bb388c-0196-4f0c-9567-2ef5b30889dd-combined-ca-bundle\") pod \"b9bb388c-0196-4f0c-9567-2ef5b30889dd\" (UID: \"b9bb388c-0196-4f0c-9567-2ef5b30889dd\") " Oct 12 05:58:20 crc kubenswrapper[4930]: I1012 05:58:20.845567 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9bb388c-0196-4f0c-9567-2ef5b30889dd-kube-api-access-4clt2" (OuterVolumeSpecName: "kube-api-access-4clt2") pod "b9bb388c-0196-4f0c-9567-2ef5b30889dd" (UID: "b9bb388c-0196-4f0c-9567-2ef5b30889dd"). InnerVolumeSpecName "kube-api-access-4clt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:58:20 crc kubenswrapper[4930]: I1012 05:58:20.916036 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9bb388c-0196-4f0c-9567-2ef5b30889dd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b9bb388c-0196-4f0c-9567-2ef5b30889dd" (UID: "b9bb388c-0196-4f0c-9567-2ef5b30889dd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:58:20 crc kubenswrapper[4930]: I1012 05:58:20.945730 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4clt2\" (UniqueName: \"kubernetes.io/projected/b9bb388c-0196-4f0c-9567-2ef5b30889dd-kube-api-access-4clt2\") on node \"crc\" DevicePath \"\"" Oct 12 05:58:20 crc kubenswrapper[4930]: I1012 05:58:20.945775 4930 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9bb388c-0196-4f0c-9567-2ef5b30889dd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 05:58:20 crc kubenswrapper[4930]: I1012 05:58:20.988429 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9bb388c-0196-4f0c-9567-2ef5b30889dd-config-data" (OuterVolumeSpecName: "config-data") pod "b9bb388c-0196-4f0c-9567-2ef5b30889dd" (UID: "b9bb388c-0196-4f0c-9567-2ef5b30889dd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.026249 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-jj2m5"] Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.047453 4930 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9bb388c-0196-4f0c-9567-2ef5b30889dd-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.307150 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-jj2m5" event={"ID":"448db83a-c0af-4680-890f-24b8d8da1088","Type":"ContainerStarted","Data":"9a267634c47516a8610d7b803a05e7cf1334339d9b2bd497dc7a1bd3ba6c4736"} Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.310412 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-wndmm" event={"ID":"b9bb388c-0196-4f0c-9567-2ef5b30889dd","Type":"ContainerDied","Data":"013acc0458f454f4abcfd0e868c4d08fd13fef15d0e56b40389180dc871c6bac"} Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.310457 4930 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="013acc0458f454f4abcfd0e868c4d08fd13fef15d0e56b40389180dc871c6bac" Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.310508 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-wndmm" Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.561786 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55b99bf79c-xbwtf"] Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.562383 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55b99bf79c-xbwtf" podUID="a43020f9-02f4-4d52-b00f-74fd6da4e46b" containerName="dnsmasq-dns" containerID="cri-o://e44647d19ab751b3e2adbe990f4423e7c2122dd92fb056ee1ca8360d8a58997d" gracePeriod=10 Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.579351 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-qfmkt"] Oct 12 05:58:21 crc kubenswrapper[4930]: E1012 05:58:21.579861 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9bb388c-0196-4f0c-9567-2ef5b30889dd" containerName="keystone-db-sync" Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.579879 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9bb388c-0196-4f0c-9567-2ef5b30889dd" containerName="keystone-db-sync" Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.580091 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9bb388c-0196-4f0c-9567-2ef5b30889dd" containerName="keystone-db-sync" Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.580714 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qfmkt" Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.587719 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.588233 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.588330 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.588756 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-hkmzw" Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.589997 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-qfmkt"] Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.640872 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58bbf48b7f-dlk7f"] Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.642682 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58bbf48b7f-dlk7f" Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.691143 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58bbf48b7f-dlk7f"] Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.715092 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-applier-0"] Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.716172 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.723202 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-watcher-dockercfg-bcmxb" Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.723376 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-applier-config-data" Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.731355 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.762539 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f8e573a1-8105-4b4a-9ba8-dac6aef25de2-fernet-keys\") pod \"keystone-bootstrap-qfmkt\" (UID: \"f8e573a1-8105-4b4a-9ba8-dac6aef25de2\") " pod="openstack/keystone-bootstrap-qfmkt" Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.762602 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f8e573a1-8105-4b4a-9ba8-dac6aef25de2-credential-keys\") pod \"keystone-bootstrap-qfmkt\" (UID: \"f8e573a1-8105-4b4a-9ba8-dac6aef25de2\") " pod="openstack/keystone-bootstrap-qfmkt" Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.762637 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxwkg\" (UniqueName: \"kubernetes.io/projected/ce5f0598-e80c-49d3-bd51-3bbd167395db-kube-api-access-lxwkg\") pod \"dnsmasq-dns-58bbf48b7f-dlk7f\" (UID: \"ce5f0598-e80c-49d3-bd51-3bbd167395db\") " pod="openstack/dnsmasq-dns-58bbf48b7f-dlk7f" Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.762661 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce5f0598-e80c-49d3-bd51-3bbd167395db-ovsdbserver-nb\") pod \"dnsmasq-dns-58bbf48b7f-dlk7f\" (UID: \"ce5f0598-e80c-49d3-bd51-3bbd167395db\") " pod="openstack/dnsmasq-dns-58bbf48b7f-dlk7f" Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.762684 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8e573a1-8105-4b4a-9ba8-dac6aef25de2-config-data\") pod \"keystone-bootstrap-qfmkt\" (UID: \"f8e573a1-8105-4b4a-9ba8-dac6aef25de2\") " pod="openstack/keystone-bootstrap-qfmkt" Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.762714 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmchd\" (UniqueName: \"kubernetes.io/projected/f8e573a1-8105-4b4a-9ba8-dac6aef25de2-kube-api-access-lmchd\") pod \"keystone-bootstrap-qfmkt\" (UID: \"f8e573a1-8105-4b4a-9ba8-dac6aef25de2\") " pod="openstack/keystone-bootstrap-qfmkt" Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.762747 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce5f0598-e80c-49d3-bd51-3bbd167395db-dns-svc\") pod \"dnsmasq-dns-58bbf48b7f-dlk7f\" (UID: \"ce5f0598-e80c-49d3-bd51-3bbd167395db\") " pod="openstack/dnsmasq-dns-58bbf48b7f-dlk7f" Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.762803 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce5f0598-e80c-49d3-bd51-3bbd167395db-config\") pod \"dnsmasq-dns-58bbf48b7f-dlk7f\" (UID: \"ce5f0598-e80c-49d3-bd51-3bbd167395db\") " pod="openstack/dnsmasq-dns-58bbf48b7f-dlk7f" Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.762825 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce5f0598-e80c-49d3-bd51-3bbd167395db-ovsdbserver-sb\") pod \"dnsmasq-dns-58bbf48b7f-dlk7f\" (UID: \"ce5f0598-e80c-49d3-bd51-3bbd167395db\") " pod="openstack/dnsmasq-dns-58bbf48b7f-dlk7f" Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.762844 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8e573a1-8105-4b4a-9ba8-dac6aef25de2-combined-ca-bundle\") pod \"keystone-bootstrap-qfmkt\" (UID: \"f8e573a1-8105-4b4a-9ba8-dac6aef25de2\") " pod="openstack/keystone-bootstrap-qfmkt" Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.762873 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8e573a1-8105-4b4a-9ba8-dac6aef25de2-scripts\") pod \"keystone-bootstrap-qfmkt\" (UID: \"f8e573a1-8105-4b4a-9ba8-dac6aef25de2\") " pod="openstack/keystone-bootstrap-qfmkt" Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.762903 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce5f0598-e80c-49d3-bd51-3bbd167395db-dns-swift-storage-0\") pod \"dnsmasq-dns-58bbf48b7f-dlk7f\" (UID: \"ce5f0598-e80c-49d3-bd51-3bbd167395db\") " pod="openstack/dnsmasq-dns-58bbf48b7f-dlk7f" Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.765079 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.766228 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.769164 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.773879 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.790829 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-84fd96956f-5sqc8"] Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.792291 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-84fd96956f-5sqc8" Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.796921 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.796987 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.797120 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.797153 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-6dffg" Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.813524 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.814960 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.818892 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.833052 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.852136 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-84fd96956f-5sqc8"] Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.897871 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvck7\" (UniqueName: \"kubernetes.io/projected/dda9904e-b764-43a5-83d2-5c993023f740-kube-api-access-mvck7\") pod \"watcher-decision-engine-0\" (UID: \"dda9904e-b764-43a5-83d2-5c993023f740\") " pod="openstack/watcher-decision-engine-0" Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.898968 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27cst\" (UniqueName: \"kubernetes.io/projected/934c43b8-93c5-442d-ba52-2e3bfc2d8f35-kube-api-access-27cst\") pod \"watcher-applier-0\" (UID: \"934c43b8-93c5-442d-ba52-2e3bfc2d8f35\") " pod="openstack/watcher-applier-0" Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.899131 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce5f0598-e80c-49d3-bd51-3bbd167395db-config\") pod \"dnsmasq-dns-58bbf48b7f-dlk7f\" (UID: \"ce5f0598-e80c-49d3-bd51-3bbd167395db\") " pod="openstack/dnsmasq-dns-58bbf48b7f-dlk7f" Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.899213 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dda9904e-b764-43a5-83d2-5c993023f740-logs\") pod \"watcher-decision-engine-0\" (UID: \"dda9904e-b764-43a5-83d2-5c993023f740\") " pod="openstack/watcher-decision-engine-0" Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.900050 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce5f0598-e80c-49d3-bd51-3bbd167395db-ovsdbserver-sb\") pod \"dnsmasq-dns-58bbf48b7f-dlk7f\" (UID: \"ce5f0598-e80c-49d3-bd51-3bbd167395db\") " pod="openstack/dnsmasq-dns-58bbf48b7f-dlk7f" Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.900155 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8e573a1-8105-4b4a-9ba8-dac6aef25de2-combined-ca-bundle\") pod \"keystone-bootstrap-qfmkt\" (UID: \"f8e573a1-8105-4b4a-9ba8-dac6aef25de2\") " pod="openstack/keystone-bootstrap-qfmkt" Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.900228 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/934c43b8-93c5-442d-ba52-2e3bfc2d8f35-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"934c43b8-93c5-442d-ba52-2e3bfc2d8f35\") " pod="openstack/watcher-applier-0" Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.900328 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8e573a1-8105-4b4a-9ba8-dac6aef25de2-scripts\") pod \"keystone-bootstrap-qfmkt\" (UID: \"f8e573a1-8105-4b4a-9ba8-dac6aef25de2\") " pod="openstack/keystone-bootstrap-qfmkt" Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.900458 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dda9904e-b764-43a5-83d2-5c993023f740-config-data\") pod \"watcher-decision-engine-0\" (UID: \"dda9904e-b764-43a5-83d2-5c993023f740\") " pod="openstack/watcher-decision-engine-0" Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.900526 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce5f0598-e80c-49d3-bd51-3bbd167395db-dns-swift-storage-0\") pod \"dnsmasq-dns-58bbf48b7f-dlk7f\" (UID: \"ce5f0598-e80c-49d3-bd51-3bbd167395db\") " pod="openstack/dnsmasq-dns-58bbf48b7f-dlk7f" Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.900632 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dda9904e-b764-43a5-83d2-5c993023f740-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"dda9904e-b764-43a5-83d2-5c993023f740\") " pod="openstack/watcher-decision-engine-0" Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.902846 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f8e573a1-8105-4b4a-9ba8-dac6aef25de2-fernet-keys\") pod \"keystone-bootstrap-qfmkt\" (UID: \"f8e573a1-8105-4b4a-9ba8-dac6aef25de2\") " pod="openstack/keystone-bootstrap-qfmkt" Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.903043 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f8e573a1-8105-4b4a-9ba8-dac6aef25de2-credential-keys\") pod \"keystone-bootstrap-qfmkt\" (UID: \"f8e573a1-8105-4b4a-9ba8-dac6aef25de2\") " pod="openstack/keystone-bootstrap-qfmkt" Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.903129 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxwkg\" (UniqueName: \"kubernetes.io/projected/ce5f0598-e80c-49d3-bd51-3bbd167395db-kube-api-access-lxwkg\") pod \"dnsmasq-dns-58bbf48b7f-dlk7f\" (UID: \"ce5f0598-e80c-49d3-bd51-3bbd167395db\") " pod="openstack/dnsmasq-dns-58bbf48b7f-dlk7f" Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.903212 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce5f0598-e80c-49d3-bd51-3bbd167395db-ovsdbserver-nb\") pod \"dnsmasq-dns-58bbf48b7f-dlk7f\" (UID: \"ce5f0598-e80c-49d3-bd51-3bbd167395db\") " pod="openstack/dnsmasq-dns-58bbf48b7f-dlk7f" Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.903343 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8e573a1-8105-4b4a-9ba8-dac6aef25de2-config-data\") pod \"keystone-bootstrap-qfmkt\" (UID: \"f8e573a1-8105-4b4a-9ba8-dac6aef25de2\") " pod="openstack/keystone-bootstrap-qfmkt" Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.903482 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/934c43b8-93c5-442d-ba52-2e3bfc2d8f35-logs\") pod \"watcher-applier-0\" (UID: \"934c43b8-93c5-442d-ba52-2e3bfc2d8f35\") " pod="openstack/watcher-applier-0" Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.903587 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmchd\" (UniqueName: \"kubernetes.io/projected/f8e573a1-8105-4b4a-9ba8-dac6aef25de2-kube-api-access-lmchd\") pod \"keystone-bootstrap-qfmkt\" (UID: \"f8e573a1-8105-4b4a-9ba8-dac6aef25de2\") " pod="openstack/keystone-bootstrap-qfmkt" Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.903730 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce5f0598-e80c-49d3-bd51-3bbd167395db-dns-svc\") pod \"dnsmasq-dns-58bbf48b7f-dlk7f\" (UID: \"ce5f0598-e80c-49d3-bd51-3bbd167395db\") " pod="openstack/dnsmasq-dns-58bbf48b7f-dlk7f" Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.903913 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/934c43b8-93c5-442d-ba52-2e3bfc2d8f35-config-data\") pod \"watcher-applier-0\" (UID: \"934c43b8-93c5-442d-ba52-2e3bfc2d8f35\") " pod="openstack/watcher-applier-0" Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.904055 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/dda9904e-b764-43a5-83d2-5c993023f740-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"dda9904e-b764-43a5-83d2-5c993023f740\") " pod="openstack/watcher-decision-engine-0" Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.908120 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce5f0598-e80c-49d3-bd51-3bbd167395db-ovsdbserver-nb\") pod \"dnsmasq-dns-58bbf48b7f-dlk7f\" (UID: \"ce5f0598-e80c-49d3-bd51-3bbd167395db\") " pod="openstack/dnsmasq-dns-58bbf48b7f-dlk7f" Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.909313 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce5f0598-e80c-49d3-bd51-3bbd167395db-dns-svc\") pod \"dnsmasq-dns-58bbf48b7f-dlk7f\" (UID: \"ce5f0598-e80c-49d3-bd51-3bbd167395db\") " pod="openstack/dnsmasq-dns-58bbf48b7f-dlk7f" Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.911766 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce5f0598-e80c-49d3-bd51-3bbd167395db-dns-swift-storage-0\") pod \"dnsmasq-dns-58bbf48b7f-dlk7f\" (UID: \"ce5f0598-e80c-49d3-bd51-3bbd167395db\") " pod="openstack/dnsmasq-dns-58bbf48b7f-dlk7f" Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.902582 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce5f0598-e80c-49d3-bd51-3bbd167395db-config\") pod \"dnsmasq-dns-58bbf48b7f-dlk7f\" (UID: \"ce5f0598-e80c-49d3-bd51-3bbd167395db\") " pod="openstack/dnsmasq-dns-58bbf48b7f-dlk7f" Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.901995 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce5f0598-e80c-49d3-bd51-3bbd167395db-ovsdbserver-sb\") pod \"dnsmasq-dns-58bbf48b7f-dlk7f\" (UID: \"ce5f0598-e80c-49d3-bd51-3bbd167395db\") " pod="openstack/dnsmasq-dns-58bbf48b7f-dlk7f" Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.952297 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmchd\" (UniqueName: \"kubernetes.io/projected/f8e573a1-8105-4b4a-9ba8-dac6aef25de2-kube-api-access-lmchd\") pod \"keystone-bootstrap-qfmkt\" (UID: \"f8e573a1-8105-4b4a-9ba8-dac6aef25de2\") " pod="openstack/keystone-bootstrap-qfmkt" Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.955859 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8e573a1-8105-4b4a-9ba8-dac6aef25de2-scripts\") pod \"keystone-bootstrap-qfmkt\" (UID: \"f8e573a1-8105-4b4a-9ba8-dac6aef25de2\") " pod="openstack/keystone-bootstrap-qfmkt" Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.966390 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8e573a1-8105-4b4a-9ba8-dac6aef25de2-config-data\") pod \"keystone-bootstrap-qfmkt\" (UID: \"f8e573a1-8105-4b4a-9ba8-dac6aef25de2\") " pod="openstack/keystone-bootstrap-qfmkt" Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.976136 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f8e573a1-8105-4b4a-9ba8-dac6aef25de2-credential-keys\") pod \"keystone-bootstrap-qfmkt\" (UID: \"f8e573a1-8105-4b4a-9ba8-dac6aef25de2\") " pod="openstack/keystone-bootstrap-qfmkt" Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.976867 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f8e573a1-8105-4b4a-9ba8-dac6aef25de2-fernet-keys\") pod \"keystone-bootstrap-qfmkt\" (UID: \"f8e573a1-8105-4b4a-9ba8-dac6aef25de2\") " pod="openstack/keystone-bootstrap-qfmkt" Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.992567 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8e573a1-8105-4b4a-9ba8-dac6aef25de2-combined-ca-bundle\") pod \"keystone-bootstrap-qfmkt\" (UID: \"f8e573a1-8105-4b4a-9ba8-dac6aef25de2\") " pod="openstack/keystone-bootstrap-qfmkt" Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.993757 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-bg9r7"] Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.995209 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-bg9r7" Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.998671 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-fcjrp" Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.999125 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 12 05:58:21 crc kubenswrapper[4930]: I1012 05:58:21.999389 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.009606 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/934c43b8-93c5-442d-ba52-2e3bfc2d8f35-logs\") pod \"watcher-applier-0\" (UID: \"934c43b8-93c5-442d-ba52-2e3bfc2d8f35\") " pod="openstack/watcher-applier-0" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.010561 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tk6hg\" (UniqueName: \"kubernetes.io/projected/b69d2329-3396-4e67-9880-940dacef7e56-kube-api-access-tk6hg\") pod \"horizon-84fd96956f-5sqc8\" (UID: \"b69d2329-3396-4e67-9880-940dacef7e56\") " pod="openstack/horizon-84fd96956f-5sqc8" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.010672 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/799201c0-a8e7-40f8-bce3-8d43fc5f9a9e-config-data\") pod \"watcher-api-0\" (UID: \"799201c0-a8e7-40f8-bce3-8d43fc5f9a9e\") " pod="openstack/watcher-api-0" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.010801 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/934c43b8-93c5-442d-ba52-2e3bfc2d8f35-config-data\") pod \"watcher-applier-0\" (UID: \"934c43b8-93c5-442d-ba52-2e3bfc2d8f35\") " pod="openstack/watcher-applier-0" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.011228 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b69d2329-3396-4e67-9880-940dacef7e56-horizon-secret-key\") pod \"horizon-84fd96956f-5sqc8\" (UID: \"b69d2329-3396-4e67-9880-940dacef7e56\") " pod="openstack/horizon-84fd96956f-5sqc8" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.009635 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxwkg\" (UniqueName: \"kubernetes.io/projected/ce5f0598-e80c-49d3-bd51-3bbd167395db-kube-api-access-lxwkg\") pod \"dnsmasq-dns-58bbf48b7f-dlk7f\" (UID: \"ce5f0598-e80c-49d3-bd51-3bbd167395db\") " pod="openstack/dnsmasq-dns-58bbf48b7f-dlk7f" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.010288 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/934c43b8-93c5-442d-ba52-2e3bfc2d8f35-logs\") pod \"watcher-applier-0\" (UID: \"934c43b8-93c5-442d-ba52-2e3bfc2d8f35\") " pod="openstack/watcher-applier-0" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.011527 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/dda9904e-b764-43a5-83d2-5c993023f740-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"dda9904e-b764-43a5-83d2-5c993023f740\") " pod="openstack/watcher-decision-engine-0" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.011560 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/799201c0-a8e7-40f8-bce3-8d43fc5f9a9e-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"799201c0-a8e7-40f8-bce3-8d43fc5f9a9e\") " pod="openstack/watcher-api-0" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.011673 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvck7\" (UniqueName: \"kubernetes.io/projected/dda9904e-b764-43a5-83d2-5c993023f740-kube-api-access-mvck7\") pod \"watcher-decision-engine-0\" (UID: \"dda9904e-b764-43a5-83d2-5c993023f740\") " pod="openstack/watcher-decision-engine-0" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.011699 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27cst\" (UniqueName: \"kubernetes.io/projected/934c43b8-93c5-442d-ba52-2e3bfc2d8f35-kube-api-access-27cst\") pod \"watcher-applier-0\" (UID: \"934c43b8-93c5-442d-ba52-2e3bfc2d8f35\") " pod="openstack/watcher-applier-0" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.011759 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b69d2329-3396-4e67-9880-940dacef7e56-logs\") pod \"horizon-84fd96956f-5sqc8\" (UID: \"b69d2329-3396-4e67-9880-940dacef7e56\") " pod="openstack/horizon-84fd96956f-5sqc8" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.011785 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dda9904e-b764-43a5-83d2-5c993023f740-logs\") pod \"watcher-decision-engine-0\" (UID: \"dda9904e-b764-43a5-83d2-5c993023f740\") " pod="openstack/watcher-decision-engine-0" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.011816 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/799201c0-a8e7-40f8-bce3-8d43fc5f9a9e-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"799201c0-a8e7-40f8-bce3-8d43fc5f9a9e\") " pod="openstack/watcher-api-0" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.011847 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/934c43b8-93c5-442d-ba52-2e3bfc2d8f35-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"934c43b8-93c5-442d-ba52-2e3bfc2d8f35\") " pod="openstack/watcher-applier-0" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.015926 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/799201c0-a8e7-40f8-bce3-8d43fc5f9a9e-logs\") pod \"watcher-api-0\" (UID: \"799201c0-a8e7-40f8-bce3-8d43fc5f9a9e\") " pod="openstack/watcher-api-0" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.015976 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b69d2329-3396-4e67-9880-940dacef7e56-config-data\") pod \"horizon-84fd96956f-5sqc8\" (UID: \"b69d2329-3396-4e67-9880-940dacef7e56\") " pod="openstack/horizon-84fd96956f-5sqc8" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.016002 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dda9904e-b764-43a5-83d2-5c993023f740-config-data\") pod \"watcher-decision-engine-0\" (UID: \"dda9904e-b764-43a5-83d2-5c993023f740\") " pod="openstack/watcher-decision-engine-0" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.016037 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dda9904e-b764-43a5-83d2-5c993023f740-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"dda9904e-b764-43a5-83d2-5c993023f740\") " pod="openstack/watcher-decision-engine-0" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.016077 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b69d2329-3396-4e67-9880-940dacef7e56-scripts\") pod \"horizon-84fd96956f-5sqc8\" (UID: \"b69d2329-3396-4e67-9880-940dacef7e56\") " pod="openstack/horizon-84fd96956f-5sqc8" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.016158 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmbhd\" (UniqueName: \"kubernetes.io/projected/799201c0-a8e7-40f8-bce3-8d43fc5f9a9e-kube-api-access-mmbhd\") pod \"watcher-api-0\" (UID: \"799201c0-a8e7-40f8-bce3-8d43fc5f9a9e\") " pod="openstack/watcher-api-0" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.012503 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dda9904e-b764-43a5-83d2-5c993023f740-logs\") pod \"watcher-decision-engine-0\" (UID: \"dda9904e-b764-43a5-83d2-5c993023f740\") " pod="openstack/watcher-decision-engine-0" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.021236 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-bg9r7"] Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.024442 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dda9904e-b764-43a5-83d2-5c993023f740-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"dda9904e-b764-43a5-83d2-5c993023f740\") " pod="openstack/watcher-decision-engine-0" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.038492 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/934c43b8-93c5-442d-ba52-2e3bfc2d8f35-config-data\") pod \"watcher-applier-0\" (UID: \"934c43b8-93c5-442d-ba52-2e3bfc2d8f35\") " pod="openstack/watcher-applier-0" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.048066 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/934c43b8-93c5-442d-ba52-2e3bfc2d8f35-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"934c43b8-93c5-442d-ba52-2e3bfc2d8f35\") " pod="openstack/watcher-applier-0" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.050513 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dda9904e-b764-43a5-83d2-5c993023f740-config-data\") pod \"watcher-decision-engine-0\" (UID: \"dda9904e-b764-43a5-83d2-5c993023f740\") " pod="openstack/watcher-decision-engine-0" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.056896 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/dda9904e-b764-43a5-83d2-5c993023f740-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"dda9904e-b764-43a5-83d2-5c993023f740\") " pod="openstack/watcher-decision-engine-0" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.068995 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.073322 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.085698 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.086158 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.086722 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvck7\" (UniqueName: \"kubernetes.io/projected/dda9904e-b764-43a5-83d2-5c993023f740-kube-api-access-mvck7\") pod \"watcher-decision-engine-0\" (UID: \"dda9904e-b764-43a5-83d2-5c993023f740\") " pod="openstack/watcher-decision-engine-0" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.094425 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27cst\" (UniqueName: \"kubernetes.io/projected/934c43b8-93c5-442d-ba52-2e3bfc2d8f35-kube-api-access-27cst\") pod \"watcher-applier-0\" (UID: \"934c43b8-93c5-442d-ba52-2e3bfc2d8f35\") " pod="openstack/watcher-applier-0" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.105919 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.129528 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b69d2329-3396-4e67-9880-940dacef7e56-logs\") pod \"horizon-84fd96956f-5sqc8\" (UID: \"b69d2329-3396-4e67-9880-940dacef7e56\") " pod="openstack/horizon-84fd96956f-5sqc8" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.129580 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66mtr\" (UniqueName: \"kubernetes.io/projected/99ff807c-d810-4ff9-9ed3-3b2d37d3fbed-kube-api-access-66mtr\") pod \"cinder-db-sync-bg9r7\" (UID: \"99ff807c-d810-4ff9-9ed3-3b2d37d3fbed\") " pod="openstack/cinder-db-sync-bg9r7" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.129602 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/799201c0-a8e7-40f8-bce3-8d43fc5f9a9e-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"799201c0-a8e7-40f8-bce3-8d43fc5f9a9e\") " pod="openstack/watcher-api-0" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.129629 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99ff807c-d810-4ff9-9ed3-3b2d37d3fbed-config-data\") pod \"cinder-db-sync-bg9r7\" (UID: \"99ff807c-d810-4ff9-9ed3-3b2d37d3fbed\") " pod="openstack/cinder-db-sync-bg9r7" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.129644 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/99ff807c-d810-4ff9-9ed3-3b2d37d3fbed-db-sync-config-data\") pod \"cinder-db-sync-bg9r7\" (UID: \"99ff807c-d810-4ff9-9ed3-3b2d37d3fbed\") " pod="openstack/cinder-db-sync-bg9r7" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.129661 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7b25d9dd-3bce-4f8d-9219-1d6ce75878ed-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7b25d9dd-3bce-4f8d-9219-1d6ce75878ed\") " pod="openstack/ceilometer-0" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.129680 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/799201c0-a8e7-40f8-bce3-8d43fc5f9a9e-logs\") pod \"watcher-api-0\" (UID: \"799201c0-a8e7-40f8-bce3-8d43fc5f9a9e\") " pod="openstack/watcher-api-0" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.129702 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b69d2329-3396-4e67-9880-940dacef7e56-config-data\") pod \"horizon-84fd96956f-5sqc8\" (UID: \"b69d2329-3396-4e67-9880-940dacef7e56\") " pod="openstack/horizon-84fd96956f-5sqc8" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.129721 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/99ff807c-d810-4ff9-9ed3-3b2d37d3fbed-etc-machine-id\") pod \"cinder-db-sync-bg9r7\" (UID: \"99ff807c-d810-4ff9-9ed3-3b2d37d3fbed\") " pod="openstack/cinder-db-sync-bg9r7" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.129800 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b25d9dd-3bce-4f8d-9219-1d6ce75878ed-run-httpd\") pod \"ceilometer-0\" (UID: \"7b25d9dd-3bce-4f8d-9219-1d6ce75878ed\") " pod="openstack/ceilometer-0" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.129829 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b69d2329-3396-4e67-9880-940dacef7e56-scripts\") pod \"horizon-84fd96956f-5sqc8\" (UID: \"b69d2329-3396-4e67-9880-940dacef7e56\") " pod="openstack/horizon-84fd96956f-5sqc8" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.129850 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b25d9dd-3bce-4f8d-9219-1d6ce75878ed-scripts\") pod \"ceilometer-0\" (UID: \"7b25d9dd-3bce-4f8d-9219-1d6ce75878ed\") " pod="openstack/ceilometer-0" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.129874 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmbhd\" (UniqueName: \"kubernetes.io/projected/799201c0-a8e7-40f8-bce3-8d43fc5f9a9e-kube-api-access-mmbhd\") pod \"watcher-api-0\" (UID: \"799201c0-a8e7-40f8-bce3-8d43fc5f9a9e\") " pod="openstack/watcher-api-0" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.129916 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tk6hg\" (UniqueName: \"kubernetes.io/projected/b69d2329-3396-4e67-9880-940dacef7e56-kube-api-access-tk6hg\") pod \"horizon-84fd96956f-5sqc8\" (UID: \"b69d2329-3396-4e67-9880-940dacef7e56\") " pod="openstack/horizon-84fd96956f-5sqc8" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.129932 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/799201c0-a8e7-40f8-bce3-8d43fc5f9a9e-config-data\") pod \"watcher-api-0\" (UID: \"799201c0-a8e7-40f8-bce3-8d43fc5f9a9e\") " pod="openstack/watcher-api-0" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.129952 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b25d9dd-3bce-4f8d-9219-1d6ce75878ed-config-data\") pod \"ceilometer-0\" (UID: \"7b25d9dd-3bce-4f8d-9219-1d6ce75878ed\") " pod="openstack/ceilometer-0" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.129966 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b69d2329-3396-4e67-9880-940dacef7e56-horizon-secret-key\") pod \"horizon-84fd96956f-5sqc8\" (UID: \"b69d2329-3396-4e67-9880-940dacef7e56\") " pod="openstack/horizon-84fd96956f-5sqc8" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.129987 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b25d9dd-3bce-4f8d-9219-1d6ce75878ed-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7b25d9dd-3bce-4f8d-9219-1d6ce75878ed\") " pod="openstack/ceilometer-0" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.130002 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsmjv\" (UniqueName: \"kubernetes.io/projected/7b25d9dd-3bce-4f8d-9219-1d6ce75878ed-kube-api-access-fsmjv\") pod \"ceilometer-0\" (UID: \"7b25d9dd-3bce-4f8d-9219-1d6ce75878ed\") " pod="openstack/ceilometer-0" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.130017 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/799201c0-a8e7-40f8-bce3-8d43fc5f9a9e-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"799201c0-a8e7-40f8-bce3-8d43fc5f9a9e\") " pod="openstack/watcher-api-0" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.130038 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b25d9dd-3bce-4f8d-9219-1d6ce75878ed-log-httpd\") pod \"ceilometer-0\" (UID: \"7b25d9dd-3bce-4f8d-9219-1d6ce75878ed\") " pod="openstack/ceilometer-0" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.130054 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99ff807c-d810-4ff9-9ed3-3b2d37d3fbed-scripts\") pod \"cinder-db-sync-bg9r7\" (UID: \"99ff807c-d810-4ff9-9ed3-3b2d37d3fbed\") " pod="openstack/cinder-db-sync-bg9r7" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.130072 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99ff807c-d810-4ff9-9ed3-3b2d37d3fbed-combined-ca-bundle\") pod \"cinder-db-sync-bg9r7\" (UID: \"99ff807c-d810-4ff9-9ed3-3b2d37d3fbed\") " pod="openstack/cinder-db-sync-bg9r7" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.130066 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b69d2329-3396-4e67-9880-940dacef7e56-logs\") pod \"horizon-84fd96956f-5sqc8\" (UID: \"b69d2329-3396-4e67-9880-940dacef7e56\") " pod="openstack/horizon-84fd96956f-5sqc8" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.131093 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b69d2329-3396-4e67-9880-940dacef7e56-scripts\") pod \"horizon-84fd96956f-5sqc8\" (UID: \"b69d2329-3396-4e67-9880-940dacef7e56\") " pod="openstack/horizon-84fd96956f-5sqc8" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.131343 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/799201c0-a8e7-40f8-bce3-8d43fc5f9a9e-logs\") pod \"watcher-api-0\" (UID: \"799201c0-a8e7-40f8-bce3-8d43fc5f9a9e\") " pod="openstack/watcher-api-0" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.138042 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.138536 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/799201c0-a8e7-40f8-bce3-8d43fc5f9a9e-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"799201c0-a8e7-40f8-bce3-8d43fc5f9a9e\") " pod="openstack/watcher-api-0" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.141146 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/799201c0-a8e7-40f8-bce3-8d43fc5f9a9e-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"799201c0-a8e7-40f8-bce3-8d43fc5f9a9e\") " pod="openstack/watcher-api-0" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.141435 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b69d2329-3396-4e67-9880-940dacef7e56-horizon-secret-key\") pod \"horizon-84fd96956f-5sqc8\" (UID: \"b69d2329-3396-4e67-9880-940dacef7e56\") " pod="openstack/horizon-84fd96956f-5sqc8" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.143163 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b69d2329-3396-4e67-9880-940dacef7e56-config-data\") pod \"horizon-84fd96956f-5sqc8\" (UID: \"b69d2329-3396-4e67-9880-940dacef7e56\") " pod="openstack/horizon-84fd96956f-5sqc8" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.146605 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/799201c0-a8e7-40f8-bce3-8d43fc5f9a9e-config-data\") pod \"watcher-api-0\" (UID: \"799201c0-a8e7-40f8-bce3-8d43fc5f9a9e\") " pod="openstack/watcher-api-0" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.157243 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tk6hg\" (UniqueName: \"kubernetes.io/projected/b69d2329-3396-4e67-9880-940dacef7e56-kube-api-access-tk6hg\") pod \"horizon-84fd96956f-5sqc8\" (UID: \"b69d2329-3396-4e67-9880-940dacef7e56\") " pod="openstack/horizon-84fd96956f-5sqc8" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.158222 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmbhd\" (UniqueName: \"kubernetes.io/projected/799201c0-a8e7-40f8-bce3-8d43fc5f9a9e-kube-api-access-mmbhd\") pod \"watcher-api-0\" (UID: \"799201c0-a8e7-40f8-bce3-8d43fc5f9a9e\") " pod="openstack/watcher-api-0" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.162044 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.162198 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-dmtxt"] Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.163699 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-dmtxt" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.165347 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-5rw7t" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.166935 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.174069 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-84fd96956f-5sqc8" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.179966 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-dmtxt"] Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.188419 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-96nsw"] Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.189555 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-96nsw" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.194085 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-xtxgk" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.194451 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.195092 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.195354 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.195647 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58bbf48b7f-dlk7f"] Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.196299 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58bbf48b7f-dlk7f" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.204718 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-96nsw"] Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.221364 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6df4dd4f95-cwhwt"] Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.222954 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6df4dd4f95-cwhwt" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.229634 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6df4dd4f95-cwhwt"] Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.231819 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b25d9dd-3bce-4f8d-9219-1d6ce75878ed-log-httpd\") pod \"ceilometer-0\" (UID: \"7b25d9dd-3bce-4f8d-9219-1d6ce75878ed\") " pod="openstack/ceilometer-0" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.231850 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ce22649f-fe81-4c02-a26a-15e45e306b82-db-sync-config-data\") pod \"barbican-db-sync-dmtxt\" (UID: \"ce22649f-fe81-4c02-a26a-15e45e306b82\") " pod="openstack/barbican-db-sync-dmtxt" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.231872 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99ff807c-d810-4ff9-9ed3-3b2d37d3fbed-scripts\") pod \"cinder-db-sync-bg9r7\" (UID: \"99ff807c-d810-4ff9-9ed3-3b2d37d3fbed\") " pod="openstack/cinder-db-sync-bg9r7" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.231893 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99ff807c-d810-4ff9-9ed3-3b2d37d3fbed-combined-ca-bundle\") pod \"cinder-db-sync-bg9r7\" (UID: \"99ff807c-d810-4ff9-9ed3-3b2d37d3fbed\") " pod="openstack/cinder-db-sync-bg9r7" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.231917 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/995795c9-befc-4ce9-8a38-8791ba628061-config\") pod \"neutron-db-sync-96nsw\" (UID: \"995795c9-befc-4ce9-8a38-8791ba628061\") " pod="openstack/neutron-db-sync-96nsw" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.231950 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce22649f-fe81-4c02-a26a-15e45e306b82-combined-ca-bundle\") pod \"barbican-db-sync-dmtxt\" (UID: \"ce22649f-fe81-4c02-a26a-15e45e306b82\") " pod="openstack/barbican-db-sync-dmtxt" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.231974 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66mtr\" (UniqueName: \"kubernetes.io/projected/99ff807c-d810-4ff9-9ed3-3b2d37d3fbed-kube-api-access-66mtr\") pod \"cinder-db-sync-bg9r7\" (UID: \"99ff807c-d810-4ff9-9ed3-3b2d37d3fbed\") " pod="openstack/cinder-db-sync-bg9r7" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.232002 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99ff807c-d810-4ff9-9ed3-3b2d37d3fbed-config-data\") pod \"cinder-db-sync-bg9r7\" (UID: \"99ff807c-d810-4ff9-9ed3-3b2d37d3fbed\") " pod="openstack/cinder-db-sync-bg9r7" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.232016 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/99ff807c-d810-4ff9-9ed3-3b2d37d3fbed-db-sync-config-data\") pod \"cinder-db-sync-bg9r7\" (UID: \"99ff807c-d810-4ff9-9ed3-3b2d37d3fbed\") " pod="openstack/cinder-db-sync-bg9r7" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.232037 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7b25d9dd-3bce-4f8d-9219-1d6ce75878ed-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7b25d9dd-3bce-4f8d-9219-1d6ce75878ed\") " pod="openstack/ceilometer-0" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.232060 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jg46m\" (UniqueName: \"kubernetes.io/projected/995795c9-befc-4ce9-8a38-8791ba628061-kube-api-access-jg46m\") pod \"neutron-db-sync-96nsw\" (UID: \"995795c9-befc-4ce9-8a38-8791ba628061\") " pod="openstack/neutron-db-sync-96nsw" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.232078 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/99ff807c-d810-4ff9-9ed3-3b2d37d3fbed-etc-machine-id\") pod \"cinder-db-sync-bg9r7\" (UID: \"99ff807c-d810-4ff9-9ed3-3b2d37d3fbed\") " pod="openstack/cinder-db-sync-bg9r7" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.232095 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b25d9dd-3bce-4f8d-9219-1d6ce75878ed-run-httpd\") pod \"ceilometer-0\" (UID: \"7b25d9dd-3bce-4f8d-9219-1d6ce75878ed\") " pod="openstack/ceilometer-0" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.232119 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/995795c9-befc-4ce9-8a38-8791ba628061-combined-ca-bundle\") pod \"neutron-db-sync-96nsw\" (UID: \"995795c9-befc-4ce9-8a38-8791ba628061\") " pod="openstack/neutron-db-sync-96nsw" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.232144 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b25d9dd-3bce-4f8d-9219-1d6ce75878ed-scripts\") pod \"ceilometer-0\" (UID: \"7b25d9dd-3bce-4f8d-9219-1d6ce75878ed\") " pod="openstack/ceilometer-0" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.232198 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g882n\" (UniqueName: \"kubernetes.io/projected/ce22649f-fe81-4c02-a26a-15e45e306b82-kube-api-access-g882n\") pod \"barbican-db-sync-dmtxt\" (UID: \"ce22649f-fe81-4c02-a26a-15e45e306b82\") " pod="openstack/barbican-db-sync-dmtxt" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.232217 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b25d9dd-3bce-4f8d-9219-1d6ce75878ed-config-data\") pod \"ceilometer-0\" (UID: \"7b25d9dd-3bce-4f8d-9219-1d6ce75878ed\") " pod="openstack/ceilometer-0" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.232238 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b25d9dd-3bce-4f8d-9219-1d6ce75878ed-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7b25d9dd-3bce-4f8d-9219-1d6ce75878ed\") " pod="openstack/ceilometer-0" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.232254 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsmjv\" (UniqueName: \"kubernetes.io/projected/7b25d9dd-3bce-4f8d-9219-1d6ce75878ed-kube-api-access-fsmjv\") pod \"ceilometer-0\" (UID: \"7b25d9dd-3bce-4f8d-9219-1d6ce75878ed\") " pod="openstack/ceilometer-0" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.232304 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b25d9dd-3bce-4f8d-9219-1d6ce75878ed-log-httpd\") pod \"ceilometer-0\" (UID: \"7b25d9dd-3bce-4f8d-9219-1d6ce75878ed\") " pod="openstack/ceilometer-0" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.232497 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/99ff807c-d810-4ff9-9ed3-3b2d37d3fbed-etc-machine-id\") pod \"cinder-db-sync-bg9r7\" (UID: \"99ff807c-d810-4ff9-9ed3-3b2d37d3fbed\") " pod="openstack/cinder-db-sync-bg9r7" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.232680 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b25d9dd-3bce-4f8d-9219-1d6ce75878ed-run-httpd\") pod \"ceilometer-0\" (UID: \"7b25d9dd-3bce-4f8d-9219-1d6ce75878ed\") " pod="openstack/ceilometer-0" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.236002 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7b25d9dd-3bce-4f8d-9219-1d6ce75878ed-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7b25d9dd-3bce-4f8d-9219-1d6ce75878ed\") " pod="openstack/ceilometer-0" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.240068 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99ff807c-d810-4ff9-9ed3-3b2d37d3fbed-scripts\") pod \"cinder-db-sync-bg9r7\" (UID: \"99ff807c-d810-4ff9-9ed3-3b2d37d3fbed\") " pod="openstack/cinder-db-sync-bg9r7" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.241567 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b25d9dd-3bce-4f8d-9219-1d6ce75878ed-scripts\") pod \"ceilometer-0\" (UID: \"7b25d9dd-3bce-4f8d-9219-1d6ce75878ed\") " pod="openstack/ceilometer-0" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.241609 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-578598f949-m7h9x"] Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.245608 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/99ff807c-d810-4ff9-9ed3-3b2d37d3fbed-db-sync-config-data\") pod \"cinder-db-sync-bg9r7\" (UID: \"99ff807c-d810-4ff9-9ed3-3b2d37d3fbed\") " pod="openstack/cinder-db-sync-bg9r7" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.247010 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99ff807c-d810-4ff9-9ed3-3b2d37d3fbed-config-data\") pod \"cinder-db-sync-bg9r7\" (UID: \"99ff807c-d810-4ff9-9ed3-3b2d37d3fbed\") " pod="openstack/cinder-db-sync-bg9r7" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.247510 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-578598f949-m7h9x" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.254979 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99ff807c-d810-4ff9-9ed3-3b2d37d3fbed-combined-ca-bundle\") pod \"cinder-db-sync-bg9r7\" (UID: \"99ff807c-d810-4ff9-9ed3-3b2d37d3fbed\") " pod="openstack/cinder-db-sync-bg9r7" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.256729 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b25d9dd-3bce-4f8d-9219-1d6ce75878ed-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7b25d9dd-3bce-4f8d-9219-1d6ce75878ed\") " pod="openstack/ceilometer-0" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.256950 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b25d9dd-3bce-4f8d-9219-1d6ce75878ed-config-data\") pod \"ceilometer-0\" (UID: \"7b25d9dd-3bce-4f8d-9219-1d6ce75878ed\") " pod="openstack/ceilometer-0" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.257150 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-sdmls"] Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.259573 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-sdmls" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.265223 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-578598f949-m7h9x"] Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.266600 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsmjv\" (UniqueName: \"kubernetes.io/projected/7b25d9dd-3bce-4f8d-9219-1d6ce75878ed-kube-api-access-fsmjv\") pod \"ceilometer-0\" (UID: \"7b25d9dd-3bce-4f8d-9219-1d6ce75878ed\") " pod="openstack/ceilometer-0" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.266642 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.266904 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66mtr\" (UniqueName: \"kubernetes.io/projected/99ff807c-d810-4ff9-9ed3-3b2d37d3fbed-kube-api-access-66mtr\") pod \"cinder-db-sync-bg9r7\" (UID: \"99ff807c-d810-4ff9-9ed3-3b2d37d3fbed\") " pod="openstack/cinder-db-sync-bg9r7" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.267321 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qfmkt" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.271341 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.271699 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-7gdgz" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.274841 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-sdmls"] Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.333844 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jg46m\" (UniqueName: \"kubernetes.io/projected/995795c9-befc-4ce9-8a38-8791ba628061-kube-api-access-jg46m\") pod \"neutron-db-sync-96nsw\" (UID: \"995795c9-befc-4ce9-8a38-8791ba628061\") " pod="openstack/neutron-db-sync-96nsw" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.334112 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/995795c9-befc-4ce9-8a38-8791ba628061-combined-ca-bundle\") pod \"neutron-db-sync-96nsw\" (UID: \"995795c9-befc-4ce9-8a38-8791ba628061\") " pod="openstack/neutron-db-sync-96nsw" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.334143 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48b62183-7a23-4f98-aef5-4b6b6f89bd53-config\") pod \"dnsmasq-dns-578598f949-m7h9x\" (UID: \"48b62183-7a23-4f98-aef5-4b6b6f89bd53\") " pod="openstack/dnsmasq-dns-578598f949-m7h9x" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.334171 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea4a96dd-8928-4248-b294-1b0b6413abef-scripts\") pod \"placement-db-sync-sdmls\" (UID: \"ea4a96dd-8928-4248-b294-1b0b6413abef\") " pod="openstack/placement-db-sync-sdmls" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.334190 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea4a96dd-8928-4248-b294-1b0b6413abef-combined-ca-bundle\") pod \"placement-db-sync-sdmls\" (UID: \"ea4a96dd-8928-4248-b294-1b0b6413abef\") " pod="openstack/placement-db-sync-sdmls" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.334217 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbhtq\" (UniqueName: \"kubernetes.io/projected/ea4a96dd-8928-4248-b294-1b0b6413abef-kube-api-access-mbhtq\") pod \"placement-db-sync-sdmls\" (UID: \"ea4a96dd-8928-4248-b294-1b0b6413abef\") " pod="openstack/placement-db-sync-sdmls" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.334236 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f549aa26-b902-4497-838b-6b80e635897c-scripts\") pod \"horizon-6df4dd4f95-cwhwt\" (UID: \"f549aa26-b902-4497-838b-6b80e635897c\") " pod="openstack/horizon-6df4dd4f95-cwhwt" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.334253 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f549aa26-b902-4497-838b-6b80e635897c-logs\") pod \"horizon-6df4dd4f95-cwhwt\" (UID: \"f549aa26-b902-4497-838b-6b80e635897c\") " pod="openstack/horizon-6df4dd4f95-cwhwt" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.334271 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/48b62183-7a23-4f98-aef5-4b6b6f89bd53-dns-swift-storage-0\") pod \"dnsmasq-dns-578598f949-m7h9x\" (UID: \"48b62183-7a23-4f98-aef5-4b6b6f89bd53\") " pod="openstack/dnsmasq-dns-578598f949-m7h9x" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.334287 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f549aa26-b902-4497-838b-6b80e635897c-config-data\") pod \"horizon-6df4dd4f95-cwhwt\" (UID: \"f549aa26-b902-4497-838b-6b80e635897c\") " pod="openstack/horizon-6df4dd4f95-cwhwt" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.334302 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f549aa26-b902-4497-838b-6b80e635897c-horizon-secret-key\") pod \"horizon-6df4dd4f95-cwhwt\" (UID: \"f549aa26-b902-4497-838b-6b80e635897c\") " pod="openstack/horizon-6df4dd4f95-cwhwt" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.334320 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea4a96dd-8928-4248-b294-1b0b6413abef-config-data\") pod \"placement-db-sync-sdmls\" (UID: \"ea4a96dd-8928-4248-b294-1b0b6413abef\") " pod="openstack/placement-db-sync-sdmls" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.334339 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/48b62183-7a23-4f98-aef5-4b6b6f89bd53-ovsdbserver-nb\") pod \"dnsmasq-dns-578598f949-m7h9x\" (UID: \"48b62183-7a23-4f98-aef5-4b6b6f89bd53\") " pod="openstack/dnsmasq-dns-578598f949-m7h9x" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.334371 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g882n\" (UniqueName: \"kubernetes.io/projected/ce22649f-fe81-4c02-a26a-15e45e306b82-kube-api-access-g882n\") pod \"barbican-db-sync-dmtxt\" (UID: \"ce22649f-fe81-4c02-a26a-15e45e306b82\") " pod="openstack/barbican-db-sync-dmtxt" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.334387 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hwnm\" (UniqueName: \"kubernetes.io/projected/48b62183-7a23-4f98-aef5-4b6b6f89bd53-kube-api-access-8hwnm\") pod \"dnsmasq-dns-578598f949-m7h9x\" (UID: \"48b62183-7a23-4f98-aef5-4b6b6f89bd53\") " pod="openstack/dnsmasq-dns-578598f949-m7h9x" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.334408 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/48b62183-7a23-4f98-aef5-4b6b6f89bd53-ovsdbserver-sb\") pod \"dnsmasq-dns-578598f949-m7h9x\" (UID: \"48b62183-7a23-4f98-aef5-4b6b6f89bd53\") " pod="openstack/dnsmasq-dns-578598f949-m7h9x" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.334433 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6sgc\" (UniqueName: \"kubernetes.io/projected/f549aa26-b902-4497-838b-6b80e635897c-kube-api-access-z6sgc\") pod \"horizon-6df4dd4f95-cwhwt\" (UID: \"f549aa26-b902-4497-838b-6b80e635897c\") " pod="openstack/horizon-6df4dd4f95-cwhwt" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.345847 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/995795c9-befc-4ce9-8a38-8791ba628061-combined-ca-bundle\") pod \"neutron-db-sync-96nsw\" (UID: \"995795c9-befc-4ce9-8a38-8791ba628061\") " pod="openstack/neutron-db-sync-96nsw" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.346181 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ce22649f-fe81-4c02-a26a-15e45e306b82-db-sync-config-data\") pod \"barbican-db-sync-dmtxt\" (UID: \"ce22649f-fe81-4c02-a26a-15e45e306b82\") " pod="openstack/barbican-db-sync-dmtxt" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.334456 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ce22649f-fe81-4c02-a26a-15e45e306b82-db-sync-config-data\") pod \"barbican-db-sync-dmtxt\" (UID: \"ce22649f-fe81-4c02-a26a-15e45e306b82\") " pod="openstack/barbican-db-sync-dmtxt" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.346764 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/995795c9-befc-4ce9-8a38-8791ba628061-config\") pod \"neutron-db-sync-96nsw\" (UID: \"995795c9-befc-4ce9-8a38-8791ba628061\") " pod="openstack/neutron-db-sync-96nsw" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.346804 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/48b62183-7a23-4f98-aef5-4b6b6f89bd53-dns-svc\") pod \"dnsmasq-dns-578598f949-m7h9x\" (UID: \"48b62183-7a23-4f98-aef5-4b6b6f89bd53\") " pod="openstack/dnsmasq-dns-578598f949-m7h9x" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.346835 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea4a96dd-8928-4248-b294-1b0b6413abef-logs\") pod \"placement-db-sync-sdmls\" (UID: \"ea4a96dd-8928-4248-b294-1b0b6413abef\") " pod="openstack/placement-db-sync-sdmls" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.346897 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce22649f-fe81-4c02-a26a-15e45e306b82-combined-ca-bundle\") pod \"barbican-db-sync-dmtxt\" (UID: \"ce22649f-fe81-4c02-a26a-15e45e306b82\") " pod="openstack/barbican-db-sync-dmtxt" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.351497 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/995795c9-befc-4ce9-8a38-8791ba628061-config\") pod \"neutron-db-sync-96nsw\" (UID: \"995795c9-befc-4ce9-8a38-8791ba628061\") " pod="openstack/neutron-db-sync-96nsw" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.355961 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g882n\" (UniqueName: \"kubernetes.io/projected/ce22649f-fe81-4c02-a26a-15e45e306b82-kube-api-access-g882n\") pod \"barbican-db-sync-dmtxt\" (UID: \"ce22649f-fe81-4c02-a26a-15e45e306b82\") " pod="openstack/barbican-db-sync-dmtxt" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.357898 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jg46m\" (UniqueName: \"kubernetes.io/projected/995795c9-befc-4ce9-8a38-8791ba628061-kube-api-access-jg46m\") pod \"neutron-db-sync-96nsw\" (UID: \"995795c9-befc-4ce9-8a38-8791ba628061\") " pod="openstack/neutron-db-sync-96nsw" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.361040 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce22649f-fe81-4c02-a26a-15e45e306b82-combined-ca-bundle\") pod \"barbican-db-sync-dmtxt\" (UID: \"ce22649f-fe81-4c02-a26a-15e45e306b82\") " pod="openstack/barbican-db-sync-dmtxt" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.389011 4930 generic.go:334] "Generic (PLEG): container finished" podID="a43020f9-02f4-4d52-b00f-74fd6da4e46b" containerID="e44647d19ab751b3e2adbe990f4423e7c2122dd92fb056ee1ca8360d8a58997d" exitCode=0 Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.389064 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55b99bf79c-xbwtf" event={"ID":"a43020f9-02f4-4d52-b00f-74fd6da4e46b","Type":"ContainerDied","Data":"e44647d19ab751b3e2adbe990f4423e7c2122dd92fb056ee1ca8360d8a58997d"} Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.438209 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55b99bf79c-xbwtf" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.448571 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hwnm\" (UniqueName: \"kubernetes.io/projected/48b62183-7a23-4f98-aef5-4b6b6f89bd53-kube-api-access-8hwnm\") pod \"dnsmasq-dns-578598f949-m7h9x\" (UID: \"48b62183-7a23-4f98-aef5-4b6b6f89bd53\") " pod="openstack/dnsmasq-dns-578598f949-m7h9x" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.448615 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/48b62183-7a23-4f98-aef5-4b6b6f89bd53-ovsdbserver-sb\") pod \"dnsmasq-dns-578598f949-m7h9x\" (UID: \"48b62183-7a23-4f98-aef5-4b6b6f89bd53\") " pod="openstack/dnsmasq-dns-578598f949-m7h9x" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.448641 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6sgc\" (UniqueName: \"kubernetes.io/projected/f549aa26-b902-4497-838b-6b80e635897c-kube-api-access-z6sgc\") pod \"horizon-6df4dd4f95-cwhwt\" (UID: \"f549aa26-b902-4497-838b-6b80e635897c\") " pod="openstack/horizon-6df4dd4f95-cwhwt" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.448674 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/48b62183-7a23-4f98-aef5-4b6b6f89bd53-dns-svc\") pod \"dnsmasq-dns-578598f949-m7h9x\" (UID: \"48b62183-7a23-4f98-aef5-4b6b6f89bd53\") " pod="openstack/dnsmasq-dns-578598f949-m7h9x" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.448694 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea4a96dd-8928-4248-b294-1b0b6413abef-logs\") pod \"placement-db-sync-sdmls\" (UID: \"ea4a96dd-8928-4248-b294-1b0b6413abef\") " pod="openstack/placement-db-sync-sdmls" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.448821 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48b62183-7a23-4f98-aef5-4b6b6f89bd53-config\") pod \"dnsmasq-dns-578598f949-m7h9x\" (UID: \"48b62183-7a23-4f98-aef5-4b6b6f89bd53\") " pod="openstack/dnsmasq-dns-578598f949-m7h9x" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.448846 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea4a96dd-8928-4248-b294-1b0b6413abef-scripts\") pod \"placement-db-sync-sdmls\" (UID: \"ea4a96dd-8928-4248-b294-1b0b6413abef\") " pod="openstack/placement-db-sync-sdmls" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.448863 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea4a96dd-8928-4248-b294-1b0b6413abef-combined-ca-bundle\") pod \"placement-db-sync-sdmls\" (UID: \"ea4a96dd-8928-4248-b294-1b0b6413abef\") " pod="openstack/placement-db-sync-sdmls" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.448899 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f549aa26-b902-4497-838b-6b80e635897c-scripts\") pod \"horizon-6df4dd4f95-cwhwt\" (UID: \"f549aa26-b902-4497-838b-6b80e635897c\") " pod="openstack/horizon-6df4dd4f95-cwhwt" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.448912 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f549aa26-b902-4497-838b-6b80e635897c-logs\") pod \"horizon-6df4dd4f95-cwhwt\" (UID: \"f549aa26-b902-4497-838b-6b80e635897c\") " pod="openstack/horizon-6df4dd4f95-cwhwt" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.448927 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbhtq\" (UniqueName: \"kubernetes.io/projected/ea4a96dd-8928-4248-b294-1b0b6413abef-kube-api-access-mbhtq\") pod \"placement-db-sync-sdmls\" (UID: \"ea4a96dd-8928-4248-b294-1b0b6413abef\") " pod="openstack/placement-db-sync-sdmls" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.448944 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/48b62183-7a23-4f98-aef5-4b6b6f89bd53-dns-swift-storage-0\") pod \"dnsmasq-dns-578598f949-m7h9x\" (UID: \"48b62183-7a23-4f98-aef5-4b6b6f89bd53\") " pod="openstack/dnsmasq-dns-578598f949-m7h9x" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.448957 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f549aa26-b902-4497-838b-6b80e635897c-config-data\") pod \"horizon-6df4dd4f95-cwhwt\" (UID: \"f549aa26-b902-4497-838b-6b80e635897c\") " pod="openstack/horizon-6df4dd4f95-cwhwt" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.448973 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f549aa26-b902-4497-838b-6b80e635897c-horizon-secret-key\") pod \"horizon-6df4dd4f95-cwhwt\" (UID: \"f549aa26-b902-4497-838b-6b80e635897c\") " pod="openstack/horizon-6df4dd4f95-cwhwt" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.448988 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea4a96dd-8928-4248-b294-1b0b6413abef-config-data\") pod \"placement-db-sync-sdmls\" (UID: \"ea4a96dd-8928-4248-b294-1b0b6413abef\") " pod="openstack/placement-db-sync-sdmls" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.449008 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/48b62183-7a23-4f98-aef5-4b6b6f89bd53-ovsdbserver-nb\") pod \"dnsmasq-dns-578598f949-m7h9x\" (UID: \"48b62183-7a23-4f98-aef5-4b6b6f89bd53\") " pod="openstack/dnsmasq-dns-578598f949-m7h9x" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.450021 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/48b62183-7a23-4f98-aef5-4b6b6f89bd53-ovsdbserver-nb\") pod \"dnsmasq-dns-578598f949-m7h9x\" (UID: \"48b62183-7a23-4f98-aef5-4b6b6f89bd53\") " pod="openstack/dnsmasq-dns-578598f949-m7h9x" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.450766 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/48b62183-7a23-4f98-aef5-4b6b6f89bd53-ovsdbserver-sb\") pod \"dnsmasq-dns-578598f949-m7h9x\" (UID: \"48b62183-7a23-4f98-aef5-4b6b6f89bd53\") " pod="openstack/dnsmasq-dns-578598f949-m7h9x" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.451028 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f549aa26-b902-4497-838b-6b80e635897c-scripts\") pod \"horizon-6df4dd4f95-cwhwt\" (UID: \"f549aa26-b902-4497-838b-6b80e635897c\") " pod="openstack/horizon-6df4dd4f95-cwhwt" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.451539 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/48b62183-7a23-4f98-aef5-4b6b6f89bd53-dns-swift-storage-0\") pod \"dnsmasq-dns-578598f949-m7h9x\" (UID: \"48b62183-7a23-4f98-aef5-4b6b6f89bd53\") " pod="openstack/dnsmasq-dns-578598f949-m7h9x" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.451807 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f549aa26-b902-4497-838b-6b80e635897c-logs\") pod \"horizon-6df4dd4f95-cwhwt\" (UID: \"f549aa26-b902-4497-838b-6b80e635897c\") " pod="openstack/horizon-6df4dd4f95-cwhwt" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.451828 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/48b62183-7a23-4f98-aef5-4b6b6f89bd53-dns-svc\") pod \"dnsmasq-dns-578598f949-m7h9x\" (UID: \"48b62183-7a23-4f98-aef5-4b6b6f89bd53\") " pod="openstack/dnsmasq-dns-578598f949-m7h9x" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.452073 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48b62183-7a23-4f98-aef5-4b6b6f89bd53-config\") pod \"dnsmasq-dns-578598f949-m7h9x\" (UID: \"48b62183-7a23-4f98-aef5-4b6b6f89bd53\") " pod="openstack/dnsmasq-dns-578598f949-m7h9x" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.452350 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea4a96dd-8928-4248-b294-1b0b6413abef-logs\") pod \"placement-db-sync-sdmls\" (UID: \"ea4a96dd-8928-4248-b294-1b0b6413abef\") " pod="openstack/placement-db-sync-sdmls" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.452516 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f549aa26-b902-4497-838b-6b80e635897c-config-data\") pod \"horizon-6df4dd4f95-cwhwt\" (UID: \"f549aa26-b902-4497-838b-6b80e635897c\") " pod="openstack/horizon-6df4dd4f95-cwhwt" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.455868 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-bg9r7" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.461439 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f549aa26-b902-4497-838b-6b80e635897c-horizon-secret-key\") pod \"horizon-6df4dd4f95-cwhwt\" (UID: \"f549aa26-b902-4497-838b-6b80e635897c\") " pod="openstack/horizon-6df4dd4f95-cwhwt" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.470915 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea4a96dd-8928-4248-b294-1b0b6413abef-scripts\") pod \"placement-db-sync-sdmls\" (UID: \"ea4a96dd-8928-4248-b294-1b0b6413abef\") " pod="openstack/placement-db-sync-sdmls" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.472528 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea4a96dd-8928-4248-b294-1b0b6413abef-combined-ca-bundle\") pod \"placement-db-sync-sdmls\" (UID: \"ea4a96dd-8928-4248-b294-1b0b6413abef\") " pod="openstack/placement-db-sync-sdmls" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.472971 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.473814 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea4a96dd-8928-4248-b294-1b0b6413abef-config-data\") pod \"placement-db-sync-sdmls\" (UID: \"ea4a96dd-8928-4248-b294-1b0b6413abef\") " pod="openstack/placement-db-sync-sdmls" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.476945 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbhtq\" (UniqueName: \"kubernetes.io/projected/ea4a96dd-8928-4248-b294-1b0b6413abef-kube-api-access-mbhtq\") pod \"placement-db-sync-sdmls\" (UID: \"ea4a96dd-8928-4248-b294-1b0b6413abef\") " pod="openstack/placement-db-sync-sdmls" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.484604 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6sgc\" (UniqueName: \"kubernetes.io/projected/f549aa26-b902-4497-838b-6b80e635897c-kube-api-access-z6sgc\") pod \"horizon-6df4dd4f95-cwhwt\" (UID: \"f549aa26-b902-4497-838b-6b80e635897c\") " pod="openstack/horizon-6df4dd4f95-cwhwt" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.488302 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hwnm\" (UniqueName: \"kubernetes.io/projected/48b62183-7a23-4f98-aef5-4b6b6f89bd53-kube-api-access-8hwnm\") pod \"dnsmasq-dns-578598f949-m7h9x\" (UID: \"48b62183-7a23-4f98-aef5-4b6b6f89bd53\") " pod="openstack/dnsmasq-dns-578598f949-m7h9x" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.491198 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-dmtxt" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.507045 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-96nsw" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.564844 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a43020f9-02f4-4d52-b00f-74fd6da4e46b-dns-svc\") pod \"a43020f9-02f4-4d52-b00f-74fd6da4e46b\" (UID: \"a43020f9-02f4-4d52-b00f-74fd6da4e46b\") " Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.564895 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a43020f9-02f4-4d52-b00f-74fd6da4e46b-config\") pod \"a43020f9-02f4-4d52-b00f-74fd6da4e46b\" (UID: \"a43020f9-02f4-4d52-b00f-74fd6da4e46b\") " Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.564957 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thc66\" (UniqueName: \"kubernetes.io/projected/a43020f9-02f4-4d52-b00f-74fd6da4e46b-kube-api-access-thc66\") pod \"a43020f9-02f4-4d52-b00f-74fd6da4e46b\" (UID: \"a43020f9-02f4-4d52-b00f-74fd6da4e46b\") " Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.565021 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a43020f9-02f4-4d52-b00f-74fd6da4e46b-ovsdbserver-nb\") pod \"a43020f9-02f4-4d52-b00f-74fd6da4e46b\" (UID: \"a43020f9-02f4-4d52-b00f-74fd6da4e46b\") " Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.565093 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a43020f9-02f4-4d52-b00f-74fd6da4e46b-dns-swift-storage-0\") pod \"a43020f9-02f4-4d52-b00f-74fd6da4e46b\" (UID: \"a43020f9-02f4-4d52-b00f-74fd6da4e46b\") " Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.565179 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a43020f9-02f4-4d52-b00f-74fd6da4e46b-ovsdbserver-sb\") pod \"a43020f9-02f4-4d52-b00f-74fd6da4e46b\" (UID: \"a43020f9-02f4-4d52-b00f-74fd6da4e46b\") " Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.574675 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6df4dd4f95-cwhwt" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.575613 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a43020f9-02f4-4d52-b00f-74fd6da4e46b-kube-api-access-thc66" (OuterVolumeSpecName: "kube-api-access-thc66") pod "a43020f9-02f4-4d52-b00f-74fd6da4e46b" (UID: "a43020f9-02f4-4d52-b00f-74fd6da4e46b"). InnerVolumeSpecName "kube-api-access-thc66". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.610869 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-578598f949-m7h9x" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.631270 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-sdmls" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.668304 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a43020f9-02f4-4d52-b00f-74fd6da4e46b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a43020f9-02f4-4d52-b00f-74fd6da4e46b" (UID: "a43020f9-02f4-4d52-b00f-74fd6da4e46b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.668624 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thc66\" (UniqueName: \"kubernetes.io/projected/a43020f9-02f4-4d52-b00f-74fd6da4e46b-kube-api-access-thc66\") on node \"crc\" DevicePath \"\"" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.678764 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a43020f9-02f4-4d52-b00f-74fd6da4e46b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a43020f9-02f4-4d52-b00f-74fd6da4e46b" (UID: "a43020f9-02f4-4d52-b00f-74fd6da4e46b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.682198 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a43020f9-02f4-4d52-b00f-74fd6da4e46b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a43020f9-02f4-4d52-b00f-74fd6da4e46b" (UID: "a43020f9-02f4-4d52-b00f-74fd6da4e46b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.691850 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a43020f9-02f4-4d52-b00f-74fd6da4e46b-config" (OuterVolumeSpecName: "config") pod "a43020f9-02f4-4d52-b00f-74fd6da4e46b" (UID: "a43020f9-02f4-4d52-b00f-74fd6da4e46b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.742276 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a43020f9-02f4-4d52-b00f-74fd6da4e46b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a43020f9-02f4-4d52-b00f-74fd6da4e46b" (UID: "a43020f9-02f4-4d52-b00f-74fd6da4e46b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.770059 4930 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a43020f9-02f4-4d52-b00f-74fd6da4e46b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.770094 4930 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a43020f9-02f4-4d52-b00f-74fd6da4e46b-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.770106 4930 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a43020f9-02f4-4d52-b00f-74fd6da4e46b-config\") on node \"crc\" DevicePath \"\"" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.770117 4930 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a43020f9-02f4-4d52-b00f-74fd6da4e46b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 12 05:58:22 crc kubenswrapper[4930]: I1012 05:58:22.770126 4930 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a43020f9-02f4-4d52-b00f-74fd6da4e46b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 12 05:58:23 crc kubenswrapper[4930]: I1012 05:58:23.029704 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Oct 12 05:58:23 crc kubenswrapper[4930]: W1012 05:58:23.037053 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod934c43b8_93c5_442d_ba52_2e3bfc2d8f35.slice/crio-d4cd79f0aac633d971ece62da40822881b6244052665f3617ae36f8870cea846 WatchSource:0}: Error finding container d4cd79f0aac633d971ece62da40822881b6244052665f3617ae36f8870cea846: Status 404 returned error can't find the container with id d4cd79f0aac633d971ece62da40822881b6244052665f3617ae36f8870cea846 Oct 12 05:58:23 crc kubenswrapper[4930]: I1012 05:58:23.044924 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 12 05:58:23 crc kubenswrapper[4930]: W1012 05:58:23.058857 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddda9904e_b764_43a5_83d2_5c993023f740.slice/crio-a303ddb5384b147b5928088f3723f1c39f9fc1dfd698abe27c3ed1e825a82c69 WatchSource:0}: Error finding container a303ddb5384b147b5928088f3723f1c39f9fc1dfd698abe27c3ed1e825a82c69: Status 404 returned error can't find the container with id a303ddb5384b147b5928088f3723f1c39f9fc1dfd698abe27c3ed1e825a82c69 Oct 12 05:58:23 crc kubenswrapper[4930]: I1012 05:58:23.368930 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-84fd96956f-5sqc8"] Oct 12 05:58:23 crc kubenswrapper[4930]: I1012 05:58:23.384869 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58bbf48b7f-dlk7f"] Oct 12 05:58:23 crc kubenswrapper[4930]: I1012 05:58:23.410440 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Oct 12 05:58:23 crc kubenswrapper[4930]: I1012 05:58:23.416441 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55b99bf79c-xbwtf" event={"ID":"a43020f9-02f4-4d52-b00f-74fd6da4e46b","Type":"ContainerDied","Data":"282c4e6ca435fa75d2920a7b597dbd7819bb076f08f5fcf008acb02b9c845af9"} Oct 12 05:58:23 crc kubenswrapper[4930]: I1012 05:58:23.416491 4930 scope.go:117] "RemoveContainer" containerID="e44647d19ab751b3e2adbe990f4423e7c2122dd92fb056ee1ca8360d8a58997d" Oct 12 05:58:23 crc kubenswrapper[4930]: I1012 05:58:23.416603 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55b99bf79c-xbwtf" Oct 12 05:58:23 crc kubenswrapper[4930]: I1012 05:58:23.427253 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-84fd96956f-5sqc8" event={"ID":"b69d2329-3396-4e67-9880-940dacef7e56","Type":"ContainerStarted","Data":"ef3edbe0788a2bbd3ac45ef204a0fa7190a8bcd0575a57e60101bc87b93f2ebc"} Oct 12 05:58:23 crc kubenswrapper[4930]: I1012 05:58:23.429635 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"934c43b8-93c5-442d-ba52-2e3bfc2d8f35","Type":"ContainerStarted","Data":"d4cd79f0aac633d971ece62da40822881b6244052665f3617ae36f8870cea846"} Oct 12 05:58:23 crc kubenswrapper[4930]: I1012 05:58:23.431206 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"dda9904e-b764-43a5-83d2-5c993023f740","Type":"ContainerStarted","Data":"a303ddb5384b147b5928088f3723f1c39f9fc1dfd698abe27c3ed1e825a82c69"} Oct 12 05:58:23 crc kubenswrapper[4930]: I1012 05:58:23.470582 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55b99bf79c-xbwtf"] Oct 12 05:58:23 crc kubenswrapper[4930]: I1012 05:58:23.482032 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55b99bf79c-xbwtf"] Oct 12 05:58:23 crc kubenswrapper[4930]: I1012 05:58:23.494820 4930 scope.go:117] "RemoveContainer" containerID="d01e5938047a270c554579961f7a5bf6b9042ef286e598a049e951cd5dfc957c" Oct 12 05:58:23 crc kubenswrapper[4930]: I1012 05:58:23.692952 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-qfmkt"] Oct 12 05:58:23 crc kubenswrapper[4930]: I1012 05:58:23.710703 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6df4dd4f95-cwhwt"] Oct 12 05:58:23 crc kubenswrapper[4930]: W1012 05:58:23.717769 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b25d9dd_3bce_4f8d_9219_1d6ce75878ed.slice/crio-98f743e3fd40ba0726cdaec6f7dbaf40edbce7e61986a664ab469afdd890d255 WatchSource:0}: Error finding container 98f743e3fd40ba0726cdaec6f7dbaf40edbce7e61986a664ab469afdd890d255: Status 404 returned error can't find the container with id 98f743e3fd40ba0726cdaec6f7dbaf40edbce7e61986a664ab469afdd890d255 Oct 12 05:58:23 crc kubenswrapper[4930]: I1012 05:58:23.724198 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 12 05:58:23 crc kubenswrapper[4930]: W1012 05:58:23.731898 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf549aa26_b902_4497_838b_6b80e635897c.slice/crio-f8af6f942697e7e8f483d761214ec737acd6905fd522d134004bde7c580d7459 WatchSource:0}: Error finding container f8af6f942697e7e8f483d761214ec737acd6905fd522d134004bde7c580d7459: Status 404 returned error can't find the container with id f8af6f942697e7e8f483d761214ec737acd6905fd522d134004bde7c580d7459 Oct 12 05:58:23 crc kubenswrapper[4930]: I1012 05:58:23.737858 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-96nsw"] Oct 12 05:58:23 crc kubenswrapper[4930]: W1012 05:58:23.763217 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8e573a1_8105_4b4a_9ba8_dac6aef25de2.slice/crio-f7ca2e2062aa8bf990b0d5c491edea8210645d04d426d3c6aab7824167ba43f5 WatchSource:0}: Error finding container f7ca2e2062aa8bf990b0d5c491edea8210645d04d426d3c6aab7824167ba43f5: Status 404 returned error can't find the container with id f7ca2e2062aa8bf990b0d5c491edea8210645d04d426d3c6aab7824167ba43f5 Oct 12 05:58:23 crc kubenswrapper[4930]: I1012 05:58:23.770629 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-bg9r7"] Oct 12 05:58:23 crc kubenswrapper[4930]: I1012 05:58:23.806792 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-dmtxt"] Oct 12 05:58:23 crc kubenswrapper[4930]: I1012 05:58:23.973860 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-578598f949-m7h9x"] Oct 12 05:58:24 crc kubenswrapper[4930]: I1012 05:58:24.011858 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-sdmls"] Oct 12 05:58:24 crc kubenswrapper[4930]: I1012 05:58:24.185820 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a43020f9-02f4-4d52-b00f-74fd6da4e46b" path="/var/lib/kubelet/pods/a43020f9-02f4-4d52-b00f-74fd6da4e46b/volumes" Oct 12 05:58:24 crc kubenswrapper[4930]: I1012 05:58:24.464133 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7b25d9dd-3bce-4f8d-9219-1d6ce75878ed","Type":"ContainerStarted","Data":"98f743e3fd40ba0726cdaec6f7dbaf40edbce7e61986a664ab469afdd890d255"} Oct 12 05:58:24 crc kubenswrapper[4930]: I1012 05:58:24.476133 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-dmtxt" event={"ID":"ce22649f-fe81-4c02-a26a-15e45e306b82","Type":"ContainerStarted","Data":"d2027d3fa80b2349acd02c8da6dbcdcc634ce84d2c5fb887c0545866e1345a28"} Oct 12 05:58:24 crc kubenswrapper[4930]: I1012 05:58:24.477027 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-96nsw" event={"ID":"995795c9-befc-4ce9-8a38-8791ba628061","Type":"ContainerStarted","Data":"7ac18e15dea8e2a93d519b9426cd780512681ad422d0dfa569d49cb212ef7d13"} Oct 12 05:58:24 crc kubenswrapper[4930]: I1012 05:58:24.485249 4930 generic.go:334] "Generic (PLEG): container finished" podID="ce5f0598-e80c-49d3-bd51-3bbd167395db" containerID="9cf604c1d27f31ce1d596e2e97147d60ed3882ae389270e2626ca879bb2ccb8c" exitCode=0 Oct 12 05:58:24 crc kubenswrapper[4930]: I1012 05:58:24.485309 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58bbf48b7f-dlk7f" event={"ID":"ce5f0598-e80c-49d3-bd51-3bbd167395db","Type":"ContainerDied","Data":"9cf604c1d27f31ce1d596e2e97147d60ed3882ae389270e2626ca879bb2ccb8c"} Oct 12 05:58:24 crc kubenswrapper[4930]: I1012 05:58:24.485335 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58bbf48b7f-dlk7f" event={"ID":"ce5f0598-e80c-49d3-bd51-3bbd167395db","Type":"ContainerStarted","Data":"753134b915841de7d174b99f9344952a47f4fb84a851ac000c9680df343f6849"} Oct 12 05:58:24 crc kubenswrapper[4930]: I1012 05:58:24.512296 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qfmkt" event={"ID":"f8e573a1-8105-4b4a-9ba8-dac6aef25de2","Type":"ContainerStarted","Data":"452b6aabed2e3099d896db9eed8281d48b7281c884e3b2385020e0b8bf256332"} Oct 12 05:58:24 crc kubenswrapper[4930]: I1012 05:58:24.512385 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qfmkt" event={"ID":"f8e573a1-8105-4b4a-9ba8-dac6aef25de2","Type":"ContainerStarted","Data":"f7ca2e2062aa8bf990b0d5c491edea8210645d04d426d3c6aab7824167ba43f5"} Oct 12 05:58:24 crc kubenswrapper[4930]: I1012 05:58:24.515670 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-578598f949-m7h9x" event={"ID":"48b62183-7a23-4f98-aef5-4b6b6f89bd53","Type":"ContainerStarted","Data":"59e9df2e208b82763c4f091c6436715552273d7c5303eb5475ace56cfeb0aedf"} Oct 12 05:58:24 crc kubenswrapper[4930]: I1012 05:58:24.521538 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-bg9r7" event={"ID":"99ff807c-d810-4ff9-9ed3-3b2d37d3fbed","Type":"ContainerStarted","Data":"54ff6d0a1a07f334ab917b1b33b541bf3c3f61db974736985b8f6f060e7866c3"} Oct 12 05:58:24 crc kubenswrapper[4930]: I1012 05:58:24.530806 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6df4dd4f95-cwhwt" event={"ID":"f549aa26-b902-4497-838b-6b80e635897c","Type":"ContainerStarted","Data":"f8af6f942697e7e8f483d761214ec737acd6905fd522d134004bde7c580d7459"} Oct 12 05:58:24 crc kubenswrapper[4930]: I1012 05:58:24.535697 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-qfmkt" podStartSLOduration=3.53567988 podStartE2EDuration="3.53567988s" podCreationTimestamp="2025-10-12 05:58:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:58:24.525867705 +0000 UTC m=+1037.067969470" watchObservedRunningTime="2025-10-12 05:58:24.53567988 +0000 UTC m=+1037.077781645" Oct 12 05:58:24 crc kubenswrapper[4930]: I1012 05:58:24.537063 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-sdmls" event={"ID":"ea4a96dd-8928-4248-b294-1b0b6413abef","Type":"ContainerStarted","Data":"87821483aa2c5cd6a81aa58ed500ecf9a06b15d28bc2905dbe17a2ff794f3a7c"} Oct 12 05:58:24 crc kubenswrapper[4930]: I1012 05:58:24.552575 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"799201c0-a8e7-40f8-bce3-8d43fc5f9a9e","Type":"ContainerStarted","Data":"2e692fae791c683fc477c82b235c0199958dbe386c5ca7b83a54dc4179b5cbdb"} Oct 12 05:58:24 crc kubenswrapper[4930]: I1012 05:58:24.552615 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"799201c0-a8e7-40f8-bce3-8d43fc5f9a9e","Type":"ContainerStarted","Data":"430654705aed322c284d1cc300957fda8b0edbdd3ee95b03b58120323604b0b2"} Oct 12 05:58:24 crc kubenswrapper[4930]: I1012 05:58:24.552624 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"799201c0-a8e7-40f8-bce3-8d43fc5f9a9e","Type":"ContainerStarted","Data":"81cbd14d2bed27eafba8d27d628f41f54d6c93b983d72d2dd96030b3f9d62184"} Oct 12 05:58:24 crc kubenswrapper[4930]: I1012 05:58:24.598985 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=3.598966789 podStartE2EDuration="3.598966789s" podCreationTimestamp="2025-10-12 05:58:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:58:24.583041262 +0000 UTC m=+1037.125143027" watchObservedRunningTime="2025-10-12 05:58:24.598966789 +0000 UTC m=+1037.141068554" Oct 12 05:58:25 crc kubenswrapper[4930]: I1012 05:58:25.051615 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Oct 12 05:58:25 crc kubenswrapper[4930]: I1012 05:58:25.131203 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6df4dd4f95-cwhwt"] Oct 12 05:58:25 crc kubenswrapper[4930]: I1012 05:58:25.163424 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-86b8d48789-7rd2m"] Oct 12 05:58:25 crc kubenswrapper[4930]: E1012 05:58:25.163872 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a43020f9-02f4-4d52-b00f-74fd6da4e46b" containerName="init" Oct 12 05:58:25 crc kubenswrapper[4930]: I1012 05:58:25.163890 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="a43020f9-02f4-4d52-b00f-74fd6da4e46b" containerName="init" Oct 12 05:58:25 crc kubenswrapper[4930]: E1012 05:58:25.163919 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a43020f9-02f4-4d52-b00f-74fd6da4e46b" containerName="dnsmasq-dns" Oct 12 05:58:25 crc kubenswrapper[4930]: I1012 05:58:25.163926 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="a43020f9-02f4-4d52-b00f-74fd6da4e46b" containerName="dnsmasq-dns" Oct 12 05:58:25 crc kubenswrapper[4930]: I1012 05:58:25.164097 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="a43020f9-02f4-4d52-b00f-74fd6da4e46b" containerName="dnsmasq-dns" Oct 12 05:58:25 crc kubenswrapper[4930]: I1012 05:58:25.165261 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-86b8d48789-7rd2m" Oct 12 05:58:25 crc kubenswrapper[4930]: I1012 05:58:25.168148 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-86b8d48789-7rd2m"] Oct 12 05:58:25 crc kubenswrapper[4930]: I1012 05:58:25.178915 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 12 05:58:25 crc kubenswrapper[4930]: I1012 05:58:25.240910 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9dd57cc3-6793-42f9-b938-620f968192c3-scripts\") pod \"horizon-86b8d48789-7rd2m\" (UID: \"9dd57cc3-6793-42f9-b938-620f968192c3\") " pod="openstack/horizon-86b8d48789-7rd2m" Oct 12 05:58:25 crc kubenswrapper[4930]: I1012 05:58:25.240995 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9dd57cc3-6793-42f9-b938-620f968192c3-logs\") pod \"horizon-86b8d48789-7rd2m\" (UID: \"9dd57cc3-6793-42f9-b938-620f968192c3\") " pod="openstack/horizon-86b8d48789-7rd2m" Oct 12 05:58:25 crc kubenswrapper[4930]: I1012 05:58:25.241071 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9dd57cc3-6793-42f9-b938-620f968192c3-horizon-secret-key\") pod \"horizon-86b8d48789-7rd2m\" (UID: \"9dd57cc3-6793-42f9-b938-620f968192c3\") " pod="openstack/horizon-86b8d48789-7rd2m" Oct 12 05:58:25 crc kubenswrapper[4930]: I1012 05:58:25.241092 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9dd57cc3-6793-42f9-b938-620f968192c3-config-data\") pod \"horizon-86b8d48789-7rd2m\" (UID: \"9dd57cc3-6793-42f9-b938-620f968192c3\") " pod="openstack/horizon-86b8d48789-7rd2m" Oct 12 05:58:25 crc kubenswrapper[4930]: I1012 05:58:25.241111 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rx59v\" (UniqueName: \"kubernetes.io/projected/9dd57cc3-6793-42f9-b938-620f968192c3-kube-api-access-rx59v\") pod \"horizon-86b8d48789-7rd2m\" (UID: \"9dd57cc3-6793-42f9-b938-620f968192c3\") " pod="openstack/horizon-86b8d48789-7rd2m" Oct 12 05:58:25 crc kubenswrapper[4930]: I1012 05:58:25.342907 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9dd57cc3-6793-42f9-b938-620f968192c3-logs\") pod \"horizon-86b8d48789-7rd2m\" (UID: \"9dd57cc3-6793-42f9-b938-620f968192c3\") " pod="openstack/horizon-86b8d48789-7rd2m" Oct 12 05:58:25 crc kubenswrapper[4930]: I1012 05:58:25.343027 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9dd57cc3-6793-42f9-b938-620f968192c3-horizon-secret-key\") pod \"horizon-86b8d48789-7rd2m\" (UID: \"9dd57cc3-6793-42f9-b938-620f968192c3\") " pod="openstack/horizon-86b8d48789-7rd2m" Oct 12 05:58:25 crc kubenswrapper[4930]: I1012 05:58:25.343058 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9dd57cc3-6793-42f9-b938-620f968192c3-config-data\") pod \"horizon-86b8d48789-7rd2m\" (UID: \"9dd57cc3-6793-42f9-b938-620f968192c3\") " pod="openstack/horizon-86b8d48789-7rd2m" Oct 12 05:58:25 crc kubenswrapper[4930]: I1012 05:58:25.343078 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rx59v\" (UniqueName: \"kubernetes.io/projected/9dd57cc3-6793-42f9-b938-620f968192c3-kube-api-access-rx59v\") pod \"horizon-86b8d48789-7rd2m\" (UID: \"9dd57cc3-6793-42f9-b938-620f968192c3\") " pod="openstack/horizon-86b8d48789-7rd2m" Oct 12 05:58:25 crc kubenswrapper[4930]: I1012 05:58:25.343138 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9dd57cc3-6793-42f9-b938-620f968192c3-scripts\") pod \"horizon-86b8d48789-7rd2m\" (UID: \"9dd57cc3-6793-42f9-b938-620f968192c3\") " pod="openstack/horizon-86b8d48789-7rd2m" Oct 12 05:58:25 crc kubenswrapper[4930]: I1012 05:58:25.344616 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9dd57cc3-6793-42f9-b938-620f968192c3-logs\") pod \"horizon-86b8d48789-7rd2m\" (UID: \"9dd57cc3-6793-42f9-b938-620f968192c3\") " pod="openstack/horizon-86b8d48789-7rd2m" Oct 12 05:58:25 crc kubenswrapper[4930]: I1012 05:58:25.345327 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9dd57cc3-6793-42f9-b938-620f968192c3-config-data\") pod \"horizon-86b8d48789-7rd2m\" (UID: \"9dd57cc3-6793-42f9-b938-620f968192c3\") " pod="openstack/horizon-86b8d48789-7rd2m" Oct 12 05:58:25 crc kubenswrapper[4930]: I1012 05:58:25.345557 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9dd57cc3-6793-42f9-b938-620f968192c3-scripts\") pod \"horizon-86b8d48789-7rd2m\" (UID: \"9dd57cc3-6793-42f9-b938-620f968192c3\") " pod="openstack/horizon-86b8d48789-7rd2m" Oct 12 05:58:25 crc kubenswrapper[4930]: I1012 05:58:25.351392 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9dd57cc3-6793-42f9-b938-620f968192c3-horizon-secret-key\") pod \"horizon-86b8d48789-7rd2m\" (UID: \"9dd57cc3-6793-42f9-b938-620f968192c3\") " pod="openstack/horizon-86b8d48789-7rd2m" Oct 12 05:58:25 crc kubenswrapper[4930]: I1012 05:58:25.361201 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rx59v\" (UniqueName: \"kubernetes.io/projected/9dd57cc3-6793-42f9-b938-620f968192c3-kube-api-access-rx59v\") pod \"horizon-86b8d48789-7rd2m\" (UID: \"9dd57cc3-6793-42f9-b938-620f968192c3\") " pod="openstack/horizon-86b8d48789-7rd2m" Oct 12 05:58:25 crc kubenswrapper[4930]: I1012 05:58:25.514277 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-86b8d48789-7rd2m" Oct 12 05:58:25 crc kubenswrapper[4930]: I1012 05:58:25.574598 4930 generic.go:334] "Generic (PLEG): container finished" podID="48b62183-7a23-4f98-aef5-4b6b6f89bd53" containerID="1fda28ffc05e5ae6a1e31aaa8d875bb93ce176fa2c5d3bafa5cdee2f7273ca3a" exitCode=0 Oct 12 05:58:25 crc kubenswrapper[4930]: I1012 05:58:25.574658 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-578598f949-m7h9x" event={"ID":"48b62183-7a23-4f98-aef5-4b6b6f89bd53","Type":"ContainerDied","Data":"1fda28ffc05e5ae6a1e31aaa8d875bb93ce176fa2c5d3bafa5cdee2f7273ca3a"} Oct 12 05:58:25 crc kubenswrapper[4930]: I1012 05:58:25.599105 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-96nsw" event={"ID":"995795c9-befc-4ce9-8a38-8791ba628061","Type":"ContainerStarted","Data":"30ebdfc31a962560651811b3199c242f5f32817e99142ec937334cfc7c99609f"} Oct 12 05:58:25 crc kubenswrapper[4930]: I1012 05:58:25.600770 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Oct 12 05:58:25 crc kubenswrapper[4930]: I1012 05:58:25.632058 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-96nsw" podStartSLOduration=4.632040407 podStartE2EDuration="4.632040407s" podCreationTimestamp="2025-10-12 05:58:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:58:25.618422018 +0000 UTC m=+1038.160523783" watchObservedRunningTime="2025-10-12 05:58:25.632040407 +0000 UTC m=+1038.174142172" Oct 12 05:58:26 crc kubenswrapper[4930]: I1012 05:58:26.615934 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="799201c0-a8e7-40f8-bce3-8d43fc5f9a9e" containerName="watcher-api-log" containerID="cri-o://430654705aed322c284d1cc300957fda8b0edbdd3ee95b03b58120323604b0b2" gracePeriod=30 Oct 12 05:58:26 crc kubenswrapper[4930]: I1012 05:58:26.616515 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="799201c0-a8e7-40f8-bce3-8d43fc5f9a9e" containerName="watcher-api" containerID="cri-o://2e692fae791c683fc477c82b235c0199958dbe386c5ca7b83a54dc4179b5cbdb" gracePeriod=30 Oct 12 05:58:26 crc kubenswrapper[4930]: I1012 05:58:26.626478 4930 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="799201c0-a8e7-40f8-bce3-8d43fc5f9a9e" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.152:9322/\": EOF" Oct 12 05:58:26 crc kubenswrapper[4930]: I1012 05:58:26.630795 4930 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="799201c0-a8e7-40f8-bce3-8d43fc5f9a9e" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.152:9322/\": EOF" Oct 12 05:58:27 crc kubenswrapper[4930]: I1012 05:58:27.195708 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Oct 12 05:58:27 crc kubenswrapper[4930]: I1012 05:58:27.632974 4930 generic.go:334] "Generic (PLEG): container finished" podID="799201c0-a8e7-40f8-bce3-8d43fc5f9a9e" containerID="430654705aed322c284d1cc300957fda8b0edbdd3ee95b03b58120323604b0b2" exitCode=143 Oct 12 05:58:27 crc kubenswrapper[4930]: I1012 05:58:27.633017 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"799201c0-a8e7-40f8-bce3-8d43fc5f9a9e","Type":"ContainerDied","Data":"430654705aed322c284d1cc300957fda8b0edbdd3ee95b03b58120323604b0b2"} Oct 12 05:58:27 crc kubenswrapper[4930]: I1012 05:58:27.980185 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58bbf48b7f-dlk7f" Oct 12 05:58:28 crc kubenswrapper[4930]: I1012 05:58:28.012570 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxwkg\" (UniqueName: \"kubernetes.io/projected/ce5f0598-e80c-49d3-bd51-3bbd167395db-kube-api-access-lxwkg\") pod \"ce5f0598-e80c-49d3-bd51-3bbd167395db\" (UID: \"ce5f0598-e80c-49d3-bd51-3bbd167395db\") " Oct 12 05:58:28 crc kubenswrapper[4930]: I1012 05:58:28.012658 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce5f0598-e80c-49d3-bd51-3bbd167395db-config\") pod \"ce5f0598-e80c-49d3-bd51-3bbd167395db\" (UID: \"ce5f0598-e80c-49d3-bd51-3bbd167395db\") " Oct 12 05:58:28 crc kubenswrapper[4930]: I1012 05:58:28.012679 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce5f0598-e80c-49d3-bd51-3bbd167395db-ovsdbserver-sb\") pod \"ce5f0598-e80c-49d3-bd51-3bbd167395db\" (UID: \"ce5f0598-e80c-49d3-bd51-3bbd167395db\") " Oct 12 05:58:28 crc kubenswrapper[4930]: I1012 05:58:28.012796 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce5f0598-e80c-49d3-bd51-3bbd167395db-dns-swift-storage-0\") pod \"ce5f0598-e80c-49d3-bd51-3bbd167395db\" (UID: \"ce5f0598-e80c-49d3-bd51-3bbd167395db\") " Oct 12 05:58:28 crc kubenswrapper[4930]: I1012 05:58:28.012866 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce5f0598-e80c-49d3-bd51-3bbd167395db-dns-svc\") pod \"ce5f0598-e80c-49d3-bd51-3bbd167395db\" (UID: \"ce5f0598-e80c-49d3-bd51-3bbd167395db\") " Oct 12 05:58:28 crc kubenswrapper[4930]: I1012 05:58:28.012909 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce5f0598-e80c-49d3-bd51-3bbd167395db-ovsdbserver-nb\") pod \"ce5f0598-e80c-49d3-bd51-3bbd167395db\" (UID: \"ce5f0598-e80c-49d3-bd51-3bbd167395db\") " Oct 12 05:58:28 crc kubenswrapper[4930]: I1012 05:58:28.040442 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce5f0598-e80c-49d3-bd51-3bbd167395db-kube-api-access-lxwkg" (OuterVolumeSpecName: "kube-api-access-lxwkg") pod "ce5f0598-e80c-49d3-bd51-3bbd167395db" (UID: "ce5f0598-e80c-49d3-bd51-3bbd167395db"). InnerVolumeSpecName "kube-api-access-lxwkg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:58:28 crc kubenswrapper[4930]: I1012 05:58:28.058944 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce5f0598-e80c-49d3-bd51-3bbd167395db-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ce5f0598-e80c-49d3-bd51-3bbd167395db" (UID: "ce5f0598-e80c-49d3-bd51-3bbd167395db"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:58:28 crc kubenswrapper[4930]: I1012 05:58:28.120976 4930 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce5f0598-e80c-49d3-bd51-3bbd167395db-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 12 05:58:28 crc kubenswrapper[4930]: I1012 05:58:28.121027 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxwkg\" (UniqueName: \"kubernetes.io/projected/ce5f0598-e80c-49d3-bd51-3bbd167395db-kube-api-access-lxwkg\") on node \"crc\" DevicePath \"\"" Oct 12 05:58:28 crc kubenswrapper[4930]: I1012 05:58:28.162192 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce5f0598-e80c-49d3-bd51-3bbd167395db-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ce5f0598-e80c-49d3-bd51-3bbd167395db" (UID: "ce5f0598-e80c-49d3-bd51-3bbd167395db"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:58:28 crc kubenswrapper[4930]: I1012 05:58:28.201969 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce5f0598-e80c-49d3-bd51-3bbd167395db-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ce5f0598-e80c-49d3-bd51-3bbd167395db" (UID: "ce5f0598-e80c-49d3-bd51-3bbd167395db"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:58:28 crc kubenswrapper[4930]: I1012 05:58:28.202530 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce5f0598-e80c-49d3-bd51-3bbd167395db-config" (OuterVolumeSpecName: "config") pod "ce5f0598-e80c-49d3-bd51-3bbd167395db" (UID: "ce5f0598-e80c-49d3-bd51-3bbd167395db"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:58:28 crc kubenswrapper[4930]: I1012 05:58:28.220524 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce5f0598-e80c-49d3-bd51-3bbd167395db-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ce5f0598-e80c-49d3-bd51-3bbd167395db" (UID: "ce5f0598-e80c-49d3-bd51-3bbd167395db"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:58:28 crc kubenswrapper[4930]: I1012 05:58:28.224530 4930 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce5f0598-e80c-49d3-bd51-3bbd167395db-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 12 05:58:28 crc kubenswrapper[4930]: I1012 05:58:28.224788 4930 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce5f0598-e80c-49d3-bd51-3bbd167395db-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 12 05:58:28 crc kubenswrapper[4930]: I1012 05:58:28.224799 4930 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce5f0598-e80c-49d3-bd51-3bbd167395db-config\") on node \"crc\" DevicePath \"\"" Oct 12 05:58:28 crc kubenswrapper[4930]: I1012 05:58:28.224812 4930 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce5f0598-e80c-49d3-bd51-3bbd167395db-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 12 05:58:28 crc kubenswrapper[4930]: I1012 05:58:28.525618 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-86b8d48789-7rd2m"] Oct 12 05:58:28 crc kubenswrapper[4930]: I1012 05:58:28.672501 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58bbf48b7f-dlk7f" event={"ID":"ce5f0598-e80c-49d3-bd51-3bbd167395db","Type":"ContainerDied","Data":"753134b915841de7d174b99f9344952a47f4fb84a851ac000c9680df343f6849"} Oct 12 05:58:28 crc kubenswrapper[4930]: I1012 05:58:28.672551 4930 scope.go:117] "RemoveContainer" containerID="9cf604c1d27f31ce1d596e2e97147d60ed3882ae389270e2626ca879bb2ccb8c" Oct 12 05:58:28 crc kubenswrapper[4930]: I1012 05:58:28.672668 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58bbf48b7f-dlk7f" Oct 12 05:58:28 crc kubenswrapper[4930]: I1012 05:58:28.765489 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58bbf48b7f-dlk7f"] Oct 12 05:58:28 crc kubenswrapper[4930]: I1012 05:58:28.771566 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58bbf48b7f-dlk7f"] Oct 12 05:58:30 crc kubenswrapper[4930]: I1012 05:58:30.151997 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce5f0598-e80c-49d3-bd51-3bbd167395db" path="/var/lib/kubelet/pods/ce5f0598-e80c-49d3-bd51-3bbd167395db/volumes" Oct 12 05:58:30 crc kubenswrapper[4930]: I1012 05:58:30.368645 4930 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="799201c0-a8e7-40f8-bce3-8d43fc5f9a9e" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.152:9322/\": read tcp 10.217.0.2:56294->10.217.0.152:9322: read: connection reset by peer" Oct 12 05:58:30 crc kubenswrapper[4930]: I1012 05:58:30.685745 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-84fd96956f-5sqc8"] Oct 12 05:58:30 crc kubenswrapper[4930]: I1012 05:58:30.800923 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6778cd8bb8-9zhz5"] Oct 12 05:58:30 crc kubenswrapper[4930]: E1012 05:58:30.801576 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce5f0598-e80c-49d3-bd51-3bbd167395db" containerName="init" Oct 12 05:58:30 crc kubenswrapper[4930]: I1012 05:58:30.801592 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce5f0598-e80c-49d3-bd51-3bbd167395db" containerName="init" Oct 12 05:58:30 crc kubenswrapper[4930]: I1012 05:58:30.801906 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce5f0598-e80c-49d3-bd51-3bbd167395db" containerName="init" Oct 12 05:58:30 crc kubenswrapper[4930]: I1012 05:58:30.829106 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6778cd8bb8-9zhz5" Oct 12 05:58:30 crc kubenswrapper[4930]: I1012 05:58:30.832746 4930 generic.go:334] "Generic (PLEG): container finished" podID="799201c0-a8e7-40f8-bce3-8d43fc5f9a9e" containerID="2e692fae791c683fc477c82b235c0199958dbe386c5ca7b83a54dc4179b5cbdb" exitCode=0 Oct 12 05:58:30 crc kubenswrapper[4930]: I1012 05:58:30.832841 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6778cd8bb8-9zhz5"] Oct 12 05:58:30 crc kubenswrapper[4930]: I1012 05:58:30.832873 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"799201c0-a8e7-40f8-bce3-8d43fc5f9a9e","Type":"ContainerDied","Data":"2e692fae791c683fc477c82b235c0199958dbe386c5ca7b83a54dc4179b5cbdb"} Oct 12 05:58:30 crc kubenswrapper[4930]: I1012 05:58:30.842223 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Oct 12 05:58:30 crc kubenswrapper[4930]: I1012 05:58:30.878972 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-86b8d48789-7rd2m"] Oct 12 05:58:30 crc kubenswrapper[4930]: I1012 05:58:30.985871 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6d76466876-jf9t8"] Oct 12 05:58:30 crc kubenswrapper[4930]: I1012 05:58:30.987394 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6d76466876-jf9t8" Oct 12 05:58:31 crc kubenswrapper[4930]: I1012 05:58:31.005570 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6d76466876-jf9t8"] Oct 12 05:58:31 crc kubenswrapper[4930]: I1012 05:58:31.024572 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30049dfb-04f2-455c-a949-08bd6ff892d0-combined-ca-bundle\") pod \"horizon-6778cd8bb8-9zhz5\" (UID: \"30049dfb-04f2-455c-a949-08bd6ff892d0\") " pod="openstack/horizon-6778cd8bb8-9zhz5" Oct 12 05:58:31 crc kubenswrapper[4930]: I1012 05:58:31.025278 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/30049dfb-04f2-455c-a949-08bd6ff892d0-horizon-tls-certs\") pod \"horizon-6778cd8bb8-9zhz5\" (UID: \"30049dfb-04f2-455c-a949-08bd6ff892d0\") " pod="openstack/horizon-6778cd8bb8-9zhz5" Oct 12 05:58:31 crc kubenswrapper[4930]: I1012 05:58:31.025458 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/30049dfb-04f2-455c-a949-08bd6ff892d0-horizon-secret-key\") pod \"horizon-6778cd8bb8-9zhz5\" (UID: \"30049dfb-04f2-455c-a949-08bd6ff892d0\") " pod="openstack/horizon-6778cd8bb8-9zhz5" Oct 12 05:58:31 crc kubenswrapper[4930]: I1012 05:58:31.025580 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/30049dfb-04f2-455c-a949-08bd6ff892d0-config-data\") pod \"horizon-6778cd8bb8-9zhz5\" (UID: \"30049dfb-04f2-455c-a949-08bd6ff892d0\") " pod="openstack/horizon-6778cd8bb8-9zhz5" Oct 12 05:58:31 crc kubenswrapper[4930]: I1012 05:58:31.025749 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/30049dfb-04f2-455c-a949-08bd6ff892d0-scripts\") pod \"horizon-6778cd8bb8-9zhz5\" (UID: \"30049dfb-04f2-455c-a949-08bd6ff892d0\") " pod="openstack/horizon-6778cd8bb8-9zhz5" Oct 12 05:58:31 crc kubenswrapper[4930]: I1012 05:58:31.025928 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fqrn\" (UniqueName: \"kubernetes.io/projected/30049dfb-04f2-455c-a949-08bd6ff892d0-kube-api-access-2fqrn\") pod \"horizon-6778cd8bb8-9zhz5\" (UID: \"30049dfb-04f2-455c-a949-08bd6ff892d0\") " pod="openstack/horizon-6778cd8bb8-9zhz5" Oct 12 05:58:31 crc kubenswrapper[4930]: I1012 05:58:31.026059 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30049dfb-04f2-455c-a949-08bd6ff892d0-logs\") pod \"horizon-6778cd8bb8-9zhz5\" (UID: \"30049dfb-04f2-455c-a949-08bd6ff892d0\") " pod="openstack/horizon-6778cd8bb8-9zhz5" Oct 12 05:58:31 crc kubenswrapper[4930]: I1012 05:58:31.127678 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a97771f5-bcbe-42d8-bdd8-41b43f8899a0-horizon-tls-certs\") pod \"horizon-6d76466876-jf9t8\" (UID: \"a97771f5-bcbe-42d8-bdd8-41b43f8899a0\") " pod="openstack/horizon-6d76466876-jf9t8" Oct 12 05:58:31 crc kubenswrapper[4930]: I1012 05:58:31.128049 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a97771f5-bcbe-42d8-bdd8-41b43f8899a0-logs\") pod \"horizon-6d76466876-jf9t8\" (UID: \"a97771f5-bcbe-42d8-bdd8-41b43f8899a0\") " pod="openstack/horizon-6d76466876-jf9t8" Oct 12 05:58:31 crc kubenswrapper[4930]: I1012 05:58:31.128072 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/30049dfb-04f2-455c-a949-08bd6ff892d0-scripts\") pod \"horizon-6778cd8bb8-9zhz5\" (UID: \"30049dfb-04f2-455c-a949-08bd6ff892d0\") " pod="openstack/horizon-6778cd8bb8-9zhz5" Oct 12 05:58:31 crc kubenswrapper[4930]: I1012 05:58:31.128092 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a97771f5-bcbe-42d8-bdd8-41b43f8899a0-combined-ca-bundle\") pod \"horizon-6d76466876-jf9t8\" (UID: \"a97771f5-bcbe-42d8-bdd8-41b43f8899a0\") " pod="openstack/horizon-6d76466876-jf9t8" Oct 12 05:58:31 crc kubenswrapper[4930]: I1012 05:58:31.128140 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fqrn\" (UniqueName: \"kubernetes.io/projected/30049dfb-04f2-455c-a949-08bd6ff892d0-kube-api-access-2fqrn\") pod \"horizon-6778cd8bb8-9zhz5\" (UID: \"30049dfb-04f2-455c-a949-08bd6ff892d0\") " pod="openstack/horizon-6778cd8bb8-9zhz5" Oct 12 05:58:31 crc kubenswrapper[4930]: I1012 05:58:31.128177 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30049dfb-04f2-455c-a949-08bd6ff892d0-logs\") pod \"horizon-6778cd8bb8-9zhz5\" (UID: \"30049dfb-04f2-455c-a949-08bd6ff892d0\") " pod="openstack/horizon-6778cd8bb8-9zhz5" Oct 12 05:58:31 crc kubenswrapper[4930]: I1012 05:58:31.128277 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30049dfb-04f2-455c-a949-08bd6ff892d0-combined-ca-bundle\") pod \"horizon-6778cd8bb8-9zhz5\" (UID: \"30049dfb-04f2-455c-a949-08bd6ff892d0\") " pod="openstack/horizon-6778cd8bb8-9zhz5" Oct 12 05:58:31 crc kubenswrapper[4930]: I1012 05:58:31.128352 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a97771f5-bcbe-42d8-bdd8-41b43f8899a0-horizon-secret-key\") pod \"horizon-6d76466876-jf9t8\" (UID: \"a97771f5-bcbe-42d8-bdd8-41b43f8899a0\") " pod="openstack/horizon-6d76466876-jf9t8" Oct 12 05:58:31 crc kubenswrapper[4930]: I1012 05:58:31.128374 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjpr9\" (UniqueName: \"kubernetes.io/projected/a97771f5-bcbe-42d8-bdd8-41b43f8899a0-kube-api-access-bjpr9\") pod \"horizon-6d76466876-jf9t8\" (UID: \"a97771f5-bcbe-42d8-bdd8-41b43f8899a0\") " pod="openstack/horizon-6d76466876-jf9t8" Oct 12 05:58:31 crc kubenswrapper[4930]: I1012 05:58:31.128450 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/30049dfb-04f2-455c-a949-08bd6ff892d0-horizon-tls-certs\") pod \"horizon-6778cd8bb8-9zhz5\" (UID: \"30049dfb-04f2-455c-a949-08bd6ff892d0\") " pod="openstack/horizon-6778cd8bb8-9zhz5" Oct 12 05:58:31 crc kubenswrapper[4930]: I1012 05:58:31.128514 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/30049dfb-04f2-455c-a949-08bd6ff892d0-horizon-secret-key\") pod \"horizon-6778cd8bb8-9zhz5\" (UID: \"30049dfb-04f2-455c-a949-08bd6ff892d0\") " pod="openstack/horizon-6778cd8bb8-9zhz5" Oct 12 05:58:31 crc kubenswrapper[4930]: I1012 05:58:31.128538 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a97771f5-bcbe-42d8-bdd8-41b43f8899a0-scripts\") pod \"horizon-6d76466876-jf9t8\" (UID: \"a97771f5-bcbe-42d8-bdd8-41b43f8899a0\") " pod="openstack/horizon-6d76466876-jf9t8" Oct 12 05:58:31 crc kubenswrapper[4930]: I1012 05:58:31.128564 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a97771f5-bcbe-42d8-bdd8-41b43f8899a0-config-data\") pod \"horizon-6d76466876-jf9t8\" (UID: \"a97771f5-bcbe-42d8-bdd8-41b43f8899a0\") " pod="openstack/horizon-6d76466876-jf9t8" Oct 12 05:58:31 crc kubenswrapper[4930]: I1012 05:58:31.128590 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/30049dfb-04f2-455c-a949-08bd6ff892d0-config-data\") pod \"horizon-6778cd8bb8-9zhz5\" (UID: \"30049dfb-04f2-455c-a949-08bd6ff892d0\") " pod="openstack/horizon-6778cd8bb8-9zhz5" Oct 12 05:58:31 crc kubenswrapper[4930]: I1012 05:58:31.129668 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/30049dfb-04f2-455c-a949-08bd6ff892d0-config-data\") pod \"horizon-6778cd8bb8-9zhz5\" (UID: \"30049dfb-04f2-455c-a949-08bd6ff892d0\") " pod="openstack/horizon-6778cd8bb8-9zhz5" Oct 12 05:58:31 crc kubenswrapper[4930]: I1012 05:58:31.130446 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/30049dfb-04f2-455c-a949-08bd6ff892d0-scripts\") pod \"horizon-6778cd8bb8-9zhz5\" (UID: \"30049dfb-04f2-455c-a949-08bd6ff892d0\") " pod="openstack/horizon-6778cd8bb8-9zhz5" Oct 12 05:58:31 crc kubenswrapper[4930]: I1012 05:58:31.131688 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30049dfb-04f2-455c-a949-08bd6ff892d0-logs\") pod \"horizon-6778cd8bb8-9zhz5\" (UID: \"30049dfb-04f2-455c-a949-08bd6ff892d0\") " pod="openstack/horizon-6778cd8bb8-9zhz5" Oct 12 05:58:31 crc kubenswrapper[4930]: I1012 05:58:31.140058 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30049dfb-04f2-455c-a949-08bd6ff892d0-combined-ca-bundle\") pod \"horizon-6778cd8bb8-9zhz5\" (UID: \"30049dfb-04f2-455c-a949-08bd6ff892d0\") " pod="openstack/horizon-6778cd8bb8-9zhz5" Oct 12 05:58:31 crc kubenswrapper[4930]: I1012 05:58:31.140180 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/30049dfb-04f2-455c-a949-08bd6ff892d0-horizon-tls-certs\") pod \"horizon-6778cd8bb8-9zhz5\" (UID: \"30049dfb-04f2-455c-a949-08bd6ff892d0\") " pod="openstack/horizon-6778cd8bb8-9zhz5" Oct 12 05:58:31 crc kubenswrapper[4930]: I1012 05:58:31.149114 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/30049dfb-04f2-455c-a949-08bd6ff892d0-horizon-secret-key\") pod \"horizon-6778cd8bb8-9zhz5\" (UID: \"30049dfb-04f2-455c-a949-08bd6ff892d0\") " pod="openstack/horizon-6778cd8bb8-9zhz5" Oct 12 05:58:31 crc kubenswrapper[4930]: I1012 05:58:31.149382 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fqrn\" (UniqueName: \"kubernetes.io/projected/30049dfb-04f2-455c-a949-08bd6ff892d0-kube-api-access-2fqrn\") pod \"horizon-6778cd8bb8-9zhz5\" (UID: \"30049dfb-04f2-455c-a949-08bd6ff892d0\") " pod="openstack/horizon-6778cd8bb8-9zhz5" Oct 12 05:58:31 crc kubenswrapper[4930]: I1012 05:58:31.189449 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6778cd8bb8-9zhz5" Oct 12 05:58:31 crc kubenswrapper[4930]: I1012 05:58:31.230445 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a97771f5-bcbe-42d8-bdd8-41b43f8899a0-horizon-secret-key\") pod \"horizon-6d76466876-jf9t8\" (UID: \"a97771f5-bcbe-42d8-bdd8-41b43f8899a0\") " pod="openstack/horizon-6d76466876-jf9t8" Oct 12 05:58:31 crc kubenswrapper[4930]: I1012 05:58:31.230481 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjpr9\" (UniqueName: \"kubernetes.io/projected/a97771f5-bcbe-42d8-bdd8-41b43f8899a0-kube-api-access-bjpr9\") pod \"horizon-6d76466876-jf9t8\" (UID: \"a97771f5-bcbe-42d8-bdd8-41b43f8899a0\") " pod="openstack/horizon-6d76466876-jf9t8" Oct 12 05:58:31 crc kubenswrapper[4930]: I1012 05:58:31.230562 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a97771f5-bcbe-42d8-bdd8-41b43f8899a0-scripts\") pod \"horizon-6d76466876-jf9t8\" (UID: \"a97771f5-bcbe-42d8-bdd8-41b43f8899a0\") " pod="openstack/horizon-6d76466876-jf9t8" Oct 12 05:58:31 crc kubenswrapper[4930]: I1012 05:58:31.230583 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a97771f5-bcbe-42d8-bdd8-41b43f8899a0-config-data\") pod \"horizon-6d76466876-jf9t8\" (UID: \"a97771f5-bcbe-42d8-bdd8-41b43f8899a0\") " pod="openstack/horizon-6d76466876-jf9t8" Oct 12 05:58:31 crc kubenswrapper[4930]: I1012 05:58:31.230620 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a97771f5-bcbe-42d8-bdd8-41b43f8899a0-horizon-tls-certs\") pod \"horizon-6d76466876-jf9t8\" (UID: \"a97771f5-bcbe-42d8-bdd8-41b43f8899a0\") " pod="openstack/horizon-6d76466876-jf9t8" Oct 12 05:58:31 crc kubenswrapper[4930]: I1012 05:58:31.230646 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a97771f5-bcbe-42d8-bdd8-41b43f8899a0-logs\") pod \"horizon-6d76466876-jf9t8\" (UID: \"a97771f5-bcbe-42d8-bdd8-41b43f8899a0\") " pod="openstack/horizon-6d76466876-jf9t8" Oct 12 05:58:31 crc kubenswrapper[4930]: I1012 05:58:31.230666 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a97771f5-bcbe-42d8-bdd8-41b43f8899a0-combined-ca-bundle\") pod \"horizon-6d76466876-jf9t8\" (UID: \"a97771f5-bcbe-42d8-bdd8-41b43f8899a0\") " pod="openstack/horizon-6d76466876-jf9t8" Oct 12 05:58:31 crc kubenswrapper[4930]: I1012 05:58:31.233603 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a97771f5-bcbe-42d8-bdd8-41b43f8899a0-scripts\") pod \"horizon-6d76466876-jf9t8\" (UID: \"a97771f5-bcbe-42d8-bdd8-41b43f8899a0\") " pod="openstack/horizon-6d76466876-jf9t8" Oct 12 05:58:31 crc kubenswrapper[4930]: I1012 05:58:31.234224 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a97771f5-bcbe-42d8-bdd8-41b43f8899a0-config-data\") pod \"horizon-6d76466876-jf9t8\" (UID: \"a97771f5-bcbe-42d8-bdd8-41b43f8899a0\") " pod="openstack/horizon-6d76466876-jf9t8" Oct 12 05:58:31 crc kubenswrapper[4930]: I1012 05:58:31.234637 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a97771f5-bcbe-42d8-bdd8-41b43f8899a0-logs\") pod \"horizon-6d76466876-jf9t8\" (UID: \"a97771f5-bcbe-42d8-bdd8-41b43f8899a0\") " pod="openstack/horizon-6d76466876-jf9t8" Oct 12 05:58:31 crc kubenswrapper[4930]: I1012 05:58:31.235918 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a97771f5-bcbe-42d8-bdd8-41b43f8899a0-combined-ca-bundle\") pod \"horizon-6d76466876-jf9t8\" (UID: \"a97771f5-bcbe-42d8-bdd8-41b43f8899a0\") " pod="openstack/horizon-6d76466876-jf9t8" Oct 12 05:58:31 crc kubenswrapper[4930]: I1012 05:58:31.236913 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a97771f5-bcbe-42d8-bdd8-41b43f8899a0-horizon-secret-key\") pod \"horizon-6d76466876-jf9t8\" (UID: \"a97771f5-bcbe-42d8-bdd8-41b43f8899a0\") " pod="openstack/horizon-6d76466876-jf9t8" Oct 12 05:58:31 crc kubenswrapper[4930]: I1012 05:58:31.239642 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a97771f5-bcbe-42d8-bdd8-41b43f8899a0-horizon-tls-certs\") pod \"horizon-6d76466876-jf9t8\" (UID: \"a97771f5-bcbe-42d8-bdd8-41b43f8899a0\") " pod="openstack/horizon-6d76466876-jf9t8" Oct 12 05:58:31 crc kubenswrapper[4930]: I1012 05:58:31.250432 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjpr9\" (UniqueName: \"kubernetes.io/projected/a97771f5-bcbe-42d8-bdd8-41b43f8899a0-kube-api-access-bjpr9\") pod \"horizon-6d76466876-jf9t8\" (UID: \"a97771f5-bcbe-42d8-bdd8-41b43f8899a0\") " pod="openstack/horizon-6d76466876-jf9t8" Oct 12 05:58:31 crc kubenswrapper[4930]: I1012 05:58:31.309238 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6d76466876-jf9t8" Oct 12 05:58:31 crc kubenswrapper[4930]: I1012 05:58:31.851765 4930 generic.go:334] "Generic (PLEG): container finished" podID="f8e573a1-8105-4b4a-9ba8-dac6aef25de2" containerID="452b6aabed2e3099d896db9eed8281d48b7281c884e3b2385020e0b8bf256332" exitCode=0 Oct 12 05:58:31 crc kubenswrapper[4930]: I1012 05:58:31.851803 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qfmkt" event={"ID":"f8e573a1-8105-4b4a-9ba8-dac6aef25de2","Type":"ContainerDied","Data":"452b6aabed2e3099d896db9eed8281d48b7281c884e3b2385020e0b8bf256332"} Oct 12 05:58:32 crc kubenswrapper[4930]: I1012 05:58:32.196816 4930 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="799201c0-a8e7-40f8-bce3-8d43fc5f9a9e" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.152:9322/\": dial tcp 10.217.0.152:9322: connect: connection refused" Oct 12 05:58:32 crc kubenswrapper[4930]: I1012 05:58:32.862046 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-86b8d48789-7rd2m" event={"ID":"9dd57cc3-6793-42f9-b938-620f968192c3","Type":"ContainerStarted","Data":"0c62560e93c7a21b11e0e81a3eb1c9724b0fa387c2eb476a96c580480c31c257"} Oct 12 05:58:37 crc kubenswrapper[4930]: I1012 05:58:37.196049 4930 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="799201c0-a8e7-40f8-bce3-8d43fc5f9a9e" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.152:9322/\": dial tcp 10.217.0.152:9322: connect: connection refused" Oct 12 05:58:38 crc kubenswrapper[4930]: I1012 05:58:38.920370 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qfmkt" Oct 12 05:58:38 crc kubenswrapper[4930]: I1012 05:58:38.931525 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qfmkt" event={"ID":"f8e573a1-8105-4b4a-9ba8-dac6aef25de2","Type":"ContainerDied","Data":"f7ca2e2062aa8bf990b0d5c491edea8210645d04d426d3c6aab7824167ba43f5"} Oct 12 05:58:38 crc kubenswrapper[4930]: I1012 05:58:38.931562 4930 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7ca2e2062aa8bf990b0d5c491edea8210645d04d426d3c6aab7824167ba43f5" Oct 12 05:58:38 crc kubenswrapper[4930]: I1012 05:58:38.931569 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qfmkt" Oct 12 05:58:38 crc kubenswrapper[4930]: I1012 05:58:38.953083 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f8e573a1-8105-4b4a-9ba8-dac6aef25de2-credential-keys\") pod \"f8e573a1-8105-4b4a-9ba8-dac6aef25de2\" (UID: \"f8e573a1-8105-4b4a-9ba8-dac6aef25de2\") " Oct 12 05:58:38 crc kubenswrapper[4930]: I1012 05:58:38.953272 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8e573a1-8105-4b4a-9ba8-dac6aef25de2-config-data\") pod \"f8e573a1-8105-4b4a-9ba8-dac6aef25de2\" (UID: \"f8e573a1-8105-4b4a-9ba8-dac6aef25de2\") " Oct 12 05:58:38 crc kubenswrapper[4930]: I1012 05:58:38.953296 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8e573a1-8105-4b4a-9ba8-dac6aef25de2-combined-ca-bundle\") pod \"f8e573a1-8105-4b4a-9ba8-dac6aef25de2\" (UID: \"f8e573a1-8105-4b4a-9ba8-dac6aef25de2\") " Oct 12 05:58:38 crc kubenswrapper[4930]: I1012 05:58:38.953324 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lmchd\" (UniqueName: \"kubernetes.io/projected/f8e573a1-8105-4b4a-9ba8-dac6aef25de2-kube-api-access-lmchd\") pod \"f8e573a1-8105-4b4a-9ba8-dac6aef25de2\" (UID: \"f8e573a1-8105-4b4a-9ba8-dac6aef25de2\") " Oct 12 05:58:38 crc kubenswrapper[4930]: I1012 05:58:38.953370 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8e573a1-8105-4b4a-9ba8-dac6aef25de2-scripts\") pod \"f8e573a1-8105-4b4a-9ba8-dac6aef25de2\" (UID: \"f8e573a1-8105-4b4a-9ba8-dac6aef25de2\") " Oct 12 05:58:38 crc kubenswrapper[4930]: I1012 05:58:38.953406 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f8e573a1-8105-4b4a-9ba8-dac6aef25de2-fernet-keys\") pod \"f8e573a1-8105-4b4a-9ba8-dac6aef25de2\" (UID: \"f8e573a1-8105-4b4a-9ba8-dac6aef25de2\") " Oct 12 05:58:38 crc kubenswrapper[4930]: I1012 05:58:38.962125 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8e573a1-8105-4b4a-9ba8-dac6aef25de2-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "f8e573a1-8105-4b4a-9ba8-dac6aef25de2" (UID: "f8e573a1-8105-4b4a-9ba8-dac6aef25de2"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:58:38 crc kubenswrapper[4930]: I1012 05:58:38.962142 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8e573a1-8105-4b4a-9ba8-dac6aef25de2-scripts" (OuterVolumeSpecName: "scripts") pod "f8e573a1-8105-4b4a-9ba8-dac6aef25de2" (UID: "f8e573a1-8105-4b4a-9ba8-dac6aef25de2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:58:38 crc kubenswrapper[4930]: I1012 05:58:38.962251 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8e573a1-8105-4b4a-9ba8-dac6aef25de2-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "f8e573a1-8105-4b4a-9ba8-dac6aef25de2" (UID: "f8e573a1-8105-4b4a-9ba8-dac6aef25de2"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:58:38 crc kubenswrapper[4930]: I1012 05:58:38.967429 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8e573a1-8105-4b4a-9ba8-dac6aef25de2-kube-api-access-lmchd" (OuterVolumeSpecName: "kube-api-access-lmchd") pod "f8e573a1-8105-4b4a-9ba8-dac6aef25de2" (UID: "f8e573a1-8105-4b4a-9ba8-dac6aef25de2"). InnerVolumeSpecName "kube-api-access-lmchd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:58:38 crc kubenswrapper[4930]: I1012 05:58:38.989926 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8e573a1-8105-4b4a-9ba8-dac6aef25de2-config-data" (OuterVolumeSpecName: "config-data") pod "f8e573a1-8105-4b4a-9ba8-dac6aef25de2" (UID: "f8e573a1-8105-4b4a-9ba8-dac6aef25de2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:58:39 crc kubenswrapper[4930]: I1012 05:58:39.001460 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8e573a1-8105-4b4a-9ba8-dac6aef25de2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f8e573a1-8105-4b4a-9ba8-dac6aef25de2" (UID: "f8e573a1-8105-4b4a-9ba8-dac6aef25de2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:58:39 crc kubenswrapper[4930]: I1012 05:58:39.059373 4930 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8e573a1-8105-4b4a-9ba8-dac6aef25de2-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 05:58:39 crc kubenswrapper[4930]: I1012 05:58:39.059429 4930 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8e573a1-8105-4b4a-9ba8-dac6aef25de2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 05:58:39 crc kubenswrapper[4930]: I1012 05:58:39.059447 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lmchd\" (UniqueName: \"kubernetes.io/projected/f8e573a1-8105-4b4a-9ba8-dac6aef25de2-kube-api-access-lmchd\") on node \"crc\" DevicePath \"\"" Oct 12 05:58:39 crc kubenswrapper[4930]: I1012 05:58:39.059460 4930 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8e573a1-8105-4b4a-9ba8-dac6aef25de2-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 05:58:39 crc kubenswrapper[4930]: I1012 05:58:39.059472 4930 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f8e573a1-8105-4b4a-9ba8-dac6aef25de2-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 12 05:58:39 crc kubenswrapper[4930]: I1012 05:58:39.059485 4930 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f8e573a1-8105-4b4a-9ba8-dac6aef25de2-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 12 05:58:40 crc kubenswrapper[4930]: I1012 05:58:40.024789 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-qfmkt"] Oct 12 05:58:40 crc kubenswrapper[4930]: I1012 05:58:40.034210 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-qfmkt"] Oct 12 05:58:40 crc kubenswrapper[4930]: I1012 05:58:40.101280 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-h2tnf"] Oct 12 05:58:40 crc kubenswrapper[4930]: E1012 05:58:40.102186 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8e573a1-8105-4b4a-9ba8-dac6aef25de2" containerName="keystone-bootstrap" Oct 12 05:58:40 crc kubenswrapper[4930]: I1012 05:58:40.102205 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8e573a1-8105-4b4a-9ba8-dac6aef25de2" containerName="keystone-bootstrap" Oct 12 05:58:40 crc kubenswrapper[4930]: I1012 05:58:40.102447 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8e573a1-8105-4b4a-9ba8-dac6aef25de2" containerName="keystone-bootstrap" Oct 12 05:58:40 crc kubenswrapper[4930]: I1012 05:58:40.103399 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-h2tnf" Oct 12 05:58:40 crc kubenswrapper[4930]: I1012 05:58:40.109177 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-hkmzw" Oct 12 05:58:40 crc kubenswrapper[4930]: I1012 05:58:40.109449 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 12 05:58:40 crc kubenswrapper[4930]: I1012 05:58:40.110048 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 12 05:58:40 crc kubenswrapper[4930]: I1012 05:58:40.110131 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 12 05:58:40 crc kubenswrapper[4930]: I1012 05:58:40.122557 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-h2tnf"] Oct 12 05:58:40 crc kubenswrapper[4930]: I1012 05:58:40.146412 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8e573a1-8105-4b4a-9ba8-dac6aef25de2" path="/var/lib/kubelet/pods/f8e573a1-8105-4b4a-9ba8-dac6aef25de2/volumes" Oct 12 05:58:40 crc kubenswrapper[4930]: I1012 05:58:40.184670 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b228afd2-c418-49a3-97d4-35a298e324a6-config-data\") pod \"keystone-bootstrap-h2tnf\" (UID: \"b228afd2-c418-49a3-97d4-35a298e324a6\") " pod="openstack/keystone-bootstrap-h2tnf" Oct 12 05:58:40 crc kubenswrapper[4930]: I1012 05:58:40.184737 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b228afd2-c418-49a3-97d4-35a298e324a6-fernet-keys\") pod \"keystone-bootstrap-h2tnf\" (UID: \"b228afd2-c418-49a3-97d4-35a298e324a6\") " pod="openstack/keystone-bootstrap-h2tnf" Oct 12 05:58:40 crc kubenswrapper[4930]: I1012 05:58:40.184881 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nkgr\" (UniqueName: \"kubernetes.io/projected/b228afd2-c418-49a3-97d4-35a298e324a6-kube-api-access-4nkgr\") pod \"keystone-bootstrap-h2tnf\" (UID: \"b228afd2-c418-49a3-97d4-35a298e324a6\") " pod="openstack/keystone-bootstrap-h2tnf" Oct 12 05:58:40 crc kubenswrapper[4930]: I1012 05:58:40.184930 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b228afd2-c418-49a3-97d4-35a298e324a6-scripts\") pod \"keystone-bootstrap-h2tnf\" (UID: \"b228afd2-c418-49a3-97d4-35a298e324a6\") " pod="openstack/keystone-bootstrap-h2tnf" Oct 12 05:58:40 crc kubenswrapper[4930]: I1012 05:58:40.184948 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b228afd2-c418-49a3-97d4-35a298e324a6-combined-ca-bundle\") pod \"keystone-bootstrap-h2tnf\" (UID: \"b228afd2-c418-49a3-97d4-35a298e324a6\") " pod="openstack/keystone-bootstrap-h2tnf" Oct 12 05:58:40 crc kubenswrapper[4930]: I1012 05:58:40.185001 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b228afd2-c418-49a3-97d4-35a298e324a6-credential-keys\") pod \"keystone-bootstrap-h2tnf\" (UID: \"b228afd2-c418-49a3-97d4-35a298e324a6\") " pod="openstack/keystone-bootstrap-h2tnf" Oct 12 05:58:40 crc kubenswrapper[4930]: I1012 05:58:40.286344 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b228afd2-c418-49a3-97d4-35a298e324a6-config-data\") pod \"keystone-bootstrap-h2tnf\" (UID: \"b228afd2-c418-49a3-97d4-35a298e324a6\") " pod="openstack/keystone-bootstrap-h2tnf" Oct 12 05:58:40 crc kubenswrapper[4930]: I1012 05:58:40.286403 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b228afd2-c418-49a3-97d4-35a298e324a6-fernet-keys\") pod \"keystone-bootstrap-h2tnf\" (UID: \"b228afd2-c418-49a3-97d4-35a298e324a6\") " pod="openstack/keystone-bootstrap-h2tnf" Oct 12 05:58:40 crc kubenswrapper[4930]: I1012 05:58:40.286466 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nkgr\" (UniqueName: \"kubernetes.io/projected/b228afd2-c418-49a3-97d4-35a298e324a6-kube-api-access-4nkgr\") pod \"keystone-bootstrap-h2tnf\" (UID: \"b228afd2-c418-49a3-97d4-35a298e324a6\") " pod="openstack/keystone-bootstrap-h2tnf" Oct 12 05:58:40 crc kubenswrapper[4930]: I1012 05:58:40.286504 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b228afd2-c418-49a3-97d4-35a298e324a6-scripts\") pod \"keystone-bootstrap-h2tnf\" (UID: \"b228afd2-c418-49a3-97d4-35a298e324a6\") " pod="openstack/keystone-bootstrap-h2tnf" Oct 12 05:58:40 crc kubenswrapper[4930]: I1012 05:58:40.286522 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b228afd2-c418-49a3-97d4-35a298e324a6-combined-ca-bundle\") pod \"keystone-bootstrap-h2tnf\" (UID: \"b228afd2-c418-49a3-97d4-35a298e324a6\") " pod="openstack/keystone-bootstrap-h2tnf" Oct 12 05:58:40 crc kubenswrapper[4930]: I1012 05:58:40.286539 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b228afd2-c418-49a3-97d4-35a298e324a6-credential-keys\") pod \"keystone-bootstrap-h2tnf\" (UID: \"b228afd2-c418-49a3-97d4-35a298e324a6\") " pod="openstack/keystone-bootstrap-h2tnf" Oct 12 05:58:40 crc kubenswrapper[4930]: I1012 05:58:40.292423 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b228afd2-c418-49a3-97d4-35a298e324a6-credential-keys\") pod \"keystone-bootstrap-h2tnf\" (UID: \"b228afd2-c418-49a3-97d4-35a298e324a6\") " pod="openstack/keystone-bootstrap-h2tnf" Oct 12 05:58:40 crc kubenswrapper[4930]: I1012 05:58:40.292504 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b228afd2-c418-49a3-97d4-35a298e324a6-scripts\") pod \"keystone-bootstrap-h2tnf\" (UID: \"b228afd2-c418-49a3-97d4-35a298e324a6\") " pod="openstack/keystone-bootstrap-h2tnf" Oct 12 05:58:40 crc kubenswrapper[4930]: I1012 05:58:40.292588 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b228afd2-c418-49a3-97d4-35a298e324a6-config-data\") pod \"keystone-bootstrap-h2tnf\" (UID: \"b228afd2-c418-49a3-97d4-35a298e324a6\") " pod="openstack/keystone-bootstrap-h2tnf" Oct 12 05:58:40 crc kubenswrapper[4930]: I1012 05:58:40.294497 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b228afd2-c418-49a3-97d4-35a298e324a6-combined-ca-bundle\") pod \"keystone-bootstrap-h2tnf\" (UID: \"b228afd2-c418-49a3-97d4-35a298e324a6\") " pod="openstack/keystone-bootstrap-h2tnf" Oct 12 05:58:40 crc kubenswrapper[4930]: I1012 05:58:40.299896 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b228afd2-c418-49a3-97d4-35a298e324a6-fernet-keys\") pod \"keystone-bootstrap-h2tnf\" (UID: \"b228afd2-c418-49a3-97d4-35a298e324a6\") " pod="openstack/keystone-bootstrap-h2tnf" Oct 12 05:58:40 crc kubenswrapper[4930]: I1012 05:58:40.301159 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nkgr\" (UniqueName: \"kubernetes.io/projected/b228afd2-c418-49a3-97d4-35a298e324a6-kube-api-access-4nkgr\") pod \"keystone-bootstrap-h2tnf\" (UID: \"b228afd2-c418-49a3-97d4-35a298e324a6\") " pod="openstack/keystone-bootstrap-h2tnf" Oct 12 05:58:40 crc kubenswrapper[4930]: I1012 05:58:40.418913 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-h2tnf" Oct 12 05:58:42 crc kubenswrapper[4930]: E1012 05:58:42.378781 4930 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-placement-api:current" Oct 12 05:58:42 crc kubenswrapper[4930]: E1012 05:58:42.379393 4930 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-placement-api:current" Oct 12 05:58:42 crc kubenswrapper[4930]: E1012 05:58:42.379558 4930 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.rdoproject.org/podified-master-centos10/openstack-placement-api:current,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mbhtq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-sdmls_openstack(ea4a96dd-8928-4248-b294-1b0b6413abef): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 12 05:58:42 crc kubenswrapper[4930]: E1012 05:58:42.381115 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-sdmls" podUID="ea4a96dd-8928-4248-b294-1b0b6413abef" Oct 12 05:58:42 crc kubenswrapper[4930]: E1012 05:58:42.866105 4930 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current" Oct 12 05:58:42 crc kubenswrapper[4930]: E1012 05:58:42.866151 4930 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current" Oct 12 05:58:42 crc kubenswrapper[4930]: E1012 05:58:42.866300 4930 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n69h4h665h65bh8h547h57h56bhffhfbhfbh9h64bh74h5f9h67dh98h699h65bh5d6h587h694h74h5b9h58bh56ch86h567h58chcbhc8h67bq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fsmjv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(7b25d9dd-3bce-4f8d-9219-1d6ce75878ed): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 12 05:58:42 crc kubenswrapper[4930]: E1012 05:58:42.979983 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-placement-api:current\\\"\"" pod="openstack/placement-db-sync-sdmls" podUID="ea4a96dd-8928-4248-b294-1b0b6413abef" Oct 12 05:58:47 crc kubenswrapper[4930]: I1012 05:58:47.197065 4930 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="799201c0-a8e7-40f8-bce3-8d43fc5f9a9e" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.152:9322/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 12 05:58:49 crc kubenswrapper[4930]: I1012 05:58:49.070531 4930 generic.go:334] "Generic (PLEG): container finished" podID="995795c9-befc-4ce9-8a38-8791ba628061" containerID="30ebdfc31a962560651811b3199c242f5f32817e99142ec937334cfc7c99609f" exitCode=0 Oct 12 05:58:49 crc kubenswrapper[4930]: I1012 05:58:49.070630 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-96nsw" event={"ID":"995795c9-befc-4ce9-8a38-8791ba628061","Type":"ContainerDied","Data":"30ebdfc31a962560651811b3199c242f5f32817e99142ec937334cfc7c99609f"} Oct 12 05:58:52 crc kubenswrapper[4930]: I1012 05:58:52.198385 4930 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="799201c0-a8e7-40f8-bce3-8d43fc5f9a9e" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.152:9322/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 12 05:58:52 crc kubenswrapper[4930]: E1012 05:58:52.651608 4930 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-horizon:current" Oct 12 05:58:52 crc kubenswrapper[4930]: E1012 05:58:52.651675 4930 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-horizon:current" Oct 12 05:58:52 crc kubenswrapper[4930]: E1012 05:58:52.651865 4930 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.rdoproject.org/podified-master-centos10/openstack-horizon:current,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5fbhd5h65h575h649h66hdch664h5fh647h699h59dh694h5fbh9bh85h664h687h65fh87h54fh67chdch7ch5cdh587h5ddh56bh5c8h665h5fch576q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:yes,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rx59v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-86b8d48789-7rd2m_openstack(9dd57cc3-6793-42f9-b938-620f968192c3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 12 05:58:52 crc kubenswrapper[4930]: E1012 05:58:52.703014 4930 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-glance-api:current" Oct 12 05:58:52 crc kubenswrapper[4930]: E1012 05:58:52.703078 4930 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-glance-api:current" Oct 12 05:58:52 crc kubenswrapper[4930]: E1012 05:58:52.703230 4930 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.rdoproject.org/podified-master-centos10/openstack-glance-api:current,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lr9gg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-jj2m5_openstack(448db83a-c0af-4680-890f-24b8d8da1088): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 12 05:58:52 crc kubenswrapper[4930]: E1012 05:58:52.704479 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-jj2m5" podUID="448db83a-c0af-4680-890f-24b8d8da1088" Oct 12 05:58:53 crc kubenswrapper[4930]: E1012 05:58:53.116275 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-glance-api:current\\\"\"" pod="openstack/glance-db-sync-jj2m5" podUID="448db83a-c0af-4680-890f-24b8d8da1088" Oct 12 05:58:53 crc kubenswrapper[4930]: E1012 05:58:53.200519 4930 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-barbican-api:current" Oct 12 05:58:53 crc kubenswrapper[4930]: E1012 05:58:53.200572 4930 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-barbican-api:current" Oct 12 05:58:53 crc kubenswrapper[4930]: E1012 05:58:53.200670 4930 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.rdoproject.org/podified-master-centos10/openstack-barbican-api:current,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-g882n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-dmtxt_openstack(ce22649f-fe81-4c02-a26a-15e45e306b82): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 12 05:58:53 crc kubenswrapper[4930]: E1012 05:58:53.201957 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-dmtxt" podUID="ce22649f-fe81-4c02-a26a-15e45e306b82" Oct 12 05:58:53 crc kubenswrapper[4930]: I1012 05:58:53.300685 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 12 05:58:53 crc kubenswrapper[4930]: I1012 05:58:53.367320 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/799201c0-a8e7-40f8-bce3-8d43fc5f9a9e-logs\") pod \"799201c0-a8e7-40f8-bce3-8d43fc5f9a9e\" (UID: \"799201c0-a8e7-40f8-bce3-8d43fc5f9a9e\") " Oct 12 05:58:53 crc kubenswrapper[4930]: I1012 05:58:53.367576 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmbhd\" (UniqueName: \"kubernetes.io/projected/799201c0-a8e7-40f8-bce3-8d43fc5f9a9e-kube-api-access-mmbhd\") pod \"799201c0-a8e7-40f8-bce3-8d43fc5f9a9e\" (UID: \"799201c0-a8e7-40f8-bce3-8d43fc5f9a9e\") " Oct 12 05:58:53 crc kubenswrapper[4930]: I1012 05:58:53.367811 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/799201c0-a8e7-40f8-bce3-8d43fc5f9a9e-custom-prometheus-ca\") pod \"799201c0-a8e7-40f8-bce3-8d43fc5f9a9e\" (UID: \"799201c0-a8e7-40f8-bce3-8d43fc5f9a9e\") " Oct 12 05:58:53 crc kubenswrapper[4930]: I1012 05:58:53.367911 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/799201c0-a8e7-40f8-bce3-8d43fc5f9a9e-combined-ca-bundle\") pod \"799201c0-a8e7-40f8-bce3-8d43fc5f9a9e\" (UID: \"799201c0-a8e7-40f8-bce3-8d43fc5f9a9e\") " Oct 12 05:58:53 crc kubenswrapper[4930]: I1012 05:58:53.367915 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/799201c0-a8e7-40f8-bce3-8d43fc5f9a9e-logs" (OuterVolumeSpecName: "logs") pod "799201c0-a8e7-40f8-bce3-8d43fc5f9a9e" (UID: "799201c0-a8e7-40f8-bce3-8d43fc5f9a9e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 05:58:53 crc kubenswrapper[4930]: I1012 05:58:53.368011 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/799201c0-a8e7-40f8-bce3-8d43fc5f9a9e-config-data\") pod \"799201c0-a8e7-40f8-bce3-8d43fc5f9a9e\" (UID: \"799201c0-a8e7-40f8-bce3-8d43fc5f9a9e\") " Oct 12 05:58:53 crc kubenswrapper[4930]: I1012 05:58:53.368628 4930 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/799201c0-a8e7-40f8-bce3-8d43fc5f9a9e-logs\") on node \"crc\" DevicePath \"\"" Oct 12 05:58:53 crc kubenswrapper[4930]: I1012 05:58:53.376651 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/799201c0-a8e7-40f8-bce3-8d43fc5f9a9e-kube-api-access-mmbhd" (OuterVolumeSpecName: "kube-api-access-mmbhd") pod "799201c0-a8e7-40f8-bce3-8d43fc5f9a9e" (UID: "799201c0-a8e7-40f8-bce3-8d43fc5f9a9e"). InnerVolumeSpecName "kube-api-access-mmbhd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:58:53 crc kubenswrapper[4930]: I1012 05:58:53.403493 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/799201c0-a8e7-40f8-bce3-8d43fc5f9a9e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "799201c0-a8e7-40f8-bce3-8d43fc5f9a9e" (UID: "799201c0-a8e7-40f8-bce3-8d43fc5f9a9e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:58:53 crc kubenswrapper[4930]: I1012 05:58:53.408814 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/799201c0-a8e7-40f8-bce3-8d43fc5f9a9e-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "799201c0-a8e7-40f8-bce3-8d43fc5f9a9e" (UID: "799201c0-a8e7-40f8-bce3-8d43fc5f9a9e"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:58:53 crc kubenswrapper[4930]: I1012 05:58:53.433911 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/799201c0-a8e7-40f8-bce3-8d43fc5f9a9e-config-data" (OuterVolumeSpecName: "config-data") pod "799201c0-a8e7-40f8-bce3-8d43fc5f9a9e" (UID: "799201c0-a8e7-40f8-bce3-8d43fc5f9a9e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:58:53 crc kubenswrapper[4930]: I1012 05:58:53.470686 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mmbhd\" (UniqueName: \"kubernetes.io/projected/799201c0-a8e7-40f8-bce3-8d43fc5f9a9e-kube-api-access-mmbhd\") on node \"crc\" DevicePath \"\"" Oct 12 05:58:53 crc kubenswrapper[4930]: I1012 05:58:53.470763 4930 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/799201c0-a8e7-40f8-bce3-8d43fc5f9a9e-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Oct 12 05:58:53 crc kubenswrapper[4930]: I1012 05:58:53.470784 4930 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/799201c0-a8e7-40f8-bce3-8d43fc5f9a9e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 05:58:53 crc kubenswrapper[4930]: I1012 05:58:53.470806 4930 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/799201c0-a8e7-40f8-bce3-8d43fc5f9a9e-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 05:58:54 crc kubenswrapper[4930]: I1012 05:58:54.130871 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 12 05:58:54 crc kubenswrapper[4930]: I1012 05:58:54.130877 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"799201c0-a8e7-40f8-bce3-8d43fc5f9a9e","Type":"ContainerDied","Data":"81cbd14d2bed27eafba8d27d628f41f54d6c93b983d72d2dd96030b3f9d62184"} Oct 12 05:58:54 crc kubenswrapper[4930]: E1012 05:58:54.133083 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-barbican-api:current\\\"\"" pod="openstack/barbican-db-sync-dmtxt" podUID="ce22649f-fe81-4c02-a26a-15e45e306b82" Oct 12 05:58:54 crc kubenswrapper[4930]: I1012 05:58:54.196307 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Oct 12 05:58:54 crc kubenswrapper[4930]: I1012 05:58:54.214055 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-api-0"] Oct 12 05:58:54 crc kubenswrapper[4930]: I1012 05:58:54.220557 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Oct 12 05:58:54 crc kubenswrapper[4930]: E1012 05:58:54.221164 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="799201c0-a8e7-40f8-bce3-8d43fc5f9a9e" containerName="watcher-api-log" Oct 12 05:58:54 crc kubenswrapper[4930]: I1012 05:58:54.221259 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="799201c0-a8e7-40f8-bce3-8d43fc5f9a9e" containerName="watcher-api-log" Oct 12 05:58:54 crc kubenswrapper[4930]: E1012 05:58:54.223106 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="799201c0-a8e7-40f8-bce3-8d43fc5f9a9e" containerName="watcher-api" Oct 12 05:58:54 crc kubenswrapper[4930]: I1012 05:58:54.223324 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="799201c0-a8e7-40f8-bce3-8d43fc5f9a9e" containerName="watcher-api" Oct 12 05:58:54 crc kubenswrapper[4930]: I1012 05:58:54.224154 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="799201c0-a8e7-40f8-bce3-8d43fc5f9a9e" containerName="watcher-api-log" Oct 12 05:58:54 crc kubenswrapper[4930]: I1012 05:58:54.224374 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="799201c0-a8e7-40f8-bce3-8d43fc5f9a9e" containerName="watcher-api" Oct 12 05:58:54 crc kubenswrapper[4930]: I1012 05:58:54.226519 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 12 05:58:54 crc kubenswrapper[4930]: I1012 05:58:54.226898 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Oct 12 05:58:54 crc kubenswrapper[4930]: I1012 05:58:54.233027 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Oct 12 05:58:54 crc kubenswrapper[4930]: I1012 05:58:54.284026 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f64aa33-3396-4700-bb0f-57b16e39e368-logs\") pod \"watcher-api-0\" (UID: \"8f64aa33-3396-4700-bb0f-57b16e39e368\") " pod="openstack/watcher-api-0" Oct 12 05:58:54 crc kubenswrapper[4930]: I1012 05:58:54.284074 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgkvv\" (UniqueName: \"kubernetes.io/projected/8f64aa33-3396-4700-bb0f-57b16e39e368-kube-api-access-lgkvv\") pod \"watcher-api-0\" (UID: \"8f64aa33-3396-4700-bb0f-57b16e39e368\") " pod="openstack/watcher-api-0" Oct 12 05:58:54 crc kubenswrapper[4930]: I1012 05:58:54.284110 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f64aa33-3396-4700-bb0f-57b16e39e368-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"8f64aa33-3396-4700-bb0f-57b16e39e368\") " pod="openstack/watcher-api-0" Oct 12 05:58:54 crc kubenswrapper[4930]: I1012 05:58:54.284141 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/8f64aa33-3396-4700-bb0f-57b16e39e368-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"8f64aa33-3396-4700-bb0f-57b16e39e368\") " pod="openstack/watcher-api-0" Oct 12 05:58:54 crc kubenswrapper[4930]: I1012 05:58:54.284189 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f64aa33-3396-4700-bb0f-57b16e39e368-config-data\") pod \"watcher-api-0\" (UID: \"8f64aa33-3396-4700-bb0f-57b16e39e368\") " pod="openstack/watcher-api-0" Oct 12 05:58:54 crc kubenswrapper[4930]: I1012 05:58:54.385798 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f64aa33-3396-4700-bb0f-57b16e39e368-config-data\") pod \"watcher-api-0\" (UID: \"8f64aa33-3396-4700-bb0f-57b16e39e368\") " pod="openstack/watcher-api-0" Oct 12 05:58:54 crc kubenswrapper[4930]: I1012 05:58:54.385964 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f64aa33-3396-4700-bb0f-57b16e39e368-logs\") pod \"watcher-api-0\" (UID: \"8f64aa33-3396-4700-bb0f-57b16e39e368\") " pod="openstack/watcher-api-0" Oct 12 05:58:54 crc kubenswrapper[4930]: I1012 05:58:54.386266 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgkvv\" (UniqueName: \"kubernetes.io/projected/8f64aa33-3396-4700-bb0f-57b16e39e368-kube-api-access-lgkvv\") pod \"watcher-api-0\" (UID: \"8f64aa33-3396-4700-bb0f-57b16e39e368\") " pod="openstack/watcher-api-0" Oct 12 05:58:54 crc kubenswrapper[4930]: I1012 05:58:54.386636 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f64aa33-3396-4700-bb0f-57b16e39e368-logs\") pod \"watcher-api-0\" (UID: \"8f64aa33-3396-4700-bb0f-57b16e39e368\") " pod="openstack/watcher-api-0" Oct 12 05:58:54 crc kubenswrapper[4930]: I1012 05:58:54.386949 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f64aa33-3396-4700-bb0f-57b16e39e368-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"8f64aa33-3396-4700-bb0f-57b16e39e368\") " pod="openstack/watcher-api-0" Oct 12 05:58:54 crc kubenswrapper[4930]: I1012 05:58:54.386984 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/8f64aa33-3396-4700-bb0f-57b16e39e368-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"8f64aa33-3396-4700-bb0f-57b16e39e368\") " pod="openstack/watcher-api-0" Oct 12 05:58:54 crc kubenswrapper[4930]: I1012 05:58:54.390982 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f64aa33-3396-4700-bb0f-57b16e39e368-config-data\") pod \"watcher-api-0\" (UID: \"8f64aa33-3396-4700-bb0f-57b16e39e368\") " pod="openstack/watcher-api-0" Oct 12 05:58:54 crc kubenswrapper[4930]: I1012 05:58:54.405116 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/8f64aa33-3396-4700-bb0f-57b16e39e368-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"8f64aa33-3396-4700-bb0f-57b16e39e368\") " pod="openstack/watcher-api-0" Oct 12 05:58:54 crc kubenswrapper[4930]: I1012 05:58:54.405774 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f64aa33-3396-4700-bb0f-57b16e39e368-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"8f64aa33-3396-4700-bb0f-57b16e39e368\") " pod="openstack/watcher-api-0" Oct 12 05:58:54 crc kubenswrapper[4930]: I1012 05:58:54.407631 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgkvv\" (UniqueName: \"kubernetes.io/projected/8f64aa33-3396-4700-bb0f-57b16e39e368-kube-api-access-lgkvv\") pod \"watcher-api-0\" (UID: \"8f64aa33-3396-4700-bb0f-57b16e39e368\") " pod="openstack/watcher-api-0" Oct 12 05:58:54 crc kubenswrapper[4930]: I1012 05:58:54.543118 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 12 05:58:56 crc kubenswrapper[4930]: I1012 05:58:56.155150 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="799201c0-a8e7-40f8-bce3-8d43fc5f9a9e" path="/var/lib/kubelet/pods/799201c0-a8e7-40f8-bce3-8d43fc5f9a9e/volumes" Oct 12 05:58:57 crc kubenswrapper[4930]: I1012 05:58:57.199698 4930 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="799201c0-a8e7-40f8-bce3-8d43fc5f9a9e" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.152:9322/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 12 05:59:00 crc kubenswrapper[4930]: I1012 05:59:00.958715 4930 scope.go:117] "RemoveContainer" containerID="2e692fae791c683fc477c82b235c0199958dbe386c5ca7b83a54dc4179b5cbdb" Oct 12 05:59:01 crc kubenswrapper[4930]: I1012 05:59:01.109684 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-96nsw" Oct 12 05:59:01 crc kubenswrapper[4930]: I1012 05:59:01.211274 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jg46m\" (UniqueName: \"kubernetes.io/projected/995795c9-befc-4ce9-8a38-8791ba628061-kube-api-access-jg46m\") pod \"995795c9-befc-4ce9-8a38-8791ba628061\" (UID: \"995795c9-befc-4ce9-8a38-8791ba628061\") " Oct 12 05:59:01 crc kubenswrapper[4930]: I1012 05:59:01.211462 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/995795c9-befc-4ce9-8a38-8791ba628061-config\") pod \"995795c9-befc-4ce9-8a38-8791ba628061\" (UID: \"995795c9-befc-4ce9-8a38-8791ba628061\") " Oct 12 05:59:01 crc kubenswrapper[4930]: I1012 05:59:01.211607 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/995795c9-befc-4ce9-8a38-8791ba628061-combined-ca-bundle\") pod \"995795c9-befc-4ce9-8a38-8791ba628061\" (UID: \"995795c9-befc-4ce9-8a38-8791ba628061\") " Oct 12 05:59:01 crc kubenswrapper[4930]: I1012 05:59:01.217943 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/995795c9-befc-4ce9-8a38-8791ba628061-kube-api-access-jg46m" (OuterVolumeSpecName: "kube-api-access-jg46m") pod "995795c9-befc-4ce9-8a38-8791ba628061" (UID: "995795c9-befc-4ce9-8a38-8791ba628061"). InnerVolumeSpecName "kube-api-access-jg46m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:59:01 crc kubenswrapper[4930]: I1012 05:59:01.220119 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-96nsw" event={"ID":"995795c9-befc-4ce9-8a38-8791ba628061","Type":"ContainerDied","Data":"7ac18e15dea8e2a93d519b9426cd780512681ad422d0dfa569d49cb212ef7d13"} Oct 12 05:59:01 crc kubenswrapper[4930]: I1012 05:59:01.220155 4930 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ac18e15dea8e2a93d519b9426cd780512681ad422d0dfa569d49cb212ef7d13" Oct 12 05:59:01 crc kubenswrapper[4930]: I1012 05:59:01.220184 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-96nsw" Oct 12 05:59:01 crc kubenswrapper[4930]: I1012 05:59:01.245904 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/995795c9-befc-4ce9-8a38-8791ba628061-config" (OuterVolumeSpecName: "config") pod "995795c9-befc-4ce9-8a38-8791ba628061" (UID: "995795c9-befc-4ce9-8a38-8791ba628061"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:59:01 crc kubenswrapper[4930]: I1012 05:59:01.254358 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/995795c9-befc-4ce9-8a38-8791ba628061-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "995795c9-befc-4ce9-8a38-8791ba628061" (UID: "995795c9-befc-4ce9-8a38-8791ba628061"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:59:01 crc kubenswrapper[4930]: I1012 05:59:01.314110 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jg46m\" (UniqueName: \"kubernetes.io/projected/995795c9-befc-4ce9-8a38-8791ba628061-kube-api-access-jg46m\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:01 crc kubenswrapper[4930]: I1012 05:59:01.314471 4930 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/995795c9-befc-4ce9-8a38-8791ba628061-config\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:01 crc kubenswrapper[4930]: I1012 05:59:01.314636 4930 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/995795c9-befc-4ce9-8a38-8791ba628061-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:02 crc kubenswrapper[4930]: I1012 05:59:02.422292 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-578598f949-m7h9x"] Oct 12 05:59:02 crc kubenswrapper[4930]: I1012 05:59:02.465945 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7648c6b969-m7fgp"] Oct 12 05:59:02 crc kubenswrapper[4930]: E1012 05:59:02.466437 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="995795c9-befc-4ce9-8a38-8791ba628061" containerName="neutron-db-sync" Oct 12 05:59:02 crc kubenswrapper[4930]: I1012 05:59:02.466463 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="995795c9-befc-4ce9-8a38-8791ba628061" containerName="neutron-db-sync" Oct 12 05:59:02 crc kubenswrapper[4930]: I1012 05:59:02.466697 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="995795c9-befc-4ce9-8a38-8791ba628061" containerName="neutron-db-sync" Oct 12 05:59:02 crc kubenswrapper[4930]: I1012 05:59:02.467923 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7648c6b969-m7fgp" Oct 12 05:59:02 crc kubenswrapper[4930]: I1012 05:59:02.487129 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7648c6b969-m7fgp"] Oct 12 05:59:02 crc kubenswrapper[4930]: I1012 05:59:02.515544 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-578d784664-rp79z"] Oct 12 05:59:02 crc kubenswrapper[4930]: I1012 05:59:02.517855 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-578d784664-rp79z" Oct 12 05:59:02 crc kubenswrapper[4930]: I1012 05:59:02.520715 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-xtxgk" Oct 12 05:59:02 crc kubenswrapper[4930]: I1012 05:59:02.522522 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 12 05:59:02 crc kubenswrapper[4930]: I1012 05:59:02.522602 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 12 05:59:02 crc kubenswrapper[4930]: I1012 05:59:02.522798 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Oct 12 05:59:02 crc kubenswrapper[4930]: I1012 05:59:02.532579 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-578d784664-rp79z"] Oct 12 05:59:02 crc kubenswrapper[4930]: I1012 05:59:02.649271 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39903c9a-2fa6-4fb3-8868-b2c055bdd11c-config\") pod \"dnsmasq-dns-7648c6b969-m7fgp\" (UID: \"39903c9a-2fa6-4fb3-8868-b2c055bdd11c\") " pod="openstack/dnsmasq-dns-7648c6b969-m7fgp" Oct 12 05:59:02 crc kubenswrapper[4930]: I1012 05:59:02.649317 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9000baf-8bd7-4d76-a2c8-b98ad280728d-combined-ca-bundle\") pod \"neutron-578d784664-rp79z\" (UID: \"c9000baf-8bd7-4d76-a2c8-b98ad280728d\") " pod="openstack/neutron-578d784664-rp79z" Oct 12 05:59:02 crc kubenswrapper[4930]: I1012 05:59:02.649388 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqp9m\" (UniqueName: \"kubernetes.io/projected/c9000baf-8bd7-4d76-a2c8-b98ad280728d-kube-api-access-lqp9m\") pod \"neutron-578d784664-rp79z\" (UID: \"c9000baf-8bd7-4d76-a2c8-b98ad280728d\") " pod="openstack/neutron-578d784664-rp79z" Oct 12 05:59:02 crc kubenswrapper[4930]: I1012 05:59:02.649428 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/39903c9a-2fa6-4fb3-8868-b2c055bdd11c-dns-swift-storage-0\") pod \"dnsmasq-dns-7648c6b969-m7fgp\" (UID: \"39903c9a-2fa6-4fb3-8868-b2c055bdd11c\") " pod="openstack/dnsmasq-dns-7648c6b969-m7fgp" Oct 12 05:59:02 crc kubenswrapper[4930]: I1012 05:59:02.649481 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/39903c9a-2fa6-4fb3-8868-b2c055bdd11c-ovsdbserver-nb\") pod \"dnsmasq-dns-7648c6b969-m7fgp\" (UID: \"39903c9a-2fa6-4fb3-8868-b2c055bdd11c\") " pod="openstack/dnsmasq-dns-7648c6b969-m7fgp" Oct 12 05:59:02 crc kubenswrapper[4930]: I1012 05:59:02.649502 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c9000baf-8bd7-4d76-a2c8-b98ad280728d-httpd-config\") pod \"neutron-578d784664-rp79z\" (UID: \"c9000baf-8bd7-4d76-a2c8-b98ad280728d\") " pod="openstack/neutron-578d784664-rp79z" Oct 12 05:59:02 crc kubenswrapper[4930]: I1012 05:59:02.649537 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/39903c9a-2fa6-4fb3-8868-b2c055bdd11c-ovsdbserver-sb\") pod \"dnsmasq-dns-7648c6b969-m7fgp\" (UID: \"39903c9a-2fa6-4fb3-8868-b2c055bdd11c\") " pod="openstack/dnsmasq-dns-7648c6b969-m7fgp" Oct 12 05:59:02 crc kubenswrapper[4930]: I1012 05:59:02.649599 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c9000baf-8bd7-4d76-a2c8-b98ad280728d-config\") pod \"neutron-578d784664-rp79z\" (UID: \"c9000baf-8bd7-4d76-a2c8-b98ad280728d\") " pod="openstack/neutron-578d784664-rp79z" Oct 12 05:59:02 crc kubenswrapper[4930]: I1012 05:59:02.649627 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/39903c9a-2fa6-4fb3-8868-b2c055bdd11c-dns-svc\") pod \"dnsmasq-dns-7648c6b969-m7fgp\" (UID: \"39903c9a-2fa6-4fb3-8868-b2c055bdd11c\") " pod="openstack/dnsmasq-dns-7648c6b969-m7fgp" Oct 12 05:59:02 crc kubenswrapper[4930]: I1012 05:59:02.649665 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9000baf-8bd7-4d76-a2c8-b98ad280728d-ovndb-tls-certs\") pod \"neutron-578d784664-rp79z\" (UID: \"c9000baf-8bd7-4d76-a2c8-b98ad280728d\") " pod="openstack/neutron-578d784664-rp79z" Oct 12 05:59:02 crc kubenswrapper[4930]: I1012 05:59:02.649707 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6wkn\" (UniqueName: \"kubernetes.io/projected/39903c9a-2fa6-4fb3-8868-b2c055bdd11c-kube-api-access-s6wkn\") pod \"dnsmasq-dns-7648c6b969-m7fgp\" (UID: \"39903c9a-2fa6-4fb3-8868-b2c055bdd11c\") " pod="openstack/dnsmasq-dns-7648c6b969-m7fgp" Oct 12 05:59:02 crc kubenswrapper[4930]: I1012 05:59:02.750781 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/39903c9a-2fa6-4fb3-8868-b2c055bdd11c-ovsdbserver-nb\") pod \"dnsmasq-dns-7648c6b969-m7fgp\" (UID: \"39903c9a-2fa6-4fb3-8868-b2c055bdd11c\") " pod="openstack/dnsmasq-dns-7648c6b969-m7fgp" Oct 12 05:59:02 crc kubenswrapper[4930]: I1012 05:59:02.751112 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c9000baf-8bd7-4d76-a2c8-b98ad280728d-httpd-config\") pod \"neutron-578d784664-rp79z\" (UID: \"c9000baf-8bd7-4d76-a2c8-b98ad280728d\") " pod="openstack/neutron-578d784664-rp79z" Oct 12 05:59:02 crc kubenswrapper[4930]: I1012 05:59:02.751142 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/39903c9a-2fa6-4fb3-8868-b2c055bdd11c-ovsdbserver-sb\") pod \"dnsmasq-dns-7648c6b969-m7fgp\" (UID: \"39903c9a-2fa6-4fb3-8868-b2c055bdd11c\") " pod="openstack/dnsmasq-dns-7648c6b969-m7fgp" Oct 12 05:59:02 crc kubenswrapper[4930]: I1012 05:59:02.751162 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c9000baf-8bd7-4d76-a2c8-b98ad280728d-config\") pod \"neutron-578d784664-rp79z\" (UID: \"c9000baf-8bd7-4d76-a2c8-b98ad280728d\") " pod="openstack/neutron-578d784664-rp79z" Oct 12 05:59:02 crc kubenswrapper[4930]: I1012 05:59:02.751181 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/39903c9a-2fa6-4fb3-8868-b2c055bdd11c-dns-svc\") pod \"dnsmasq-dns-7648c6b969-m7fgp\" (UID: \"39903c9a-2fa6-4fb3-8868-b2c055bdd11c\") " pod="openstack/dnsmasq-dns-7648c6b969-m7fgp" Oct 12 05:59:02 crc kubenswrapper[4930]: I1012 05:59:02.751211 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9000baf-8bd7-4d76-a2c8-b98ad280728d-ovndb-tls-certs\") pod \"neutron-578d784664-rp79z\" (UID: \"c9000baf-8bd7-4d76-a2c8-b98ad280728d\") " pod="openstack/neutron-578d784664-rp79z" Oct 12 05:59:02 crc kubenswrapper[4930]: I1012 05:59:02.751243 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6wkn\" (UniqueName: \"kubernetes.io/projected/39903c9a-2fa6-4fb3-8868-b2c055bdd11c-kube-api-access-s6wkn\") pod \"dnsmasq-dns-7648c6b969-m7fgp\" (UID: \"39903c9a-2fa6-4fb3-8868-b2c055bdd11c\") " pod="openstack/dnsmasq-dns-7648c6b969-m7fgp" Oct 12 05:59:02 crc kubenswrapper[4930]: I1012 05:59:02.751288 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39903c9a-2fa6-4fb3-8868-b2c055bdd11c-config\") pod \"dnsmasq-dns-7648c6b969-m7fgp\" (UID: \"39903c9a-2fa6-4fb3-8868-b2c055bdd11c\") " pod="openstack/dnsmasq-dns-7648c6b969-m7fgp" Oct 12 05:59:02 crc kubenswrapper[4930]: I1012 05:59:02.751314 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9000baf-8bd7-4d76-a2c8-b98ad280728d-combined-ca-bundle\") pod \"neutron-578d784664-rp79z\" (UID: \"c9000baf-8bd7-4d76-a2c8-b98ad280728d\") " pod="openstack/neutron-578d784664-rp79z" Oct 12 05:59:02 crc kubenswrapper[4930]: I1012 05:59:02.751373 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqp9m\" (UniqueName: \"kubernetes.io/projected/c9000baf-8bd7-4d76-a2c8-b98ad280728d-kube-api-access-lqp9m\") pod \"neutron-578d784664-rp79z\" (UID: \"c9000baf-8bd7-4d76-a2c8-b98ad280728d\") " pod="openstack/neutron-578d784664-rp79z" Oct 12 05:59:02 crc kubenswrapper[4930]: I1012 05:59:02.751419 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/39903c9a-2fa6-4fb3-8868-b2c055bdd11c-dns-swift-storage-0\") pod \"dnsmasq-dns-7648c6b969-m7fgp\" (UID: \"39903c9a-2fa6-4fb3-8868-b2c055bdd11c\") " pod="openstack/dnsmasq-dns-7648c6b969-m7fgp" Oct 12 05:59:02 crc kubenswrapper[4930]: I1012 05:59:02.752506 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/39903c9a-2fa6-4fb3-8868-b2c055bdd11c-dns-swift-storage-0\") pod \"dnsmasq-dns-7648c6b969-m7fgp\" (UID: \"39903c9a-2fa6-4fb3-8868-b2c055bdd11c\") " pod="openstack/dnsmasq-dns-7648c6b969-m7fgp" Oct 12 05:59:02 crc kubenswrapper[4930]: I1012 05:59:02.753084 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/39903c9a-2fa6-4fb3-8868-b2c055bdd11c-ovsdbserver-nb\") pod \"dnsmasq-dns-7648c6b969-m7fgp\" (UID: \"39903c9a-2fa6-4fb3-8868-b2c055bdd11c\") " pod="openstack/dnsmasq-dns-7648c6b969-m7fgp" Oct 12 05:59:02 crc kubenswrapper[4930]: I1012 05:59:02.758228 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39903c9a-2fa6-4fb3-8868-b2c055bdd11c-config\") pod \"dnsmasq-dns-7648c6b969-m7fgp\" (UID: \"39903c9a-2fa6-4fb3-8868-b2c055bdd11c\") " pod="openstack/dnsmasq-dns-7648c6b969-m7fgp" Oct 12 05:59:02 crc kubenswrapper[4930]: I1012 05:59:02.759281 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/39903c9a-2fa6-4fb3-8868-b2c055bdd11c-dns-svc\") pod \"dnsmasq-dns-7648c6b969-m7fgp\" (UID: \"39903c9a-2fa6-4fb3-8868-b2c055bdd11c\") " pod="openstack/dnsmasq-dns-7648c6b969-m7fgp" Oct 12 05:59:02 crc kubenswrapper[4930]: I1012 05:59:02.764883 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9000baf-8bd7-4d76-a2c8-b98ad280728d-combined-ca-bundle\") pod \"neutron-578d784664-rp79z\" (UID: \"c9000baf-8bd7-4d76-a2c8-b98ad280728d\") " pod="openstack/neutron-578d784664-rp79z" Oct 12 05:59:02 crc kubenswrapper[4930]: I1012 05:59:02.765082 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c9000baf-8bd7-4d76-a2c8-b98ad280728d-config\") pod \"neutron-578d784664-rp79z\" (UID: \"c9000baf-8bd7-4d76-a2c8-b98ad280728d\") " pod="openstack/neutron-578d784664-rp79z" Oct 12 05:59:02 crc kubenswrapper[4930]: I1012 05:59:02.765305 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9000baf-8bd7-4d76-a2c8-b98ad280728d-ovndb-tls-certs\") pod \"neutron-578d784664-rp79z\" (UID: \"c9000baf-8bd7-4d76-a2c8-b98ad280728d\") " pod="openstack/neutron-578d784664-rp79z" Oct 12 05:59:02 crc kubenswrapper[4930]: I1012 05:59:02.777888 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/39903c9a-2fa6-4fb3-8868-b2c055bdd11c-ovsdbserver-sb\") pod \"dnsmasq-dns-7648c6b969-m7fgp\" (UID: \"39903c9a-2fa6-4fb3-8868-b2c055bdd11c\") " pod="openstack/dnsmasq-dns-7648c6b969-m7fgp" Oct 12 05:59:02 crc kubenswrapper[4930]: I1012 05:59:02.778648 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c9000baf-8bd7-4d76-a2c8-b98ad280728d-httpd-config\") pod \"neutron-578d784664-rp79z\" (UID: \"c9000baf-8bd7-4d76-a2c8-b98ad280728d\") " pod="openstack/neutron-578d784664-rp79z" Oct 12 05:59:02 crc kubenswrapper[4930]: I1012 05:59:02.781817 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6wkn\" (UniqueName: \"kubernetes.io/projected/39903c9a-2fa6-4fb3-8868-b2c055bdd11c-kube-api-access-s6wkn\") pod \"dnsmasq-dns-7648c6b969-m7fgp\" (UID: \"39903c9a-2fa6-4fb3-8868-b2c055bdd11c\") " pod="openstack/dnsmasq-dns-7648c6b969-m7fgp" Oct 12 05:59:02 crc kubenswrapper[4930]: I1012 05:59:02.782858 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqp9m\" (UniqueName: \"kubernetes.io/projected/c9000baf-8bd7-4d76-a2c8-b98ad280728d-kube-api-access-lqp9m\") pod \"neutron-578d784664-rp79z\" (UID: \"c9000baf-8bd7-4d76-a2c8-b98ad280728d\") " pod="openstack/neutron-578d784664-rp79z" Oct 12 05:59:02 crc kubenswrapper[4930]: I1012 05:59:02.783690 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7648c6b969-m7fgp" Oct 12 05:59:02 crc kubenswrapper[4930]: E1012 05:59:02.791849 4930 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-cinder-api:current" Oct 12 05:59:02 crc kubenswrapper[4930]: E1012 05:59:02.791915 4930 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-cinder-api:current" Oct 12 05:59:02 crc kubenswrapper[4930]: E1012 05:59:02.793676 4930 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.rdoproject.org/podified-master-centos10/openstack-cinder-api:current,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-66mtr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-bg9r7_openstack(99ff807c-d810-4ff9-9ed3-3b2d37d3fbed): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 12 05:59:02 crc kubenswrapper[4930]: E1012 05:59:02.795122 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-bg9r7" podUID="99ff807c-d810-4ff9-9ed3-3b2d37d3fbed" Oct 12 05:59:02 crc kubenswrapper[4930]: I1012 05:59:02.839349 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-578d784664-rp79z" Oct 12 05:59:03 crc kubenswrapper[4930]: I1012 05:59:03.045590 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6778cd8bb8-9zhz5"] Oct 12 05:59:03 crc kubenswrapper[4930]: E1012 05:59:03.248705 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cinder-api:current\\\"\"" pod="openstack/cinder-db-sync-bg9r7" podUID="99ff807c-d810-4ff9-9ed3-3b2d37d3fbed" Oct 12 05:59:04 crc kubenswrapper[4930]: I1012 05:59:04.911461 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6844c9655c-rvdcz"] Oct 12 05:59:04 crc kubenswrapper[4930]: I1012 05:59:04.913404 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6844c9655c-rvdcz" Oct 12 05:59:04 crc kubenswrapper[4930]: I1012 05:59:04.915844 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Oct 12 05:59:04 crc kubenswrapper[4930]: I1012 05:59:04.917604 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Oct 12 05:59:04 crc kubenswrapper[4930]: I1012 05:59:04.926225 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6844c9655c-rvdcz"] Oct 12 05:59:05 crc kubenswrapper[4930]: I1012 05:59:05.102844 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q78p2\" (UniqueName: \"kubernetes.io/projected/532bae95-f9fc-4633-b53c-2f398cbb8bd2-kube-api-access-q78p2\") pod \"neutron-6844c9655c-rvdcz\" (UID: \"532bae95-f9fc-4633-b53c-2f398cbb8bd2\") " pod="openstack/neutron-6844c9655c-rvdcz" Oct 12 05:59:05 crc kubenswrapper[4930]: I1012 05:59:05.103186 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/532bae95-f9fc-4633-b53c-2f398cbb8bd2-internal-tls-certs\") pod \"neutron-6844c9655c-rvdcz\" (UID: \"532bae95-f9fc-4633-b53c-2f398cbb8bd2\") " pod="openstack/neutron-6844c9655c-rvdcz" Oct 12 05:59:05 crc kubenswrapper[4930]: I1012 05:59:05.103237 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/532bae95-f9fc-4633-b53c-2f398cbb8bd2-ovndb-tls-certs\") pod \"neutron-6844c9655c-rvdcz\" (UID: \"532bae95-f9fc-4633-b53c-2f398cbb8bd2\") " pod="openstack/neutron-6844c9655c-rvdcz" Oct 12 05:59:05 crc kubenswrapper[4930]: I1012 05:59:05.103269 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/532bae95-f9fc-4633-b53c-2f398cbb8bd2-config\") pod \"neutron-6844c9655c-rvdcz\" (UID: \"532bae95-f9fc-4633-b53c-2f398cbb8bd2\") " pod="openstack/neutron-6844c9655c-rvdcz" Oct 12 05:59:05 crc kubenswrapper[4930]: I1012 05:59:05.103320 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/532bae95-f9fc-4633-b53c-2f398cbb8bd2-combined-ca-bundle\") pod \"neutron-6844c9655c-rvdcz\" (UID: \"532bae95-f9fc-4633-b53c-2f398cbb8bd2\") " pod="openstack/neutron-6844c9655c-rvdcz" Oct 12 05:59:05 crc kubenswrapper[4930]: I1012 05:59:05.103392 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/532bae95-f9fc-4633-b53c-2f398cbb8bd2-public-tls-certs\") pod \"neutron-6844c9655c-rvdcz\" (UID: \"532bae95-f9fc-4633-b53c-2f398cbb8bd2\") " pod="openstack/neutron-6844c9655c-rvdcz" Oct 12 05:59:05 crc kubenswrapper[4930]: I1012 05:59:05.103454 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/532bae95-f9fc-4633-b53c-2f398cbb8bd2-httpd-config\") pod \"neutron-6844c9655c-rvdcz\" (UID: \"532bae95-f9fc-4633-b53c-2f398cbb8bd2\") " pod="openstack/neutron-6844c9655c-rvdcz" Oct 12 05:59:05 crc kubenswrapper[4930]: I1012 05:59:05.204825 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/532bae95-f9fc-4633-b53c-2f398cbb8bd2-public-tls-certs\") pod \"neutron-6844c9655c-rvdcz\" (UID: \"532bae95-f9fc-4633-b53c-2f398cbb8bd2\") " pod="openstack/neutron-6844c9655c-rvdcz" Oct 12 05:59:05 crc kubenswrapper[4930]: I1012 05:59:05.204898 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/532bae95-f9fc-4633-b53c-2f398cbb8bd2-httpd-config\") pod \"neutron-6844c9655c-rvdcz\" (UID: \"532bae95-f9fc-4633-b53c-2f398cbb8bd2\") " pod="openstack/neutron-6844c9655c-rvdcz" Oct 12 05:59:05 crc kubenswrapper[4930]: I1012 05:59:05.204949 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q78p2\" (UniqueName: \"kubernetes.io/projected/532bae95-f9fc-4633-b53c-2f398cbb8bd2-kube-api-access-q78p2\") pod \"neutron-6844c9655c-rvdcz\" (UID: \"532bae95-f9fc-4633-b53c-2f398cbb8bd2\") " pod="openstack/neutron-6844c9655c-rvdcz" Oct 12 05:59:05 crc kubenswrapper[4930]: I1012 05:59:05.204986 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/532bae95-f9fc-4633-b53c-2f398cbb8bd2-internal-tls-certs\") pod \"neutron-6844c9655c-rvdcz\" (UID: \"532bae95-f9fc-4633-b53c-2f398cbb8bd2\") " pod="openstack/neutron-6844c9655c-rvdcz" Oct 12 05:59:05 crc kubenswrapper[4930]: I1012 05:59:05.205019 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/532bae95-f9fc-4633-b53c-2f398cbb8bd2-ovndb-tls-certs\") pod \"neutron-6844c9655c-rvdcz\" (UID: \"532bae95-f9fc-4633-b53c-2f398cbb8bd2\") " pod="openstack/neutron-6844c9655c-rvdcz" Oct 12 05:59:05 crc kubenswrapper[4930]: I1012 05:59:05.205040 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/532bae95-f9fc-4633-b53c-2f398cbb8bd2-config\") pod \"neutron-6844c9655c-rvdcz\" (UID: \"532bae95-f9fc-4633-b53c-2f398cbb8bd2\") " pod="openstack/neutron-6844c9655c-rvdcz" Oct 12 05:59:05 crc kubenswrapper[4930]: I1012 05:59:05.205057 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/532bae95-f9fc-4633-b53c-2f398cbb8bd2-combined-ca-bundle\") pod \"neutron-6844c9655c-rvdcz\" (UID: \"532bae95-f9fc-4633-b53c-2f398cbb8bd2\") " pod="openstack/neutron-6844c9655c-rvdcz" Oct 12 05:59:05 crc kubenswrapper[4930]: I1012 05:59:05.211124 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/532bae95-f9fc-4633-b53c-2f398cbb8bd2-internal-tls-certs\") pod \"neutron-6844c9655c-rvdcz\" (UID: \"532bae95-f9fc-4633-b53c-2f398cbb8bd2\") " pod="openstack/neutron-6844c9655c-rvdcz" Oct 12 05:59:05 crc kubenswrapper[4930]: I1012 05:59:05.212091 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/532bae95-f9fc-4633-b53c-2f398cbb8bd2-public-tls-certs\") pod \"neutron-6844c9655c-rvdcz\" (UID: \"532bae95-f9fc-4633-b53c-2f398cbb8bd2\") " pod="openstack/neutron-6844c9655c-rvdcz" Oct 12 05:59:05 crc kubenswrapper[4930]: I1012 05:59:05.212823 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/532bae95-f9fc-4633-b53c-2f398cbb8bd2-httpd-config\") pod \"neutron-6844c9655c-rvdcz\" (UID: \"532bae95-f9fc-4633-b53c-2f398cbb8bd2\") " pod="openstack/neutron-6844c9655c-rvdcz" Oct 12 05:59:05 crc kubenswrapper[4930]: I1012 05:59:05.213255 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/532bae95-f9fc-4633-b53c-2f398cbb8bd2-combined-ca-bundle\") pod \"neutron-6844c9655c-rvdcz\" (UID: \"532bae95-f9fc-4633-b53c-2f398cbb8bd2\") " pod="openstack/neutron-6844c9655c-rvdcz" Oct 12 05:59:05 crc kubenswrapper[4930]: I1012 05:59:05.213379 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/532bae95-f9fc-4633-b53c-2f398cbb8bd2-config\") pod \"neutron-6844c9655c-rvdcz\" (UID: \"532bae95-f9fc-4633-b53c-2f398cbb8bd2\") " pod="openstack/neutron-6844c9655c-rvdcz" Oct 12 05:59:05 crc kubenswrapper[4930]: I1012 05:59:05.214489 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/532bae95-f9fc-4633-b53c-2f398cbb8bd2-ovndb-tls-certs\") pod \"neutron-6844c9655c-rvdcz\" (UID: \"532bae95-f9fc-4633-b53c-2f398cbb8bd2\") " pod="openstack/neutron-6844c9655c-rvdcz" Oct 12 05:59:05 crc kubenswrapper[4930]: I1012 05:59:05.234482 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q78p2\" (UniqueName: \"kubernetes.io/projected/532bae95-f9fc-4633-b53c-2f398cbb8bd2-kube-api-access-q78p2\") pod \"neutron-6844c9655c-rvdcz\" (UID: \"532bae95-f9fc-4633-b53c-2f398cbb8bd2\") " pod="openstack/neutron-6844c9655c-rvdcz" Oct 12 05:59:05 crc kubenswrapper[4930]: I1012 05:59:05.534713 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6844c9655c-rvdcz" Oct 12 05:59:07 crc kubenswrapper[4930]: W1012 05:59:07.081052 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod30049dfb_04f2_455c_a949_08bd6ff892d0.slice/crio-aee8af9e74229c0d66d27ff2d19312dda66138fb638443ac3e6fee244267604a WatchSource:0}: Error finding container aee8af9e74229c0d66d27ff2d19312dda66138fb638443ac3e6fee244267604a: Status 404 returned error can't find the container with id aee8af9e74229c0d66d27ff2d19312dda66138fb638443ac3e6fee244267604a Oct 12 05:59:07 crc kubenswrapper[4930]: I1012 05:59:07.152020 4930 scope.go:117] "RemoveContainer" containerID="430654705aed322c284d1cc300957fda8b0edbdd3ee95b03b58120323604b0b2" Oct 12 05:59:07 crc kubenswrapper[4930]: I1012 05:59:07.300782 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6778cd8bb8-9zhz5" event={"ID":"30049dfb-04f2-455c-a949-08bd6ff892d0","Type":"ContainerStarted","Data":"aee8af9e74229c0d66d27ff2d19312dda66138fb638443ac3e6fee244267604a"} Oct 12 05:59:07 crc kubenswrapper[4930]: I1012 05:59:07.307885 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-578598f949-m7h9x" podUID="48b62183-7a23-4f98-aef5-4b6b6f89bd53" containerName="dnsmasq-dns" containerID="cri-o://19232370521bd46be2071f0bc663623c9c9933624af52d1c1c4a6258a19863f8" gracePeriod=10 Oct 12 05:59:07 crc kubenswrapper[4930]: I1012 05:59:07.308440 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-578598f949-m7h9x" Oct 12 05:59:07 crc kubenswrapper[4930]: I1012 05:59:07.329833 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-578598f949-m7h9x" podStartSLOduration=46.329818328 podStartE2EDuration="46.329818328s" podCreationTimestamp="2025-10-12 05:58:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:59:07.329593203 +0000 UTC m=+1079.871694988" watchObservedRunningTime="2025-10-12 05:59:07.329818328 +0000 UTC m=+1079.871920093" Oct 12 05:59:07 crc kubenswrapper[4930]: I1012 05:59:07.504689 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6d76466876-jf9t8"] Oct 12 05:59:07 crc kubenswrapper[4930]: W1012 05:59:07.572914 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda97771f5_bcbe_42d8_bdd8_41b43f8899a0.slice/crio-62c33a302d0697c68a30f906bd01591c49b1813d7ea7fa49376f99bece3a2ae2 WatchSource:0}: Error finding container 62c33a302d0697c68a30f906bd01591c49b1813d7ea7fa49376f99bece3a2ae2: Status 404 returned error can't find the container with id 62c33a302d0697c68a30f906bd01591c49b1813d7ea7fa49376f99bece3a2ae2 Oct 12 05:59:07 crc kubenswrapper[4930]: I1012 05:59:07.715035 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-h2tnf"] Oct 12 05:59:07 crc kubenswrapper[4930]: I1012 05:59:07.742679 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Oct 12 05:59:07 crc kubenswrapper[4930]: W1012 05:59:07.753988 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb228afd2_c418_49a3_97d4_35a298e324a6.slice/crio-e8580725cb515cd77f4046ccb47fb4986e937f01de9bdf11aa3e74580113c403 WatchSource:0}: Error finding container e8580725cb515cd77f4046ccb47fb4986e937f01de9bdf11aa3e74580113c403: Status 404 returned error can't find the container with id e8580725cb515cd77f4046ccb47fb4986e937f01de9bdf11aa3e74580113c403 Oct 12 05:59:07 crc kubenswrapper[4930]: W1012 05:59:07.755581 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8f64aa33_3396_4700_bb0f_57b16e39e368.slice/crio-8a152e29d5ae29833f14b5cc8de4880c5e8cda4535d42bb3040c5018fe681ec2 WatchSource:0}: Error finding container 8a152e29d5ae29833f14b5cc8de4880c5e8cda4535d42bb3040c5018fe681ec2: Status 404 returned error can't find the container with id 8a152e29d5ae29833f14b5cc8de4880c5e8cda4535d42bb3040c5018fe681ec2 Oct 12 05:59:07 crc kubenswrapper[4930]: I1012 05:59:07.800039 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-578598f949-m7h9x" Oct 12 05:59:07 crc kubenswrapper[4930]: E1012 05:59:07.820051 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/horizon-86b8d48789-7rd2m" podUID="9dd57cc3-6793-42f9-b938-620f968192c3" Oct 12 05:59:07 crc kubenswrapper[4930]: I1012 05:59:07.912716 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7648c6b969-m7fgp"] Oct 12 05:59:07 crc kubenswrapper[4930]: I1012 05:59:07.966165 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/48b62183-7a23-4f98-aef5-4b6b6f89bd53-ovsdbserver-sb\") pod \"48b62183-7a23-4f98-aef5-4b6b6f89bd53\" (UID: \"48b62183-7a23-4f98-aef5-4b6b6f89bd53\") " Oct 12 05:59:07 crc kubenswrapper[4930]: I1012 05:59:07.966232 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/48b62183-7a23-4f98-aef5-4b6b6f89bd53-dns-swift-storage-0\") pod \"48b62183-7a23-4f98-aef5-4b6b6f89bd53\" (UID: \"48b62183-7a23-4f98-aef5-4b6b6f89bd53\") " Oct 12 05:59:07 crc kubenswrapper[4930]: I1012 05:59:07.966296 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8hwnm\" (UniqueName: \"kubernetes.io/projected/48b62183-7a23-4f98-aef5-4b6b6f89bd53-kube-api-access-8hwnm\") pod \"48b62183-7a23-4f98-aef5-4b6b6f89bd53\" (UID: \"48b62183-7a23-4f98-aef5-4b6b6f89bd53\") " Oct 12 05:59:07 crc kubenswrapper[4930]: I1012 05:59:07.966316 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48b62183-7a23-4f98-aef5-4b6b6f89bd53-config\") pod \"48b62183-7a23-4f98-aef5-4b6b6f89bd53\" (UID: \"48b62183-7a23-4f98-aef5-4b6b6f89bd53\") " Oct 12 05:59:07 crc kubenswrapper[4930]: I1012 05:59:07.966334 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/48b62183-7a23-4f98-aef5-4b6b6f89bd53-dns-svc\") pod \"48b62183-7a23-4f98-aef5-4b6b6f89bd53\" (UID: \"48b62183-7a23-4f98-aef5-4b6b6f89bd53\") " Oct 12 05:59:07 crc kubenswrapper[4930]: I1012 05:59:07.966426 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/48b62183-7a23-4f98-aef5-4b6b6f89bd53-ovsdbserver-nb\") pod \"48b62183-7a23-4f98-aef5-4b6b6f89bd53\" (UID: \"48b62183-7a23-4f98-aef5-4b6b6f89bd53\") " Oct 12 05:59:07 crc kubenswrapper[4930]: I1012 05:59:07.993689 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48b62183-7a23-4f98-aef5-4b6b6f89bd53-kube-api-access-8hwnm" (OuterVolumeSpecName: "kube-api-access-8hwnm") pod "48b62183-7a23-4f98-aef5-4b6b6f89bd53" (UID: "48b62183-7a23-4f98-aef5-4b6b6f89bd53"). InnerVolumeSpecName "kube-api-access-8hwnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:59:08 crc kubenswrapper[4930]: I1012 05:59:08.007680 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-578d784664-rp79z"] Oct 12 05:59:08 crc kubenswrapper[4930]: I1012 05:59:08.068574 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8hwnm\" (UniqueName: \"kubernetes.io/projected/48b62183-7a23-4f98-aef5-4b6b6f89bd53-kube-api-access-8hwnm\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:08 crc kubenswrapper[4930]: I1012 05:59:08.310725 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48b62183-7a23-4f98-aef5-4b6b6f89bd53-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "48b62183-7a23-4f98-aef5-4b6b6f89bd53" (UID: "48b62183-7a23-4f98-aef5-4b6b6f89bd53"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:59:08 crc kubenswrapper[4930]: I1012 05:59:08.377335 4930 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/48b62183-7a23-4f98-aef5-4b6b6f89bd53-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:08 crc kubenswrapper[4930]: I1012 05:59:08.380216 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48b62183-7a23-4f98-aef5-4b6b6f89bd53-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "48b62183-7a23-4f98-aef5-4b6b6f89bd53" (UID: "48b62183-7a23-4f98-aef5-4b6b6f89bd53"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:59:08 crc kubenswrapper[4930]: I1012 05:59:08.395386 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48b62183-7a23-4f98-aef5-4b6b6f89bd53-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "48b62183-7a23-4f98-aef5-4b6b6f89bd53" (UID: "48b62183-7a23-4f98-aef5-4b6b6f89bd53"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:59:08 crc kubenswrapper[4930]: I1012 05:59:08.399807 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-86b8d48789-7rd2m" podUID="9dd57cc3-6793-42f9-b938-620f968192c3" containerName="horizon" containerID="cri-o://8392226d6c0e7f43f3b4908ab41afde31479dee244db52b006703321a58fafe6" gracePeriod=30 Oct 12 05:59:08 crc kubenswrapper[4930]: I1012 05:59:08.403917 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-applier-0" podStartSLOduration=27.595116141 podStartE2EDuration="47.403899566s" podCreationTimestamp="2025-10-12 05:58:21 +0000 UTC" firstStartedPulling="2025-10-12 05:58:23.03885961 +0000 UTC m=+1035.580961375" lastFinishedPulling="2025-10-12 05:58:42.847642995 +0000 UTC m=+1055.389744800" observedRunningTime="2025-10-12 05:59:08.391129458 +0000 UTC m=+1080.933231233" watchObservedRunningTime="2025-10-12 05:59:08.403899566 +0000 UTC m=+1080.946001331" Oct 12 05:59:08 crc kubenswrapper[4930]: I1012 05:59:08.406423 4930 generic.go:334] "Generic (PLEG): container finished" podID="48b62183-7a23-4f98-aef5-4b6b6f89bd53" containerID="19232370521bd46be2071f0bc663623c9c9933624af52d1c1c4a6258a19863f8" exitCode=0 Oct 12 05:59:08 crc kubenswrapper[4930]: I1012 05:59:08.406479 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48b62183-7a23-4f98-aef5-4b6b6f89bd53-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "48b62183-7a23-4f98-aef5-4b6b6f89bd53" (UID: "48b62183-7a23-4f98-aef5-4b6b6f89bd53"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:59:08 crc kubenswrapper[4930]: I1012 05:59:08.406567 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-578598f949-m7h9x" Oct 12 05:59:08 crc kubenswrapper[4930]: I1012 05:59:08.420891 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48b62183-7a23-4f98-aef5-4b6b6f89bd53-config" (OuterVolumeSpecName: "config") pod "48b62183-7a23-4f98-aef5-4b6b6f89bd53" (UID: "48b62183-7a23-4f98-aef5-4b6b6f89bd53"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:59:08 crc kubenswrapper[4930]: I1012 05:59:08.443479 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=42.685310307 podStartE2EDuration="47.443459741s" podCreationTimestamp="2025-10-12 05:58:21 +0000 UTC" firstStartedPulling="2025-10-12 05:58:23.070625283 +0000 UTC m=+1035.612727048" lastFinishedPulling="2025-10-12 05:58:27.828774717 +0000 UTC m=+1040.370876482" observedRunningTime="2025-10-12 05:59:08.440181769 +0000 UTC m=+1080.982283534" watchObservedRunningTime="2025-10-12 05:59:08.443459741 +0000 UTC m=+1080.985561506" Oct 12 05:59:08 crc kubenswrapper[4930]: I1012 05:59:08.479991 4930 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/48b62183-7a23-4f98-aef5-4b6b6f89bd53-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:08 crc kubenswrapper[4930]: I1012 05:59:08.481089 4930 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48b62183-7a23-4f98-aef5-4b6b6f89bd53-config\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:08 crc kubenswrapper[4930]: I1012 05:59:08.481118 4930 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/48b62183-7a23-4f98-aef5-4b6b6f89bd53-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:08 crc kubenswrapper[4930]: I1012 05:59:08.481131 4930 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/48b62183-7a23-4f98-aef5-4b6b6f89bd53-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:08 crc kubenswrapper[4930]: I1012 05:59:08.511885 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6844c9655c-rvdcz"] Oct 12 05:59:08 crc kubenswrapper[4930]: I1012 05:59:08.511962 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-h2tnf" event={"ID":"b228afd2-c418-49a3-97d4-35a298e324a6","Type":"ContainerStarted","Data":"e8580725cb515cd77f4046ccb47fb4986e937f01de9bdf11aa3e74580113c403"} Oct 12 05:59:08 crc kubenswrapper[4930]: I1012 05:59:08.511986 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6d76466876-jf9t8" event={"ID":"a97771f5-bcbe-42d8-bdd8-41b43f8899a0","Type":"ContainerStarted","Data":"62c33a302d0697c68a30f906bd01591c49b1813d7ea7fa49376f99bece3a2ae2"} Oct 12 05:59:08 crc kubenswrapper[4930]: I1012 05:59:08.512024 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6df4dd4f95-cwhwt" event={"ID":"f549aa26-b902-4497-838b-6b80e635897c","Type":"ContainerStarted","Data":"8e39d8ecb42b78b16591060ef87ea5c3018f7d42f7b006b9f645b2a67c37e569"} Oct 12 05:59:08 crc kubenswrapper[4930]: I1012 05:59:08.512037 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"934c43b8-93c5-442d-ba52-2e3bfc2d8f35","Type":"ContainerStarted","Data":"aeffaa8d356815b890e1800cedd6d8c251abf5043b47dcbe6b2830f01ca890c8"} Oct 12 05:59:08 crc kubenswrapper[4930]: I1012 05:59:08.512054 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-sdmls" event={"ID":"ea4a96dd-8928-4248-b294-1b0b6413abef","Type":"ContainerStarted","Data":"1479b3928a1c33b74735ec97bccec250012d2096e39d5f6687fc2fb1e8bcd0d1"} Oct 12 05:59:08 crc kubenswrapper[4930]: I1012 05:59:08.512115 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-84fd96956f-5sqc8" event={"ID":"b69d2329-3396-4e67-9880-940dacef7e56","Type":"ContainerStarted","Data":"47c711d16000c5d14e73813bcff6b60f98928393a79b11066202e3ecdf688e9e"} Oct 12 05:59:08 crc kubenswrapper[4930]: I1012 05:59:08.512130 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7648c6b969-m7fgp" event={"ID":"39903c9a-2fa6-4fb3-8868-b2c055bdd11c","Type":"ContainerStarted","Data":"64db9d9334415a54f565981fde31abf2adda6d6feef6076aa66d8db067199175"} Oct 12 05:59:08 crc kubenswrapper[4930]: I1012 05:59:08.512145 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6844c9655c-rvdcz" event={"ID":"532bae95-f9fc-4633-b53c-2f398cbb8bd2","Type":"ContainerStarted","Data":"7662eab89cf41f2b4b09e18a7d08ed1e32cf130381310b4c4f18863b8081509b"} Oct 12 05:59:08 crc kubenswrapper[4930]: I1012 05:59:08.513691 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6778cd8bb8-9zhz5" event={"ID":"30049dfb-04f2-455c-a949-08bd6ff892d0","Type":"ContainerStarted","Data":"ac8297145e46be91604457a14764e05292e58540ab5b16ca150f8ac1bf5273b1"} Oct 12 05:59:08 crc kubenswrapper[4930]: I1012 05:59:08.513704 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-86b8d48789-7rd2m" event={"ID":"9dd57cc3-6793-42f9-b938-620f968192c3","Type":"ContainerStarted","Data":"8392226d6c0e7f43f3b4908ab41afde31479dee244db52b006703321a58fafe6"} Oct 12 05:59:08 crc kubenswrapper[4930]: I1012 05:59:08.513715 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-578598f949-m7h9x" event={"ID":"48b62183-7a23-4f98-aef5-4b6b6f89bd53","Type":"ContainerDied","Data":"19232370521bd46be2071f0bc663623c9c9933624af52d1c1c4a6258a19863f8"} Oct 12 05:59:08 crc kubenswrapper[4930]: I1012 05:59:08.513731 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-578598f949-m7h9x" event={"ID":"48b62183-7a23-4f98-aef5-4b6b6f89bd53","Type":"ContainerDied","Data":"59e9df2e208b82763c4f091c6436715552273d7c5303eb5475ace56cfeb0aedf"} Oct 12 05:59:08 crc kubenswrapper[4930]: I1012 05:59:08.513754 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"8f64aa33-3396-4700-bb0f-57b16e39e368","Type":"ContainerStarted","Data":"8a152e29d5ae29833f14b5cc8de4880c5e8cda4535d42bb3040c5018fe681ec2"} Oct 12 05:59:08 crc kubenswrapper[4930]: I1012 05:59:08.513764 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7b25d9dd-3bce-4f8d-9219-1d6ce75878ed","Type":"ContainerStarted","Data":"42eb9ff39af53c03159df9698802c263efb99e17ab668023f32c7e55890df52f"} Oct 12 05:59:08 crc kubenswrapper[4930]: I1012 05:59:08.513777 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-578d784664-rp79z" event={"ID":"c9000baf-8bd7-4d76-a2c8-b98ad280728d","Type":"ContainerStarted","Data":"4cdb28b7d3b50e5ce551554a7ab24221224103e0f0e792b646717bca1024b489"} Oct 12 05:59:08 crc kubenswrapper[4930]: I1012 05:59:08.513787 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"dda9904e-b764-43a5-83d2-5c993023f740","Type":"ContainerStarted","Data":"967ac9781fbc3313b269597b25fe58509982553e5823ced10136d6d47186803c"} Oct 12 05:59:08 crc kubenswrapper[4930]: I1012 05:59:08.513809 4930 scope.go:117] "RemoveContainer" containerID="19232370521bd46be2071f0bc663623c9c9933624af52d1c1c4a6258a19863f8" Oct 12 05:59:08 crc kubenswrapper[4930]: I1012 05:59:08.559379 4930 scope.go:117] "RemoveContainer" containerID="1fda28ffc05e5ae6a1e31aaa8d875bb93ce176fa2c5d3bafa5cdee2f7273ca3a" Oct 12 05:59:08 crc kubenswrapper[4930]: I1012 05:59:08.605165 4930 scope.go:117] "RemoveContainer" containerID="19232370521bd46be2071f0bc663623c9c9933624af52d1c1c4a6258a19863f8" Oct 12 05:59:08 crc kubenswrapper[4930]: E1012 05:59:08.605589 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19232370521bd46be2071f0bc663623c9c9933624af52d1c1c4a6258a19863f8\": container with ID starting with 19232370521bd46be2071f0bc663623c9c9933624af52d1c1c4a6258a19863f8 not found: ID does not exist" containerID="19232370521bd46be2071f0bc663623c9c9933624af52d1c1c4a6258a19863f8" Oct 12 05:59:08 crc kubenswrapper[4930]: I1012 05:59:08.605635 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19232370521bd46be2071f0bc663623c9c9933624af52d1c1c4a6258a19863f8"} err="failed to get container status \"19232370521bd46be2071f0bc663623c9c9933624af52d1c1c4a6258a19863f8\": rpc error: code = NotFound desc = could not find container \"19232370521bd46be2071f0bc663623c9c9933624af52d1c1c4a6258a19863f8\": container with ID starting with 19232370521bd46be2071f0bc663623c9c9933624af52d1c1c4a6258a19863f8 not found: ID does not exist" Oct 12 05:59:08 crc kubenswrapper[4930]: I1012 05:59:08.605658 4930 scope.go:117] "RemoveContainer" containerID="1fda28ffc05e5ae6a1e31aaa8d875bb93ce176fa2c5d3bafa5cdee2f7273ca3a" Oct 12 05:59:08 crc kubenswrapper[4930]: E1012 05:59:08.605871 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fda28ffc05e5ae6a1e31aaa8d875bb93ce176fa2c5d3bafa5cdee2f7273ca3a\": container with ID starting with 1fda28ffc05e5ae6a1e31aaa8d875bb93ce176fa2c5d3bafa5cdee2f7273ca3a not found: ID does not exist" containerID="1fda28ffc05e5ae6a1e31aaa8d875bb93ce176fa2c5d3bafa5cdee2f7273ca3a" Oct 12 05:59:08 crc kubenswrapper[4930]: I1012 05:59:08.605908 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fda28ffc05e5ae6a1e31aaa8d875bb93ce176fa2c5d3bafa5cdee2f7273ca3a"} err="failed to get container status \"1fda28ffc05e5ae6a1e31aaa8d875bb93ce176fa2c5d3bafa5cdee2f7273ca3a\": rpc error: code = NotFound desc = could not find container \"1fda28ffc05e5ae6a1e31aaa8d875bb93ce176fa2c5d3bafa5cdee2f7273ca3a\": container with ID starting with 1fda28ffc05e5ae6a1e31aaa8d875bb93ce176fa2c5d3bafa5cdee2f7273ca3a not found: ID does not exist" Oct 12 05:59:08 crc kubenswrapper[4930]: I1012 05:59:08.756987 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-sdmls" podStartSLOduration=3.449184883 podStartE2EDuration="46.756970275s" podCreationTimestamp="2025-10-12 05:58:22 +0000 UTC" firstStartedPulling="2025-10-12 05:58:24.082376438 +0000 UTC m=+1036.624478203" lastFinishedPulling="2025-10-12 05:59:07.39016183 +0000 UTC m=+1079.932263595" observedRunningTime="2025-10-12 05:59:08.53904263 +0000 UTC m=+1081.081144395" watchObservedRunningTime="2025-10-12 05:59:08.756970275 +0000 UTC m=+1081.299072040" Oct 12 05:59:08 crc kubenswrapper[4930]: I1012 05:59:08.758574 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-578598f949-m7h9x"] Oct 12 05:59:08 crc kubenswrapper[4930]: I1012 05:59:08.765899 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-578598f949-m7h9x"] Oct 12 05:59:09 crc kubenswrapper[4930]: I1012 05:59:09.432526 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6844c9655c-rvdcz" event={"ID":"532bae95-f9fc-4633-b53c-2f398cbb8bd2","Type":"ContainerStarted","Data":"4df9ad6bdabac7d2107da19314ef0958c629e4348515d5a5b59abbd7fd1db770"} Oct 12 05:59:09 crc kubenswrapper[4930]: I1012 05:59:09.434423 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6d76466876-jf9t8" event={"ID":"a97771f5-bcbe-42d8-bdd8-41b43f8899a0","Type":"ContainerStarted","Data":"20d53d3e26c17f4738661e085e461e1ccc8ad0958130e9ac7a0837b1b0e69ebf"} Oct 12 05:59:10 crc kubenswrapper[4930]: I1012 05:59:10.160115 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48b62183-7a23-4f98-aef5-4b6b6f89bd53" path="/var/lib/kubelet/pods/48b62183-7a23-4f98-aef5-4b6b6f89bd53/volumes" Oct 12 05:59:11 crc kubenswrapper[4930]: I1012 05:59:11.463594 4930 generic.go:334] "Generic (PLEG): container finished" podID="39903c9a-2fa6-4fb3-8868-b2c055bdd11c" containerID="2e2dc4254cc94c380ea3d9e5b59ed4c5af7e5369610b130b2ff2049d1bf5c49e" exitCode=0 Oct 12 05:59:11 crc kubenswrapper[4930]: I1012 05:59:11.463698 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7648c6b969-m7fgp" event={"ID":"39903c9a-2fa6-4fb3-8868-b2c055bdd11c","Type":"ContainerDied","Data":"2e2dc4254cc94c380ea3d9e5b59ed4c5af7e5369610b130b2ff2049d1bf5c49e"} Oct 12 05:59:12 crc kubenswrapper[4930]: I1012 05:59:12.107475 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-applier-0" Oct 12 05:59:12 crc kubenswrapper[4930]: I1012 05:59:12.108965 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-applier-0" Oct 12 05:59:12 crc kubenswrapper[4930]: I1012 05:59:12.200913 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 12 05:59:12 crc kubenswrapper[4930]: I1012 05:59:12.201050 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-applier-0" Oct 12 05:59:12 crc kubenswrapper[4930]: I1012 05:59:12.219984 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Oct 12 05:59:12 crc kubenswrapper[4930]: I1012 05:59:12.477465 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"8f64aa33-3396-4700-bb0f-57b16e39e368","Type":"ContainerStarted","Data":"2d144f341e028cb128d6fc363b8ab8768ff57f1a0e734f0e3a9ab7fead6f1aae"} Oct 12 05:59:12 crc kubenswrapper[4930]: I1012 05:59:12.480404 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-84fd96956f-5sqc8" event={"ID":"b69d2329-3396-4e67-9880-940dacef7e56","Type":"ContainerStarted","Data":"d3b1d7f06ed0fc2481170337c30d12182ca4b0c017afe2dc33827d41b0c934d2"} Oct 12 05:59:12 crc kubenswrapper[4930]: I1012 05:59:12.480538 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-84fd96956f-5sqc8" podUID="b69d2329-3396-4e67-9880-940dacef7e56" containerName="horizon-log" containerID="cri-o://47c711d16000c5d14e73813bcff6b60f98928393a79b11066202e3ecdf688e9e" gracePeriod=30 Oct 12 05:59:12 crc kubenswrapper[4930]: I1012 05:59:12.480572 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-84fd96956f-5sqc8" podUID="b69d2329-3396-4e67-9880-940dacef7e56" containerName="horizon" containerID="cri-o://d3b1d7f06ed0fc2481170337c30d12182ca4b0c017afe2dc33827d41b0c934d2" gracePeriod=30 Oct 12 05:59:12 crc kubenswrapper[4930]: I1012 05:59:12.482611 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-h2tnf" event={"ID":"b228afd2-c418-49a3-97d4-35a298e324a6","Type":"ContainerStarted","Data":"3f438248d7c6cec3c44e79f882a5dd6fa35ec572adbbd2750232f6baa074b758"} Oct 12 05:59:12 crc kubenswrapper[4930]: I1012 05:59:12.485676 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6844c9655c-rvdcz" event={"ID":"532bae95-f9fc-4633-b53c-2f398cbb8bd2","Type":"ContainerStarted","Data":"875d5bb2728b4c2063add5800b1a3b9d0cc1cd1048eff011fded1c5370ca9f8d"} Oct 12 05:59:12 crc kubenswrapper[4930]: I1012 05:59:12.485933 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6844c9655c-rvdcz" Oct 12 05:59:12 crc kubenswrapper[4930]: I1012 05:59:12.488902 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6778cd8bb8-9zhz5" event={"ID":"30049dfb-04f2-455c-a949-08bd6ff892d0","Type":"ContainerStarted","Data":"61889e483f0c5377a242095541e1e60b08683d4d632d130c12343e4cefb25513"} Oct 12 05:59:12 crc kubenswrapper[4930]: I1012 05:59:12.490548 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-578d784664-rp79z" event={"ID":"c9000baf-8bd7-4d76-a2c8-b98ad280728d","Type":"ContainerStarted","Data":"ed5390836965969e0f55b579f3d611d101a852bbae842bf2a578de46b26a78e6"} Oct 12 05:59:12 crc kubenswrapper[4930]: I1012 05:59:12.492624 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6df4dd4f95-cwhwt" event={"ID":"f549aa26-b902-4497-838b-6b80e635897c","Type":"ContainerStarted","Data":"7383ee320f6575bed0c7fd6bd7510da07b91c2256f049f463e141453b7afb156"} Oct 12 05:59:12 crc kubenswrapper[4930]: I1012 05:59:12.492699 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6df4dd4f95-cwhwt" podUID="f549aa26-b902-4497-838b-6b80e635897c" containerName="horizon" containerID="cri-o://7383ee320f6575bed0c7fd6bd7510da07b91c2256f049f463e141453b7afb156" gracePeriod=30 Oct 12 05:59:12 crc kubenswrapper[4930]: I1012 05:59:12.492874 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6df4dd4f95-cwhwt" podUID="f549aa26-b902-4497-838b-6b80e635897c" containerName="horizon-log" containerID="cri-o://8e39d8ecb42b78b16591060ef87ea5c3018f7d42f7b006b9f645b2a67c37e569" gracePeriod=30 Oct 12 05:59:12 crc kubenswrapper[4930]: I1012 05:59:12.502024 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-84fd96956f-5sqc8" podStartSLOduration=13.914938908 podStartE2EDuration="51.502006222s" podCreationTimestamp="2025-10-12 05:58:21 +0000 UTC" firstStartedPulling="2025-10-12 05:58:23.394129646 +0000 UTC m=+1035.936231411" lastFinishedPulling="2025-10-12 05:59:00.98119694 +0000 UTC m=+1073.523298725" observedRunningTime="2025-10-12 05:59:12.496361381 +0000 UTC m=+1085.038463146" watchObservedRunningTime="2025-10-12 05:59:12.502006222 +0000 UTC m=+1085.044107987" Oct 12 05:59:12 crc kubenswrapper[4930]: I1012 05:59:12.505485 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-jj2m5" event={"ID":"448db83a-c0af-4680-890f-24b8d8da1088","Type":"ContainerStarted","Data":"1b9aba71eeec6c8e02392ccc8b7f58218c5cc9ebe67585f1ff3bd806865a7174"} Oct 12 05:59:12 crc kubenswrapper[4930]: I1012 05:59:12.508808 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6d76466876-jf9t8" event={"ID":"a97771f5-bcbe-42d8-bdd8-41b43f8899a0","Type":"ContainerStarted","Data":"fcf0fd6a6c8508bec65a26311025a8d2d5b30d016bd0e661420a0021c108b4ad"} Oct 12 05:59:12 crc kubenswrapper[4930]: I1012 05:59:12.509605 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Oct 12 05:59:12 crc kubenswrapper[4930]: I1012 05:59:12.538904 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6778cd8bb8-9zhz5" podStartSLOduration=42.53888493 podStartE2EDuration="42.53888493s" podCreationTimestamp="2025-10-12 05:58:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:59:12.52362923 +0000 UTC m=+1085.065730995" watchObservedRunningTime="2025-10-12 05:59:12.53888493 +0000 UTC m=+1085.080986695" Oct 12 05:59:12 crc kubenswrapper[4930]: I1012 05:59:12.548233 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6df4dd4f95-cwhwt" podStartSLOduration=8.101828003 podStartE2EDuration="51.548216322s" podCreationTimestamp="2025-10-12 05:58:21 +0000 UTC" firstStartedPulling="2025-10-12 05:58:23.78304494 +0000 UTC m=+1036.325146705" lastFinishedPulling="2025-10-12 05:59:07.229433259 +0000 UTC m=+1079.771535024" observedRunningTime="2025-10-12 05:59:12.538672774 +0000 UTC m=+1085.080774539" watchObservedRunningTime="2025-10-12 05:59:12.548216322 +0000 UTC m=+1085.090318077" Oct 12 05:59:12 crc kubenswrapper[4930]: I1012 05:59:12.553573 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Oct 12 05:59:12 crc kubenswrapper[4930]: I1012 05:59:12.566868 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6844c9655c-rvdcz" podStartSLOduration=8.566847736 podStartE2EDuration="8.566847736s" podCreationTimestamp="2025-10-12 05:59:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:59:12.558596 +0000 UTC m=+1085.100697765" watchObservedRunningTime="2025-10-12 05:59:12.566847736 +0000 UTC m=+1085.108949501" Oct 12 05:59:12 crc kubenswrapper[4930]: I1012 05:59:12.573865 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-h2tnf" podStartSLOduration=32.57384028 podStartE2EDuration="32.57384028s" podCreationTimestamp="2025-10-12 05:58:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:59:12.573475721 +0000 UTC m=+1085.115577486" watchObservedRunningTime="2025-10-12 05:59:12.57384028 +0000 UTC m=+1085.115942045" Oct 12 05:59:12 crc kubenswrapper[4930]: I1012 05:59:12.575939 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6df4dd4f95-cwhwt" Oct 12 05:59:12 crc kubenswrapper[4930]: I1012 05:59:12.605786 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-applier-0" Oct 12 05:59:12 crc kubenswrapper[4930]: I1012 05:59:12.612101 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 12 05:59:12 crc kubenswrapper[4930]: I1012 05:59:12.621441 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6d76466876-jf9t8" podStartSLOduration=42.621426954 podStartE2EDuration="42.621426954s" podCreationTimestamp="2025-10-12 05:58:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:59:12.614139283 +0000 UTC m=+1085.156241048" watchObservedRunningTime="2025-10-12 05:59:12.621426954 +0000 UTC m=+1085.163528719" Oct 12 05:59:12 crc kubenswrapper[4930]: I1012 05:59:12.637425 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-jj2m5" podStartSLOduration=7.075054252 podStartE2EDuration="53.637408922s" podCreationTimestamp="2025-10-12 05:58:19 +0000 UTC" firstStartedPulling="2025-10-12 05:58:21.041453969 +0000 UTC m=+1033.583555734" lastFinishedPulling="2025-10-12 05:59:07.603808639 +0000 UTC m=+1080.145910404" observedRunningTime="2025-10-12 05:59:12.635272309 +0000 UTC m=+1085.177374074" watchObservedRunningTime="2025-10-12 05:59:12.637408922 +0000 UTC m=+1085.179510677" Oct 12 05:59:12 crc kubenswrapper[4930]: I1012 05:59:12.686903 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-applier-0"] Oct 12 05:59:13 crc kubenswrapper[4930]: I1012 05:59:13.519892 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7648c6b969-m7fgp" event={"ID":"39903c9a-2fa6-4fb3-8868-b2c055bdd11c","Type":"ContainerStarted","Data":"108bb256a26485ad47d064e75e6eebb87735e2755bc7af4a1c379f1e6d10355b"} Oct 12 05:59:13 crc kubenswrapper[4930]: I1012 05:59:13.520407 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7648c6b969-m7fgp" Oct 12 05:59:13 crc kubenswrapper[4930]: I1012 05:59:13.522581 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-dmtxt" event={"ID":"ce22649f-fe81-4c02-a26a-15e45e306b82","Type":"ContainerStarted","Data":"148e80f03a3b8ca1088059a87fb924230d189b4c9191ed927d45cd9215c559c8"} Oct 12 05:59:13 crc kubenswrapper[4930]: I1012 05:59:13.526053 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-578d784664-rp79z" event={"ID":"c9000baf-8bd7-4d76-a2c8-b98ad280728d","Type":"ContainerStarted","Data":"64524e145955403b7c027d759d117e90588d66231377c4ce4125c8a86458ba4b"} Oct 12 05:59:13 crc kubenswrapper[4930]: I1012 05:59:13.526124 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-578d784664-rp79z" Oct 12 05:59:13 crc kubenswrapper[4930]: I1012 05:59:13.528245 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"8f64aa33-3396-4700-bb0f-57b16e39e368","Type":"ContainerStarted","Data":"9b4e5324382ab8226f2902432ecafac57ec1761f1c6be848d7be65b1937d48bb"} Oct 12 05:59:13 crc kubenswrapper[4930]: I1012 05:59:13.553884 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7648c6b969-m7fgp" podStartSLOduration=11.553866076 podStartE2EDuration="11.553866076s" podCreationTimestamp="2025-10-12 05:59:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:59:13.548550464 +0000 UTC m=+1086.090652229" watchObservedRunningTime="2025-10-12 05:59:13.553866076 +0000 UTC m=+1086.095967841" Oct 12 05:59:13 crc kubenswrapper[4930]: I1012 05:59:13.567764 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-578d784664-rp79z" podStartSLOduration=11.567730361 podStartE2EDuration="11.567730361s" podCreationTimestamp="2025-10-12 05:59:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:59:13.563033474 +0000 UTC m=+1086.105135239" watchObservedRunningTime="2025-10-12 05:59:13.567730361 +0000 UTC m=+1086.109832116" Oct 12 05:59:13 crc kubenswrapper[4930]: I1012 05:59:13.588590 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=19.58857618 podStartE2EDuration="19.58857618s" podCreationTimestamp="2025-10-12 05:58:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:59:13.584594111 +0000 UTC m=+1086.126695876" watchObservedRunningTime="2025-10-12 05:59:13.58857618 +0000 UTC m=+1086.130677945" Oct 12 05:59:13 crc kubenswrapper[4930]: I1012 05:59:13.605055 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-dmtxt" podStartSLOduration=3.404881379 podStartE2EDuration="52.60503372s" podCreationTimestamp="2025-10-12 05:58:21 +0000 UTC" firstStartedPulling="2025-10-12 05:58:23.766415335 +0000 UTC m=+1036.308517100" lastFinishedPulling="2025-10-12 05:59:12.966567676 +0000 UTC m=+1085.508669441" observedRunningTime="2025-10-12 05:59:13.597592905 +0000 UTC m=+1086.139694660" watchObservedRunningTime="2025-10-12 05:59:13.60503372 +0000 UTC m=+1086.147135485" Oct 12 05:59:14 crc kubenswrapper[4930]: I1012 05:59:14.544499 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Oct 12 05:59:14 crc kubenswrapper[4930]: I1012 05:59:14.544816 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0" Oct 12 05:59:14 crc kubenswrapper[4930]: I1012 05:59:14.544831 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Oct 12 05:59:14 crc kubenswrapper[4930]: I1012 05:59:14.555962 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-applier-0" podUID="934c43b8-93c5-442d-ba52-2e3bfc2d8f35" containerName="watcher-applier" containerID="cri-o://aeffaa8d356815b890e1800cedd6d8c251abf5043b47dcbe6b2830f01ca890c8" gracePeriod=30 Oct 12 05:59:14 crc kubenswrapper[4930]: I1012 05:59:14.560656 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-decision-engine-0" podUID="dda9904e-b764-43a5-83d2-5c993023f740" containerName="watcher-decision-engine" containerID="cri-o://967ac9781fbc3313b269597b25fe58509982553e5823ced10136d6d47186803c" gracePeriod=30 Oct 12 05:59:15 crc kubenswrapper[4930]: I1012 05:59:15.514994 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-86b8d48789-7rd2m" Oct 12 05:59:15 crc kubenswrapper[4930]: I1012 05:59:15.569592 4930 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 12 05:59:15 crc kubenswrapper[4930]: I1012 05:59:15.588480 4930 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/watcher-api-0" podUID="8f64aa33-3396-4700-bb0f-57b16e39e368" containerName="watcher-api-log" probeResult="failure" output="Get \"http://10.217.0.164:9322/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 12 05:59:16 crc kubenswrapper[4930]: I1012 05:59:16.581421 4930 generic.go:334] "Generic (PLEG): container finished" podID="dda9904e-b764-43a5-83d2-5c993023f740" containerID="967ac9781fbc3313b269597b25fe58509982553e5823ced10136d6d47186803c" exitCode=1 Oct 12 05:59:16 crc kubenswrapper[4930]: I1012 05:59:16.581568 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"dda9904e-b764-43a5-83d2-5c993023f740","Type":"ContainerDied","Data":"967ac9781fbc3313b269597b25fe58509982553e5823ced10136d6d47186803c"} Oct 12 05:59:16 crc kubenswrapper[4930]: I1012 05:59:16.591865 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-bg9r7" event={"ID":"99ff807c-d810-4ff9-9ed3-3b2d37d3fbed","Type":"ContainerStarted","Data":"5c5db18e20e927f2b3f8fc81b314b0d8b00783743c0b89453be15b7b032c587b"} Oct 12 05:59:16 crc kubenswrapper[4930]: I1012 05:59:16.598114 4930 generic.go:334] "Generic (PLEG): container finished" podID="934c43b8-93c5-442d-ba52-2e3bfc2d8f35" containerID="aeffaa8d356815b890e1800cedd6d8c251abf5043b47dcbe6b2830f01ca890c8" exitCode=0 Oct 12 05:59:16 crc kubenswrapper[4930]: I1012 05:59:16.598179 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"934c43b8-93c5-442d-ba52-2e3bfc2d8f35","Type":"ContainerDied","Data":"aeffaa8d356815b890e1800cedd6d8c251abf5043b47dcbe6b2830f01ca890c8"} Oct 12 05:59:16 crc kubenswrapper[4930]: I1012 05:59:16.604404 4930 generic.go:334] "Generic (PLEG): container finished" podID="ea4a96dd-8928-4248-b294-1b0b6413abef" containerID="1479b3928a1c33b74735ec97bccec250012d2096e39d5f6687fc2fb1e8bcd0d1" exitCode=0 Oct 12 05:59:16 crc kubenswrapper[4930]: I1012 05:59:16.604439 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-sdmls" event={"ID":"ea4a96dd-8928-4248-b294-1b0b6413abef","Type":"ContainerDied","Data":"1479b3928a1c33b74735ec97bccec250012d2096e39d5f6687fc2fb1e8bcd0d1"} Oct 12 05:59:16 crc kubenswrapper[4930]: I1012 05:59:16.619664 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-bg9r7" podStartSLOduration=5.200539569 podStartE2EDuration="55.619646264s" podCreationTimestamp="2025-10-12 05:58:21 +0000 UTC" firstStartedPulling="2025-10-12 05:58:23.932213862 +0000 UTC m=+1036.474315627" lastFinishedPulling="2025-10-12 05:59:14.351320557 +0000 UTC m=+1086.893422322" observedRunningTime="2025-10-12 05:59:16.614982868 +0000 UTC m=+1089.157084633" watchObservedRunningTime="2025-10-12 05:59:16.619646264 +0000 UTC m=+1089.161748029" Oct 12 05:59:17 crc kubenswrapper[4930]: E1012 05:59:17.108006 4930 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of aeffaa8d356815b890e1800cedd6d8c251abf5043b47dcbe6b2830f01ca890c8 is running failed: container process not found" containerID="aeffaa8d356815b890e1800cedd6d8c251abf5043b47dcbe6b2830f01ca890c8" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Oct 12 05:59:17 crc kubenswrapper[4930]: E1012 05:59:17.108834 4930 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of aeffaa8d356815b890e1800cedd6d8c251abf5043b47dcbe6b2830f01ca890c8 is running failed: container process not found" containerID="aeffaa8d356815b890e1800cedd6d8c251abf5043b47dcbe6b2830f01ca890c8" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Oct 12 05:59:17 crc kubenswrapper[4930]: E1012 05:59:17.109136 4930 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of aeffaa8d356815b890e1800cedd6d8c251abf5043b47dcbe6b2830f01ca890c8 is running failed: container process not found" containerID="aeffaa8d356815b890e1800cedd6d8c251abf5043b47dcbe6b2830f01ca890c8" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Oct 12 05:59:17 crc kubenswrapper[4930]: E1012 05:59:17.109203 4930 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of aeffaa8d356815b890e1800cedd6d8c251abf5043b47dcbe6b2830f01ca890c8 is running failed: container process not found" probeType="Readiness" pod="openstack/watcher-applier-0" podUID="934c43b8-93c5-442d-ba52-2e3bfc2d8f35" containerName="watcher-applier" Oct 12 05:59:17 crc kubenswrapper[4930]: I1012 05:59:17.408384 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Oct 12 05:59:18 crc kubenswrapper[4930]: I1012 05:59:18.341239 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-sdmls" Oct 12 05:59:18 crc kubenswrapper[4930]: I1012 05:59:18.455677 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea4a96dd-8928-4248-b294-1b0b6413abef-combined-ca-bundle\") pod \"ea4a96dd-8928-4248-b294-1b0b6413abef\" (UID: \"ea4a96dd-8928-4248-b294-1b0b6413abef\") " Oct 12 05:59:18 crc kubenswrapper[4930]: I1012 05:59:18.455728 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea4a96dd-8928-4248-b294-1b0b6413abef-config-data\") pod \"ea4a96dd-8928-4248-b294-1b0b6413abef\" (UID: \"ea4a96dd-8928-4248-b294-1b0b6413abef\") " Oct 12 05:59:18 crc kubenswrapper[4930]: I1012 05:59:18.455788 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbhtq\" (UniqueName: \"kubernetes.io/projected/ea4a96dd-8928-4248-b294-1b0b6413abef-kube-api-access-mbhtq\") pod \"ea4a96dd-8928-4248-b294-1b0b6413abef\" (UID: \"ea4a96dd-8928-4248-b294-1b0b6413abef\") " Oct 12 05:59:18 crc kubenswrapper[4930]: I1012 05:59:18.455828 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea4a96dd-8928-4248-b294-1b0b6413abef-logs\") pod \"ea4a96dd-8928-4248-b294-1b0b6413abef\" (UID: \"ea4a96dd-8928-4248-b294-1b0b6413abef\") " Oct 12 05:59:18 crc kubenswrapper[4930]: I1012 05:59:18.455945 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea4a96dd-8928-4248-b294-1b0b6413abef-scripts\") pod \"ea4a96dd-8928-4248-b294-1b0b6413abef\" (UID: \"ea4a96dd-8928-4248-b294-1b0b6413abef\") " Oct 12 05:59:18 crc kubenswrapper[4930]: I1012 05:59:18.456936 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea4a96dd-8928-4248-b294-1b0b6413abef-logs" (OuterVolumeSpecName: "logs") pod "ea4a96dd-8928-4248-b294-1b0b6413abef" (UID: "ea4a96dd-8928-4248-b294-1b0b6413abef"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 05:59:18 crc kubenswrapper[4930]: I1012 05:59:18.471857 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea4a96dd-8928-4248-b294-1b0b6413abef-scripts" (OuterVolumeSpecName: "scripts") pod "ea4a96dd-8928-4248-b294-1b0b6413abef" (UID: "ea4a96dd-8928-4248-b294-1b0b6413abef"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:59:18 crc kubenswrapper[4930]: I1012 05:59:18.474518 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea4a96dd-8928-4248-b294-1b0b6413abef-kube-api-access-mbhtq" (OuterVolumeSpecName: "kube-api-access-mbhtq") pod "ea4a96dd-8928-4248-b294-1b0b6413abef" (UID: "ea4a96dd-8928-4248-b294-1b0b6413abef"). InnerVolumeSpecName "kube-api-access-mbhtq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:59:18 crc kubenswrapper[4930]: I1012 05:59:18.512454 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea4a96dd-8928-4248-b294-1b0b6413abef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ea4a96dd-8928-4248-b294-1b0b6413abef" (UID: "ea4a96dd-8928-4248-b294-1b0b6413abef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:59:18 crc kubenswrapper[4930]: I1012 05:59:18.514667 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea4a96dd-8928-4248-b294-1b0b6413abef-config-data" (OuterVolumeSpecName: "config-data") pod "ea4a96dd-8928-4248-b294-1b0b6413abef" (UID: "ea4a96dd-8928-4248-b294-1b0b6413abef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:59:18 crc kubenswrapper[4930]: I1012 05:59:18.558247 4930 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea4a96dd-8928-4248-b294-1b0b6413abef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:18 crc kubenswrapper[4930]: I1012 05:59:18.558294 4930 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea4a96dd-8928-4248-b294-1b0b6413abef-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:18 crc kubenswrapper[4930]: I1012 05:59:18.558307 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mbhtq\" (UniqueName: \"kubernetes.io/projected/ea4a96dd-8928-4248-b294-1b0b6413abef-kube-api-access-mbhtq\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:18 crc kubenswrapper[4930]: I1012 05:59:18.558321 4930 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea4a96dd-8928-4248-b294-1b0b6413abef-logs\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:18 crc kubenswrapper[4930]: I1012 05:59:18.558331 4930 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea4a96dd-8928-4248-b294-1b0b6413abef-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:18 crc kubenswrapper[4930]: I1012 05:59:18.620691 4930 generic.go:334] "Generic (PLEG): container finished" podID="b228afd2-c418-49a3-97d4-35a298e324a6" containerID="3f438248d7c6cec3c44e79f882a5dd6fa35ec572adbbd2750232f6baa074b758" exitCode=0 Oct 12 05:59:18 crc kubenswrapper[4930]: I1012 05:59:18.620785 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-h2tnf" event={"ID":"b228afd2-c418-49a3-97d4-35a298e324a6","Type":"ContainerDied","Data":"3f438248d7c6cec3c44e79f882a5dd6fa35ec572adbbd2750232f6baa074b758"} Oct 12 05:59:18 crc kubenswrapper[4930]: I1012 05:59:18.622500 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-sdmls" event={"ID":"ea4a96dd-8928-4248-b294-1b0b6413abef","Type":"ContainerDied","Data":"87821483aa2c5cd6a81aa58ed500ecf9a06b15d28bc2905dbe17a2ff794f3a7c"} Oct 12 05:59:18 crc kubenswrapper[4930]: I1012 05:59:18.622529 4930 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87821483aa2c5cd6a81aa58ed500ecf9a06b15d28bc2905dbe17a2ff794f3a7c" Oct 12 05:59:18 crc kubenswrapper[4930]: I1012 05:59:18.622582 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-sdmls" Oct 12 05:59:18 crc kubenswrapper[4930]: I1012 05:59:18.717106 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-544d94f45b-79l8m"] Oct 12 05:59:18 crc kubenswrapper[4930]: E1012 05:59:18.717517 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea4a96dd-8928-4248-b294-1b0b6413abef" containerName="placement-db-sync" Oct 12 05:59:18 crc kubenswrapper[4930]: I1012 05:59:18.717532 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea4a96dd-8928-4248-b294-1b0b6413abef" containerName="placement-db-sync" Oct 12 05:59:18 crc kubenswrapper[4930]: E1012 05:59:18.717564 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48b62183-7a23-4f98-aef5-4b6b6f89bd53" containerName="dnsmasq-dns" Oct 12 05:59:18 crc kubenswrapper[4930]: I1012 05:59:18.717571 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="48b62183-7a23-4f98-aef5-4b6b6f89bd53" containerName="dnsmasq-dns" Oct 12 05:59:18 crc kubenswrapper[4930]: E1012 05:59:18.717591 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48b62183-7a23-4f98-aef5-4b6b6f89bd53" containerName="init" Oct 12 05:59:18 crc kubenswrapper[4930]: I1012 05:59:18.717598 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="48b62183-7a23-4f98-aef5-4b6b6f89bd53" containerName="init" Oct 12 05:59:18 crc kubenswrapper[4930]: I1012 05:59:18.717823 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea4a96dd-8928-4248-b294-1b0b6413abef" containerName="placement-db-sync" Oct 12 05:59:18 crc kubenswrapper[4930]: I1012 05:59:18.717840 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="48b62183-7a23-4f98-aef5-4b6b6f89bd53" containerName="dnsmasq-dns" Oct 12 05:59:18 crc kubenswrapper[4930]: I1012 05:59:18.718966 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-544d94f45b-79l8m" Oct 12 05:59:18 crc kubenswrapper[4930]: I1012 05:59:18.726867 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Oct 12 05:59:18 crc kubenswrapper[4930]: I1012 05:59:18.727530 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 12 05:59:18 crc kubenswrapper[4930]: I1012 05:59:18.727754 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Oct 12 05:59:18 crc kubenswrapper[4930]: I1012 05:59:18.727891 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 12 05:59:18 crc kubenswrapper[4930]: I1012 05:59:18.728012 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-7gdgz" Oct 12 05:59:18 crc kubenswrapper[4930]: I1012 05:59:18.749848 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-544d94f45b-79l8m"] Oct 12 05:59:18 crc kubenswrapper[4930]: I1012 05:59:18.863754 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cfa2a2e-ac4f-415b-9dd2-dabf059ad679-scripts\") pod \"placement-544d94f45b-79l8m\" (UID: \"8cfa2a2e-ac4f-415b-9dd2-dabf059ad679\") " pod="openstack/placement-544d94f45b-79l8m" Oct 12 05:59:18 crc kubenswrapper[4930]: I1012 05:59:18.863816 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cfa2a2e-ac4f-415b-9dd2-dabf059ad679-config-data\") pod \"placement-544d94f45b-79l8m\" (UID: \"8cfa2a2e-ac4f-415b-9dd2-dabf059ad679\") " pod="openstack/placement-544d94f45b-79l8m" Oct 12 05:59:18 crc kubenswrapper[4930]: I1012 05:59:18.863848 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8cfa2a2e-ac4f-415b-9dd2-dabf059ad679-internal-tls-certs\") pod \"placement-544d94f45b-79l8m\" (UID: \"8cfa2a2e-ac4f-415b-9dd2-dabf059ad679\") " pod="openstack/placement-544d94f45b-79l8m" Oct 12 05:59:18 crc kubenswrapper[4930]: I1012 05:59:18.863914 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmvrr\" (UniqueName: \"kubernetes.io/projected/8cfa2a2e-ac4f-415b-9dd2-dabf059ad679-kube-api-access-fmvrr\") pod \"placement-544d94f45b-79l8m\" (UID: \"8cfa2a2e-ac4f-415b-9dd2-dabf059ad679\") " pod="openstack/placement-544d94f45b-79l8m" Oct 12 05:59:18 crc kubenswrapper[4930]: I1012 05:59:18.863940 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8cfa2a2e-ac4f-415b-9dd2-dabf059ad679-logs\") pod \"placement-544d94f45b-79l8m\" (UID: \"8cfa2a2e-ac4f-415b-9dd2-dabf059ad679\") " pod="openstack/placement-544d94f45b-79l8m" Oct 12 05:59:18 crc kubenswrapper[4930]: I1012 05:59:18.863954 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8cfa2a2e-ac4f-415b-9dd2-dabf059ad679-public-tls-certs\") pod \"placement-544d94f45b-79l8m\" (UID: \"8cfa2a2e-ac4f-415b-9dd2-dabf059ad679\") " pod="openstack/placement-544d94f45b-79l8m" Oct 12 05:59:18 crc kubenswrapper[4930]: I1012 05:59:18.863990 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cfa2a2e-ac4f-415b-9dd2-dabf059ad679-combined-ca-bundle\") pod \"placement-544d94f45b-79l8m\" (UID: \"8cfa2a2e-ac4f-415b-9dd2-dabf059ad679\") " pod="openstack/placement-544d94f45b-79l8m" Oct 12 05:59:18 crc kubenswrapper[4930]: I1012 05:59:18.968076 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cfa2a2e-ac4f-415b-9dd2-dabf059ad679-scripts\") pod \"placement-544d94f45b-79l8m\" (UID: \"8cfa2a2e-ac4f-415b-9dd2-dabf059ad679\") " pod="openstack/placement-544d94f45b-79l8m" Oct 12 05:59:18 crc kubenswrapper[4930]: I1012 05:59:18.968142 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cfa2a2e-ac4f-415b-9dd2-dabf059ad679-config-data\") pod \"placement-544d94f45b-79l8m\" (UID: \"8cfa2a2e-ac4f-415b-9dd2-dabf059ad679\") " pod="openstack/placement-544d94f45b-79l8m" Oct 12 05:59:18 crc kubenswrapper[4930]: I1012 05:59:18.968171 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8cfa2a2e-ac4f-415b-9dd2-dabf059ad679-internal-tls-certs\") pod \"placement-544d94f45b-79l8m\" (UID: \"8cfa2a2e-ac4f-415b-9dd2-dabf059ad679\") " pod="openstack/placement-544d94f45b-79l8m" Oct 12 05:59:18 crc kubenswrapper[4930]: I1012 05:59:18.968232 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmvrr\" (UniqueName: \"kubernetes.io/projected/8cfa2a2e-ac4f-415b-9dd2-dabf059ad679-kube-api-access-fmvrr\") pod \"placement-544d94f45b-79l8m\" (UID: \"8cfa2a2e-ac4f-415b-9dd2-dabf059ad679\") " pod="openstack/placement-544d94f45b-79l8m" Oct 12 05:59:18 crc kubenswrapper[4930]: I1012 05:59:18.968263 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8cfa2a2e-ac4f-415b-9dd2-dabf059ad679-logs\") pod \"placement-544d94f45b-79l8m\" (UID: \"8cfa2a2e-ac4f-415b-9dd2-dabf059ad679\") " pod="openstack/placement-544d94f45b-79l8m" Oct 12 05:59:18 crc kubenswrapper[4930]: I1012 05:59:18.968277 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8cfa2a2e-ac4f-415b-9dd2-dabf059ad679-public-tls-certs\") pod \"placement-544d94f45b-79l8m\" (UID: \"8cfa2a2e-ac4f-415b-9dd2-dabf059ad679\") " pod="openstack/placement-544d94f45b-79l8m" Oct 12 05:59:18 crc kubenswrapper[4930]: I1012 05:59:18.968313 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cfa2a2e-ac4f-415b-9dd2-dabf059ad679-combined-ca-bundle\") pod \"placement-544d94f45b-79l8m\" (UID: \"8cfa2a2e-ac4f-415b-9dd2-dabf059ad679\") " pod="openstack/placement-544d94f45b-79l8m" Oct 12 05:59:18 crc kubenswrapper[4930]: I1012 05:59:18.969024 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8cfa2a2e-ac4f-415b-9dd2-dabf059ad679-logs\") pod \"placement-544d94f45b-79l8m\" (UID: \"8cfa2a2e-ac4f-415b-9dd2-dabf059ad679\") " pod="openstack/placement-544d94f45b-79l8m" Oct 12 05:59:18 crc kubenswrapper[4930]: I1012 05:59:18.974091 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cfa2a2e-ac4f-415b-9dd2-dabf059ad679-combined-ca-bundle\") pod \"placement-544d94f45b-79l8m\" (UID: \"8cfa2a2e-ac4f-415b-9dd2-dabf059ad679\") " pod="openstack/placement-544d94f45b-79l8m" Oct 12 05:59:18 crc kubenswrapper[4930]: I1012 05:59:18.974132 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8cfa2a2e-ac4f-415b-9dd2-dabf059ad679-public-tls-certs\") pod \"placement-544d94f45b-79l8m\" (UID: \"8cfa2a2e-ac4f-415b-9dd2-dabf059ad679\") " pod="openstack/placement-544d94f45b-79l8m" Oct 12 05:59:18 crc kubenswrapper[4930]: I1012 05:59:18.974554 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cfa2a2e-ac4f-415b-9dd2-dabf059ad679-config-data\") pod \"placement-544d94f45b-79l8m\" (UID: \"8cfa2a2e-ac4f-415b-9dd2-dabf059ad679\") " pod="openstack/placement-544d94f45b-79l8m" Oct 12 05:59:18 crc kubenswrapper[4930]: I1012 05:59:18.986965 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cfa2a2e-ac4f-415b-9dd2-dabf059ad679-scripts\") pod \"placement-544d94f45b-79l8m\" (UID: \"8cfa2a2e-ac4f-415b-9dd2-dabf059ad679\") " pod="openstack/placement-544d94f45b-79l8m" Oct 12 05:59:18 crc kubenswrapper[4930]: I1012 05:59:18.991477 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8cfa2a2e-ac4f-415b-9dd2-dabf059ad679-internal-tls-certs\") pod \"placement-544d94f45b-79l8m\" (UID: \"8cfa2a2e-ac4f-415b-9dd2-dabf059ad679\") " pod="openstack/placement-544d94f45b-79l8m" Oct 12 05:59:18 crc kubenswrapper[4930]: I1012 05:59:18.991933 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmvrr\" (UniqueName: \"kubernetes.io/projected/8cfa2a2e-ac4f-415b-9dd2-dabf059ad679-kube-api-access-fmvrr\") pod \"placement-544d94f45b-79l8m\" (UID: \"8cfa2a2e-ac4f-415b-9dd2-dabf059ad679\") " pod="openstack/placement-544d94f45b-79l8m" Oct 12 05:59:19 crc kubenswrapper[4930]: I1012 05:59:19.044510 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-544d94f45b-79l8m" Oct 12 05:59:21 crc kubenswrapper[4930]: I1012 05:59:21.189312 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6778cd8bb8-9zhz5" Oct 12 05:59:21 crc kubenswrapper[4930]: I1012 05:59:21.189564 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6778cd8bb8-9zhz5" Oct 12 05:59:21 crc kubenswrapper[4930]: I1012 05:59:21.314886 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6d76466876-jf9t8" Oct 12 05:59:21 crc kubenswrapper[4930]: I1012 05:59:21.314935 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6d76466876-jf9t8" Oct 12 05:59:21 crc kubenswrapper[4930]: I1012 05:59:21.459020 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Oct 12 05:59:21 crc kubenswrapper[4930]: I1012 05:59:21.469630 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-h2tnf" Oct 12 05:59:21 crc kubenswrapper[4930]: I1012 05:59:21.642912 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/934c43b8-93c5-442d-ba52-2e3bfc2d8f35-config-data\") pod \"934c43b8-93c5-442d-ba52-2e3bfc2d8f35\" (UID: \"934c43b8-93c5-442d-ba52-2e3bfc2d8f35\") " Oct 12 05:59:21 crc kubenswrapper[4930]: I1012 05:59:21.642977 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b228afd2-c418-49a3-97d4-35a298e324a6-credential-keys\") pod \"b228afd2-c418-49a3-97d4-35a298e324a6\" (UID: \"b228afd2-c418-49a3-97d4-35a298e324a6\") " Oct 12 05:59:21 crc kubenswrapper[4930]: I1012 05:59:21.643041 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b228afd2-c418-49a3-97d4-35a298e324a6-combined-ca-bundle\") pod \"b228afd2-c418-49a3-97d4-35a298e324a6\" (UID: \"b228afd2-c418-49a3-97d4-35a298e324a6\") " Oct 12 05:59:21 crc kubenswrapper[4930]: I1012 05:59:21.643090 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/934c43b8-93c5-442d-ba52-2e3bfc2d8f35-combined-ca-bundle\") pod \"934c43b8-93c5-442d-ba52-2e3bfc2d8f35\" (UID: \"934c43b8-93c5-442d-ba52-2e3bfc2d8f35\") " Oct 12 05:59:21 crc kubenswrapper[4930]: I1012 05:59:21.643156 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27cst\" (UniqueName: \"kubernetes.io/projected/934c43b8-93c5-442d-ba52-2e3bfc2d8f35-kube-api-access-27cst\") pod \"934c43b8-93c5-442d-ba52-2e3bfc2d8f35\" (UID: \"934c43b8-93c5-442d-ba52-2e3bfc2d8f35\") " Oct 12 05:59:21 crc kubenswrapper[4930]: I1012 05:59:21.643176 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b228afd2-c418-49a3-97d4-35a298e324a6-config-data\") pod \"b228afd2-c418-49a3-97d4-35a298e324a6\" (UID: \"b228afd2-c418-49a3-97d4-35a298e324a6\") " Oct 12 05:59:21 crc kubenswrapper[4930]: I1012 05:59:21.643225 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/934c43b8-93c5-442d-ba52-2e3bfc2d8f35-logs\") pod \"934c43b8-93c5-442d-ba52-2e3bfc2d8f35\" (UID: \"934c43b8-93c5-442d-ba52-2e3bfc2d8f35\") " Oct 12 05:59:21 crc kubenswrapper[4930]: I1012 05:59:21.643241 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b228afd2-c418-49a3-97d4-35a298e324a6-fernet-keys\") pod \"b228afd2-c418-49a3-97d4-35a298e324a6\" (UID: \"b228afd2-c418-49a3-97d4-35a298e324a6\") " Oct 12 05:59:21 crc kubenswrapper[4930]: I1012 05:59:21.643261 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4nkgr\" (UniqueName: \"kubernetes.io/projected/b228afd2-c418-49a3-97d4-35a298e324a6-kube-api-access-4nkgr\") pod \"b228afd2-c418-49a3-97d4-35a298e324a6\" (UID: \"b228afd2-c418-49a3-97d4-35a298e324a6\") " Oct 12 05:59:21 crc kubenswrapper[4930]: I1012 05:59:21.643333 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b228afd2-c418-49a3-97d4-35a298e324a6-scripts\") pod \"b228afd2-c418-49a3-97d4-35a298e324a6\" (UID: \"b228afd2-c418-49a3-97d4-35a298e324a6\") " Oct 12 05:59:21 crc kubenswrapper[4930]: I1012 05:59:21.644959 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/934c43b8-93c5-442d-ba52-2e3bfc2d8f35-logs" (OuterVolumeSpecName: "logs") pod "934c43b8-93c5-442d-ba52-2e3bfc2d8f35" (UID: "934c43b8-93c5-442d-ba52-2e3bfc2d8f35"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 05:59:21 crc kubenswrapper[4930]: I1012 05:59:21.652023 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/934c43b8-93c5-442d-ba52-2e3bfc2d8f35-kube-api-access-27cst" (OuterVolumeSpecName: "kube-api-access-27cst") pod "934c43b8-93c5-442d-ba52-2e3bfc2d8f35" (UID: "934c43b8-93c5-442d-ba52-2e3bfc2d8f35"). InnerVolumeSpecName "kube-api-access-27cst". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:59:21 crc kubenswrapper[4930]: I1012 05:59:21.652478 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b228afd2-c418-49a3-97d4-35a298e324a6-scripts" (OuterVolumeSpecName: "scripts") pod "b228afd2-c418-49a3-97d4-35a298e324a6" (UID: "b228afd2-c418-49a3-97d4-35a298e324a6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:59:21 crc kubenswrapper[4930]: I1012 05:59:21.653497 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b228afd2-c418-49a3-97d4-35a298e324a6-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "b228afd2-c418-49a3-97d4-35a298e324a6" (UID: "b228afd2-c418-49a3-97d4-35a298e324a6"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:59:21 crc kubenswrapper[4930]: I1012 05:59:21.655855 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b228afd2-c418-49a3-97d4-35a298e324a6-kube-api-access-4nkgr" (OuterVolumeSpecName: "kube-api-access-4nkgr") pod "b228afd2-c418-49a3-97d4-35a298e324a6" (UID: "b228afd2-c418-49a3-97d4-35a298e324a6"). InnerVolumeSpecName "kube-api-access-4nkgr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:59:21 crc kubenswrapper[4930]: I1012 05:59:21.681672 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b228afd2-c418-49a3-97d4-35a298e324a6-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "b228afd2-c418-49a3-97d4-35a298e324a6" (UID: "b228afd2-c418-49a3-97d4-35a298e324a6"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:59:21 crc kubenswrapper[4930]: I1012 05:59:21.702675 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"934c43b8-93c5-442d-ba52-2e3bfc2d8f35","Type":"ContainerDied","Data":"d4cd79f0aac633d971ece62da40822881b6244052665f3617ae36f8870cea846"} Oct 12 05:59:21 crc kubenswrapper[4930]: I1012 05:59:21.702755 4930 scope.go:117] "RemoveContainer" containerID="aeffaa8d356815b890e1800cedd6d8c251abf5043b47dcbe6b2830f01ca890c8" Oct 12 05:59:21 crc kubenswrapper[4930]: I1012 05:59:21.702903 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Oct 12 05:59:21 crc kubenswrapper[4930]: I1012 05:59:21.705956 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-h2tnf" event={"ID":"b228afd2-c418-49a3-97d4-35a298e324a6","Type":"ContainerDied","Data":"e8580725cb515cd77f4046ccb47fb4986e937f01de9bdf11aa3e74580113c403"} Oct 12 05:59:21 crc kubenswrapper[4930]: I1012 05:59:21.706004 4930 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e8580725cb515cd77f4046ccb47fb4986e937f01de9bdf11aa3e74580113c403" Oct 12 05:59:21 crc kubenswrapper[4930]: I1012 05:59:21.706086 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-h2tnf" Oct 12 05:59:21 crc kubenswrapper[4930]: I1012 05:59:21.719355 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b228afd2-c418-49a3-97d4-35a298e324a6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b228afd2-c418-49a3-97d4-35a298e324a6" (UID: "b228afd2-c418-49a3-97d4-35a298e324a6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:59:21 crc kubenswrapper[4930]: I1012 05:59:21.724393 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/934c43b8-93c5-442d-ba52-2e3bfc2d8f35-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "934c43b8-93c5-442d-ba52-2e3bfc2d8f35" (UID: "934c43b8-93c5-442d-ba52-2e3bfc2d8f35"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:59:21 crc kubenswrapper[4930]: I1012 05:59:21.734927 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b228afd2-c418-49a3-97d4-35a298e324a6-config-data" (OuterVolumeSpecName: "config-data") pod "b228afd2-c418-49a3-97d4-35a298e324a6" (UID: "b228afd2-c418-49a3-97d4-35a298e324a6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:59:21 crc kubenswrapper[4930]: I1012 05:59:21.745014 4930 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b228afd2-c418-49a3-97d4-35a298e324a6-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:21 crc kubenswrapper[4930]: I1012 05:59:21.745040 4930 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b228afd2-c418-49a3-97d4-35a298e324a6-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:21 crc kubenswrapper[4930]: I1012 05:59:21.745049 4930 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b228afd2-c418-49a3-97d4-35a298e324a6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:21 crc kubenswrapper[4930]: I1012 05:59:21.745059 4930 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/934c43b8-93c5-442d-ba52-2e3bfc2d8f35-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:21 crc kubenswrapper[4930]: I1012 05:59:21.745068 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27cst\" (UniqueName: \"kubernetes.io/projected/934c43b8-93c5-442d-ba52-2e3bfc2d8f35-kube-api-access-27cst\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:21 crc kubenswrapper[4930]: I1012 05:59:21.745077 4930 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b228afd2-c418-49a3-97d4-35a298e324a6-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:21 crc kubenswrapper[4930]: I1012 05:59:21.745087 4930 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/934c43b8-93c5-442d-ba52-2e3bfc2d8f35-logs\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:21 crc kubenswrapper[4930]: I1012 05:59:21.745095 4930 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b228afd2-c418-49a3-97d4-35a298e324a6-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:21 crc kubenswrapper[4930]: I1012 05:59:21.745103 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4nkgr\" (UniqueName: \"kubernetes.io/projected/b228afd2-c418-49a3-97d4-35a298e324a6-kube-api-access-4nkgr\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:21 crc kubenswrapper[4930]: I1012 05:59:21.746658 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/934c43b8-93c5-442d-ba52-2e3bfc2d8f35-config-data" (OuterVolumeSpecName: "config-data") pod "934c43b8-93c5-442d-ba52-2e3bfc2d8f35" (UID: "934c43b8-93c5-442d-ba52-2e3bfc2d8f35"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:59:21 crc kubenswrapper[4930]: I1012 05:59:21.847344 4930 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/934c43b8-93c5-442d-ba52-2e3bfc2d8f35-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:22 crc kubenswrapper[4930]: I1012 05:59:22.047337 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-applier-0"] Oct 12 05:59:22 crc kubenswrapper[4930]: I1012 05:59:22.057934 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-applier-0"] Oct 12 05:59:22 crc kubenswrapper[4930]: I1012 05:59:22.073456 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-applier-0"] Oct 12 05:59:22 crc kubenswrapper[4930]: E1012 05:59:22.074580 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="934c43b8-93c5-442d-ba52-2e3bfc2d8f35" containerName="watcher-applier" Oct 12 05:59:22 crc kubenswrapper[4930]: I1012 05:59:22.074604 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="934c43b8-93c5-442d-ba52-2e3bfc2d8f35" containerName="watcher-applier" Oct 12 05:59:22 crc kubenswrapper[4930]: E1012 05:59:22.074628 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b228afd2-c418-49a3-97d4-35a298e324a6" containerName="keystone-bootstrap" Oct 12 05:59:22 crc kubenswrapper[4930]: I1012 05:59:22.074636 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="b228afd2-c418-49a3-97d4-35a298e324a6" containerName="keystone-bootstrap" Oct 12 05:59:22 crc kubenswrapper[4930]: I1012 05:59:22.074877 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="934c43b8-93c5-442d-ba52-2e3bfc2d8f35" containerName="watcher-applier" Oct 12 05:59:22 crc kubenswrapper[4930]: I1012 05:59:22.074898 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="b228afd2-c418-49a3-97d4-35a298e324a6" containerName="keystone-bootstrap" Oct 12 05:59:22 crc kubenswrapper[4930]: I1012 05:59:22.075728 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Oct 12 05:59:22 crc kubenswrapper[4930]: I1012 05:59:22.079782 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-applier-config-data" Oct 12 05:59:22 crc kubenswrapper[4930]: I1012 05:59:22.083604 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Oct 12 05:59:22 crc kubenswrapper[4930]: E1012 05:59:22.142120 4930 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 967ac9781fbc3313b269597b25fe58509982553e5823ced10136d6d47186803c is running failed: container process not found" containerID="967ac9781fbc3313b269597b25fe58509982553e5823ced10136d6d47186803c" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Oct 12 05:59:22 crc kubenswrapper[4930]: E1012 05:59:22.143205 4930 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 967ac9781fbc3313b269597b25fe58509982553e5823ced10136d6d47186803c is running failed: container process not found" containerID="967ac9781fbc3313b269597b25fe58509982553e5823ced10136d6d47186803c" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Oct 12 05:59:22 crc kubenswrapper[4930]: E1012 05:59:22.145085 4930 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 967ac9781fbc3313b269597b25fe58509982553e5823ced10136d6d47186803c is running failed: container process not found" containerID="967ac9781fbc3313b269597b25fe58509982553e5823ced10136d6d47186803c" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Oct 12 05:59:22 crc kubenswrapper[4930]: E1012 05:59:22.145121 4930 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 967ac9781fbc3313b269597b25fe58509982553e5823ced10136d6d47186803c is running failed: container process not found" probeType="Readiness" pod="openstack/watcher-decision-engine-0" podUID="dda9904e-b764-43a5-83d2-5c993023f740" containerName="watcher-decision-engine" Oct 12 05:59:22 crc kubenswrapper[4930]: I1012 05:59:22.147271 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="934c43b8-93c5-442d-ba52-2e3bfc2d8f35" path="/var/lib/kubelet/pods/934c43b8-93c5-442d-ba52-2e3bfc2d8f35/volumes" Oct 12 05:59:22 crc kubenswrapper[4930]: I1012 05:59:22.153223 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/079031ef-591d-44a8-9a65-fdc0eaea1a0d-logs\") pod \"watcher-applier-0\" (UID: \"079031ef-591d-44a8-9a65-fdc0eaea1a0d\") " pod="openstack/watcher-applier-0" Oct 12 05:59:22 crc kubenswrapper[4930]: I1012 05:59:22.153280 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfq2b\" (UniqueName: \"kubernetes.io/projected/079031ef-591d-44a8-9a65-fdc0eaea1a0d-kube-api-access-hfq2b\") pod \"watcher-applier-0\" (UID: \"079031ef-591d-44a8-9a65-fdc0eaea1a0d\") " pod="openstack/watcher-applier-0" Oct 12 05:59:22 crc kubenswrapper[4930]: I1012 05:59:22.153311 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/079031ef-591d-44a8-9a65-fdc0eaea1a0d-config-data\") pod \"watcher-applier-0\" (UID: \"079031ef-591d-44a8-9a65-fdc0eaea1a0d\") " pod="openstack/watcher-applier-0" Oct 12 05:59:22 crc kubenswrapper[4930]: I1012 05:59:22.153341 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/079031ef-591d-44a8-9a65-fdc0eaea1a0d-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"079031ef-591d-44a8-9a65-fdc0eaea1a0d\") " pod="openstack/watcher-applier-0" Oct 12 05:59:22 crc kubenswrapper[4930]: I1012 05:59:22.174975 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-84fd96956f-5sqc8" Oct 12 05:59:22 crc kubenswrapper[4930]: I1012 05:59:22.254510 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/079031ef-591d-44a8-9a65-fdc0eaea1a0d-logs\") pod \"watcher-applier-0\" (UID: \"079031ef-591d-44a8-9a65-fdc0eaea1a0d\") " pod="openstack/watcher-applier-0" Oct 12 05:59:22 crc kubenswrapper[4930]: I1012 05:59:22.254584 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfq2b\" (UniqueName: \"kubernetes.io/projected/079031ef-591d-44a8-9a65-fdc0eaea1a0d-kube-api-access-hfq2b\") pod \"watcher-applier-0\" (UID: \"079031ef-591d-44a8-9a65-fdc0eaea1a0d\") " pod="openstack/watcher-applier-0" Oct 12 05:59:22 crc kubenswrapper[4930]: I1012 05:59:22.254619 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/079031ef-591d-44a8-9a65-fdc0eaea1a0d-config-data\") pod \"watcher-applier-0\" (UID: \"079031ef-591d-44a8-9a65-fdc0eaea1a0d\") " pod="openstack/watcher-applier-0" Oct 12 05:59:22 crc kubenswrapper[4930]: I1012 05:59:22.254642 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/079031ef-591d-44a8-9a65-fdc0eaea1a0d-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"079031ef-591d-44a8-9a65-fdc0eaea1a0d\") " pod="openstack/watcher-applier-0" Oct 12 05:59:22 crc kubenswrapper[4930]: I1012 05:59:22.255214 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/079031ef-591d-44a8-9a65-fdc0eaea1a0d-logs\") pod \"watcher-applier-0\" (UID: \"079031ef-591d-44a8-9a65-fdc0eaea1a0d\") " pod="openstack/watcher-applier-0" Oct 12 05:59:22 crc kubenswrapper[4930]: I1012 05:59:22.260489 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/079031ef-591d-44a8-9a65-fdc0eaea1a0d-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"079031ef-591d-44a8-9a65-fdc0eaea1a0d\") " pod="openstack/watcher-applier-0" Oct 12 05:59:22 crc kubenswrapper[4930]: I1012 05:59:22.263916 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/079031ef-591d-44a8-9a65-fdc0eaea1a0d-config-data\") pod \"watcher-applier-0\" (UID: \"079031ef-591d-44a8-9a65-fdc0eaea1a0d\") " pod="openstack/watcher-applier-0" Oct 12 05:59:22 crc kubenswrapper[4930]: I1012 05:59:22.286701 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfq2b\" (UniqueName: \"kubernetes.io/projected/079031ef-591d-44a8-9a65-fdc0eaea1a0d-kube-api-access-hfq2b\") pod \"watcher-applier-0\" (UID: \"079031ef-591d-44a8-9a65-fdc0eaea1a0d\") " pod="openstack/watcher-applier-0" Oct 12 05:59:22 crc kubenswrapper[4930]: I1012 05:59:22.392701 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Oct 12 05:59:22 crc kubenswrapper[4930]: I1012 05:59:22.612055 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-546b85cb56-ln9lt"] Oct 12 05:59:22 crc kubenswrapper[4930]: I1012 05:59:22.613379 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-546b85cb56-ln9lt" Oct 12 05:59:22 crc kubenswrapper[4930]: I1012 05:59:22.622383 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 12 05:59:22 crc kubenswrapper[4930]: I1012 05:59:22.622647 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Oct 12 05:59:22 crc kubenswrapper[4930]: I1012 05:59:22.622785 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-hkmzw" Oct 12 05:59:22 crc kubenswrapper[4930]: I1012 05:59:22.622944 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 12 05:59:22 crc kubenswrapper[4930]: I1012 05:59:22.623401 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 12 05:59:22 crc kubenswrapper[4930]: I1012 05:59:22.623514 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Oct 12 05:59:22 crc kubenswrapper[4930]: I1012 05:59:22.648851 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-546b85cb56-ln9lt"] Oct 12 05:59:22 crc kubenswrapper[4930]: I1012 05:59:22.665684 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/456742ca-6f3a-485a-81ee-2a4d84df38c8-public-tls-certs\") pod \"keystone-546b85cb56-ln9lt\" (UID: \"456742ca-6f3a-485a-81ee-2a4d84df38c8\") " pod="openstack/keystone-546b85cb56-ln9lt" Oct 12 05:59:22 crc kubenswrapper[4930]: I1012 05:59:22.665801 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/456742ca-6f3a-485a-81ee-2a4d84df38c8-internal-tls-certs\") pod \"keystone-546b85cb56-ln9lt\" (UID: \"456742ca-6f3a-485a-81ee-2a4d84df38c8\") " pod="openstack/keystone-546b85cb56-ln9lt" Oct 12 05:59:22 crc kubenswrapper[4930]: I1012 05:59:22.665851 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9456t\" (UniqueName: \"kubernetes.io/projected/456742ca-6f3a-485a-81ee-2a4d84df38c8-kube-api-access-9456t\") pod \"keystone-546b85cb56-ln9lt\" (UID: \"456742ca-6f3a-485a-81ee-2a4d84df38c8\") " pod="openstack/keystone-546b85cb56-ln9lt" Oct 12 05:59:22 crc kubenswrapper[4930]: I1012 05:59:22.665866 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/456742ca-6f3a-485a-81ee-2a4d84df38c8-credential-keys\") pod \"keystone-546b85cb56-ln9lt\" (UID: \"456742ca-6f3a-485a-81ee-2a4d84df38c8\") " pod="openstack/keystone-546b85cb56-ln9lt" Oct 12 05:59:22 crc kubenswrapper[4930]: I1012 05:59:22.665890 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/456742ca-6f3a-485a-81ee-2a4d84df38c8-combined-ca-bundle\") pod \"keystone-546b85cb56-ln9lt\" (UID: \"456742ca-6f3a-485a-81ee-2a4d84df38c8\") " pod="openstack/keystone-546b85cb56-ln9lt" Oct 12 05:59:22 crc kubenswrapper[4930]: I1012 05:59:22.665926 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/456742ca-6f3a-485a-81ee-2a4d84df38c8-scripts\") pod \"keystone-546b85cb56-ln9lt\" (UID: \"456742ca-6f3a-485a-81ee-2a4d84df38c8\") " pod="openstack/keystone-546b85cb56-ln9lt" Oct 12 05:59:22 crc kubenswrapper[4930]: I1012 05:59:22.665947 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/456742ca-6f3a-485a-81ee-2a4d84df38c8-fernet-keys\") pod \"keystone-546b85cb56-ln9lt\" (UID: \"456742ca-6f3a-485a-81ee-2a4d84df38c8\") " pod="openstack/keystone-546b85cb56-ln9lt" Oct 12 05:59:22 crc kubenswrapper[4930]: I1012 05:59:22.665969 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/456742ca-6f3a-485a-81ee-2a4d84df38c8-config-data\") pod \"keystone-546b85cb56-ln9lt\" (UID: \"456742ca-6f3a-485a-81ee-2a4d84df38c8\") " pod="openstack/keystone-546b85cb56-ln9lt" Oct 12 05:59:22 crc kubenswrapper[4930]: I1012 05:59:22.767295 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/456742ca-6f3a-485a-81ee-2a4d84df38c8-public-tls-certs\") pod \"keystone-546b85cb56-ln9lt\" (UID: \"456742ca-6f3a-485a-81ee-2a4d84df38c8\") " pod="openstack/keystone-546b85cb56-ln9lt" Oct 12 05:59:22 crc kubenswrapper[4930]: I1012 05:59:22.767526 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/456742ca-6f3a-485a-81ee-2a4d84df38c8-internal-tls-certs\") pod \"keystone-546b85cb56-ln9lt\" (UID: \"456742ca-6f3a-485a-81ee-2a4d84df38c8\") " pod="openstack/keystone-546b85cb56-ln9lt" Oct 12 05:59:22 crc kubenswrapper[4930]: I1012 05:59:22.767570 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9456t\" (UniqueName: \"kubernetes.io/projected/456742ca-6f3a-485a-81ee-2a4d84df38c8-kube-api-access-9456t\") pod \"keystone-546b85cb56-ln9lt\" (UID: \"456742ca-6f3a-485a-81ee-2a4d84df38c8\") " pod="openstack/keystone-546b85cb56-ln9lt" Oct 12 05:59:22 crc kubenswrapper[4930]: I1012 05:59:22.767587 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/456742ca-6f3a-485a-81ee-2a4d84df38c8-credential-keys\") pod \"keystone-546b85cb56-ln9lt\" (UID: \"456742ca-6f3a-485a-81ee-2a4d84df38c8\") " pod="openstack/keystone-546b85cb56-ln9lt" Oct 12 05:59:22 crc kubenswrapper[4930]: I1012 05:59:22.767613 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/456742ca-6f3a-485a-81ee-2a4d84df38c8-combined-ca-bundle\") pod \"keystone-546b85cb56-ln9lt\" (UID: \"456742ca-6f3a-485a-81ee-2a4d84df38c8\") " pod="openstack/keystone-546b85cb56-ln9lt" Oct 12 05:59:22 crc kubenswrapper[4930]: I1012 05:59:22.767649 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/456742ca-6f3a-485a-81ee-2a4d84df38c8-scripts\") pod \"keystone-546b85cb56-ln9lt\" (UID: \"456742ca-6f3a-485a-81ee-2a4d84df38c8\") " pod="openstack/keystone-546b85cb56-ln9lt" Oct 12 05:59:22 crc kubenswrapper[4930]: I1012 05:59:22.767668 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/456742ca-6f3a-485a-81ee-2a4d84df38c8-fernet-keys\") pod \"keystone-546b85cb56-ln9lt\" (UID: \"456742ca-6f3a-485a-81ee-2a4d84df38c8\") " pod="openstack/keystone-546b85cb56-ln9lt" Oct 12 05:59:22 crc kubenswrapper[4930]: I1012 05:59:22.767690 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/456742ca-6f3a-485a-81ee-2a4d84df38c8-config-data\") pod \"keystone-546b85cb56-ln9lt\" (UID: \"456742ca-6f3a-485a-81ee-2a4d84df38c8\") " pod="openstack/keystone-546b85cb56-ln9lt" Oct 12 05:59:22 crc kubenswrapper[4930]: I1012 05:59:22.785843 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7648c6b969-m7fgp" Oct 12 05:59:22 crc kubenswrapper[4930]: I1012 05:59:22.793929 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/456742ca-6f3a-485a-81ee-2a4d84df38c8-combined-ca-bundle\") pod \"keystone-546b85cb56-ln9lt\" (UID: \"456742ca-6f3a-485a-81ee-2a4d84df38c8\") " pod="openstack/keystone-546b85cb56-ln9lt" Oct 12 05:59:22 crc kubenswrapper[4930]: I1012 05:59:22.799054 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/456742ca-6f3a-485a-81ee-2a4d84df38c8-config-data\") pod \"keystone-546b85cb56-ln9lt\" (UID: \"456742ca-6f3a-485a-81ee-2a4d84df38c8\") " pod="openstack/keystone-546b85cb56-ln9lt" Oct 12 05:59:22 crc kubenswrapper[4930]: I1012 05:59:22.799103 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/456742ca-6f3a-485a-81ee-2a4d84df38c8-public-tls-certs\") pod \"keystone-546b85cb56-ln9lt\" (UID: \"456742ca-6f3a-485a-81ee-2a4d84df38c8\") " pod="openstack/keystone-546b85cb56-ln9lt" Oct 12 05:59:22 crc kubenswrapper[4930]: I1012 05:59:22.799786 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/456742ca-6f3a-485a-81ee-2a4d84df38c8-scripts\") pod \"keystone-546b85cb56-ln9lt\" (UID: \"456742ca-6f3a-485a-81ee-2a4d84df38c8\") " pod="openstack/keystone-546b85cb56-ln9lt" Oct 12 05:59:22 crc kubenswrapper[4930]: I1012 05:59:22.813299 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/456742ca-6f3a-485a-81ee-2a4d84df38c8-fernet-keys\") pod \"keystone-546b85cb56-ln9lt\" (UID: \"456742ca-6f3a-485a-81ee-2a4d84df38c8\") " pod="openstack/keystone-546b85cb56-ln9lt" Oct 12 05:59:22 crc kubenswrapper[4930]: I1012 05:59:22.822021 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/456742ca-6f3a-485a-81ee-2a4d84df38c8-credential-keys\") pod \"keystone-546b85cb56-ln9lt\" (UID: \"456742ca-6f3a-485a-81ee-2a4d84df38c8\") " pod="openstack/keystone-546b85cb56-ln9lt" Oct 12 05:59:22 crc kubenswrapper[4930]: I1012 05:59:22.823230 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9456t\" (UniqueName: \"kubernetes.io/projected/456742ca-6f3a-485a-81ee-2a4d84df38c8-kube-api-access-9456t\") pod \"keystone-546b85cb56-ln9lt\" (UID: \"456742ca-6f3a-485a-81ee-2a4d84df38c8\") " pod="openstack/keystone-546b85cb56-ln9lt" Oct 12 05:59:22 crc kubenswrapper[4930]: I1012 05:59:22.824316 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/456742ca-6f3a-485a-81ee-2a4d84df38c8-internal-tls-certs\") pod \"keystone-546b85cb56-ln9lt\" (UID: \"456742ca-6f3a-485a-81ee-2a4d84df38c8\") " pod="openstack/keystone-546b85cb56-ln9lt" Oct 12 05:59:22 crc kubenswrapper[4930]: I1012 05:59:22.827437 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-546b85cb56-ln9lt" Oct 12 05:59:22 crc kubenswrapper[4930]: I1012 05:59:22.828119 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Oct 12 05:59:22 crc kubenswrapper[4930]: I1012 05:59:22.904674 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Oct 12 05:59:22 crc kubenswrapper[4930]: I1012 05:59:22.905279 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"dda9904e-b764-43a5-83d2-5c993023f740","Type":"ContainerDied","Data":"a303ddb5384b147b5928088f3723f1c39f9fc1dfd698abe27c3ed1e825a82c69"} Oct 12 05:59:22 crc kubenswrapper[4930]: I1012 05:59:22.905313 4930 scope.go:117] "RemoveContainer" containerID="967ac9781fbc3313b269597b25fe58509982553e5823ced10136d6d47186803c" Oct 12 05:59:22 crc kubenswrapper[4930]: I1012 05:59:22.954911 4930 generic.go:334] "Generic (PLEG): container finished" podID="ce22649f-fe81-4c02-a26a-15e45e306b82" containerID="148e80f03a3b8ca1088059a87fb924230d189b4c9191ed927d45cd9215c559c8" exitCode=0 Oct 12 05:59:22 crc kubenswrapper[4930]: I1012 05:59:22.954950 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-dmtxt" event={"ID":"ce22649f-fe81-4c02-a26a-15e45e306b82","Type":"ContainerDied","Data":"148e80f03a3b8ca1088059a87fb924230d189b4c9191ed927d45cd9215c559c8"} Oct 12 05:59:22 crc kubenswrapper[4930]: I1012 05:59:22.974706 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76f9c4c8bc-6hjqx"] Oct 12 05:59:22 crc kubenswrapper[4930]: I1012 05:59:22.974966 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-76f9c4c8bc-6hjqx" podUID="c538bd3d-6ead-4b75-a12e-327b70390f9c" containerName="dnsmasq-dns" containerID="cri-o://f74502473d6f97afe0249a1d38c4e959958a619305d15a4309737b260c87bd64" gracePeriod=10 Oct 12 05:59:22 crc kubenswrapper[4930]: I1012 05:59:22.980485 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dda9904e-b764-43a5-83d2-5c993023f740-logs\") pod \"dda9904e-b764-43a5-83d2-5c993023f740\" (UID: \"dda9904e-b764-43a5-83d2-5c993023f740\") " Oct 12 05:59:22 crc kubenswrapper[4930]: I1012 05:59:22.980555 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dda9904e-b764-43a5-83d2-5c993023f740-combined-ca-bundle\") pod \"dda9904e-b764-43a5-83d2-5c993023f740\" (UID: \"dda9904e-b764-43a5-83d2-5c993023f740\") " Oct 12 05:59:22 crc kubenswrapper[4930]: I1012 05:59:22.980590 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dda9904e-b764-43a5-83d2-5c993023f740-config-data\") pod \"dda9904e-b764-43a5-83d2-5c993023f740\" (UID: \"dda9904e-b764-43a5-83d2-5c993023f740\") " Oct 12 05:59:22 crc kubenswrapper[4930]: I1012 05:59:22.980676 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvck7\" (UniqueName: \"kubernetes.io/projected/dda9904e-b764-43a5-83d2-5c993023f740-kube-api-access-mvck7\") pod \"dda9904e-b764-43a5-83d2-5c993023f740\" (UID: \"dda9904e-b764-43a5-83d2-5c993023f740\") " Oct 12 05:59:22 crc kubenswrapper[4930]: I1012 05:59:22.980777 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/dda9904e-b764-43a5-83d2-5c993023f740-custom-prometheus-ca\") pod \"dda9904e-b764-43a5-83d2-5c993023f740\" (UID: \"dda9904e-b764-43a5-83d2-5c993023f740\") " Oct 12 05:59:22 crc kubenswrapper[4930]: I1012 05:59:22.984371 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dda9904e-b764-43a5-83d2-5c993023f740-logs" (OuterVolumeSpecName: "logs") pod "dda9904e-b764-43a5-83d2-5c993023f740" (UID: "dda9904e-b764-43a5-83d2-5c993023f740"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 05:59:23 crc kubenswrapper[4930]: I1012 05:59:23.034099 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dda9904e-b764-43a5-83d2-5c993023f740-kube-api-access-mvck7" (OuterVolumeSpecName: "kube-api-access-mvck7") pod "dda9904e-b764-43a5-83d2-5c993023f740" (UID: "dda9904e-b764-43a5-83d2-5c993023f740"). InnerVolumeSpecName "kube-api-access-mvck7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:59:23 crc kubenswrapper[4930]: I1012 05:59:23.076904 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dda9904e-b764-43a5-83d2-5c993023f740-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "dda9904e-b764-43a5-83d2-5c993023f740" (UID: "dda9904e-b764-43a5-83d2-5c993023f740"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:59:23 crc kubenswrapper[4930]: I1012 05:59:23.092366 4930 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dda9904e-b764-43a5-83d2-5c993023f740-logs\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:23 crc kubenswrapper[4930]: I1012 05:59:23.092400 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvck7\" (UniqueName: \"kubernetes.io/projected/dda9904e-b764-43a5-83d2-5c993023f740-kube-api-access-mvck7\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:23 crc kubenswrapper[4930]: I1012 05:59:23.092411 4930 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/dda9904e-b764-43a5-83d2-5c993023f740-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:23 crc kubenswrapper[4930]: I1012 05:59:23.108102 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dda9904e-b764-43a5-83d2-5c993023f740-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dda9904e-b764-43a5-83d2-5c993023f740" (UID: "dda9904e-b764-43a5-83d2-5c993023f740"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:59:23 crc kubenswrapper[4930]: I1012 05:59:23.137694 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dda9904e-b764-43a5-83d2-5c993023f740-config-data" (OuterVolumeSpecName: "config-data") pod "dda9904e-b764-43a5-83d2-5c993023f740" (UID: "dda9904e-b764-43a5-83d2-5c993023f740"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:59:23 crc kubenswrapper[4930]: I1012 05:59:23.194806 4930 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dda9904e-b764-43a5-83d2-5c993023f740-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:23 crc kubenswrapper[4930]: I1012 05:59:23.195091 4930 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dda9904e-b764-43a5-83d2-5c993023f740-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:23 crc kubenswrapper[4930]: I1012 05:59:23.255443 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-544d94f45b-79l8m"] Oct 12 05:59:23 crc kubenswrapper[4930]: I1012 05:59:23.300090 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 12 05:59:23 crc kubenswrapper[4930]: I1012 05:59:23.316804 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 12 05:59:23 crc kubenswrapper[4930]: I1012 05:59:23.330783 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 12 05:59:23 crc kubenswrapper[4930]: E1012 05:59:23.331163 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dda9904e-b764-43a5-83d2-5c993023f740" containerName="watcher-decision-engine" Oct 12 05:59:23 crc kubenswrapper[4930]: I1012 05:59:23.331182 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="dda9904e-b764-43a5-83d2-5c993023f740" containerName="watcher-decision-engine" Oct 12 05:59:23 crc kubenswrapper[4930]: I1012 05:59:23.331372 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="dda9904e-b764-43a5-83d2-5c993023f740" containerName="watcher-decision-engine" Oct 12 05:59:23 crc kubenswrapper[4930]: I1012 05:59:23.331972 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Oct 12 05:59:23 crc kubenswrapper[4930]: I1012 05:59:23.334839 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Oct 12 05:59:23 crc kubenswrapper[4930]: I1012 05:59:23.338566 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 12 05:59:23 crc kubenswrapper[4930]: I1012 05:59:23.399606 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59f4bae4-1a84-449a-be72-e735294116e6-config-data\") pod \"watcher-decision-engine-0\" (UID: \"59f4bae4-1a84-449a-be72-e735294116e6\") " pod="openstack/watcher-decision-engine-0" Oct 12 05:59:23 crc kubenswrapper[4930]: I1012 05:59:23.399671 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59f4bae4-1a84-449a-be72-e735294116e6-logs\") pod \"watcher-decision-engine-0\" (UID: \"59f4bae4-1a84-449a-be72-e735294116e6\") " pod="openstack/watcher-decision-engine-0" Oct 12 05:59:23 crc kubenswrapper[4930]: I1012 05:59:23.402130 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59f4bae4-1a84-449a-be72-e735294116e6-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"59f4bae4-1a84-449a-be72-e735294116e6\") " pod="openstack/watcher-decision-engine-0" Oct 12 05:59:23 crc kubenswrapper[4930]: I1012 05:59:23.402163 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgjbx\" (UniqueName: \"kubernetes.io/projected/59f4bae4-1a84-449a-be72-e735294116e6-kube-api-access-qgjbx\") pod \"watcher-decision-engine-0\" (UID: \"59f4bae4-1a84-449a-be72-e735294116e6\") " pod="openstack/watcher-decision-engine-0" Oct 12 05:59:23 crc kubenswrapper[4930]: I1012 05:59:23.402185 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/59f4bae4-1a84-449a-be72-e735294116e6-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"59f4bae4-1a84-449a-be72-e735294116e6\") " pod="openstack/watcher-decision-engine-0" Oct 12 05:59:23 crc kubenswrapper[4930]: I1012 05:59:23.425582 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Oct 12 05:59:23 crc kubenswrapper[4930]: I1012 05:59:23.503019 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59f4bae4-1a84-449a-be72-e735294116e6-logs\") pod \"watcher-decision-engine-0\" (UID: \"59f4bae4-1a84-449a-be72-e735294116e6\") " pod="openstack/watcher-decision-engine-0" Oct 12 05:59:23 crc kubenswrapper[4930]: I1012 05:59:23.503131 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59f4bae4-1a84-449a-be72-e735294116e6-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"59f4bae4-1a84-449a-be72-e735294116e6\") " pod="openstack/watcher-decision-engine-0" Oct 12 05:59:23 crc kubenswrapper[4930]: I1012 05:59:23.503184 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgjbx\" (UniqueName: \"kubernetes.io/projected/59f4bae4-1a84-449a-be72-e735294116e6-kube-api-access-qgjbx\") pod \"watcher-decision-engine-0\" (UID: \"59f4bae4-1a84-449a-be72-e735294116e6\") " pod="openstack/watcher-decision-engine-0" Oct 12 05:59:23 crc kubenswrapper[4930]: I1012 05:59:23.503208 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/59f4bae4-1a84-449a-be72-e735294116e6-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"59f4bae4-1a84-449a-be72-e735294116e6\") " pod="openstack/watcher-decision-engine-0" Oct 12 05:59:23 crc kubenswrapper[4930]: I1012 05:59:23.503288 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59f4bae4-1a84-449a-be72-e735294116e6-config-data\") pod \"watcher-decision-engine-0\" (UID: \"59f4bae4-1a84-449a-be72-e735294116e6\") " pod="openstack/watcher-decision-engine-0" Oct 12 05:59:23 crc kubenswrapper[4930]: I1012 05:59:23.506159 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59f4bae4-1a84-449a-be72-e735294116e6-logs\") pod \"watcher-decision-engine-0\" (UID: \"59f4bae4-1a84-449a-be72-e735294116e6\") " pod="openstack/watcher-decision-engine-0" Oct 12 05:59:23 crc kubenswrapper[4930]: I1012 05:59:23.508064 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/59f4bae4-1a84-449a-be72-e735294116e6-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"59f4bae4-1a84-449a-be72-e735294116e6\") " pod="openstack/watcher-decision-engine-0" Oct 12 05:59:23 crc kubenswrapper[4930]: I1012 05:59:23.511759 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59f4bae4-1a84-449a-be72-e735294116e6-config-data\") pod \"watcher-decision-engine-0\" (UID: \"59f4bae4-1a84-449a-be72-e735294116e6\") " pod="openstack/watcher-decision-engine-0" Oct 12 05:59:23 crc kubenswrapper[4930]: I1012 05:59:23.513252 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59f4bae4-1a84-449a-be72-e735294116e6-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"59f4bae4-1a84-449a-be72-e735294116e6\") " pod="openstack/watcher-decision-engine-0" Oct 12 05:59:23 crc kubenswrapper[4930]: I1012 05:59:23.520405 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgjbx\" (UniqueName: \"kubernetes.io/projected/59f4bae4-1a84-449a-be72-e735294116e6-kube-api-access-qgjbx\") pod \"watcher-decision-engine-0\" (UID: \"59f4bae4-1a84-449a-be72-e735294116e6\") " pod="openstack/watcher-decision-engine-0" Oct 12 05:59:23 crc kubenswrapper[4930]: I1012 05:59:23.684785 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Oct 12 05:59:23 crc kubenswrapper[4930]: I1012 05:59:23.731764 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-546b85cb56-ln9lt"] Oct 12 05:59:23 crc kubenswrapper[4930]: I1012 05:59:23.792530 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76f9c4c8bc-6hjqx" Oct 12 05:59:23 crc kubenswrapper[4930]: I1012 05:59:23.909987 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c538bd3d-6ead-4b75-a12e-327b70390f9c-ovsdbserver-nb\") pod \"c538bd3d-6ead-4b75-a12e-327b70390f9c\" (UID: \"c538bd3d-6ead-4b75-a12e-327b70390f9c\") " Oct 12 05:59:23 crc kubenswrapper[4930]: I1012 05:59:23.910032 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c538bd3d-6ead-4b75-a12e-327b70390f9c-dns-svc\") pod \"c538bd3d-6ead-4b75-a12e-327b70390f9c\" (UID: \"c538bd3d-6ead-4b75-a12e-327b70390f9c\") " Oct 12 05:59:23 crc kubenswrapper[4930]: I1012 05:59:23.910200 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c538bd3d-6ead-4b75-a12e-327b70390f9c-config\") pod \"c538bd3d-6ead-4b75-a12e-327b70390f9c\" (UID: \"c538bd3d-6ead-4b75-a12e-327b70390f9c\") " Oct 12 05:59:23 crc kubenswrapper[4930]: I1012 05:59:23.910374 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c538bd3d-6ead-4b75-a12e-327b70390f9c-ovsdbserver-sb\") pod \"c538bd3d-6ead-4b75-a12e-327b70390f9c\" (UID: \"c538bd3d-6ead-4b75-a12e-327b70390f9c\") " Oct 12 05:59:23 crc kubenswrapper[4930]: I1012 05:59:23.910419 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkpp8\" (UniqueName: \"kubernetes.io/projected/c538bd3d-6ead-4b75-a12e-327b70390f9c-kube-api-access-wkpp8\") pod \"c538bd3d-6ead-4b75-a12e-327b70390f9c\" (UID: \"c538bd3d-6ead-4b75-a12e-327b70390f9c\") " Oct 12 05:59:23 crc kubenswrapper[4930]: I1012 05:59:23.931139 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c538bd3d-6ead-4b75-a12e-327b70390f9c-kube-api-access-wkpp8" (OuterVolumeSpecName: "kube-api-access-wkpp8") pod "c538bd3d-6ead-4b75-a12e-327b70390f9c" (UID: "c538bd3d-6ead-4b75-a12e-327b70390f9c"). InnerVolumeSpecName "kube-api-access-wkpp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:59:23 crc kubenswrapper[4930]: I1012 05:59:23.998129 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-544d94f45b-79l8m" event={"ID":"8cfa2a2e-ac4f-415b-9dd2-dabf059ad679","Type":"ContainerStarted","Data":"62726481ce9a294b11c7cc6dc23df5820a0559dd5fb2a9b9f442b2f27c2af795"} Oct 12 05:59:23 crc kubenswrapper[4930]: I1012 05:59:23.998470 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-544d94f45b-79l8m" event={"ID":"8cfa2a2e-ac4f-415b-9dd2-dabf059ad679","Type":"ContainerStarted","Data":"1407b16391ae8bc7a2ed2c00b352bde99511d24a75861f1b2d4db7acbf62702e"} Oct 12 05:59:24 crc kubenswrapper[4930]: I1012 05:59:24.006555 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"079031ef-591d-44a8-9a65-fdc0eaea1a0d","Type":"ContainerStarted","Data":"b7c0671bf3b54dda18af8a12dbfd50038ebd1dce024cdd45a052d9c969c5f824"} Oct 12 05:59:24 crc kubenswrapper[4930]: I1012 05:59:24.016129 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wkpp8\" (UniqueName: \"kubernetes.io/projected/c538bd3d-6ead-4b75-a12e-327b70390f9c-kube-api-access-wkpp8\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:24 crc kubenswrapper[4930]: I1012 05:59:24.017158 4930 generic.go:334] "Generic (PLEG): container finished" podID="c538bd3d-6ead-4b75-a12e-327b70390f9c" containerID="f74502473d6f97afe0249a1d38c4e959958a619305d15a4309737b260c87bd64" exitCode=0 Oct 12 05:59:24 crc kubenswrapper[4930]: I1012 05:59:24.017263 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76f9c4c8bc-6hjqx" Oct 12 05:59:24 crc kubenswrapper[4930]: I1012 05:59:24.017367 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76f9c4c8bc-6hjqx" event={"ID":"c538bd3d-6ead-4b75-a12e-327b70390f9c","Type":"ContainerDied","Data":"f74502473d6f97afe0249a1d38c4e959958a619305d15a4309737b260c87bd64"} Oct 12 05:59:24 crc kubenswrapper[4930]: I1012 05:59:24.017447 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76f9c4c8bc-6hjqx" event={"ID":"c538bd3d-6ead-4b75-a12e-327b70390f9c","Type":"ContainerDied","Data":"5796279102b5b07dfe44ffd6cc3f9770acebf998bfd8aae791f412e477c72acc"} Oct 12 05:59:24 crc kubenswrapper[4930]: I1012 05:59:24.017512 4930 scope.go:117] "RemoveContainer" containerID="f74502473d6f97afe0249a1d38c4e959958a619305d15a4309737b260c87bd64" Oct 12 05:59:24 crc kubenswrapper[4930]: I1012 05:59:24.019776 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-546b85cb56-ln9lt" event={"ID":"456742ca-6f3a-485a-81ee-2a4d84df38c8","Type":"ContainerStarted","Data":"8df6fe12c35b6a76c437e7346df15f969632b30f093aabd4f4cceb6de0873d70"} Oct 12 05:59:24 crc kubenswrapper[4930]: I1012 05:59:24.039281 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7b25d9dd-3bce-4f8d-9219-1d6ce75878ed","Type":"ContainerStarted","Data":"3cb02172b934592377e139a2782717713272e8fb6014aad096c3e8ffee26c3a3"} Oct 12 05:59:24 crc kubenswrapper[4930]: I1012 05:59:24.054250 4930 scope.go:117] "RemoveContainer" containerID="36b664bc1818cc9c2606d64e8a4ce29fc00199ab10c440540f3682d59a9d0153" Oct 12 05:59:24 crc kubenswrapper[4930]: I1012 05:59:24.085728 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c538bd3d-6ead-4b75-a12e-327b70390f9c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c538bd3d-6ead-4b75-a12e-327b70390f9c" (UID: "c538bd3d-6ead-4b75-a12e-327b70390f9c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:59:24 crc kubenswrapper[4930]: I1012 05:59:24.120149 4930 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c538bd3d-6ead-4b75-a12e-327b70390f9c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:24 crc kubenswrapper[4930]: I1012 05:59:24.144893 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c538bd3d-6ead-4b75-a12e-327b70390f9c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c538bd3d-6ead-4b75-a12e-327b70390f9c" (UID: "c538bd3d-6ead-4b75-a12e-327b70390f9c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:59:24 crc kubenswrapper[4930]: I1012 05:59:24.171154 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dda9904e-b764-43a5-83d2-5c993023f740" path="/var/lib/kubelet/pods/dda9904e-b764-43a5-83d2-5c993023f740/volumes" Oct 12 05:59:24 crc kubenswrapper[4930]: I1012 05:59:24.171453 4930 scope.go:117] "RemoveContainer" containerID="f74502473d6f97afe0249a1d38c4e959958a619305d15a4309737b260c87bd64" Oct 12 05:59:24 crc kubenswrapper[4930]: E1012 05:59:24.171854 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f74502473d6f97afe0249a1d38c4e959958a619305d15a4309737b260c87bd64\": container with ID starting with f74502473d6f97afe0249a1d38c4e959958a619305d15a4309737b260c87bd64 not found: ID does not exist" containerID="f74502473d6f97afe0249a1d38c4e959958a619305d15a4309737b260c87bd64" Oct 12 05:59:24 crc kubenswrapper[4930]: I1012 05:59:24.171891 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f74502473d6f97afe0249a1d38c4e959958a619305d15a4309737b260c87bd64"} err="failed to get container status \"f74502473d6f97afe0249a1d38c4e959958a619305d15a4309737b260c87bd64\": rpc error: code = NotFound desc = could not find container \"f74502473d6f97afe0249a1d38c4e959958a619305d15a4309737b260c87bd64\": container with ID starting with f74502473d6f97afe0249a1d38c4e959958a619305d15a4309737b260c87bd64 not found: ID does not exist" Oct 12 05:59:24 crc kubenswrapper[4930]: I1012 05:59:24.171914 4930 scope.go:117] "RemoveContainer" containerID="36b664bc1818cc9c2606d64e8a4ce29fc00199ab10c440540f3682d59a9d0153" Oct 12 05:59:24 crc kubenswrapper[4930]: E1012 05:59:24.172125 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36b664bc1818cc9c2606d64e8a4ce29fc00199ab10c440540f3682d59a9d0153\": container with ID starting with 36b664bc1818cc9c2606d64e8a4ce29fc00199ab10c440540f3682d59a9d0153 not found: ID does not exist" containerID="36b664bc1818cc9c2606d64e8a4ce29fc00199ab10c440540f3682d59a9d0153" Oct 12 05:59:24 crc kubenswrapper[4930]: I1012 05:59:24.172141 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36b664bc1818cc9c2606d64e8a4ce29fc00199ab10c440540f3682d59a9d0153"} err="failed to get container status \"36b664bc1818cc9c2606d64e8a4ce29fc00199ab10c440540f3682d59a9d0153\": rpc error: code = NotFound desc = could not find container \"36b664bc1818cc9c2606d64e8a4ce29fc00199ab10c440540f3682d59a9d0153\": container with ID starting with 36b664bc1818cc9c2606d64e8a4ce29fc00199ab10c440540f3682d59a9d0153 not found: ID does not exist" Oct 12 05:59:24 crc kubenswrapper[4930]: I1012 05:59:24.201430 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 12 05:59:24 crc kubenswrapper[4930]: I1012 05:59:24.218243 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c538bd3d-6ead-4b75-a12e-327b70390f9c-config" (OuterVolumeSpecName: "config") pod "c538bd3d-6ead-4b75-a12e-327b70390f9c" (UID: "c538bd3d-6ead-4b75-a12e-327b70390f9c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:59:24 crc kubenswrapper[4930]: I1012 05:59:24.222144 4930 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c538bd3d-6ead-4b75-a12e-327b70390f9c-config\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:24 crc kubenswrapper[4930]: I1012 05:59:24.222182 4930 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c538bd3d-6ead-4b75-a12e-327b70390f9c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:24 crc kubenswrapper[4930]: I1012 05:59:24.228812 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c538bd3d-6ead-4b75-a12e-327b70390f9c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c538bd3d-6ead-4b75-a12e-327b70390f9c" (UID: "c538bd3d-6ead-4b75-a12e-327b70390f9c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:59:24 crc kubenswrapper[4930]: I1012 05:59:24.323375 4930 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c538bd3d-6ead-4b75-a12e-327b70390f9c-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:24 crc kubenswrapper[4930]: I1012 05:59:24.528244 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-dmtxt" Oct 12 05:59:24 crc kubenswrapper[4930]: I1012 05:59:24.535675 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76f9c4c8bc-6hjqx"] Oct 12 05:59:24 crc kubenswrapper[4930]: I1012 05:59:24.542905 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-76f9c4c8bc-6hjqx"] Oct 12 05:59:24 crc kubenswrapper[4930]: I1012 05:59:24.564406 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0" Oct 12 05:59:24 crc kubenswrapper[4930]: I1012 05:59:24.573514 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Oct 12 05:59:24 crc kubenswrapper[4930]: I1012 05:59:24.631351 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g882n\" (UniqueName: \"kubernetes.io/projected/ce22649f-fe81-4c02-a26a-15e45e306b82-kube-api-access-g882n\") pod \"ce22649f-fe81-4c02-a26a-15e45e306b82\" (UID: \"ce22649f-fe81-4c02-a26a-15e45e306b82\") " Oct 12 05:59:24 crc kubenswrapper[4930]: I1012 05:59:24.631540 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce22649f-fe81-4c02-a26a-15e45e306b82-combined-ca-bundle\") pod \"ce22649f-fe81-4c02-a26a-15e45e306b82\" (UID: \"ce22649f-fe81-4c02-a26a-15e45e306b82\") " Oct 12 05:59:24 crc kubenswrapper[4930]: I1012 05:59:24.631701 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ce22649f-fe81-4c02-a26a-15e45e306b82-db-sync-config-data\") pod \"ce22649f-fe81-4c02-a26a-15e45e306b82\" (UID: \"ce22649f-fe81-4c02-a26a-15e45e306b82\") " Oct 12 05:59:24 crc kubenswrapper[4930]: I1012 05:59:24.637869 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce22649f-fe81-4c02-a26a-15e45e306b82-kube-api-access-g882n" (OuterVolumeSpecName: "kube-api-access-g882n") pod "ce22649f-fe81-4c02-a26a-15e45e306b82" (UID: "ce22649f-fe81-4c02-a26a-15e45e306b82"). InnerVolumeSpecName "kube-api-access-g882n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:59:24 crc kubenswrapper[4930]: I1012 05:59:24.650887 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce22649f-fe81-4c02-a26a-15e45e306b82-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "ce22649f-fe81-4c02-a26a-15e45e306b82" (UID: "ce22649f-fe81-4c02-a26a-15e45e306b82"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:59:24 crc kubenswrapper[4930]: I1012 05:59:24.664922 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce22649f-fe81-4c02-a26a-15e45e306b82-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ce22649f-fe81-4c02-a26a-15e45e306b82" (UID: "ce22649f-fe81-4c02-a26a-15e45e306b82"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:59:24 crc kubenswrapper[4930]: I1012 05:59:24.739994 4930 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ce22649f-fe81-4c02-a26a-15e45e306b82-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:24 crc kubenswrapper[4930]: I1012 05:59:24.740032 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g882n\" (UniqueName: \"kubernetes.io/projected/ce22649f-fe81-4c02-a26a-15e45e306b82-kube-api-access-g882n\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:24 crc kubenswrapper[4930]: I1012 05:59:24.740044 4930 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce22649f-fe81-4c02-a26a-15e45e306b82-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.055891 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-544d94f45b-79l8m" event={"ID":"8cfa2a2e-ac4f-415b-9dd2-dabf059ad679","Type":"ContainerStarted","Data":"b83458a18395ceefcd02f131b88a384540f6604ca6acc66dc98e161a0e0bc40b"} Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.056254 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-544d94f45b-79l8m" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.056270 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-544d94f45b-79l8m" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.059083 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"079031ef-591d-44a8-9a65-fdc0eaea1a0d","Type":"ContainerStarted","Data":"e4545ad64276b13b62f4060a451a8f854a86bd78c9c0e92ea11e79baf3ee9a2c"} Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.069675 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"59f4bae4-1a84-449a-be72-e735294116e6","Type":"ContainerStarted","Data":"2b5d1d70dd62ea91af5c3a935efb2c2f47ca551be7765052523abfd189021dc2"} Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.069711 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"59f4bae4-1a84-449a-be72-e735294116e6","Type":"ContainerStarted","Data":"11fe9aabaaefc071d24b067674968d325a6fcf6e69509ca6f1ddb3473f542555"} Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.074188 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-544d94f45b-79l8m" podStartSLOduration=7.074171644 podStartE2EDuration="7.074171644s" podCreationTimestamp="2025-10-12 05:59:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:59:25.073487937 +0000 UTC m=+1097.615589702" watchObservedRunningTime="2025-10-12 05:59:25.074171644 +0000 UTC m=+1097.616273409" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.086886 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-dmtxt" event={"ID":"ce22649f-fe81-4c02-a26a-15e45e306b82","Type":"ContainerDied","Data":"d2027d3fa80b2349acd02c8da6dbcdcc634ce84d2c5fb887c0545866e1345a28"} Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.086925 4930 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d2027d3fa80b2349acd02c8da6dbcdcc634ce84d2c5fb887c0545866e1345a28" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.086999 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-dmtxt" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.093863 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-applier-0" podStartSLOduration=3.093846024 podStartE2EDuration="3.093846024s" podCreationTimestamp="2025-10-12 05:59:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:59:25.091639109 +0000 UTC m=+1097.633740864" watchObservedRunningTime="2025-10-12 05:59:25.093846024 +0000 UTC m=+1097.635947789" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.095538 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-546b85cb56-ln9lt" event={"ID":"456742ca-6f3a-485a-81ee-2a4d84df38c8","Type":"ContainerStarted","Data":"fe0054b6bfc3a367aa6b877d5e66780023ace8e18278a17ec80621e0b6209c29"} Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.095575 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-546b85cb56-ln9lt" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.123699 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=2.123675726 podStartE2EDuration="2.123675726s" podCreationTimestamp="2025-10-12 05:59:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:59:25.112803006 +0000 UTC m=+1097.654904771" watchObservedRunningTime="2025-10-12 05:59:25.123675726 +0000 UTC m=+1097.665777491" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.148927 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-546b85cb56-ln9lt" podStartSLOduration=3.148903774 podStartE2EDuration="3.148903774s" podCreationTimestamp="2025-10-12 05:59:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:59:25.139948971 +0000 UTC m=+1097.682050736" watchObservedRunningTime="2025-10-12 05:59:25.148903774 +0000 UTC m=+1097.691005539" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.244901 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-76d484677c-ptrh6"] Oct 12 05:59:25 crc kubenswrapper[4930]: E1012 05:59:25.245359 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c538bd3d-6ead-4b75-a12e-327b70390f9c" containerName="dnsmasq-dns" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.245371 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="c538bd3d-6ead-4b75-a12e-327b70390f9c" containerName="dnsmasq-dns" Oct 12 05:59:25 crc kubenswrapper[4930]: E1012 05:59:25.245402 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce22649f-fe81-4c02-a26a-15e45e306b82" containerName="barbican-db-sync" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.245408 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce22649f-fe81-4c02-a26a-15e45e306b82" containerName="barbican-db-sync" Oct 12 05:59:25 crc kubenswrapper[4930]: E1012 05:59:25.245420 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c538bd3d-6ead-4b75-a12e-327b70390f9c" containerName="init" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.245428 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="c538bd3d-6ead-4b75-a12e-327b70390f9c" containerName="init" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.245592 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce22649f-fe81-4c02-a26a-15e45e306b82" containerName="barbican-db-sync" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.245607 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="c538bd3d-6ead-4b75-a12e-327b70390f9c" containerName="dnsmasq-dns" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.246619 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-76d484677c-ptrh6" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.258682 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.258921 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-5rw7t" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.259295 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.259870 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-76d484677c-ptrh6"] Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.279799 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-587996ddf4-fcrwq"] Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.282118 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-587996ddf4-fcrwq" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.295035 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.303957 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-587996ddf4-fcrwq"] Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.354873 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e26aa90e-071d-46ff-8fa1-b86f43a70e01-combined-ca-bundle\") pod \"barbican-worker-76d484677c-ptrh6\" (UID: \"e26aa90e-071d-46ff-8fa1-b86f43a70e01\") " pod="openstack/barbican-worker-76d484677c-ptrh6" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.354945 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e26aa90e-071d-46ff-8fa1-b86f43a70e01-logs\") pod \"barbican-worker-76d484677c-ptrh6\" (UID: \"e26aa90e-071d-46ff-8fa1-b86f43a70e01\") " pod="openstack/barbican-worker-76d484677c-ptrh6" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.355009 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvc66\" (UniqueName: \"kubernetes.io/projected/e26aa90e-071d-46ff-8fa1-b86f43a70e01-kube-api-access-bvc66\") pod \"barbican-worker-76d484677c-ptrh6\" (UID: \"e26aa90e-071d-46ff-8fa1-b86f43a70e01\") " pod="openstack/barbican-worker-76d484677c-ptrh6" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.355028 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e26aa90e-071d-46ff-8fa1-b86f43a70e01-config-data\") pod \"barbican-worker-76d484677c-ptrh6\" (UID: \"e26aa90e-071d-46ff-8fa1-b86f43a70e01\") " pod="openstack/barbican-worker-76d484677c-ptrh6" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.355061 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e26aa90e-071d-46ff-8fa1-b86f43a70e01-config-data-custom\") pod \"barbican-worker-76d484677c-ptrh6\" (UID: \"e26aa90e-071d-46ff-8fa1-b86f43a70e01\") " pod="openstack/barbican-worker-76d484677c-ptrh6" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.368060 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-764bcc8bff-dcspv"] Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.373885 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764bcc8bff-dcspv" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.394543 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764bcc8bff-dcspv"] Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.461465 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvc66\" (UniqueName: \"kubernetes.io/projected/e26aa90e-071d-46ff-8fa1-b86f43a70e01-kube-api-access-bvc66\") pod \"barbican-worker-76d484677c-ptrh6\" (UID: \"e26aa90e-071d-46ff-8fa1-b86f43a70e01\") " pod="openstack/barbican-worker-76d484677c-ptrh6" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.461791 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e26aa90e-071d-46ff-8fa1-b86f43a70e01-config-data\") pod \"barbican-worker-76d484677c-ptrh6\" (UID: \"e26aa90e-071d-46ff-8fa1-b86f43a70e01\") " pod="openstack/barbican-worker-76d484677c-ptrh6" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.461825 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e26aa90e-071d-46ff-8fa1-b86f43a70e01-config-data-custom\") pod \"barbican-worker-76d484677c-ptrh6\" (UID: \"e26aa90e-071d-46ff-8fa1-b86f43a70e01\") " pod="openstack/barbican-worker-76d484677c-ptrh6" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.461864 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f37233c9-4b67-4e63-949a-24fd340b334b-combined-ca-bundle\") pod \"barbican-keystone-listener-587996ddf4-fcrwq\" (UID: \"f37233c9-4b67-4e63-949a-24fd340b334b\") " pod="openstack/barbican-keystone-listener-587996ddf4-fcrwq" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.461886 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f37233c9-4b67-4e63-949a-24fd340b334b-config-data-custom\") pod \"barbican-keystone-listener-587996ddf4-fcrwq\" (UID: \"f37233c9-4b67-4e63-949a-24fd340b334b\") " pod="openstack/barbican-keystone-listener-587996ddf4-fcrwq" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.461901 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f37233c9-4b67-4e63-949a-24fd340b334b-config-data\") pod \"barbican-keystone-listener-587996ddf4-fcrwq\" (UID: \"f37233c9-4b67-4e63-949a-24fd340b334b\") " pod="openstack/barbican-keystone-listener-587996ddf4-fcrwq" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.461921 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rfmm\" (UniqueName: \"kubernetes.io/projected/f37233c9-4b67-4e63-949a-24fd340b334b-kube-api-access-4rfmm\") pod \"barbican-keystone-listener-587996ddf4-fcrwq\" (UID: \"f37233c9-4b67-4e63-949a-24fd340b334b\") " pod="openstack/barbican-keystone-listener-587996ddf4-fcrwq" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.461963 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e26aa90e-071d-46ff-8fa1-b86f43a70e01-combined-ca-bundle\") pod \"barbican-worker-76d484677c-ptrh6\" (UID: \"e26aa90e-071d-46ff-8fa1-b86f43a70e01\") " pod="openstack/barbican-worker-76d484677c-ptrh6" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.462002 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e26aa90e-071d-46ff-8fa1-b86f43a70e01-logs\") pod \"barbican-worker-76d484677c-ptrh6\" (UID: \"e26aa90e-071d-46ff-8fa1-b86f43a70e01\") " pod="openstack/barbican-worker-76d484677c-ptrh6" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.462028 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f37233c9-4b67-4e63-949a-24fd340b334b-logs\") pod \"barbican-keystone-listener-587996ddf4-fcrwq\" (UID: \"f37233c9-4b67-4e63-949a-24fd340b334b\") " pod="openstack/barbican-keystone-listener-587996ddf4-fcrwq" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.465201 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e26aa90e-071d-46ff-8fa1-b86f43a70e01-logs\") pod \"barbican-worker-76d484677c-ptrh6\" (UID: \"e26aa90e-071d-46ff-8fa1-b86f43a70e01\") " pod="openstack/barbican-worker-76d484677c-ptrh6" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.468853 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e26aa90e-071d-46ff-8fa1-b86f43a70e01-combined-ca-bundle\") pod \"barbican-worker-76d484677c-ptrh6\" (UID: \"e26aa90e-071d-46ff-8fa1-b86f43a70e01\") " pod="openstack/barbican-worker-76d484677c-ptrh6" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.489872 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e26aa90e-071d-46ff-8fa1-b86f43a70e01-config-data-custom\") pod \"barbican-worker-76d484677c-ptrh6\" (UID: \"e26aa90e-071d-46ff-8fa1-b86f43a70e01\") " pod="openstack/barbican-worker-76d484677c-ptrh6" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.490751 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e26aa90e-071d-46ff-8fa1-b86f43a70e01-config-data\") pod \"barbican-worker-76d484677c-ptrh6\" (UID: \"e26aa90e-071d-46ff-8fa1-b86f43a70e01\") " pod="openstack/barbican-worker-76d484677c-ptrh6" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.503515 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvc66\" (UniqueName: \"kubernetes.io/projected/e26aa90e-071d-46ff-8fa1-b86f43a70e01-kube-api-access-bvc66\") pod \"barbican-worker-76d484677c-ptrh6\" (UID: \"e26aa90e-071d-46ff-8fa1-b86f43a70e01\") " pod="openstack/barbican-worker-76d484677c-ptrh6" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.540058 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-86668db68d-p5ppf"] Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.543523 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-86668db68d-p5ppf" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.562298 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-86668db68d-p5ppf"] Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.571694 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f37233c9-4b67-4e63-949a-24fd340b334b-logs\") pod \"barbican-keystone-listener-587996ddf4-fcrwq\" (UID: \"f37233c9-4b67-4e63-949a-24fd340b334b\") " pod="openstack/barbican-keystone-listener-587996ddf4-fcrwq" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.571762 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25638a0f-8164-4d69-91ec-305a67a6ffc8-ovsdbserver-nb\") pod \"dnsmasq-dns-764bcc8bff-dcspv\" (UID: \"25638a0f-8164-4d69-91ec-305a67a6ffc8\") " pod="openstack/dnsmasq-dns-764bcc8bff-dcspv" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.571804 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgghz\" (UniqueName: \"kubernetes.io/projected/86141d8f-1f9e-469a-9023-63d5a56ad33b-kube-api-access-rgghz\") pod \"barbican-api-86668db68d-p5ppf\" (UID: \"86141d8f-1f9e-469a-9023-63d5a56ad33b\") " pod="openstack/barbican-api-86668db68d-p5ppf" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.571837 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25638a0f-8164-4d69-91ec-305a67a6ffc8-config\") pod \"dnsmasq-dns-764bcc8bff-dcspv\" (UID: \"25638a0f-8164-4d69-91ec-305a67a6ffc8\") " pod="openstack/dnsmasq-dns-764bcc8bff-dcspv" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.571864 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86141d8f-1f9e-469a-9023-63d5a56ad33b-combined-ca-bundle\") pod \"barbican-api-86668db68d-p5ppf\" (UID: \"86141d8f-1f9e-469a-9023-63d5a56ad33b\") " pod="openstack/barbican-api-86668db68d-p5ppf" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.571908 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25638a0f-8164-4d69-91ec-305a67a6ffc8-ovsdbserver-sb\") pod \"dnsmasq-dns-764bcc8bff-dcspv\" (UID: \"25638a0f-8164-4d69-91ec-305a67a6ffc8\") " pod="openstack/dnsmasq-dns-764bcc8bff-dcspv" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.571978 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f37233c9-4b67-4e63-949a-24fd340b334b-combined-ca-bundle\") pod \"barbican-keystone-listener-587996ddf4-fcrwq\" (UID: \"f37233c9-4b67-4e63-949a-24fd340b334b\") " pod="openstack/barbican-keystone-listener-587996ddf4-fcrwq" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.572000 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f37233c9-4b67-4e63-949a-24fd340b334b-config-data-custom\") pod \"barbican-keystone-listener-587996ddf4-fcrwq\" (UID: \"f37233c9-4b67-4e63-949a-24fd340b334b\") " pod="openstack/barbican-keystone-listener-587996ddf4-fcrwq" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.572018 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/25638a0f-8164-4d69-91ec-305a67a6ffc8-dns-swift-storage-0\") pod \"dnsmasq-dns-764bcc8bff-dcspv\" (UID: \"25638a0f-8164-4d69-91ec-305a67a6ffc8\") " pod="openstack/dnsmasq-dns-764bcc8bff-dcspv" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.572037 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f37233c9-4b67-4e63-949a-24fd340b334b-config-data\") pod \"barbican-keystone-listener-587996ddf4-fcrwq\" (UID: \"f37233c9-4b67-4e63-949a-24fd340b334b\") " pod="openstack/barbican-keystone-listener-587996ddf4-fcrwq" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.572061 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rfmm\" (UniqueName: \"kubernetes.io/projected/f37233c9-4b67-4e63-949a-24fd340b334b-kube-api-access-4rfmm\") pod \"barbican-keystone-listener-587996ddf4-fcrwq\" (UID: \"f37233c9-4b67-4e63-949a-24fd340b334b\") " pod="openstack/barbican-keystone-listener-587996ddf4-fcrwq" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.572080 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/86141d8f-1f9e-469a-9023-63d5a56ad33b-config-data-custom\") pod \"barbican-api-86668db68d-p5ppf\" (UID: \"86141d8f-1f9e-469a-9023-63d5a56ad33b\") " pod="openstack/barbican-api-86668db68d-p5ppf" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.572107 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86141d8f-1f9e-469a-9023-63d5a56ad33b-logs\") pod \"barbican-api-86668db68d-p5ppf\" (UID: \"86141d8f-1f9e-469a-9023-63d5a56ad33b\") " pod="openstack/barbican-api-86668db68d-p5ppf" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.572145 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25638a0f-8164-4d69-91ec-305a67a6ffc8-dns-svc\") pod \"dnsmasq-dns-764bcc8bff-dcspv\" (UID: \"25638a0f-8164-4d69-91ec-305a67a6ffc8\") " pod="openstack/dnsmasq-dns-764bcc8bff-dcspv" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.572167 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86141d8f-1f9e-469a-9023-63d5a56ad33b-config-data\") pod \"barbican-api-86668db68d-p5ppf\" (UID: \"86141d8f-1f9e-469a-9023-63d5a56ad33b\") " pod="openstack/barbican-api-86668db68d-p5ppf" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.572232 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gt4s\" (UniqueName: \"kubernetes.io/projected/25638a0f-8164-4d69-91ec-305a67a6ffc8-kube-api-access-9gt4s\") pod \"dnsmasq-dns-764bcc8bff-dcspv\" (UID: \"25638a0f-8164-4d69-91ec-305a67a6ffc8\") " pod="openstack/dnsmasq-dns-764bcc8bff-dcspv" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.572612 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f37233c9-4b67-4e63-949a-24fd340b334b-logs\") pod \"barbican-keystone-listener-587996ddf4-fcrwq\" (UID: \"f37233c9-4b67-4e63-949a-24fd340b334b\") " pod="openstack/barbican-keystone-listener-587996ddf4-fcrwq" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.574965 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.591574 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f37233c9-4b67-4e63-949a-24fd340b334b-combined-ca-bundle\") pod \"barbican-keystone-listener-587996ddf4-fcrwq\" (UID: \"f37233c9-4b67-4e63-949a-24fd340b334b\") " pod="openstack/barbican-keystone-listener-587996ddf4-fcrwq" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.595833 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f37233c9-4b67-4e63-949a-24fd340b334b-config-data-custom\") pod \"barbican-keystone-listener-587996ddf4-fcrwq\" (UID: \"f37233c9-4b67-4e63-949a-24fd340b334b\") " pod="openstack/barbican-keystone-listener-587996ddf4-fcrwq" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.596258 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-76d484677c-ptrh6" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.598646 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f37233c9-4b67-4e63-949a-24fd340b334b-config-data\") pod \"barbican-keystone-listener-587996ddf4-fcrwq\" (UID: \"f37233c9-4b67-4e63-949a-24fd340b334b\") " pod="openstack/barbican-keystone-listener-587996ddf4-fcrwq" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.625457 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rfmm\" (UniqueName: \"kubernetes.io/projected/f37233c9-4b67-4e63-949a-24fd340b334b-kube-api-access-4rfmm\") pod \"barbican-keystone-listener-587996ddf4-fcrwq\" (UID: \"f37233c9-4b67-4e63-949a-24fd340b334b\") " pod="openstack/barbican-keystone-listener-587996ddf4-fcrwq" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.673153 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgghz\" (UniqueName: \"kubernetes.io/projected/86141d8f-1f9e-469a-9023-63d5a56ad33b-kube-api-access-rgghz\") pod \"barbican-api-86668db68d-p5ppf\" (UID: \"86141d8f-1f9e-469a-9023-63d5a56ad33b\") " pod="openstack/barbican-api-86668db68d-p5ppf" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.673216 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25638a0f-8164-4d69-91ec-305a67a6ffc8-config\") pod \"dnsmasq-dns-764bcc8bff-dcspv\" (UID: \"25638a0f-8164-4d69-91ec-305a67a6ffc8\") " pod="openstack/dnsmasq-dns-764bcc8bff-dcspv" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.673239 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86141d8f-1f9e-469a-9023-63d5a56ad33b-combined-ca-bundle\") pod \"barbican-api-86668db68d-p5ppf\" (UID: \"86141d8f-1f9e-469a-9023-63d5a56ad33b\") " pod="openstack/barbican-api-86668db68d-p5ppf" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.673268 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25638a0f-8164-4d69-91ec-305a67a6ffc8-ovsdbserver-sb\") pod \"dnsmasq-dns-764bcc8bff-dcspv\" (UID: \"25638a0f-8164-4d69-91ec-305a67a6ffc8\") " pod="openstack/dnsmasq-dns-764bcc8bff-dcspv" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.673319 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/25638a0f-8164-4d69-91ec-305a67a6ffc8-dns-swift-storage-0\") pod \"dnsmasq-dns-764bcc8bff-dcspv\" (UID: \"25638a0f-8164-4d69-91ec-305a67a6ffc8\") " pod="openstack/dnsmasq-dns-764bcc8bff-dcspv" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.673339 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/86141d8f-1f9e-469a-9023-63d5a56ad33b-config-data-custom\") pod \"barbican-api-86668db68d-p5ppf\" (UID: \"86141d8f-1f9e-469a-9023-63d5a56ad33b\") " pod="openstack/barbican-api-86668db68d-p5ppf" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.673358 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86141d8f-1f9e-469a-9023-63d5a56ad33b-logs\") pod \"barbican-api-86668db68d-p5ppf\" (UID: \"86141d8f-1f9e-469a-9023-63d5a56ad33b\") " pod="openstack/barbican-api-86668db68d-p5ppf" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.673383 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25638a0f-8164-4d69-91ec-305a67a6ffc8-dns-svc\") pod \"dnsmasq-dns-764bcc8bff-dcspv\" (UID: \"25638a0f-8164-4d69-91ec-305a67a6ffc8\") " pod="openstack/dnsmasq-dns-764bcc8bff-dcspv" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.673415 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86141d8f-1f9e-469a-9023-63d5a56ad33b-config-data\") pod \"barbican-api-86668db68d-p5ppf\" (UID: \"86141d8f-1f9e-469a-9023-63d5a56ad33b\") " pod="openstack/barbican-api-86668db68d-p5ppf" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.673452 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gt4s\" (UniqueName: \"kubernetes.io/projected/25638a0f-8164-4d69-91ec-305a67a6ffc8-kube-api-access-9gt4s\") pod \"dnsmasq-dns-764bcc8bff-dcspv\" (UID: \"25638a0f-8164-4d69-91ec-305a67a6ffc8\") " pod="openstack/dnsmasq-dns-764bcc8bff-dcspv" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.673483 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25638a0f-8164-4d69-91ec-305a67a6ffc8-ovsdbserver-nb\") pod \"dnsmasq-dns-764bcc8bff-dcspv\" (UID: \"25638a0f-8164-4d69-91ec-305a67a6ffc8\") " pod="openstack/dnsmasq-dns-764bcc8bff-dcspv" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.674285 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25638a0f-8164-4d69-91ec-305a67a6ffc8-ovsdbserver-nb\") pod \"dnsmasq-dns-764bcc8bff-dcspv\" (UID: \"25638a0f-8164-4d69-91ec-305a67a6ffc8\") " pod="openstack/dnsmasq-dns-764bcc8bff-dcspv" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.674573 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/25638a0f-8164-4d69-91ec-305a67a6ffc8-dns-swift-storage-0\") pod \"dnsmasq-dns-764bcc8bff-dcspv\" (UID: \"25638a0f-8164-4d69-91ec-305a67a6ffc8\") " pod="openstack/dnsmasq-dns-764bcc8bff-dcspv" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.675284 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25638a0f-8164-4d69-91ec-305a67a6ffc8-ovsdbserver-sb\") pod \"dnsmasq-dns-764bcc8bff-dcspv\" (UID: \"25638a0f-8164-4d69-91ec-305a67a6ffc8\") " pod="openstack/dnsmasq-dns-764bcc8bff-dcspv" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.675282 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86141d8f-1f9e-469a-9023-63d5a56ad33b-logs\") pod \"barbican-api-86668db68d-p5ppf\" (UID: \"86141d8f-1f9e-469a-9023-63d5a56ad33b\") " pod="openstack/barbican-api-86668db68d-p5ppf" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.681454 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25638a0f-8164-4d69-91ec-305a67a6ffc8-dns-svc\") pod \"dnsmasq-dns-764bcc8bff-dcspv\" (UID: \"25638a0f-8164-4d69-91ec-305a67a6ffc8\") " pod="openstack/dnsmasq-dns-764bcc8bff-dcspv" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.686298 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/86141d8f-1f9e-469a-9023-63d5a56ad33b-config-data-custom\") pod \"barbican-api-86668db68d-p5ppf\" (UID: \"86141d8f-1f9e-469a-9023-63d5a56ad33b\") " pod="openstack/barbican-api-86668db68d-p5ppf" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.686628 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86141d8f-1f9e-469a-9023-63d5a56ad33b-combined-ca-bundle\") pod \"barbican-api-86668db68d-p5ppf\" (UID: \"86141d8f-1f9e-469a-9023-63d5a56ad33b\") " pod="openstack/barbican-api-86668db68d-p5ppf" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.690936 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86141d8f-1f9e-469a-9023-63d5a56ad33b-config-data\") pod \"barbican-api-86668db68d-p5ppf\" (UID: \"86141d8f-1f9e-469a-9023-63d5a56ad33b\") " pod="openstack/barbican-api-86668db68d-p5ppf" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.691457 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25638a0f-8164-4d69-91ec-305a67a6ffc8-config\") pod \"dnsmasq-dns-764bcc8bff-dcspv\" (UID: \"25638a0f-8164-4d69-91ec-305a67a6ffc8\") " pod="openstack/dnsmasq-dns-764bcc8bff-dcspv" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.703639 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgghz\" (UniqueName: \"kubernetes.io/projected/86141d8f-1f9e-469a-9023-63d5a56ad33b-kube-api-access-rgghz\") pod \"barbican-api-86668db68d-p5ppf\" (UID: \"86141d8f-1f9e-469a-9023-63d5a56ad33b\") " pod="openstack/barbican-api-86668db68d-p5ppf" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.704569 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-86668db68d-p5ppf" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.709421 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gt4s\" (UniqueName: \"kubernetes.io/projected/25638a0f-8164-4d69-91ec-305a67a6ffc8-kube-api-access-9gt4s\") pod \"dnsmasq-dns-764bcc8bff-dcspv\" (UID: \"25638a0f-8164-4d69-91ec-305a67a6ffc8\") " pod="openstack/dnsmasq-dns-764bcc8bff-dcspv" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.760159 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764bcc8bff-dcspv" Oct 12 05:59:25 crc kubenswrapper[4930]: I1012 05:59:25.913927 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-587996ddf4-fcrwq" Oct 12 05:59:26 crc kubenswrapper[4930]: I1012 05:59:26.190966 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c538bd3d-6ead-4b75-a12e-327b70390f9c" path="/var/lib/kubelet/pods/c538bd3d-6ead-4b75-a12e-327b70390f9c/volumes" Oct 12 05:59:26 crc kubenswrapper[4930]: I1012 05:59:26.318544 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-76d484677c-ptrh6"] Oct 12 05:59:26 crc kubenswrapper[4930]: I1012 05:59:26.458972 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-86668db68d-p5ppf"] Oct 12 05:59:26 crc kubenswrapper[4930]: I1012 05:59:26.665662 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764bcc8bff-dcspv"] Oct 12 05:59:26 crc kubenswrapper[4930]: W1012 05:59:26.696412 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod25638a0f_8164_4d69_91ec_305a67a6ffc8.slice/crio-ab5eb63a155d4fbb7bfbefa2809cd7e69eda7ce4231c6eba4e8bdeb217191226 WatchSource:0}: Error finding container ab5eb63a155d4fbb7bfbefa2809cd7e69eda7ce4231c6eba4e8bdeb217191226: Status 404 returned error can't find the container with id ab5eb63a155d4fbb7bfbefa2809cd7e69eda7ce4231c6eba4e8bdeb217191226 Oct 12 05:59:26 crc kubenswrapper[4930]: I1012 05:59:26.818881 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-587996ddf4-fcrwq"] Oct 12 05:59:27 crc kubenswrapper[4930]: I1012 05:59:27.139229 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-76d484677c-ptrh6" event={"ID":"e26aa90e-071d-46ff-8fa1-b86f43a70e01","Type":"ContainerStarted","Data":"4340be7cde2b7f0014648ef5cc656fc56e930267afdccc01a84488ecfcf4c204"} Oct 12 05:59:27 crc kubenswrapper[4930]: I1012 05:59:27.140443 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-587996ddf4-fcrwq" event={"ID":"f37233c9-4b67-4e63-949a-24fd340b334b","Type":"ContainerStarted","Data":"1556cdaf4deae6ba5d9cabadd00ada191de0a7444f126617528afeddf1c69cf3"} Oct 12 05:59:27 crc kubenswrapper[4930]: I1012 05:59:27.144755 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764bcc8bff-dcspv" event={"ID":"25638a0f-8164-4d69-91ec-305a67a6ffc8","Type":"ContainerStarted","Data":"ab5eb63a155d4fbb7bfbefa2809cd7e69eda7ce4231c6eba4e8bdeb217191226"} Oct 12 05:59:27 crc kubenswrapper[4930]: I1012 05:59:27.150213 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-86668db68d-p5ppf" event={"ID":"86141d8f-1f9e-469a-9023-63d5a56ad33b","Type":"ContainerStarted","Data":"b62d22932abf6bc9933c0371e28e61dd3dfe0b8d8d8c7ccd6ff55d33d8b9755a"} Oct 12 05:59:27 crc kubenswrapper[4930]: I1012 05:59:27.150255 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-86668db68d-p5ppf" event={"ID":"86141d8f-1f9e-469a-9023-63d5a56ad33b","Type":"ContainerStarted","Data":"ad24ca744ce212ae5d81ee9db0a6b5c7c43462a6afca1f6430c54b5495669545"} Oct 12 05:59:27 crc kubenswrapper[4930]: I1012 05:59:27.394902 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-applier-0" Oct 12 05:59:28 crc kubenswrapper[4930]: I1012 05:59:28.180022 4930 generic.go:334] "Generic (PLEG): container finished" podID="25638a0f-8164-4d69-91ec-305a67a6ffc8" containerID="14d8eab0cdf5380ec92be5602d643b864f911b7db6f045574464d78d71286431" exitCode=0 Oct 12 05:59:28 crc kubenswrapper[4930]: I1012 05:59:28.180095 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764bcc8bff-dcspv" event={"ID":"25638a0f-8164-4d69-91ec-305a67a6ffc8","Type":"ContainerDied","Data":"14d8eab0cdf5380ec92be5602d643b864f911b7db6f045574464d78d71286431"} Oct 12 05:59:28 crc kubenswrapper[4930]: I1012 05:59:28.231036 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-86668db68d-p5ppf" event={"ID":"86141d8f-1f9e-469a-9023-63d5a56ad33b","Type":"ContainerStarted","Data":"950e2f08ba01d47b3d518920328fbb0f364b05d802d234516294f27276958930"} Oct 12 05:59:28 crc kubenswrapper[4930]: I1012 05:59:28.232280 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-86668db68d-p5ppf" Oct 12 05:59:28 crc kubenswrapper[4930]: I1012 05:59:28.232320 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-86668db68d-p5ppf" Oct 12 05:59:28 crc kubenswrapper[4930]: I1012 05:59:28.314819 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-86668db68d-p5ppf" podStartSLOduration=3.314804685 podStartE2EDuration="3.314804685s" podCreationTimestamp="2025-10-12 05:59:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:59:28.314232531 +0000 UTC m=+1100.856334296" watchObservedRunningTime="2025-10-12 05:59:28.314804685 +0000 UTC m=+1100.856906450" Oct 12 05:59:28 crc kubenswrapper[4930]: I1012 05:59:28.683985 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-696d7778c8-zcb9x"] Oct 12 05:59:28 crc kubenswrapper[4930]: I1012 05:59:28.685620 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-696d7778c8-zcb9x" Oct 12 05:59:28 crc kubenswrapper[4930]: I1012 05:59:28.688444 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Oct 12 05:59:28 crc kubenswrapper[4930]: I1012 05:59:28.688651 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Oct 12 05:59:28 crc kubenswrapper[4930]: I1012 05:59:28.708491 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-696d7778c8-zcb9x"] Oct 12 05:59:28 crc kubenswrapper[4930]: I1012 05:59:28.757047 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p26h9\" (UniqueName: \"kubernetes.io/projected/b054ea5a-466c-432d-aa75-7af68a134c5e-kube-api-access-p26h9\") pod \"barbican-api-696d7778c8-zcb9x\" (UID: \"b054ea5a-466c-432d-aa75-7af68a134c5e\") " pod="openstack/barbican-api-696d7778c8-zcb9x" Oct 12 05:59:28 crc kubenswrapper[4930]: I1012 05:59:28.757117 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b054ea5a-466c-432d-aa75-7af68a134c5e-internal-tls-certs\") pod \"barbican-api-696d7778c8-zcb9x\" (UID: \"b054ea5a-466c-432d-aa75-7af68a134c5e\") " pod="openstack/barbican-api-696d7778c8-zcb9x" Oct 12 05:59:28 crc kubenswrapper[4930]: I1012 05:59:28.757140 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b054ea5a-466c-432d-aa75-7af68a134c5e-config-data-custom\") pod \"barbican-api-696d7778c8-zcb9x\" (UID: \"b054ea5a-466c-432d-aa75-7af68a134c5e\") " pod="openstack/barbican-api-696d7778c8-zcb9x" Oct 12 05:59:28 crc kubenswrapper[4930]: I1012 05:59:28.757184 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b054ea5a-466c-432d-aa75-7af68a134c5e-logs\") pod \"barbican-api-696d7778c8-zcb9x\" (UID: \"b054ea5a-466c-432d-aa75-7af68a134c5e\") " pod="openstack/barbican-api-696d7778c8-zcb9x" Oct 12 05:59:28 crc kubenswrapper[4930]: I1012 05:59:28.757206 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b054ea5a-466c-432d-aa75-7af68a134c5e-public-tls-certs\") pod \"barbican-api-696d7778c8-zcb9x\" (UID: \"b054ea5a-466c-432d-aa75-7af68a134c5e\") " pod="openstack/barbican-api-696d7778c8-zcb9x" Oct 12 05:59:28 crc kubenswrapper[4930]: I1012 05:59:28.757250 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b054ea5a-466c-432d-aa75-7af68a134c5e-combined-ca-bundle\") pod \"barbican-api-696d7778c8-zcb9x\" (UID: \"b054ea5a-466c-432d-aa75-7af68a134c5e\") " pod="openstack/barbican-api-696d7778c8-zcb9x" Oct 12 05:59:28 crc kubenswrapper[4930]: I1012 05:59:28.757270 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b054ea5a-466c-432d-aa75-7af68a134c5e-config-data\") pod \"barbican-api-696d7778c8-zcb9x\" (UID: \"b054ea5a-466c-432d-aa75-7af68a134c5e\") " pod="openstack/barbican-api-696d7778c8-zcb9x" Oct 12 05:59:28 crc kubenswrapper[4930]: I1012 05:59:28.859857 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b054ea5a-466c-432d-aa75-7af68a134c5e-logs\") pod \"barbican-api-696d7778c8-zcb9x\" (UID: \"b054ea5a-466c-432d-aa75-7af68a134c5e\") " pod="openstack/barbican-api-696d7778c8-zcb9x" Oct 12 05:59:28 crc kubenswrapper[4930]: I1012 05:59:28.860162 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b054ea5a-466c-432d-aa75-7af68a134c5e-public-tls-certs\") pod \"barbican-api-696d7778c8-zcb9x\" (UID: \"b054ea5a-466c-432d-aa75-7af68a134c5e\") " pod="openstack/barbican-api-696d7778c8-zcb9x" Oct 12 05:59:28 crc kubenswrapper[4930]: I1012 05:59:28.860222 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b054ea5a-466c-432d-aa75-7af68a134c5e-combined-ca-bundle\") pod \"barbican-api-696d7778c8-zcb9x\" (UID: \"b054ea5a-466c-432d-aa75-7af68a134c5e\") " pod="openstack/barbican-api-696d7778c8-zcb9x" Oct 12 05:59:28 crc kubenswrapper[4930]: I1012 05:59:28.860246 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b054ea5a-466c-432d-aa75-7af68a134c5e-config-data\") pod \"barbican-api-696d7778c8-zcb9x\" (UID: \"b054ea5a-466c-432d-aa75-7af68a134c5e\") " pod="openstack/barbican-api-696d7778c8-zcb9x" Oct 12 05:59:28 crc kubenswrapper[4930]: I1012 05:59:28.860326 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p26h9\" (UniqueName: \"kubernetes.io/projected/b054ea5a-466c-432d-aa75-7af68a134c5e-kube-api-access-p26h9\") pod \"barbican-api-696d7778c8-zcb9x\" (UID: \"b054ea5a-466c-432d-aa75-7af68a134c5e\") " pod="openstack/barbican-api-696d7778c8-zcb9x" Oct 12 05:59:28 crc kubenswrapper[4930]: I1012 05:59:28.860339 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b054ea5a-466c-432d-aa75-7af68a134c5e-logs\") pod \"barbican-api-696d7778c8-zcb9x\" (UID: \"b054ea5a-466c-432d-aa75-7af68a134c5e\") " pod="openstack/barbican-api-696d7778c8-zcb9x" Oct 12 05:59:28 crc kubenswrapper[4930]: I1012 05:59:28.860373 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b054ea5a-466c-432d-aa75-7af68a134c5e-internal-tls-certs\") pod \"barbican-api-696d7778c8-zcb9x\" (UID: \"b054ea5a-466c-432d-aa75-7af68a134c5e\") " pod="openstack/barbican-api-696d7778c8-zcb9x" Oct 12 05:59:28 crc kubenswrapper[4930]: I1012 05:59:28.860393 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b054ea5a-466c-432d-aa75-7af68a134c5e-config-data-custom\") pod \"barbican-api-696d7778c8-zcb9x\" (UID: \"b054ea5a-466c-432d-aa75-7af68a134c5e\") " pod="openstack/barbican-api-696d7778c8-zcb9x" Oct 12 05:59:28 crc kubenswrapper[4930]: I1012 05:59:28.866369 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b054ea5a-466c-432d-aa75-7af68a134c5e-internal-tls-certs\") pod \"barbican-api-696d7778c8-zcb9x\" (UID: \"b054ea5a-466c-432d-aa75-7af68a134c5e\") " pod="openstack/barbican-api-696d7778c8-zcb9x" Oct 12 05:59:28 crc kubenswrapper[4930]: I1012 05:59:28.867387 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b054ea5a-466c-432d-aa75-7af68a134c5e-combined-ca-bundle\") pod \"barbican-api-696d7778c8-zcb9x\" (UID: \"b054ea5a-466c-432d-aa75-7af68a134c5e\") " pod="openstack/barbican-api-696d7778c8-zcb9x" Oct 12 05:59:28 crc kubenswrapper[4930]: I1012 05:59:28.872815 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b054ea5a-466c-432d-aa75-7af68a134c5e-config-data-custom\") pod \"barbican-api-696d7778c8-zcb9x\" (UID: \"b054ea5a-466c-432d-aa75-7af68a134c5e\") " pod="openstack/barbican-api-696d7778c8-zcb9x" Oct 12 05:59:28 crc kubenswrapper[4930]: I1012 05:59:28.874374 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b054ea5a-466c-432d-aa75-7af68a134c5e-config-data\") pod \"barbican-api-696d7778c8-zcb9x\" (UID: \"b054ea5a-466c-432d-aa75-7af68a134c5e\") " pod="openstack/barbican-api-696d7778c8-zcb9x" Oct 12 05:59:28 crc kubenswrapper[4930]: I1012 05:59:28.883593 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b054ea5a-466c-432d-aa75-7af68a134c5e-public-tls-certs\") pod \"barbican-api-696d7778c8-zcb9x\" (UID: \"b054ea5a-466c-432d-aa75-7af68a134c5e\") " pod="openstack/barbican-api-696d7778c8-zcb9x" Oct 12 05:59:28 crc kubenswrapper[4930]: I1012 05:59:28.884333 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p26h9\" (UniqueName: \"kubernetes.io/projected/b054ea5a-466c-432d-aa75-7af68a134c5e-kube-api-access-p26h9\") pod \"barbican-api-696d7778c8-zcb9x\" (UID: \"b054ea5a-466c-432d-aa75-7af68a134c5e\") " pod="openstack/barbican-api-696d7778c8-zcb9x" Oct 12 05:59:29 crc kubenswrapper[4930]: I1012 05:59:29.003518 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-696d7778c8-zcb9x" Oct 12 05:59:29 crc kubenswrapper[4930]: I1012 05:59:29.243660 4930 generic.go:334] "Generic (PLEG): container finished" podID="99ff807c-d810-4ff9-9ed3-3b2d37d3fbed" containerID="5c5db18e20e927f2b3f8fc81b314b0d8b00783743c0b89453be15b7b032c587b" exitCode=0 Oct 12 05:59:29 crc kubenswrapper[4930]: I1012 05:59:29.244000 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-bg9r7" event={"ID":"99ff807c-d810-4ff9-9ed3-3b2d37d3fbed","Type":"ContainerDied","Data":"5c5db18e20e927f2b3f8fc81b314b0d8b00783743c0b89453be15b7b032c587b"} Oct 12 05:59:30 crc kubenswrapper[4930]: I1012 05:59:30.230754 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-696d7778c8-zcb9x"] Oct 12 05:59:30 crc kubenswrapper[4930]: I1012 05:59:30.261388 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-76d484677c-ptrh6" event={"ID":"e26aa90e-071d-46ff-8fa1-b86f43a70e01","Type":"ContainerStarted","Data":"affb6d40505eb15683cf8cbba424bbfdff5f3a5a284d4beb942503a9f977ffb0"} Oct 12 05:59:30 crc kubenswrapper[4930]: I1012 05:59:30.263466 4930 generic.go:334] "Generic (PLEG): container finished" podID="59f4bae4-1a84-449a-be72-e735294116e6" containerID="2b5d1d70dd62ea91af5c3a935efb2c2f47ca551be7765052523abfd189021dc2" exitCode=1 Oct 12 05:59:30 crc kubenswrapper[4930]: I1012 05:59:30.263506 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"59f4bae4-1a84-449a-be72-e735294116e6","Type":"ContainerDied","Data":"2b5d1d70dd62ea91af5c3a935efb2c2f47ca551be7765052523abfd189021dc2"} Oct 12 05:59:30 crc kubenswrapper[4930]: I1012 05:59:30.264124 4930 scope.go:117] "RemoveContainer" containerID="2b5d1d70dd62ea91af5c3a935efb2c2f47ca551be7765052523abfd189021dc2" Oct 12 05:59:30 crc kubenswrapper[4930]: W1012 05:59:30.265376 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb054ea5a_466c_432d_aa75_7af68a134c5e.slice/crio-70a27768ddb74a43527f5173380cc94fd0866b065856224fd971d493ad3f88ac WatchSource:0}: Error finding container 70a27768ddb74a43527f5173380cc94fd0866b065856224fd971d493ad3f88ac: Status 404 returned error can't find the container with id 70a27768ddb74a43527f5173380cc94fd0866b065856224fd971d493ad3f88ac Oct 12 05:59:30 crc kubenswrapper[4930]: I1012 05:59:30.265643 4930 generic.go:334] "Generic (PLEG): container finished" podID="448db83a-c0af-4680-890f-24b8d8da1088" containerID="1b9aba71eeec6c8e02392ccc8b7f58218c5cc9ebe67585f1ff3bd806865a7174" exitCode=0 Oct 12 05:59:30 crc kubenswrapper[4930]: I1012 05:59:30.265703 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-jj2m5" event={"ID":"448db83a-c0af-4680-890f-24b8d8da1088","Type":"ContainerDied","Data":"1b9aba71eeec6c8e02392ccc8b7f58218c5cc9ebe67585f1ff3bd806865a7174"} Oct 12 05:59:30 crc kubenswrapper[4930]: I1012 05:59:30.271956 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-587996ddf4-fcrwq" event={"ID":"f37233c9-4b67-4e63-949a-24fd340b334b","Type":"ContainerStarted","Data":"78ff593abeb31ce4ce164df42fe7e7a630fa56cb67b72bec4a44a0e6d3de4330"} Oct 12 05:59:30 crc kubenswrapper[4930]: I1012 05:59:30.278338 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764bcc8bff-dcspv" event={"ID":"25638a0f-8164-4d69-91ec-305a67a6ffc8","Type":"ContainerStarted","Data":"4359e8a38e253c36010491695331d4650cefae8e367913c63242d8c7807d4df9"} Oct 12 05:59:30 crc kubenswrapper[4930]: I1012 05:59:30.278373 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-764bcc8bff-dcspv" Oct 12 05:59:30 crc kubenswrapper[4930]: I1012 05:59:30.323973 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-764bcc8bff-dcspv" podStartSLOduration=5.323955519 podStartE2EDuration="5.323955519s" podCreationTimestamp="2025-10-12 05:59:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:59:30.312109264 +0000 UTC m=+1102.854211029" watchObservedRunningTime="2025-10-12 05:59:30.323955519 +0000 UTC m=+1102.866057284" Oct 12 05:59:30 crc kubenswrapper[4930]: I1012 05:59:30.651675 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-bg9r7" Oct 12 05:59:30 crc kubenswrapper[4930]: I1012 05:59:30.799394 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/99ff807c-d810-4ff9-9ed3-3b2d37d3fbed-db-sync-config-data\") pod \"99ff807c-d810-4ff9-9ed3-3b2d37d3fbed\" (UID: \"99ff807c-d810-4ff9-9ed3-3b2d37d3fbed\") " Oct 12 05:59:30 crc kubenswrapper[4930]: I1012 05:59:30.799803 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99ff807c-d810-4ff9-9ed3-3b2d37d3fbed-config-data\") pod \"99ff807c-d810-4ff9-9ed3-3b2d37d3fbed\" (UID: \"99ff807c-d810-4ff9-9ed3-3b2d37d3fbed\") " Oct 12 05:59:30 crc kubenswrapper[4930]: I1012 05:59:30.799827 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99ff807c-d810-4ff9-9ed3-3b2d37d3fbed-scripts\") pod \"99ff807c-d810-4ff9-9ed3-3b2d37d3fbed\" (UID: \"99ff807c-d810-4ff9-9ed3-3b2d37d3fbed\") " Oct 12 05:59:30 crc kubenswrapper[4930]: I1012 05:59:30.799878 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99ff807c-d810-4ff9-9ed3-3b2d37d3fbed-combined-ca-bundle\") pod \"99ff807c-d810-4ff9-9ed3-3b2d37d3fbed\" (UID: \"99ff807c-d810-4ff9-9ed3-3b2d37d3fbed\") " Oct 12 05:59:30 crc kubenswrapper[4930]: I1012 05:59:30.800153 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/99ff807c-d810-4ff9-9ed3-3b2d37d3fbed-etc-machine-id\") pod \"99ff807c-d810-4ff9-9ed3-3b2d37d3fbed\" (UID: \"99ff807c-d810-4ff9-9ed3-3b2d37d3fbed\") " Oct 12 05:59:30 crc kubenswrapper[4930]: I1012 05:59:30.800202 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66mtr\" (UniqueName: \"kubernetes.io/projected/99ff807c-d810-4ff9-9ed3-3b2d37d3fbed-kube-api-access-66mtr\") pod \"99ff807c-d810-4ff9-9ed3-3b2d37d3fbed\" (UID: \"99ff807c-d810-4ff9-9ed3-3b2d37d3fbed\") " Oct 12 05:59:30 crc kubenswrapper[4930]: I1012 05:59:30.804203 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/99ff807c-d810-4ff9-9ed3-3b2d37d3fbed-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "99ff807c-d810-4ff9-9ed3-3b2d37d3fbed" (UID: "99ff807c-d810-4ff9-9ed3-3b2d37d3fbed"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 05:59:30 crc kubenswrapper[4930]: I1012 05:59:30.806569 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99ff807c-d810-4ff9-9ed3-3b2d37d3fbed-scripts" (OuterVolumeSpecName: "scripts") pod "99ff807c-d810-4ff9-9ed3-3b2d37d3fbed" (UID: "99ff807c-d810-4ff9-9ed3-3b2d37d3fbed"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:59:30 crc kubenswrapper[4930]: I1012 05:59:30.807003 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99ff807c-d810-4ff9-9ed3-3b2d37d3fbed-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "99ff807c-d810-4ff9-9ed3-3b2d37d3fbed" (UID: "99ff807c-d810-4ff9-9ed3-3b2d37d3fbed"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:59:30 crc kubenswrapper[4930]: I1012 05:59:30.812899 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99ff807c-d810-4ff9-9ed3-3b2d37d3fbed-kube-api-access-66mtr" (OuterVolumeSpecName: "kube-api-access-66mtr") pod "99ff807c-d810-4ff9-9ed3-3b2d37d3fbed" (UID: "99ff807c-d810-4ff9-9ed3-3b2d37d3fbed"). InnerVolumeSpecName "kube-api-access-66mtr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:59:30 crc kubenswrapper[4930]: I1012 05:59:30.849371 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99ff807c-d810-4ff9-9ed3-3b2d37d3fbed-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "99ff807c-d810-4ff9-9ed3-3b2d37d3fbed" (UID: "99ff807c-d810-4ff9-9ed3-3b2d37d3fbed"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:59:30 crc kubenswrapper[4930]: I1012 05:59:30.887174 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99ff807c-d810-4ff9-9ed3-3b2d37d3fbed-config-data" (OuterVolumeSpecName: "config-data") pod "99ff807c-d810-4ff9-9ed3-3b2d37d3fbed" (UID: "99ff807c-d810-4ff9-9ed3-3b2d37d3fbed"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:59:30 crc kubenswrapper[4930]: I1012 05:59:30.902520 4930 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/99ff807c-d810-4ff9-9ed3-3b2d37d3fbed-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:30 crc kubenswrapper[4930]: I1012 05:59:30.902549 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66mtr\" (UniqueName: \"kubernetes.io/projected/99ff807c-d810-4ff9-9ed3-3b2d37d3fbed-kube-api-access-66mtr\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:30 crc kubenswrapper[4930]: I1012 05:59:30.902561 4930 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/99ff807c-d810-4ff9-9ed3-3b2d37d3fbed-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:30 crc kubenswrapper[4930]: I1012 05:59:30.902569 4930 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99ff807c-d810-4ff9-9ed3-3b2d37d3fbed-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:30 crc kubenswrapper[4930]: I1012 05:59:30.902578 4930 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99ff807c-d810-4ff9-9ed3-3b2d37d3fbed-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:30 crc kubenswrapper[4930]: I1012 05:59:30.902585 4930 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99ff807c-d810-4ff9-9ed3-3b2d37d3fbed-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.295459 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-bg9r7" event={"ID":"99ff807c-d810-4ff9-9ed3-3b2d37d3fbed","Type":"ContainerDied","Data":"54ff6d0a1a07f334ab917b1b33b541bf3c3f61db974736985b8f6f060e7866c3"} Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.295725 4930 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54ff6d0a1a07f334ab917b1b33b541bf3c3f61db974736985b8f6f060e7866c3" Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.295846 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-bg9r7" Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.306561 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-76d484677c-ptrh6" event={"ID":"e26aa90e-071d-46ff-8fa1-b86f43a70e01","Type":"ContainerStarted","Data":"e97407a2a35592b07fc2593ca73d9d2a2df15d32c85297cb8cd19b388f034e60"} Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.313147 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"59f4bae4-1a84-449a-be72-e735294116e6","Type":"ContainerStarted","Data":"893b8e00eaeb9c339e5a9528038e3102dd99c50f64a2531fff869300585ed13a"} Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.317388 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-696d7778c8-zcb9x" event={"ID":"b054ea5a-466c-432d-aa75-7af68a134c5e","Type":"ContainerStarted","Data":"a5860f2be706ea5613d97c0e29c261b670cc8ed466f0accbf1fe6d72dfc19ec9"} Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.317440 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-696d7778c8-zcb9x" event={"ID":"b054ea5a-466c-432d-aa75-7af68a134c5e","Type":"ContainerStarted","Data":"10af328188ea3b165c9988dda84f0d17763682c756f59c57bff7c75713e348ab"} Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.317452 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-696d7778c8-zcb9x" event={"ID":"b054ea5a-466c-432d-aa75-7af68a134c5e","Type":"ContainerStarted","Data":"70a27768ddb74a43527f5173380cc94fd0866b065856224fd971d493ad3f88ac"} Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.317488 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-696d7778c8-zcb9x" Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.317573 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-696d7778c8-zcb9x" Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.321559 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-587996ddf4-fcrwq" event={"ID":"f37233c9-4b67-4e63-949a-24fd340b334b","Type":"ContainerStarted","Data":"77e83a73b61a19e2614e3d149ec749400104a7212ff4fb87eda6efd14fa96826"} Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.346588 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-76d484677c-ptrh6" podStartSLOduration=2.965440119 podStartE2EDuration="6.346570007s" podCreationTimestamp="2025-10-12 05:59:25 +0000 UTC" firstStartedPulling="2025-10-12 05:59:26.405565017 +0000 UTC m=+1098.947666782" lastFinishedPulling="2025-10-12 05:59:29.786694905 +0000 UTC m=+1102.328796670" observedRunningTime="2025-10-12 05:59:31.326752624 +0000 UTC m=+1103.868854389" watchObservedRunningTime="2025-10-12 05:59:31.346570007 +0000 UTC m=+1103.888671772" Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.390011 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-696d7778c8-zcb9x" podStartSLOduration=3.389990498 podStartE2EDuration="3.389990498s" podCreationTimestamp="2025-10-12 05:59:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:59:31.360610987 +0000 UTC m=+1103.902712752" watchObservedRunningTime="2025-10-12 05:59:31.389990498 +0000 UTC m=+1103.932092263" Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.514434 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-587996ddf4-fcrwq" podStartSLOduration=3.595459794 podStartE2EDuration="6.514407696s" podCreationTimestamp="2025-10-12 05:59:25 +0000 UTC" firstStartedPulling="2025-10-12 05:59:26.869496366 +0000 UTC m=+1099.411598131" lastFinishedPulling="2025-10-12 05:59:29.788444278 +0000 UTC m=+1102.330546033" observedRunningTime="2025-10-12 05:59:31.401829663 +0000 UTC m=+1103.943931428" watchObservedRunningTime="2025-10-12 05:59:31.514407696 +0000 UTC m=+1104.056509451" Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.526661 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 12 05:59:31 crc kubenswrapper[4930]: E1012 05:59:31.527109 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99ff807c-d810-4ff9-9ed3-3b2d37d3fbed" containerName="cinder-db-sync" Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.527127 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="99ff807c-d810-4ff9-9ed3-3b2d37d3fbed" containerName="cinder-db-sync" Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.527299 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="99ff807c-d810-4ff9-9ed3-3b2d37d3fbed" containerName="cinder-db-sync" Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.530102 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.534430 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.534574 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.534699 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-fcjrp" Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.535206 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.566824 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.614911 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df83ee2b-e484-4a2a-a619-4e9aadeb19a1-config-data\") pod \"cinder-scheduler-0\" (UID: \"df83ee2b-e484-4a2a-a619-4e9aadeb19a1\") " pod="openstack/cinder-scheduler-0" Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.614963 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/df83ee2b-e484-4a2a-a619-4e9aadeb19a1-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"df83ee2b-e484-4a2a-a619-4e9aadeb19a1\") " pod="openstack/cinder-scheduler-0" Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.614991 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/df83ee2b-e484-4a2a-a619-4e9aadeb19a1-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"df83ee2b-e484-4a2a-a619-4e9aadeb19a1\") " pod="openstack/cinder-scheduler-0" Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.615017 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df83ee2b-e484-4a2a-a619-4e9aadeb19a1-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"df83ee2b-e484-4a2a-a619-4e9aadeb19a1\") " pod="openstack/cinder-scheduler-0" Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.615052 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df83ee2b-e484-4a2a-a619-4e9aadeb19a1-scripts\") pod \"cinder-scheduler-0\" (UID: \"df83ee2b-e484-4a2a-a619-4e9aadeb19a1\") " pod="openstack/cinder-scheduler-0" Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.615081 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqs9b\" (UniqueName: \"kubernetes.io/projected/df83ee2b-e484-4a2a-a619-4e9aadeb19a1-kube-api-access-xqs9b\") pod \"cinder-scheduler-0\" (UID: \"df83ee2b-e484-4a2a-a619-4e9aadeb19a1\") " pod="openstack/cinder-scheduler-0" Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.616639 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-764bcc8bff-dcspv"] Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.651846 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b8db6db6f-7hlvd"] Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.653662 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b8db6db6f-7hlvd" Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.676297 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b8db6db6f-7hlvd"] Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.717665 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df83ee2b-e484-4a2a-a619-4e9aadeb19a1-config-data\") pod \"cinder-scheduler-0\" (UID: \"df83ee2b-e484-4a2a-a619-4e9aadeb19a1\") " pod="openstack/cinder-scheduler-0" Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.717704 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/df83ee2b-e484-4a2a-a619-4e9aadeb19a1-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"df83ee2b-e484-4a2a-a619-4e9aadeb19a1\") " pod="openstack/cinder-scheduler-0" Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.717723 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/df83ee2b-e484-4a2a-a619-4e9aadeb19a1-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"df83ee2b-e484-4a2a-a619-4e9aadeb19a1\") " pod="openstack/cinder-scheduler-0" Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.717721 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.717756 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df83ee2b-e484-4a2a-a619-4e9aadeb19a1-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"df83ee2b-e484-4a2a-a619-4e9aadeb19a1\") " pod="openstack/cinder-scheduler-0" Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.717785 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df83ee2b-e484-4a2a-a619-4e9aadeb19a1-scripts\") pod \"cinder-scheduler-0\" (UID: \"df83ee2b-e484-4a2a-a619-4e9aadeb19a1\") " pod="openstack/cinder-scheduler-0" Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.717804 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqs9b\" (UniqueName: \"kubernetes.io/projected/df83ee2b-e484-4a2a-a619-4e9aadeb19a1-kube-api-access-xqs9b\") pod \"cinder-scheduler-0\" (UID: \"df83ee2b-e484-4a2a-a619-4e9aadeb19a1\") " pod="openstack/cinder-scheduler-0" Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.718072 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/df83ee2b-e484-4a2a-a619-4e9aadeb19a1-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"df83ee2b-e484-4a2a-a619-4e9aadeb19a1\") " pod="openstack/cinder-scheduler-0" Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.719367 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.724793 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.732469 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df83ee2b-e484-4a2a-a619-4e9aadeb19a1-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"df83ee2b-e484-4a2a-a619-4e9aadeb19a1\") " pod="openstack/cinder-scheduler-0" Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.732823 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df83ee2b-e484-4a2a-a619-4e9aadeb19a1-config-data\") pod \"cinder-scheduler-0\" (UID: \"df83ee2b-e484-4a2a-a619-4e9aadeb19a1\") " pod="openstack/cinder-scheduler-0" Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.745319 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df83ee2b-e484-4a2a-a619-4e9aadeb19a1-scripts\") pod \"cinder-scheduler-0\" (UID: \"df83ee2b-e484-4a2a-a619-4e9aadeb19a1\") " pod="openstack/cinder-scheduler-0" Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.747074 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/df83ee2b-e484-4a2a-a619-4e9aadeb19a1-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"df83ee2b-e484-4a2a-a619-4e9aadeb19a1\") " pod="openstack/cinder-scheduler-0" Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.763390 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqs9b\" (UniqueName: \"kubernetes.io/projected/df83ee2b-e484-4a2a-a619-4e9aadeb19a1-kube-api-access-xqs9b\") pod \"cinder-scheduler-0\" (UID: \"df83ee2b-e484-4a2a-a619-4e9aadeb19a1\") " pod="openstack/cinder-scheduler-0" Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.768013 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.821943 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9354ea26-d24c-49ef-b564-b58a6f3e4f3f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9354ea26-d24c-49ef-b564-b58a6f3e4f3f\") " pod="openstack/cinder-api-0" Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.822007 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9354ea26-d24c-49ef-b564-b58a6f3e4f3f-logs\") pod \"cinder-api-0\" (UID: \"9354ea26-d24c-49ef-b564-b58a6f3e4f3f\") " pod="openstack/cinder-api-0" Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.822024 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9354ea26-d24c-49ef-b564-b58a6f3e4f3f-scripts\") pod \"cinder-api-0\" (UID: \"9354ea26-d24c-49ef-b564-b58a6f3e4f3f\") " pod="openstack/cinder-api-0" Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.822074 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fd7qw\" (UniqueName: \"kubernetes.io/projected/7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129-kube-api-access-fd7qw\") pod \"dnsmasq-dns-6b8db6db6f-7hlvd\" (UID: \"7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129\") " pod="openstack/dnsmasq-dns-6b8db6db6f-7hlvd" Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.822103 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129-dns-svc\") pod \"dnsmasq-dns-6b8db6db6f-7hlvd\" (UID: \"7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129\") " pod="openstack/dnsmasq-dns-6b8db6db6f-7hlvd" Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.822122 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129-ovsdbserver-sb\") pod \"dnsmasq-dns-6b8db6db6f-7hlvd\" (UID: \"7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129\") " pod="openstack/dnsmasq-dns-6b8db6db6f-7hlvd" Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.822147 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129-config\") pod \"dnsmasq-dns-6b8db6db6f-7hlvd\" (UID: \"7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129\") " pod="openstack/dnsmasq-dns-6b8db6db6f-7hlvd" Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.822162 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9354ea26-d24c-49ef-b564-b58a6f3e4f3f-config-data\") pod \"cinder-api-0\" (UID: \"9354ea26-d24c-49ef-b564-b58a6f3e4f3f\") " pod="openstack/cinder-api-0" Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.822176 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9354ea26-d24c-49ef-b564-b58a6f3e4f3f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9354ea26-d24c-49ef-b564-b58a6f3e4f3f\") " pod="openstack/cinder-api-0" Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.822245 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9354ea26-d24c-49ef-b564-b58a6f3e4f3f-config-data-custom\") pod \"cinder-api-0\" (UID: \"9354ea26-d24c-49ef-b564-b58a6f3e4f3f\") " pod="openstack/cinder-api-0" Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.822264 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129-dns-swift-storage-0\") pod \"dnsmasq-dns-6b8db6db6f-7hlvd\" (UID: \"7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129\") " pod="openstack/dnsmasq-dns-6b8db6db6f-7hlvd" Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.822281 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129-ovsdbserver-nb\") pod \"dnsmasq-dns-6b8db6db6f-7hlvd\" (UID: \"7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129\") " pod="openstack/dnsmasq-dns-6b8db6db6f-7hlvd" Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.822304 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6gdv\" (UniqueName: \"kubernetes.io/projected/9354ea26-d24c-49ef-b564-b58a6f3e4f3f-kube-api-access-t6gdv\") pod \"cinder-api-0\" (UID: \"9354ea26-d24c-49ef-b564-b58a6f3e4f3f\") " pod="openstack/cinder-api-0" Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.890212 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.925268 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9354ea26-d24c-49ef-b564-b58a6f3e4f3f-logs\") pod \"cinder-api-0\" (UID: \"9354ea26-d24c-49ef-b564-b58a6f3e4f3f\") " pod="openstack/cinder-api-0" Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.925308 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9354ea26-d24c-49ef-b564-b58a6f3e4f3f-scripts\") pod \"cinder-api-0\" (UID: \"9354ea26-d24c-49ef-b564-b58a6f3e4f3f\") " pod="openstack/cinder-api-0" Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.925362 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fd7qw\" (UniqueName: \"kubernetes.io/projected/7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129-kube-api-access-fd7qw\") pod \"dnsmasq-dns-6b8db6db6f-7hlvd\" (UID: \"7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129\") " pod="openstack/dnsmasq-dns-6b8db6db6f-7hlvd" Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.925392 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129-dns-svc\") pod \"dnsmasq-dns-6b8db6db6f-7hlvd\" (UID: \"7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129\") " pod="openstack/dnsmasq-dns-6b8db6db6f-7hlvd" Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.925422 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129-ovsdbserver-sb\") pod \"dnsmasq-dns-6b8db6db6f-7hlvd\" (UID: \"7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129\") " pod="openstack/dnsmasq-dns-6b8db6db6f-7hlvd" Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.925442 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129-config\") pod \"dnsmasq-dns-6b8db6db6f-7hlvd\" (UID: \"7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129\") " pod="openstack/dnsmasq-dns-6b8db6db6f-7hlvd" Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.925458 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9354ea26-d24c-49ef-b564-b58a6f3e4f3f-config-data\") pod \"cinder-api-0\" (UID: \"9354ea26-d24c-49ef-b564-b58a6f3e4f3f\") " pod="openstack/cinder-api-0" Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.925475 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9354ea26-d24c-49ef-b564-b58a6f3e4f3f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9354ea26-d24c-49ef-b564-b58a6f3e4f3f\") " pod="openstack/cinder-api-0" Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.925547 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9354ea26-d24c-49ef-b564-b58a6f3e4f3f-config-data-custom\") pod \"cinder-api-0\" (UID: \"9354ea26-d24c-49ef-b564-b58a6f3e4f3f\") " pod="openstack/cinder-api-0" Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.925564 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129-dns-swift-storage-0\") pod \"dnsmasq-dns-6b8db6db6f-7hlvd\" (UID: \"7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129\") " pod="openstack/dnsmasq-dns-6b8db6db6f-7hlvd" Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.925580 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129-ovsdbserver-nb\") pod \"dnsmasq-dns-6b8db6db6f-7hlvd\" (UID: \"7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129\") " pod="openstack/dnsmasq-dns-6b8db6db6f-7hlvd" Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.925600 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6gdv\" (UniqueName: \"kubernetes.io/projected/9354ea26-d24c-49ef-b564-b58a6f3e4f3f-kube-api-access-t6gdv\") pod \"cinder-api-0\" (UID: \"9354ea26-d24c-49ef-b564-b58a6f3e4f3f\") " pod="openstack/cinder-api-0" Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.925621 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9354ea26-d24c-49ef-b564-b58a6f3e4f3f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9354ea26-d24c-49ef-b564-b58a6f3e4f3f\") " pod="openstack/cinder-api-0" Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.926383 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9354ea26-d24c-49ef-b564-b58a6f3e4f3f-logs\") pod \"cinder-api-0\" (UID: \"9354ea26-d24c-49ef-b564-b58a6f3e4f3f\") " pod="openstack/cinder-api-0" Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.926584 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129-config\") pod \"dnsmasq-dns-6b8db6db6f-7hlvd\" (UID: \"7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129\") " pod="openstack/dnsmasq-dns-6b8db6db6f-7hlvd" Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.927193 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129-dns-svc\") pod \"dnsmasq-dns-6b8db6db6f-7hlvd\" (UID: \"7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129\") " pod="openstack/dnsmasq-dns-6b8db6db6f-7hlvd" Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.927712 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129-ovsdbserver-sb\") pod \"dnsmasq-dns-6b8db6db6f-7hlvd\" (UID: \"7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129\") " pod="openstack/dnsmasq-dns-6b8db6db6f-7hlvd" Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.930887 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129-ovsdbserver-nb\") pod \"dnsmasq-dns-6b8db6db6f-7hlvd\" (UID: \"7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129\") " pod="openstack/dnsmasq-dns-6b8db6db6f-7hlvd" Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.931401 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129-dns-swift-storage-0\") pod \"dnsmasq-dns-6b8db6db6f-7hlvd\" (UID: \"7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129\") " pod="openstack/dnsmasq-dns-6b8db6db6f-7hlvd" Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.931686 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9354ea26-d24c-49ef-b564-b58a6f3e4f3f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9354ea26-d24c-49ef-b564-b58a6f3e4f3f\") " pod="openstack/cinder-api-0" Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.935468 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9354ea26-d24c-49ef-b564-b58a6f3e4f3f-config-data-custom\") pod \"cinder-api-0\" (UID: \"9354ea26-d24c-49ef-b564-b58a6f3e4f3f\") " pod="openstack/cinder-api-0" Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.938065 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9354ea26-d24c-49ef-b564-b58a6f3e4f3f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9354ea26-d24c-49ef-b564-b58a6f3e4f3f\") " pod="openstack/cinder-api-0" Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.943366 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9354ea26-d24c-49ef-b564-b58a6f3e4f3f-scripts\") pod \"cinder-api-0\" (UID: \"9354ea26-d24c-49ef-b564-b58a6f3e4f3f\") " pod="openstack/cinder-api-0" Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.945914 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9354ea26-d24c-49ef-b564-b58a6f3e4f3f-config-data\") pod \"cinder-api-0\" (UID: \"9354ea26-d24c-49ef-b564-b58a6f3e4f3f\") " pod="openstack/cinder-api-0" Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.949507 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6gdv\" (UniqueName: \"kubernetes.io/projected/9354ea26-d24c-49ef-b564-b58a6f3e4f3f-kube-api-access-t6gdv\") pod \"cinder-api-0\" (UID: \"9354ea26-d24c-49ef-b564-b58a6f3e4f3f\") " pod="openstack/cinder-api-0" Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.955389 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fd7qw\" (UniqueName: \"kubernetes.io/projected/7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129-kube-api-access-fd7qw\") pod \"dnsmasq-dns-6b8db6db6f-7hlvd\" (UID: \"7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129\") " pod="openstack/dnsmasq-dns-6b8db6db6f-7hlvd" Oct 12 05:59:31 crc kubenswrapper[4930]: I1012 05:59:31.994304 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b8db6db6f-7hlvd" Oct 12 05:59:32 crc kubenswrapper[4930]: I1012 05:59:32.161819 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Oct 12 05:59:32 crc kubenswrapper[4930]: I1012 05:59:32.162025 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="8f64aa33-3396-4700-bb0f-57b16e39e368" containerName="watcher-api-log" containerID="cri-o://2d144f341e028cb128d6fc363b8ab8768ff57f1a0e734f0e3a9ab7fead6f1aae" gracePeriod=30 Oct 12 05:59:32 crc kubenswrapper[4930]: I1012 05:59:32.162321 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="8f64aa33-3396-4700-bb0f-57b16e39e368" containerName="watcher-api" containerID="cri-o://9b4e5324382ab8226f2902432ecafac57ec1761f1c6be848d7be65b1937d48bb" gracePeriod=30 Oct 12 05:59:32 crc kubenswrapper[4930]: I1012 05:59:32.169642 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 12 05:59:32 crc kubenswrapper[4930]: I1012 05:59:32.275223 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-jj2m5" Oct 12 05:59:32 crc kubenswrapper[4930]: I1012 05:59:32.336585 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/448db83a-c0af-4680-890f-24b8d8da1088-config-data\") pod \"448db83a-c0af-4680-890f-24b8d8da1088\" (UID: \"448db83a-c0af-4680-890f-24b8d8da1088\") " Oct 12 05:59:32 crc kubenswrapper[4930]: I1012 05:59:32.336928 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lr9gg\" (UniqueName: \"kubernetes.io/projected/448db83a-c0af-4680-890f-24b8d8da1088-kube-api-access-lr9gg\") pod \"448db83a-c0af-4680-890f-24b8d8da1088\" (UID: \"448db83a-c0af-4680-890f-24b8d8da1088\") " Oct 12 05:59:32 crc kubenswrapper[4930]: I1012 05:59:32.337036 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/448db83a-c0af-4680-890f-24b8d8da1088-combined-ca-bundle\") pod \"448db83a-c0af-4680-890f-24b8d8da1088\" (UID: \"448db83a-c0af-4680-890f-24b8d8da1088\") " Oct 12 05:59:32 crc kubenswrapper[4930]: I1012 05:59:32.337084 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/448db83a-c0af-4680-890f-24b8d8da1088-db-sync-config-data\") pod \"448db83a-c0af-4680-890f-24b8d8da1088\" (UID: \"448db83a-c0af-4680-890f-24b8d8da1088\") " Oct 12 05:59:32 crc kubenswrapper[4930]: I1012 05:59:32.347221 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/448db83a-c0af-4680-890f-24b8d8da1088-kube-api-access-lr9gg" (OuterVolumeSpecName: "kube-api-access-lr9gg") pod "448db83a-c0af-4680-890f-24b8d8da1088" (UID: "448db83a-c0af-4680-890f-24b8d8da1088"). InnerVolumeSpecName "kube-api-access-lr9gg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:59:32 crc kubenswrapper[4930]: I1012 05:59:32.354948 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/448db83a-c0af-4680-890f-24b8d8da1088-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "448db83a-c0af-4680-890f-24b8d8da1088" (UID: "448db83a-c0af-4680-890f-24b8d8da1088"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:59:32 crc kubenswrapper[4930]: I1012 05:59:32.370124 4930 generic.go:334] "Generic (PLEG): container finished" podID="8f64aa33-3396-4700-bb0f-57b16e39e368" containerID="2d144f341e028cb128d6fc363b8ab8768ff57f1a0e734f0e3a9ab7fead6f1aae" exitCode=143 Oct 12 05:59:32 crc kubenswrapper[4930]: I1012 05:59:32.370192 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"8f64aa33-3396-4700-bb0f-57b16e39e368","Type":"ContainerDied","Data":"2d144f341e028cb128d6fc363b8ab8768ff57f1a0e734f0e3a9ab7fead6f1aae"} Oct 12 05:59:32 crc kubenswrapper[4930]: I1012 05:59:32.372809 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-jj2m5" Oct 12 05:59:32 crc kubenswrapper[4930]: I1012 05:59:32.372992 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-jj2m5" event={"ID":"448db83a-c0af-4680-890f-24b8d8da1088","Type":"ContainerDied","Data":"9a267634c47516a8610d7b803a05e7cf1334339d9b2bd497dc7a1bd3ba6c4736"} Oct 12 05:59:32 crc kubenswrapper[4930]: I1012 05:59:32.373023 4930 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a267634c47516a8610d7b803a05e7cf1334339d9b2bd497dc7a1bd3ba6c4736" Oct 12 05:59:32 crc kubenswrapper[4930]: I1012 05:59:32.373141 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-764bcc8bff-dcspv" podUID="25638a0f-8164-4d69-91ec-305a67a6ffc8" containerName="dnsmasq-dns" containerID="cri-o://4359e8a38e253c36010491695331d4650cefae8e367913c63242d8c7807d4df9" gracePeriod=10 Oct 12 05:59:32 crc kubenswrapper[4930]: I1012 05:59:32.383926 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/448db83a-c0af-4680-890f-24b8d8da1088-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "448db83a-c0af-4680-890f-24b8d8da1088" (UID: "448db83a-c0af-4680-890f-24b8d8da1088"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:59:32 crc kubenswrapper[4930]: I1012 05:59:32.393044 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-applier-0" Oct 12 05:59:32 crc kubenswrapper[4930]: I1012 05:59:32.395606 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/448db83a-c0af-4680-890f-24b8d8da1088-config-data" (OuterVolumeSpecName: "config-data") pod "448db83a-c0af-4680-890f-24b8d8da1088" (UID: "448db83a-c0af-4680-890f-24b8d8da1088"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:59:32 crc kubenswrapper[4930]: I1012 05:59:32.439550 4930 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/448db83a-c0af-4680-890f-24b8d8da1088-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:32 crc kubenswrapper[4930]: I1012 05:59:32.439580 4930 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/448db83a-c0af-4680-890f-24b8d8da1088-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:32 crc kubenswrapper[4930]: I1012 05:59:32.439589 4930 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/448db83a-c0af-4680-890f-24b8d8da1088-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:32 crc kubenswrapper[4930]: I1012 05:59:32.439601 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lr9gg\" (UniqueName: \"kubernetes.io/projected/448db83a-c0af-4680-890f-24b8d8da1088-kube-api-access-lr9gg\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:32 crc kubenswrapper[4930]: I1012 05:59:32.463642 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-applier-0" Oct 12 05:59:32 crc kubenswrapper[4930]: I1012 05:59:32.646656 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b8db6db6f-7hlvd"] Oct 12 05:59:32 crc kubenswrapper[4930]: I1012 05:59:32.790818 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b8db6db6f-7hlvd"] Oct 12 05:59:32 crc kubenswrapper[4930]: I1012 05:59:32.893998 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75958fc765-rbf85"] Oct 12 05:59:32 crc kubenswrapper[4930]: E1012 05:59:32.894957 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="448db83a-c0af-4680-890f-24b8d8da1088" containerName="glance-db-sync" Oct 12 05:59:32 crc kubenswrapper[4930]: I1012 05:59:32.894975 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="448db83a-c0af-4680-890f-24b8d8da1088" containerName="glance-db-sync" Oct 12 05:59:32 crc kubenswrapper[4930]: I1012 05:59:32.895360 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="448db83a-c0af-4680-890f-24b8d8da1088" containerName="glance-db-sync" Oct 12 05:59:32 crc kubenswrapper[4930]: I1012 05:59:32.897010 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75958fc765-rbf85" Oct 12 05:59:32 crc kubenswrapper[4930]: I1012 05:59:32.932092 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75958fc765-rbf85"] Oct 12 05:59:33 crc kubenswrapper[4930]: I1012 05:59:33.019839 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xd9sh\" (UniqueName: \"kubernetes.io/projected/0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3-kube-api-access-xd9sh\") pod \"dnsmasq-dns-75958fc765-rbf85\" (UID: \"0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3\") " pod="openstack/dnsmasq-dns-75958fc765-rbf85" Oct 12 05:59:33 crc kubenswrapper[4930]: I1012 05:59:33.020377 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3-ovsdbserver-sb\") pod \"dnsmasq-dns-75958fc765-rbf85\" (UID: \"0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3\") " pod="openstack/dnsmasq-dns-75958fc765-rbf85" Oct 12 05:59:33 crc kubenswrapper[4930]: I1012 05:59:33.020438 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3-config\") pod \"dnsmasq-dns-75958fc765-rbf85\" (UID: \"0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3\") " pod="openstack/dnsmasq-dns-75958fc765-rbf85" Oct 12 05:59:33 crc kubenswrapper[4930]: I1012 05:59:33.020521 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3-ovsdbserver-nb\") pod \"dnsmasq-dns-75958fc765-rbf85\" (UID: \"0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3\") " pod="openstack/dnsmasq-dns-75958fc765-rbf85" Oct 12 05:59:33 crc kubenswrapper[4930]: I1012 05:59:33.020555 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3-dns-swift-storage-0\") pod \"dnsmasq-dns-75958fc765-rbf85\" (UID: \"0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3\") " pod="openstack/dnsmasq-dns-75958fc765-rbf85" Oct 12 05:59:33 crc kubenswrapper[4930]: I1012 05:59:33.020600 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3-dns-svc\") pod \"dnsmasq-dns-75958fc765-rbf85\" (UID: \"0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3\") " pod="openstack/dnsmasq-dns-75958fc765-rbf85" Oct 12 05:59:33 crc kubenswrapper[4930]: I1012 05:59:33.071759 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 12 05:59:33 crc kubenswrapper[4930]: I1012 05:59:33.123827 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3-ovsdbserver-sb\") pod \"dnsmasq-dns-75958fc765-rbf85\" (UID: \"0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3\") " pod="openstack/dnsmasq-dns-75958fc765-rbf85" Oct 12 05:59:33 crc kubenswrapper[4930]: I1012 05:59:33.123915 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3-config\") pod \"dnsmasq-dns-75958fc765-rbf85\" (UID: \"0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3\") " pod="openstack/dnsmasq-dns-75958fc765-rbf85" Oct 12 05:59:33 crc kubenswrapper[4930]: I1012 05:59:33.123985 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3-ovsdbserver-nb\") pod \"dnsmasq-dns-75958fc765-rbf85\" (UID: \"0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3\") " pod="openstack/dnsmasq-dns-75958fc765-rbf85" Oct 12 05:59:33 crc kubenswrapper[4930]: I1012 05:59:33.124012 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3-dns-swift-storage-0\") pod \"dnsmasq-dns-75958fc765-rbf85\" (UID: \"0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3\") " pod="openstack/dnsmasq-dns-75958fc765-rbf85" Oct 12 05:59:33 crc kubenswrapper[4930]: I1012 05:59:33.124057 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3-dns-svc\") pod \"dnsmasq-dns-75958fc765-rbf85\" (UID: \"0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3\") " pod="openstack/dnsmasq-dns-75958fc765-rbf85" Oct 12 05:59:33 crc kubenswrapper[4930]: I1012 05:59:33.124081 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xd9sh\" (UniqueName: \"kubernetes.io/projected/0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3-kube-api-access-xd9sh\") pod \"dnsmasq-dns-75958fc765-rbf85\" (UID: \"0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3\") " pod="openstack/dnsmasq-dns-75958fc765-rbf85" Oct 12 05:59:33 crc kubenswrapper[4930]: I1012 05:59:33.125118 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3-ovsdbserver-sb\") pod \"dnsmasq-dns-75958fc765-rbf85\" (UID: \"0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3\") " pod="openstack/dnsmasq-dns-75958fc765-rbf85" Oct 12 05:59:33 crc kubenswrapper[4930]: I1012 05:59:33.125635 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3-config\") pod \"dnsmasq-dns-75958fc765-rbf85\" (UID: \"0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3\") " pod="openstack/dnsmasq-dns-75958fc765-rbf85" Oct 12 05:59:33 crc kubenswrapper[4930]: I1012 05:59:33.126576 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3-dns-swift-storage-0\") pod \"dnsmasq-dns-75958fc765-rbf85\" (UID: \"0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3\") " pod="openstack/dnsmasq-dns-75958fc765-rbf85" Oct 12 05:59:33 crc kubenswrapper[4930]: I1012 05:59:33.127164 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3-ovsdbserver-nb\") pod \"dnsmasq-dns-75958fc765-rbf85\" (UID: \"0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3\") " pod="openstack/dnsmasq-dns-75958fc765-rbf85" Oct 12 05:59:33 crc kubenswrapper[4930]: I1012 05:59:33.127409 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3-dns-svc\") pod \"dnsmasq-dns-75958fc765-rbf85\" (UID: \"0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3\") " pod="openstack/dnsmasq-dns-75958fc765-rbf85" Oct 12 05:59:33 crc kubenswrapper[4930]: I1012 05:59:33.136294 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-578d784664-rp79z" Oct 12 05:59:33 crc kubenswrapper[4930]: I1012 05:59:33.197786 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xd9sh\" (UniqueName: \"kubernetes.io/projected/0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3-kube-api-access-xd9sh\") pod \"dnsmasq-dns-75958fc765-rbf85\" (UID: \"0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3\") " pod="openstack/dnsmasq-dns-75958fc765-rbf85" Oct 12 05:59:33 crc kubenswrapper[4930]: I1012 05:59:33.315873 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 12 05:59:33 crc kubenswrapper[4930]: W1012 05:59:33.340475 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9354ea26_d24c_49ef_b564_b58a6f3e4f3f.slice/crio-83268b87516f300737ac7ca3b2e22a11166c24f847642b838202ca242ebb05b1 WatchSource:0}: Error finding container 83268b87516f300737ac7ca3b2e22a11166c24f847642b838202ca242ebb05b1: Status 404 returned error can't find the container with id 83268b87516f300737ac7ca3b2e22a11166c24f847642b838202ca242ebb05b1 Oct 12 05:59:33 crc kubenswrapper[4930]: I1012 05:59:33.402372 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b8db6db6f-7hlvd" event={"ID":"7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129","Type":"ContainerStarted","Data":"1e3e922731f26e2e87af0540c0556e109d22c2c71c86bfc8b28130bc0300c153"} Oct 12 05:59:33 crc kubenswrapper[4930]: I1012 05:59:33.411066 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764bcc8bff-dcspv" Oct 12 05:59:33 crc kubenswrapper[4930]: I1012 05:59:33.414673 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9354ea26-d24c-49ef-b564-b58a6f3e4f3f","Type":"ContainerStarted","Data":"83268b87516f300737ac7ca3b2e22a11166c24f847642b838202ca242ebb05b1"} Oct 12 05:59:33 crc kubenswrapper[4930]: I1012 05:59:33.419679 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75958fc765-rbf85" Oct 12 05:59:33 crc kubenswrapper[4930]: I1012 05:59:33.449988 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25638a0f-8164-4d69-91ec-305a67a6ffc8-ovsdbserver-nb\") pod \"25638a0f-8164-4d69-91ec-305a67a6ffc8\" (UID: \"25638a0f-8164-4d69-91ec-305a67a6ffc8\") " Oct 12 05:59:33 crc kubenswrapper[4930]: I1012 05:59:33.450045 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25638a0f-8164-4d69-91ec-305a67a6ffc8-ovsdbserver-sb\") pod \"25638a0f-8164-4d69-91ec-305a67a6ffc8\" (UID: \"25638a0f-8164-4d69-91ec-305a67a6ffc8\") " Oct 12 05:59:33 crc kubenswrapper[4930]: I1012 05:59:33.450226 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/25638a0f-8164-4d69-91ec-305a67a6ffc8-dns-swift-storage-0\") pod \"25638a0f-8164-4d69-91ec-305a67a6ffc8\" (UID: \"25638a0f-8164-4d69-91ec-305a67a6ffc8\") " Oct 12 05:59:33 crc kubenswrapper[4930]: I1012 05:59:33.450271 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25638a0f-8164-4d69-91ec-305a67a6ffc8-config\") pod \"25638a0f-8164-4d69-91ec-305a67a6ffc8\" (UID: \"25638a0f-8164-4d69-91ec-305a67a6ffc8\") " Oct 12 05:59:33 crc kubenswrapper[4930]: I1012 05:59:33.450338 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9gt4s\" (UniqueName: \"kubernetes.io/projected/25638a0f-8164-4d69-91ec-305a67a6ffc8-kube-api-access-9gt4s\") pod \"25638a0f-8164-4d69-91ec-305a67a6ffc8\" (UID: \"25638a0f-8164-4d69-91ec-305a67a6ffc8\") " Oct 12 05:59:33 crc kubenswrapper[4930]: I1012 05:59:33.450400 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25638a0f-8164-4d69-91ec-305a67a6ffc8-dns-svc\") pod \"25638a0f-8164-4d69-91ec-305a67a6ffc8\" (UID: \"25638a0f-8164-4d69-91ec-305a67a6ffc8\") " Oct 12 05:59:33 crc kubenswrapper[4930]: I1012 05:59:33.450978 4930 generic.go:334] "Generic (PLEG): container finished" podID="25638a0f-8164-4d69-91ec-305a67a6ffc8" containerID="4359e8a38e253c36010491695331d4650cefae8e367913c63242d8c7807d4df9" exitCode=0 Oct 12 05:59:33 crc kubenswrapper[4930]: I1012 05:59:33.451061 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764bcc8bff-dcspv" event={"ID":"25638a0f-8164-4d69-91ec-305a67a6ffc8","Type":"ContainerDied","Data":"4359e8a38e253c36010491695331d4650cefae8e367913c63242d8c7807d4df9"} Oct 12 05:59:33 crc kubenswrapper[4930]: I1012 05:59:33.451089 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764bcc8bff-dcspv" event={"ID":"25638a0f-8164-4d69-91ec-305a67a6ffc8","Type":"ContainerDied","Data":"ab5eb63a155d4fbb7bfbefa2809cd7e69eda7ce4231c6eba4e8bdeb217191226"} Oct 12 05:59:33 crc kubenswrapper[4930]: I1012 05:59:33.451106 4930 scope.go:117] "RemoveContainer" containerID="4359e8a38e253c36010491695331d4650cefae8e367913c63242d8c7807d4df9" Oct 12 05:59:33 crc kubenswrapper[4930]: I1012 05:59:33.451240 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764bcc8bff-dcspv" Oct 12 05:59:33 crc kubenswrapper[4930]: I1012 05:59:33.468568 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"df83ee2b-e484-4a2a-a619-4e9aadeb19a1","Type":"ContainerStarted","Data":"266ddf63427777ee248c4be7892a17fc87f86426387f54bfd5a0a14a00153de4"} Oct 12 05:59:33 crc kubenswrapper[4930]: I1012 05:59:33.475502 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25638a0f-8164-4d69-91ec-305a67a6ffc8-kube-api-access-9gt4s" (OuterVolumeSpecName: "kube-api-access-9gt4s") pod "25638a0f-8164-4d69-91ec-305a67a6ffc8" (UID: "25638a0f-8164-4d69-91ec-305a67a6ffc8"). InnerVolumeSpecName "kube-api-access-9gt4s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:59:33 crc kubenswrapper[4930]: I1012 05:59:33.534909 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-applier-0" Oct 12 05:59:33 crc kubenswrapper[4930]: I1012 05:59:33.552675 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9gt4s\" (UniqueName: \"kubernetes.io/projected/25638a0f-8164-4d69-91ec-305a67a6ffc8-kube-api-access-9gt4s\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:33 crc kubenswrapper[4930]: I1012 05:59:33.581369 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25638a0f-8164-4d69-91ec-305a67a6ffc8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "25638a0f-8164-4d69-91ec-305a67a6ffc8" (UID: "25638a0f-8164-4d69-91ec-305a67a6ffc8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:59:33 crc kubenswrapper[4930]: I1012 05:59:33.663071 4930 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25638a0f-8164-4d69-91ec-305a67a6ffc8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:33 crc kubenswrapper[4930]: I1012 05:59:33.686638 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 12 05:59:33 crc kubenswrapper[4930]: I1012 05:59:33.695575 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25638a0f-8164-4d69-91ec-305a67a6ffc8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "25638a0f-8164-4d69-91ec-305a67a6ffc8" (UID: "25638a0f-8164-4d69-91ec-305a67a6ffc8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:59:33 crc kubenswrapper[4930]: I1012 05:59:33.756244 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25638a0f-8164-4d69-91ec-305a67a6ffc8-config" (OuterVolumeSpecName: "config") pod "25638a0f-8164-4d69-91ec-305a67a6ffc8" (UID: "25638a0f-8164-4d69-91ec-305a67a6ffc8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:59:33 crc kubenswrapper[4930]: I1012 05:59:33.763279 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25638a0f-8164-4d69-91ec-305a67a6ffc8-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "25638a0f-8164-4d69-91ec-305a67a6ffc8" (UID: "25638a0f-8164-4d69-91ec-305a67a6ffc8"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:59:33 crc kubenswrapper[4930]: I1012 05:59:33.764098 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25638a0f-8164-4d69-91ec-305a67a6ffc8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "25638a0f-8164-4d69-91ec-305a67a6ffc8" (UID: "25638a0f-8164-4d69-91ec-305a67a6ffc8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:59:33 crc kubenswrapper[4930]: I1012 05:59:33.764627 4930 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/25638a0f-8164-4d69-91ec-305a67a6ffc8-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:33 crc kubenswrapper[4930]: I1012 05:59:33.764648 4930 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25638a0f-8164-4d69-91ec-305a67a6ffc8-config\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:33 crc kubenswrapper[4930]: I1012 05:59:33.764657 4930 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25638a0f-8164-4d69-91ec-305a67a6ffc8-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:33 crc kubenswrapper[4930]: I1012 05:59:33.764666 4930 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25638a0f-8164-4d69-91ec-305a67a6ffc8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:33 crc kubenswrapper[4930]: I1012 05:59:33.768071 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Oct 12 05:59:33 crc kubenswrapper[4930]: I1012 05:59:33.798256 4930 scope.go:117] "RemoveContainer" containerID="14d8eab0cdf5380ec92be5602d643b864f911b7db6f045574464d78d71286431" Oct 12 05:59:33 crc kubenswrapper[4930]: I1012 05:59:33.828190 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 12 05:59:33 crc kubenswrapper[4930]: E1012 05:59:33.828761 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25638a0f-8164-4d69-91ec-305a67a6ffc8" containerName="init" Oct 12 05:59:33 crc kubenswrapper[4930]: I1012 05:59:33.828778 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="25638a0f-8164-4d69-91ec-305a67a6ffc8" containerName="init" Oct 12 05:59:33 crc kubenswrapper[4930]: E1012 05:59:33.828796 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25638a0f-8164-4d69-91ec-305a67a6ffc8" containerName="dnsmasq-dns" Oct 12 05:59:33 crc kubenswrapper[4930]: I1012 05:59:33.828802 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="25638a0f-8164-4d69-91ec-305a67a6ffc8" containerName="dnsmasq-dns" Oct 12 05:59:33 crc kubenswrapper[4930]: I1012 05:59:33.829010 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="25638a0f-8164-4d69-91ec-305a67a6ffc8" containerName="dnsmasq-dns" Oct 12 05:59:33 crc kubenswrapper[4930]: I1012 05:59:33.830361 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 12 05:59:33 crc kubenswrapper[4930]: I1012 05:59:33.844838 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 12 05:59:33 crc kubenswrapper[4930]: I1012 05:59:33.846378 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-6jg95" Oct 12 05:59:33 crc kubenswrapper[4930]: I1012 05:59:33.846416 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 12 05:59:33 crc kubenswrapper[4930]: I1012 05:59:33.851831 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 12 05:59:33 crc kubenswrapper[4930]: I1012 05:59:33.979831 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a78b1b6-f6b5-4357-ab16-db5f1de38a60-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6a78b1b6-f6b5-4357-ab16-db5f1de38a60\") " pod="openstack/glance-default-external-api-0" Oct 12 05:59:33 crc kubenswrapper[4930]: I1012 05:59:33.979881 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77trt\" (UniqueName: \"kubernetes.io/projected/6a78b1b6-f6b5-4357-ab16-db5f1de38a60-kube-api-access-77trt\") pod \"glance-default-external-api-0\" (UID: \"6a78b1b6-f6b5-4357-ab16-db5f1de38a60\") " pod="openstack/glance-default-external-api-0" Oct 12 05:59:33 crc kubenswrapper[4930]: I1012 05:59:33.979917 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6a78b1b6-f6b5-4357-ab16-db5f1de38a60-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6a78b1b6-f6b5-4357-ab16-db5f1de38a60\") " pod="openstack/glance-default-external-api-0" Oct 12 05:59:33 crc kubenswrapper[4930]: I1012 05:59:33.979963 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a78b1b6-f6b5-4357-ab16-db5f1de38a60-config-data\") pod \"glance-default-external-api-0\" (UID: \"6a78b1b6-f6b5-4357-ab16-db5f1de38a60\") " pod="openstack/glance-default-external-api-0" Oct 12 05:59:33 crc kubenswrapper[4930]: I1012 05:59:33.980008 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a78b1b6-f6b5-4357-ab16-db5f1de38a60-scripts\") pod \"glance-default-external-api-0\" (UID: \"6a78b1b6-f6b5-4357-ab16-db5f1de38a60\") " pod="openstack/glance-default-external-api-0" Oct 12 05:59:33 crc kubenswrapper[4930]: I1012 05:59:33.980274 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"6a78b1b6-f6b5-4357-ab16-db5f1de38a60\") " pod="openstack/glance-default-external-api-0" Oct 12 05:59:33 crc kubenswrapper[4930]: I1012 05:59:33.980417 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a78b1b6-f6b5-4357-ab16-db5f1de38a60-logs\") pod \"glance-default-external-api-0\" (UID: \"6a78b1b6-f6b5-4357-ab16-db5f1de38a60\") " pod="openstack/glance-default-external-api-0" Oct 12 05:59:34 crc kubenswrapper[4930]: I1012 05:59:34.015573 4930 scope.go:117] "RemoveContainer" containerID="4359e8a38e253c36010491695331d4650cefae8e367913c63242d8c7807d4df9" Oct 12 05:59:34 crc kubenswrapper[4930]: E1012 05:59:34.023551 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4359e8a38e253c36010491695331d4650cefae8e367913c63242d8c7807d4df9\": container with ID starting with 4359e8a38e253c36010491695331d4650cefae8e367913c63242d8c7807d4df9 not found: ID does not exist" containerID="4359e8a38e253c36010491695331d4650cefae8e367913c63242d8c7807d4df9" Oct 12 05:59:34 crc kubenswrapper[4930]: I1012 05:59:34.023597 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4359e8a38e253c36010491695331d4650cefae8e367913c63242d8c7807d4df9"} err="failed to get container status \"4359e8a38e253c36010491695331d4650cefae8e367913c63242d8c7807d4df9\": rpc error: code = NotFound desc = could not find container \"4359e8a38e253c36010491695331d4650cefae8e367913c63242d8c7807d4df9\": container with ID starting with 4359e8a38e253c36010491695331d4650cefae8e367913c63242d8c7807d4df9 not found: ID does not exist" Oct 12 05:59:34 crc kubenswrapper[4930]: I1012 05:59:34.023624 4930 scope.go:117] "RemoveContainer" containerID="14d8eab0cdf5380ec92be5602d643b864f911b7db6f045574464d78d71286431" Oct 12 05:59:34 crc kubenswrapper[4930]: E1012 05:59:34.026993 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14d8eab0cdf5380ec92be5602d643b864f911b7db6f045574464d78d71286431\": container with ID starting with 14d8eab0cdf5380ec92be5602d643b864f911b7db6f045574464d78d71286431 not found: ID does not exist" containerID="14d8eab0cdf5380ec92be5602d643b864f911b7db6f045574464d78d71286431" Oct 12 05:59:34 crc kubenswrapper[4930]: I1012 05:59:34.027031 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14d8eab0cdf5380ec92be5602d643b864f911b7db6f045574464d78d71286431"} err="failed to get container status \"14d8eab0cdf5380ec92be5602d643b864f911b7db6f045574464d78d71286431\": rpc error: code = NotFound desc = could not find container \"14d8eab0cdf5380ec92be5602d643b864f911b7db6f045574464d78d71286431\": container with ID starting with 14d8eab0cdf5380ec92be5602d643b864f911b7db6f045574464d78d71286431 not found: ID does not exist" Oct 12 05:59:34 crc kubenswrapper[4930]: I1012 05:59:34.081859 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77trt\" (UniqueName: \"kubernetes.io/projected/6a78b1b6-f6b5-4357-ab16-db5f1de38a60-kube-api-access-77trt\") pod \"glance-default-external-api-0\" (UID: \"6a78b1b6-f6b5-4357-ab16-db5f1de38a60\") " pod="openstack/glance-default-external-api-0" Oct 12 05:59:34 crc kubenswrapper[4930]: I1012 05:59:34.081910 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6a78b1b6-f6b5-4357-ab16-db5f1de38a60-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6a78b1b6-f6b5-4357-ab16-db5f1de38a60\") " pod="openstack/glance-default-external-api-0" Oct 12 05:59:34 crc kubenswrapper[4930]: I1012 05:59:34.081934 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a78b1b6-f6b5-4357-ab16-db5f1de38a60-config-data\") pod \"glance-default-external-api-0\" (UID: \"6a78b1b6-f6b5-4357-ab16-db5f1de38a60\") " pod="openstack/glance-default-external-api-0" Oct 12 05:59:34 crc kubenswrapper[4930]: I1012 05:59:34.081965 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a78b1b6-f6b5-4357-ab16-db5f1de38a60-scripts\") pod \"glance-default-external-api-0\" (UID: \"6a78b1b6-f6b5-4357-ab16-db5f1de38a60\") " pod="openstack/glance-default-external-api-0" Oct 12 05:59:34 crc kubenswrapper[4930]: I1012 05:59:34.082024 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"6a78b1b6-f6b5-4357-ab16-db5f1de38a60\") " pod="openstack/glance-default-external-api-0" Oct 12 05:59:34 crc kubenswrapper[4930]: I1012 05:59:34.082067 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a78b1b6-f6b5-4357-ab16-db5f1de38a60-logs\") pod \"glance-default-external-api-0\" (UID: \"6a78b1b6-f6b5-4357-ab16-db5f1de38a60\") " pod="openstack/glance-default-external-api-0" Oct 12 05:59:34 crc kubenswrapper[4930]: I1012 05:59:34.082101 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a78b1b6-f6b5-4357-ab16-db5f1de38a60-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6a78b1b6-f6b5-4357-ab16-db5f1de38a60\") " pod="openstack/glance-default-external-api-0" Oct 12 05:59:34 crc kubenswrapper[4930]: I1012 05:59:34.085559 4930 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"6a78b1b6-f6b5-4357-ab16-db5f1de38a60\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-external-api-0" Oct 12 05:59:34 crc kubenswrapper[4930]: I1012 05:59:34.092658 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6a78b1b6-f6b5-4357-ab16-db5f1de38a60-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6a78b1b6-f6b5-4357-ab16-db5f1de38a60\") " pod="openstack/glance-default-external-api-0" Oct 12 05:59:34 crc kubenswrapper[4930]: I1012 05:59:34.092927 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a78b1b6-f6b5-4357-ab16-db5f1de38a60-logs\") pod \"glance-default-external-api-0\" (UID: \"6a78b1b6-f6b5-4357-ab16-db5f1de38a60\") " pod="openstack/glance-default-external-api-0" Oct 12 05:59:34 crc kubenswrapper[4930]: I1012 05:59:34.097745 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a78b1b6-f6b5-4357-ab16-db5f1de38a60-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6a78b1b6-f6b5-4357-ab16-db5f1de38a60\") " pod="openstack/glance-default-external-api-0" Oct 12 05:59:34 crc kubenswrapper[4930]: I1012 05:59:34.108887 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a78b1b6-f6b5-4357-ab16-db5f1de38a60-config-data\") pod \"glance-default-external-api-0\" (UID: \"6a78b1b6-f6b5-4357-ab16-db5f1de38a60\") " pod="openstack/glance-default-external-api-0" Oct 12 05:59:34 crc kubenswrapper[4930]: I1012 05:59:34.131586 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a78b1b6-f6b5-4357-ab16-db5f1de38a60-scripts\") pod \"glance-default-external-api-0\" (UID: \"6a78b1b6-f6b5-4357-ab16-db5f1de38a60\") " pod="openstack/glance-default-external-api-0" Oct 12 05:59:34 crc kubenswrapper[4930]: I1012 05:59:34.150591 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77trt\" (UniqueName: \"kubernetes.io/projected/6a78b1b6-f6b5-4357-ab16-db5f1de38a60-kube-api-access-77trt\") pod \"glance-default-external-api-0\" (UID: \"6a78b1b6-f6b5-4357-ab16-db5f1de38a60\") " pod="openstack/glance-default-external-api-0" Oct 12 05:59:34 crc kubenswrapper[4930]: I1012 05:59:34.190450 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"6a78b1b6-f6b5-4357-ab16-db5f1de38a60\") " pod="openstack/glance-default-external-api-0" Oct 12 05:59:34 crc kubenswrapper[4930]: I1012 05:59:34.241429 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 12 05:59:34 crc kubenswrapper[4930]: I1012 05:59:34.251030 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 12 05:59:34 crc kubenswrapper[4930]: I1012 05:59:34.251141 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 12 05:59:34 crc kubenswrapper[4930]: I1012 05:59:34.267137 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-764bcc8bff-dcspv"] Oct 12 05:59:34 crc kubenswrapper[4930]: I1012 05:59:34.269177 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 12 05:59:34 crc kubenswrapper[4930]: I1012 05:59:34.283518 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-764bcc8bff-dcspv"] Oct 12 05:59:34 crc kubenswrapper[4930]: I1012 05:59:34.406700 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/603009e7-da31-49c2-bb29-baad64c52187-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"603009e7-da31-49c2-bb29-baad64c52187\") " pod="openstack/glance-default-internal-api-0" Oct 12 05:59:34 crc kubenswrapper[4930]: I1012 05:59:34.406788 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"603009e7-da31-49c2-bb29-baad64c52187\") " pod="openstack/glance-default-internal-api-0" Oct 12 05:59:34 crc kubenswrapper[4930]: I1012 05:59:34.406824 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/603009e7-da31-49c2-bb29-baad64c52187-scripts\") pod \"glance-default-internal-api-0\" (UID: \"603009e7-da31-49c2-bb29-baad64c52187\") " pod="openstack/glance-default-internal-api-0" Oct 12 05:59:34 crc kubenswrapper[4930]: I1012 05:59:34.406853 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/603009e7-da31-49c2-bb29-baad64c52187-config-data\") pod \"glance-default-internal-api-0\" (UID: \"603009e7-da31-49c2-bb29-baad64c52187\") " pod="openstack/glance-default-internal-api-0" Oct 12 05:59:34 crc kubenswrapper[4930]: I1012 05:59:34.406878 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/603009e7-da31-49c2-bb29-baad64c52187-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"603009e7-da31-49c2-bb29-baad64c52187\") " pod="openstack/glance-default-internal-api-0" Oct 12 05:59:34 crc kubenswrapper[4930]: I1012 05:59:34.406953 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cptwf\" (UniqueName: \"kubernetes.io/projected/603009e7-da31-49c2-bb29-baad64c52187-kube-api-access-cptwf\") pod \"glance-default-internal-api-0\" (UID: \"603009e7-da31-49c2-bb29-baad64c52187\") " pod="openstack/glance-default-internal-api-0" Oct 12 05:59:34 crc kubenswrapper[4930]: I1012 05:59:34.406982 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/603009e7-da31-49c2-bb29-baad64c52187-logs\") pod \"glance-default-internal-api-0\" (UID: \"603009e7-da31-49c2-bb29-baad64c52187\") " pod="openstack/glance-default-internal-api-0" Oct 12 05:59:34 crc kubenswrapper[4930]: I1012 05:59:34.479039 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 12 05:59:34 crc kubenswrapper[4930]: I1012 05:59:34.487600 4930 generic.go:334] "Generic (PLEG): container finished" podID="7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129" containerID="d7461c7d344584fcfb6c787a72918bb576b0b6e05af930d2fdbb6007c45bf1ab" exitCode=0 Oct 12 05:59:34 crc kubenswrapper[4930]: I1012 05:59:34.487756 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b8db6db6f-7hlvd" event={"ID":"7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129","Type":"ContainerDied","Data":"d7461c7d344584fcfb6c787a72918bb576b0b6e05af930d2fdbb6007c45bf1ab"} Oct 12 05:59:34 crc kubenswrapper[4930]: I1012 05:59:34.488558 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Oct 12 05:59:34 crc kubenswrapper[4930]: I1012 05:59:34.508961 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cptwf\" (UniqueName: \"kubernetes.io/projected/603009e7-da31-49c2-bb29-baad64c52187-kube-api-access-cptwf\") pod \"glance-default-internal-api-0\" (UID: \"603009e7-da31-49c2-bb29-baad64c52187\") " pod="openstack/glance-default-internal-api-0" Oct 12 05:59:34 crc kubenswrapper[4930]: I1012 05:59:34.509021 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/603009e7-da31-49c2-bb29-baad64c52187-logs\") pod \"glance-default-internal-api-0\" (UID: \"603009e7-da31-49c2-bb29-baad64c52187\") " pod="openstack/glance-default-internal-api-0" Oct 12 05:59:34 crc kubenswrapper[4930]: I1012 05:59:34.509045 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/603009e7-da31-49c2-bb29-baad64c52187-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"603009e7-da31-49c2-bb29-baad64c52187\") " pod="openstack/glance-default-internal-api-0" Oct 12 05:59:34 crc kubenswrapper[4930]: I1012 05:59:34.509089 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"603009e7-da31-49c2-bb29-baad64c52187\") " pod="openstack/glance-default-internal-api-0" Oct 12 05:59:34 crc kubenswrapper[4930]: I1012 05:59:34.509129 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/603009e7-da31-49c2-bb29-baad64c52187-scripts\") pod \"glance-default-internal-api-0\" (UID: \"603009e7-da31-49c2-bb29-baad64c52187\") " pod="openstack/glance-default-internal-api-0" Oct 12 05:59:34 crc kubenswrapper[4930]: I1012 05:59:34.509160 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/603009e7-da31-49c2-bb29-baad64c52187-config-data\") pod \"glance-default-internal-api-0\" (UID: \"603009e7-da31-49c2-bb29-baad64c52187\") " pod="openstack/glance-default-internal-api-0" Oct 12 05:59:34 crc kubenswrapper[4930]: I1012 05:59:34.509186 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/603009e7-da31-49c2-bb29-baad64c52187-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"603009e7-da31-49c2-bb29-baad64c52187\") " pod="openstack/glance-default-internal-api-0" Oct 12 05:59:34 crc kubenswrapper[4930]: I1012 05:59:34.510460 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/603009e7-da31-49c2-bb29-baad64c52187-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"603009e7-da31-49c2-bb29-baad64c52187\") " pod="openstack/glance-default-internal-api-0" Oct 12 05:59:34 crc kubenswrapper[4930]: I1012 05:59:34.511277 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/603009e7-da31-49c2-bb29-baad64c52187-logs\") pod \"glance-default-internal-api-0\" (UID: \"603009e7-da31-49c2-bb29-baad64c52187\") " pod="openstack/glance-default-internal-api-0" Oct 12 05:59:34 crc kubenswrapper[4930]: I1012 05:59:34.517018 4930 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"603009e7-da31-49c2-bb29-baad64c52187\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-internal-api-0" Oct 12 05:59:34 crc kubenswrapper[4930]: I1012 05:59:34.522635 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/603009e7-da31-49c2-bb29-baad64c52187-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"603009e7-da31-49c2-bb29-baad64c52187\") " pod="openstack/glance-default-internal-api-0" Oct 12 05:59:34 crc kubenswrapper[4930]: I1012 05:59:34.522836 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/603009e7-da31-49c2-bb29-baad64c52187-scripts\") pod \"glance-default-internal-api-0\" (UID: \"603009e7-da31-49c2-bb29-baad64c52187\") " pod="openstack/glance-default-internal-api-0" Oct 12 05:59:34 crc kubenswrapper[4930]: I1012 05:59:34.523779 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/603009e7-da31-49c2-bb29-baad64c52187-config-data\") pod \"glance-default-internal-api-0\" (UID: \"603009e7-da31-49c2-bb29-baad64c52187\") " pod="openstack/glance-default-internal-api-0" Oct 12 05:59:34 crc kubenswrapper[4930]: I1012 05:59:34.537022 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cptwf\" (UniqueName: \"kubernetes.io/projected/603009e7-da31-49c2-bb29-baad64c52187-kube-api-access-cptwf\") pod \"glance-default-internal-api-0\" (UID: \"603009e7-da31-49c2-bb29-baad64c52187\") " pod="openstack/glance-default-internal-api-0" Oct 12 05:59:34 crc kubenswrapper[4930]: I1012 05:59:34.545248 4930 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="8f64aa33-3396-4700-bb0f-57b16e39e368" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.164:9322/\": dial tcp 10.217.0.164:9322: connect: connection refused" Oct 12 05:59:34 crc kubenswrapper[4930]: I1012 05:59:34.545273 4930 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="8f64aa33-3396-4700-bb0f-57b16e39e368" containerName="watcher-api-log" probeResult="failure" output="Get \"http://10.217.0.164:9322/\": dial tcp 10.217.0.164:9322: connect: connection refused" Oct 12 05:59:34 crc kubenswrapper[4930]: I1012 05:59:34.578983 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"603009e7-da31-49c2-bb29-baad64c52187\") " pod="openstack/glance-default-internal-api-0" Oct 12 05:59:34 crc kubenswrapper[4930]: I1012 05:59:34.605543 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Oct 12 05:59:34 crc kubenswrapper[4930]: I1012 05:59:34.612432 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 12 05:59:34 crc kubenswrapper[4930]: I1012 05:59:34.703439 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75958fc765-rbf85"] Oct 12 05:59:34 crc kubenswrapper[4930]: I1012 05:59:34.900140 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b8db6db6f-7hlvd" Oct 12 05:59:35 crc kubenswrapper[4930]: I1012 05:59:35.031725 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129-ovsdbserver-nb\") pod \"7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129\" (UID: \"7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129\") " Oct 12 05:59:35 crc kubenswrapper[4930]: I1012 05:59:35.032114 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129-ovsdbserver-sb\") pod \"7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129\" (UID: \"7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129\") " Oct 12 05:59:35 crc kubenswrapper[4930]: I1012 05:59:35.032149 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129-config\") pod \"7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129\" (UID: \"7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129\") " Oct 12 05:59:35 crc kubenswrapper[4930]: I1012 05:59:35.032179 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129-dns-swift-storage-0\") pod \"7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129\" (UID: \"7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129\") " Oct 12 05:59:35 crc kubenswrapper[4930]: I1012 05:59:35.032233 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129-dns-svc\") pod \"7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129\" (UID: \"7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129\") " Oct 12 05:59:35 crc kubenswrapper[4930]: I1012 05:59:35.032368 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fd7qw\" (UniqueName: \"kubernetes.io/projected/7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129-kube-api-access-fd7qw\") pod \"7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129\" (UID: \"7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129\") " Oct 12 05:59:35 crc kubenswrapper[4930]: I1012 05:59:35.061956 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129-kube-api-access-fd7qw" (OuterVolumeSpecName: "kube-api-access-fd7qw") pod "7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129" (UID: "7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129"). InnerVolumeSpecName "kube-api-access-fd7qw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:59:35 crc kubenswrapper[4930]: I1012 05:59:35.065342 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129" (UID: "7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:59:35 crc kubenswrapper[4930]: I1012 05:59:35.127689 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129" (UID: "7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:59:35 crc kubenswrapper[4930]: I1012 05:59:35.130575 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129" (UID: "7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:59:35 crc kubenswrapper[4930]: I1012 05:59:35.134444 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129-config" (OuterVolumeSpecName: "config") pod "7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129" (UID: "7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:59:35 crc kubenswrapper[4930]: I1012 05:59:35.137308 4930 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129-config\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:35 crc kubenswrapper[4930]: I1012 05:59:35.137791 4930 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:35 crc kubenswrapper[4930]: I1012 05:59:35.137849 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fd7qw\" (UniqueName: \"kubernetes.io/projected/7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129-kube-api-access-fd7qw\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:35 crc kubenswrapper[4930]: I1012 05:59:35.137859 4930 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:35 crc kubenswrapper[4930]: I1012 05:59:35.137868 4930 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:35 crc kubenswrapper[4930]: I1012 05:59:35.200855 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129" (UID: "7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:59:35 crc kubenswrapper[4930]: I1012 05:59:35.248386 4930 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:35 crc kubenswrapper[4930]: I1012 05:59:35.288668 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 12 05:59:35 crc kubenswrapper[4930]: I1012 05:59:35.508331 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75958fc765-rbf85" event={"ID":"0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3","Type":"ContainerStarted","Data":"721ba2e10bbe10d03f20cda1a6030e409e263e5fcd40f16789525d06be336d17"} Oct 12 05:59:35 crc kubenswrapper[4930]: I1012 05:59:35.517382 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"df83ee2b-e484-4a2a-a619-4e9aadeb19a1","Type":"ContainerStarted","Data":"b4c8b0882b0dbe2c9092bf447bef37dbcc054a544fa3c0d20011e40b2a894d55"} Oct 12 05:59:35 crc kubenswrapper[4930]: I1012 05:59:35.545938 4930 generic.go:334] "Generic (PLEG): container finished" podID="8f64aa33-3396-4700-bb0f-57b16e39e368" containerID="9b4e5324382ab8226f2902432ecafac57ec1761f1c6be848d7be65b1937d48bb" exitCode=0 Oct 12 05:59:35 crc kubenswrapper[4930]: I1012 05:59:35.545998 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"8f64aa33-3396-4700-bb0f-57b16e39e368","Type":"ContainerDied","Data":"9b4e5324382ab8226f2902432ecafac57ec1761f1c6be848d7be65b1937d48bb"} Oct 12 05:59:35 crc kubenswrapper[4930]: I1012 05:59:35.550293 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b8db6db6f-7hlvd" event={"ID":"7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129","Type":"ContainerDied","Data":"1e3e922731f26e2e87af0540c0556e109d22c2c71c86bfc8b28130bc0300c153"} Oct 12 05:59:35 crc kubenswrapper[4930]: I1012 05:59:35.550355 4930 scope.go:117] "RemoveContainer" containerID="d7461c7d344584fcfb6c787a72918bb576b0b6e05af930d2fdbb6007c45bf1ab" Oct 12 05:59:35 crc kubenswrapper[4930]: I1012 05:59:35.550535 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b8db6db6f-7hlvd" Oct 12 05:59:35 crc kubenswrapper[4930]: I1012 05:59:35.558125 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9354ea26-d24c-49ef-b564-b58a6f3e4f3f","Type":"ContainerStarted","Data":"ed8bf4918693de7f7434f05fe2be6d14f443570721d49aecde6d90044bbea9b7"} Oct 12 05:59:35 crc kubenswrapper[4930]: I1012 05:59:35.563023 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6844c9655c-rvdcz" Oct 12 05:59:35 crc kubenswrapper[4930]: I1012 05:59:35.687928 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b8db6db6f-7hlvd"] Oct 12 05:59:35 crc kubenswrapper[4930]: I1012 05:59:35.720794 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b8db6db6f-7hlvd"] Oct 12 05:59:35 crc kubenswrapper[4930]: I1012 05:59:35.724507 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-578d784664-rp79z"] Oct 12 05:59:35 crc kubenswrapper[4930]: I1012 05:59:35.725256 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-578d784664-rp79z" podUID="c9000baf-8bd7-4d76-a2c8-b98ad280728d" containerName="neutron-httpd" containerID="cri-o://64524e145955403b7c027d759d117e90588d66231377c4ce4125c8a86458ba4b" gracePeriod=30 Oct 12 05:59:35 crc kubenswrapper[4930]: I1012 05:59:35.731178 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-578d784664-rp79z" podUID="c9000baf-8bd7-4d76-a2c8-b98ad280728d" containerName="neutron-api" containerID="cri-o://ed5390836965969e0f55b579f3d611d101a852bbae842bf2a578de46b26a78e6" gracePeriod=30 Oct 12 05:59:35 crc kubenswrapper[4930]: I1012 05:59:35.850959 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 12 05:59:36 crc kubenswrapper[4930]: I1012 05:59:36.167966 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25638a0f-8164-4d69-91ec-305a67a6ffc8" path="/var/lib/kubelet/pods/25638a0f-8164-4d69-91ec-305a67a6ffc8/volumes" Oct 12 05:59:36 crc kubenswrapper[4930]: I1012 05:59:36.168535 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129" path="/var/lib/kubelet/pods/7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129/volumes" Oct 12 05:59:36 crc kubenswrapper[4930]: I1012 05:59:36.195939 4930 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6778cd8bb8-9zhz5" podUID="30049dfb-04f2-455c-a949-08bd6ff892d0" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.161:8443/dashboard/auth/login/?next=/dashboard/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 12 05:59:36 crc kubenswrapper[4930]: I1012 05:59:36.318913 4930 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6d76466876-jf9t8" podUID="a97771f5-bcbe-42d8-bdd8-41b43f8899a0" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.162:8443/dashboard/auth/login/?next=/dashboard/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 12 05:59:36 crc kubenswrapper[4930]: I1012 05:59:36.579635 4930 generic.go:334] "Generic (PLEG): container finished" podID="59f4bae4-1a84-449a-be72-e735294116e6" containerID="893b8e00eaeb9c339e5a9528038e3102dd99c50f64a2531fff869300585ed13a" exitCode=1 Oct 12 05:59:36 crc kubenswrapper[4930]: I1012 05:59:36.579718 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"59f4bae4-1a84-449a-be72-e735294116e6","Type":"ContainerDied","Data":"893b8e00eaeb9c339e5a9528038e3102dd99c50f64a2531fff869300585ed13a"} Oct 12 05:59:36 crc kubenswrapper[4930]: I1012 05:59:36.580297 4930 scope.go:117] "RemoveContainer" containerID="893b8e00eaeb9c339e5a9528038e3102dd99c50f64a2531fff869300585ed13a" Oct 12 05:59:36 crc kubenswrapper[4930]: E1012 05:59:36.580562 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 10s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(59f4bae4-1a84-449a-be72-e735294116e6)\"" pod="openstack/watcher-decision-engine-0" podUID="59f4bae4-1a84-449a-be72-e735294116e6" Oct 12 05:59:36 crc kubenswrapper[4930]: I1012 05:59:36.591880 4930 generic.go:334] "Generic (PLEG): container finished" podID="c9000baf-8bd7-4d76-a2c8-b98ad280728d" containerID="64524e145955403b7c027d759d117e90588d66231377c4ce4125c8a86458ba4b" exitCode=0 Oct 12 05:59:36 crc kubenswrapper[4930]: I1012 05:59:36.591925 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-578d784664-rp79z" event={"ID":"c9000baf-8bd7-4d76-a2c8-b98ad280728d","Type":"ContainerDied","Data":"64524e145955403b7c027d759d117e90588d66231377c4ce4125c8a86458ba4b"} Oct 12 05:59:37 crc kubenswrapper[4930]: I1012 05:59:37.442061 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-86668db68d-p5ppf" Oct 12 05:59:37 crc kubenswrapper[4930]: I1012 05:59:37.603307 4930 scope.go:117] "RemoveContainer" containerID="893b8e00eaeb9c339e5a9528038e3102dd99c50f64a2531fff869300585ed13a" Oct 12 05:59:37 crc kubenswrapper[4930]: E1012 05:59:37.603695 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 10s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(59f4bae4-1a84-449a-be72-e735294116e6)\"" pod="openstack/watcher-decision-engine-0" podUID="59f4bae4-1a84-449a-be72-e735294116e6" Oct 12 05:59:38 crc kubenswrapper[4930]: I1012 05:59:38.313249 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-86668db68d-p5ppf" Oct 12 05:59:38 crc kubenswrapper[4930]: I1012 05:59:38.458290 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 12 05:59:38 crc kubenswrapper[4930]: I1012 05:59:38.623569 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 12 05:59:39 crc kubenswrapper[4930]: I1012 05:59:39.661786 4930 generic.go:334] "Generic (PLEG): container finished" podID="9dd57cc3-6793-42f9-b938-620f968192c3" containerID="8392226d6c0e7f43f3b4908ab41afde31479dee244db52b006703321a58fafe6" exitCode=137 Oct 12 05:59:39 crc kubenswrapper[4930]: I1012 05:59:39.661949 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-86b8d48789-7rd2m" event={"ID":"9dd57cc3-6793-42f9-b938-620f968192c3","Type":"ContainerDied","Data":"8392226d6c0e7f43f3b4908ab41afde31479dee244db52b006703321a58fafe6"} Oct 12 05:59:39 crc kubenswrapper[4930]: I1012 05:59:39.936496 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 12 05:59:40 crc kubenswrapper[4930]: I1012 05:59:40.636574 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-696d7778c8-zcb9x" Oct 12 05:59:40 crc kubenswrapper[4930]: I1012 05:59:40.670053 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-696d7778c8-zcb9x" Oct 12 05:59:40 crc kubenswrapper[4930]: I1012 05:59:40.748765 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-86668db68d-p5ppf"] Oct 12 05:59:40 crc kubenswrapper[4930]: I1012 05:59:40.749001 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-86668db68d-p5ppf" podUID="86141d8f-1f9e-469a-9023-63d5a56ad33b" containerName="barbican-api-log" containerID="cri-o://b62d22932abf6bc9933c0371e28e61dd3dfe0b8d8d8c7ccd6ff55d33d8b9755a" gracePeriod=30 Oct 12 05:59:40 crc kubenswrapper[4930]: I1012 05:59:40.749149 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-86668db68d-p5ppf" podUID="86141d8f-1f9e-469a-9023-63d5a56ad33b" containerName="barbican-api" containerID="cri-o://950e2f08ba01d47b3d518920328fbb0f364b05d802d234516294f27276958930" gracePeriod=30 Oct 12 05:59:40 crc kubenswrapper[4930]: I1012 05:59:40.760197 4930 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-86668db68d-p5ppf" podUID="86141d8f-1f9e-469a-9023-63d5a56ad33b" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.175:9311/healthcheck\": EOF" Oct 12 05:59:40 crc kubenswrapper[4930]: I1012 05:59:40.763203 4930 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-86668db68d-p5ppf" podUID="86141d8f-1f9e-469a-9023-63d5a56ad33b" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.175:9311/healthcheck\": EOF" Oct 12 05:59:41 crc kubenswrapper[4930]: I1012 05:59:41.688105 4930 generic.go:334] "Generic (PLEG): container finished" podID="86141d8f-1f9e-469a-9023-63d5a56ad33b" containerID="b62d22932abf6bc9933c0371e28e61dd3dfe0b8d8d8c7ccd6ff55d33d8b9755a" exitCode=143 Oct 12 05:59:41 crc kubenswrapper[4930]: I1012 05:59:41.688246 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-86668db68d-p5ppf" event={"ID":"86141d8f-1f9e-469a-9023-63d5a56ad33b","Type":"ContainerDied","Data":"b62d22932abf6bc9933c0371e28e61dd3dfe0b8d8d8c7ccd6ff55d33d8b9755a"} Oct 12 05:59:42 crc kubenswrapper[4930]: I1012 05:59:42.716571 4930 generic.go:334] "Generic (PLEG): container finished" podID="b69d2329-3396-4e67-9880-940dacef7e56" containerID="d3b1d7f06ed0fc2481170337c30d12182ca4b0c017afe2dc33827d41b0c934d2" exitCode=137 Oct 12 05:59:42 crc kubenswrapper[4930]: I1012 05:59:42.716902 4930 generic.go:334] "Generic (PLEG): container finished" podID="b69d2329-3396-4e67-9880-940dacef7e56" containerID="47c711d16000c5d14e73813bcff6b60f98928393a79b11066202e3ecdf688e9e" exitCode=137 Oct 12 05:59:42 crc kubenswrapper[4930]: I1012 05:59:42.716781 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-84fd96956f-5sqc8" event={"ID":"b69d2329-3396-4e67-9880-940dacef7e56","Type":"ContainerDied","Data":"d3b1d7f06ed0fc2481170337c30d12182ca4b0c017afe2dc33827d41b0c934d2"} Oct 12 05:59:42 crc kubenswrapper[4930]: I1012 05:59:42.716974 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-84fd96956f-5sqc8" event={"ID":"b69d2329-3396-4e67-9880-940dacef7e56","Type":"ContainerDied","Data":"47c711d16000c5d14e73813bcff6b60f98928393a79b11066202e3ecdf688e9e"} Oct 12 05:59:42 crc kubenswrapper[4930]: I1012 05:59:42.720239 4930 generic.go:334] "Generic (PLEG): container finished" podID="86141d8f-1f9e-469a-9023-63d5a56ad33b" containerID="950e2f08ba01d47b3d518920328fbb0f364b05d802d234516294f27276958930" exitCode=0 Oct 12 05:59:42 crc kubenswrapper[4930]: I1012 05:59:42.720285 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-86668db68d-p5ppf" event={"ID":"86141d8f-1f9e-469a-9023-63d5a56ad33b","Type":"ContainerDied","Data":"950e2f08ba01d47b3d518920328fbb0f364b05d802d234516294f27276958930"} Oct 12 05:59:42 crc kubenswrapper[4930]: I1012 05:59:42.722524 4930 generic.go:334] "Generic (PLEG): container finished" podID="c9000baf-8bd7-4d76-a2c8-b98ad280728d" containerID="ed5390836965969e0f55b579f3d611d101a852bbae842bf2a578de46b26a78e6" exitCode=0 Oct 12 05:59:42 crc kubenswrapper[4930]: I1012 05:59:42.722564 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-578d784664-rp79z" event={"ID":"c9000baf-8bd7-4d76-a2c8-b98ad280728d","Type":"ContainerDied","Data":"ed5390836965969e0f55b579f3d611d101a852bbae842bf2a578de46b26a78e6"} Oct 12 05:59:42 crc kubenswrapper[4930]: I1012 05:59:42.724646 4930 generic.go:334] "Generic (PLEG): container finished" podID="f549aa26-b902-4497-838b-6b80e635897c" containerID="7383ee320f6575bed0c7fd6bd7510da07b91c2256f049f463e141453b7afb156" exitCode=137 Oct 12 05:59:42 crc kubenswrapper[4930]: I1012 05:59:42.724666 4930 generic.go:334] "Generic (PLEG): container finished" podID="f549aa26-b902-4497-838b-6b80e635897c" containerID="8e39d8ecb42b78b16591060ef87ea5c3018f7d42f7b006b9f645b2a67c37e569" exitCode=137 Oct 12 05:59:42 crc kubenswrapper[4930]: I1012 05:59:42.724680 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6df4dd4f95-cwhwt" event={"ID":"f549aa26-b902-4497-838b-6b80e635897c","Type":"ContainerDied","Data":"7383ee320f6575bed0c7fd6bd7510da07b91c2256f049f463e141453b7afb156"} Oct 12 05:59:42 crc kubenswrapper[4930]: I1012 05:59:42.724694 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6df4dd4f95-cwhwt" event={"ID":"f549aa26-b902-4497-838b-6b80e635897c","Type":"ContainerDied","Data":"8e39d8ecb42b78b16591060ef87ea5c3018f7d42f7b006b9f645b2a67c37e569"} Oct 12 05:59:43 crc kubenswrapper[4930]: W1012 05:59:43.179029 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a78b1b6_f6b5_4357_ab16_db5f1de38a60.slice/crio-13c6e1230dd18e3318d7800e2722860a0bfea15424eb72ab348cd3284a183ef2 WatchSource:0}: Error finding container 13c6e1230dd18e3318d7800e2722860a0bfea15424eb72ab348cd3284a183ef2: Status 404 returned error can't find the container with id 13c6e1230dd18e3318d7800e2722860a0bfea15424eb72ab348cd3284a183ef2 Oct 12 05:59:43 crc kubenswrapper[4930]: W1012 05:59:43.192998 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod603009e7_da31_49c2_bb29_baad64c52187.slice/crio-92eec786802858f170a8dc65a07aba1caf94f77c4d67ebcff59af9ed230313da WatchSource:0}: Error finding container 92eec786802858f170a8dc65a07aba1caf94f77c4d67ebcff59af9ed230313da: Status 404 returned error can't find the container with id 92eec786802858f170a8dc65a07aba1caf94f77c4d67ebcff59af9ed230313da Oct 12 05:59:43 crc kubenswrapper[4930]: I1012 05:59:43.266673 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-6d76466876-jf9t8" Oct 12 05:59:43 crc kubenswrapper[4930]: I1012 05:59:43.406717 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-6778cd8bb8-9zhz5" Oct 12 05:59:43 crc kubenswrapper[4930]: I1012 05:59:43.464968 4930 scope.go:117] "RemoveContainer" containerID="2b5d1d70dd62ea91af5c3a935efb2c2f47ca551be7765052523abfd189021dc2" Oct 12 05:59:43 crc kubenswrapper[4930]: I1012 05:59:43.505601 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 12 05:59:43 crc kubenswrapper[4930]: I1012 05:59:43.657783 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f64aa33-3396-4700-bb0f-57b16e39e368-logs\") pod \"8f64aa33-3396-4700-bb0f-57b16e39e368\" (UID: \"8f64aa33-3396-4700-bb0f-57b16e39e368\") " Oct 12 05:59:43 crc kubenswrapper[4930]: I1012 05:59:43.658169 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/8f64aa33-3396-4700-bb0f-57b16e39e368-custom-prometheus-ca\") pod \"8f64aa33-3396-4700-bb0f-57b16e39e368\" (UID: \"8f64aa33-3396-4700-bb0f-57b16e39e368\") " Oct 12 05:59:43 crc kubenswrapper[4930]: I1012 05:59:43.658200 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f64aa33-3396-4700-bb0f-57b16e39e368-combined-ca-bundle\") pod \"8f64aa33-3396-4700-bb0f-57b16e39e368\" (UID: \"8f64aa33-3396-4700-bb0f-57b16e39e368\") " Oct 12 05:59:43 crc kubenswrapper[4930]: I1012 05:59:43.658266 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f64aa33-3396-4700-bb0f-57b16e39e368-config-data\") pod \"8f64aa33-3396-4700-bb0f-57b16e39e368\" (UID: \"8f64aa33-3396-4700-bb0f-57b16e39e368\") " Oct 12 05:59:43 crc kubenswrapper[4930]: I1012 05:59:43.658357 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lgkvv\" (UniqueName: \"kubernetes.io/projected/8f64aa33-3396-4700-bb0f-57b16e39e368-kube-api-access-lgkvv\") pod \"8f64aa33-3396-4700-bb0f-57b16e39e368\" (UID: \"8f64aa33-3396-4700-bb0f-57b16e39e368\") " Oct 12 05:59:43 crc kubenswrapper[4930]: I1012 05:59:43.658769 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f64aa33-3396-4700-bb0f-57b16e39e368-logs" (OuterVolumeSpecName: "logs") pod "8f64aa33-3396-4700-bb0f-57b16e39e368" (UID: "8f64aa33-3396-4700-bb0f-57b16e39e368"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 05:59:43 crc kubenswrapper[4930]: I1012 05:59:43.659174 4930 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f64aa33-3396-4700-bb0f-57b16e39e368-logs\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:43 crc kubenswrapper[4930]: I1012 05:59:43.673908 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f64aa33-3396-4700-bb0f-57b16e39e368-kube-api-access-lgkvv" (OuterVolumeSpecName: "kube-api-access-lgkvv") pod "8f64aa33-3396-4700-bb0f-57b16e39e368" (UID: "8f64aa33-3396-4700-bb0f-57b16e39e368"). InnerVolumeSpecName "kube-api-access-lgkvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:59:43 crc kubenswrapper[4930]: I1012 05:59:43.685530 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 12 05:59:43 crc kubenswrapper[4930]: I1012 05:59:43.686192 4930 scope.go:117] "RemoveContainer" containerID="893b8e00eaeb9c339e5a9528038e3102dd99c50f64a2531fff869300585ed13a" Oct 12 05:59:43 crc kubenswrapper[4930]: E1012 05:59:43.686485 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 10s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(59f4bae4-1a84-449a-be72-e735294116e6)\"" pod="openstack/watcher-decision-engine-0" podUID="59f4bae4-1a84-449a-be72-e735294116e6" Oct 12 05:59:43 crc kubenswrapper[4930]: I1012 05:59:43.721767 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f64aa33-3396-4700-bb0f-57b16e39e368-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "8f64aa33-3396-4700-bb0f-57b16e39e368" (UID: "8f64aa33-3396-4700-bb0f-57b16e39e368"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:59:43 crc kubenswrapper[4930]: I1012 05:59:43.724816 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f64aa33-3396-4700-bb0f-57b16e39e368-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8f64aa33-3396-4700-bb0f-57b16e39e368" (UID: "8f64aa33-3396-4700-bb0f-57b16e39e368"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:59:43 crc kubenswrapper[4930]: I1012 05:59:43.766031 4930 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/8f64aa33-3396-4700-bb0f-57b16e39e368-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:43 crc kubenswrapper[4930]: I1012 05:59:43.766064 4930 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f64aa33-3396-4700-bb0f-57b16e39e368-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:43 crc kubenswrapper[4930]: I1012 05:59:43.766074 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lgkvv\" (UniqueName: \"kubernetes.io/projected/8f64aa33-3396-4700-bb0f-57b16e39e368-kube-api-access-lgkvv\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:43 crc kubenswrapper[4930]: I1012 05:59:43.768111 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"8f64aa33-3396-4700-bb0f-57b16e39e368","Type":"ContainerDied","Data":"8a152e29d5ae29833f14b5cc8de4880c5e8cda4535d42bb3040c5018fe681ec2"} Oct 12 05:59:43 crc kubenswrapper[4930]: I1012 05:59:43.768200 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 12 05:59:43 crc kubenswrapper[4930]: I1012 05:59:43.776168 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"603009e7-da31-49c2-bb29-baad64c52187","Type":"ContainerStarted","Data":"92eec786802858f170a8dc65a07aba1caf94f77c4d67ebcff59af9ed230313da"} Oct 12 05:59:43 crc kubenswrapper[4930]: I1012 05:59:43.779161 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f64aa33-3396-4700-bb0f-57b16e39e368-config-data" (OuterVolumeSpecName: "config-data") pod "8f64aa33-3396-4700-bb0f-57b16e39e368" (UID: "8f64aa33-3396-4700-bb0f-57b16e39e368"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:59:43 crc kubenswrapper[4930]: I1012 05:59:43.785958 4930 generic.go:334] "Generic (PLEG): container finished" podID="0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3" containerID="e9bc605ff3a8ee85dab30fa3131dfbafb181071c13d955a23d4dfcb804046adc" exitCode=0 Oct 12 05:59:43 crc kubenswrapper[4930]: I1012 05:59:43.786017 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75958fc765-rbf85" event={"ID":"0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3","Type":"ContainerDied","Data":"e9bc605ff3a8ee85dab30fa3131dfbafb181071c13d955a23d4dfcb804046adc"} Oct 12 05:59:43 crc kubenswrapper[4930]: I1012 05:59:43.800000 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6a78b1b6-f6b5-4357-ab16-db5f1de38a60","Type":"ContainerStarted","Data":"13c6e1230dd18e3318d7800e2722860a0bfea15424eb72ab348cd3284a183ef2"} Oct 12 05:59:43 crc kubenswrapper[4930]: I1012 05:59:43.867936 4930 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f64aa33-3396-4700-bb0f-57b16e39e368-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.115368 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.124214 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-api-0"] Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.130006 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Oct 12 05:59:44 crc kubenswrapper[4930]: E1012 05:59:44.130386 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129" containerName="init" Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.130397 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129" containerName="init" Oct 12 05:59:44 crc kubenswrapper[4930]: E1012 05:59:44.130429 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f64aa33-3396-4700-bb0f-57b16e39e368" containerName="watcher-api-log" Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.130435 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f64aa33-3396-4700-bb0f-57b16e39e368" containerName="watcher-api-log" Oct 12 05:59:44 crc kubenswrapper[4930]: E1012 05:59:44.130447 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f64aa33-3396-4700-bb0f-57b16e39e368" containerName="watcher-api" Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.130452 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f64aa33-3396-4700-bb0f-57b16e39e368" containerName="watcher-api" Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.130612 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f64aa33-3396-4700-bb0f-57b16e39e368" containerName="watcher-api" Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.130643 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f64aa33-3396-4700-bb0f-57b16e39e368" containerName="watcher-api-log" Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.130660 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e5c6e7f-ac1c-4d3f-b9da-8d1a2dd58129" containerName="init" Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.134181 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.140724 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-public-svc" Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.141041 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-internal-svc" Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.141152 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.164947 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f64aa33-3396-4700-bb0f-57b16e39e368" path="/var/lib/kubelet/pods/8f64aa33-3396-4700-bb0f-57b16e39e368/volumes" Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.165753 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.288957 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/eb68a1f2-d5d6-4fea-b29a-bc253bfc919d-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"eb68a1f2-d5d6-4fea-b29a-bc253bfc919d\") " pod="openstack/watcher-api-0" Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.289008 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb68a1f2-d5d6-4fea-b29a-bc253bfc919d-logs\") pod \"watcher-api-0\" (UID: \"eb68a1f2-d5d6-4fea-b29a-bc253bfc919d\") " pod="openstack/watcher-api-0" Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.289053 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb68a1f2-d5d6-4fea-b29a-bc253bfc919d-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"eb68a1f2-d5d6-4fea-b29a-bc253bfc919d\") " pod="openstack/watcher-api-0" Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.289088 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb68a1f2-d5d6-4fea-b29a-bc253bfc919d-config-data\") pod \"watcher-api-0\" (UID: \"eb68a1f2-d5d6-4fea-b29a-bc253bfc919d\") " pod="openstack/watcher-api-0" Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.289104 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddmbm\" (UniqueName: \"kubernetes.io/projected/eb68a1f2-d5d6-4fea-b29a-bc253bfc919d-kube-api-access-ddmbm\") pod \"watcher-api-0\" (UID: \"eb68a1f2-d5d6-4fea-b29a-bc253bfc919d\") " pod="openstack/watcher-api-0" Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.289186 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb68a1f2-d5d6-4fea-b29a-bc253bfc919d-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"eb68a1f2-d5d6-4fea-b29a-bc253bfc919d\") " pod="openstack/watcher-api-0" Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.289211 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb68a1f2-d5d6-4fea-b29a-bc253bfc919d-public-tls-certs\") pod \"watcher-api-0\" (UID: \"eb68a1f2-d5d6-4fea-b29a-bc253bfc919d\") " pod="openstack/watcher-api-0" Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.390805 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb68a1f2-d5d6-4fea-b29a-bc253bfc919d-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"eb68a1f2-d5d6-4fea-b29a-bc253bfc919d\") " pod="openstack/watcher-api-0" Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.390886 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb68a1f2-d5d6-4fea-b29a-bc253bfc919d-public-tls-certs\") pod \"watcher-api-0\" (UID: \"eb68a1f2-d5d6-4fea-b29a-bc253bfc919d\") " pod="openstack/watcher-api-0" Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.390964 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/eb68a1f2-d5d6-4fea-b29a-bc253bfc919d-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"eb68a1f2-d5d6-4fea-b29a-bc253bfc919d\") " pod="openstack/watcher-api-0" Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.391004 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb68a1f2-d5d6-4fea-b29a-bc253bfc919d-logs\") pod \"watcher-api-0\" (UID: \"eb68a1f2-d5d6-4fea-b29a-bc253bfc919d\") " pod="openstack/watcher-api-0" Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.391045 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb68a1f2-d5d6-4fea-b29a-bc253bfc919d-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"eb68a1f2-d5d6-4fea-b29a-bc253bfc919d\") " pod="openstack/watcher-api-0" Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.391094 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb68a1f2-d5d6-4fea-b29a-bc253bfc919d-config-data\") pod \"watcher-api-0\" (UID: \"eb68a1f2-d5d6-4fea-b29a-bc253bfc919d\") " pod="openstack/watcher-api-0" Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.391120 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddmbm\" (UniqueName: \"kubernetes.io/projected/eb68a1f2-d5d6-4fea-b29a-bc253bfc919d-kube-api-access-ddmbm\") pod \"watcher-api-0\" (UID: \"eb68a1f2-d5d6-4fea-b29a-bc253bfc919d\") " pod="openstack/watcher-api-0" Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.392007 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb68a1f2-d5d6-4fea-b29a-bc253bfc919d-logs\") pod \"watcher-api-0\" (UID: \"eb68a1f2-d5d6-4fea-b29a-bc253bfc919d\") " pod="openstack/watcher-api-0" Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.397243 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/eb68a1f2-d5d6-4fea-b29a-bc253bfc919d-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"eb68a1f2-d5d6-4fea-b29a-bc253bfc919d\") " pod="openstack/watcher-api-0" Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.397550 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb68a1f2-d5d6-4fea-b29a-bc253bfc919d-public-tls-certs\") pod \"watcher-api-0\" (UID: \"eb68a1f2-d5d6-4fea-b29a-bc253bfc919d\") " pod="openstack/watcher-api-0" Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.399973 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb68a1f2-d5d6-4fea-b29a-bc253bfc919d-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"eb68a1f2-d5d6-4fea-b29a-bc253bfc919d\") " pod="openstack/watcher-api-0" Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.401495 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb68a1f2-d5d6-4fea-b29a-bc253bfc919d-config-data\") pod \"watcher-api-0\" (UID: \"eb68a1f2-d5d6-4fea-b29a-bc253bfc919d\") " pod="openstack/watcher-api-0" Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.413509 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb68a1f2-d5d6-4fea-b29a-bc253bfc919d-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"eb68a1f2-d5d6-4fea-b29a-bc253bfc919d\") " pod="openstack/watcher-api-0" Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.413910 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddmbm\" (UniqueName: \"kubernetes.io/projected/eb68a1f2-d5d6-4fea-b29a-bc253bfc919d-kube-api-access-ddmbm\") pod \"watcher-api-0\" (UID: \"eb68a1f2-d5d6-4fea-b29a-bc253bfc919d\") " pod="openstack/watcher-api-0" Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.523346 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.544045 4930 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="8f64aa33-3396-4700-bb0f-57b16e39e368" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.164:9322/\": dial tcp 10.217.0.164:9322: i/o timeout (Client.Timeout exceeded while awaiting headers)" Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.549637 4930 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="8f64aa33-3396-4700-bb0f-57b16e39e368" containerName="watcher-api-log" probeResult="failure" output="Get \"http://10.217.0.164:9322/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.658918 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-86668db68d-p5ppf" Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.660464 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-578d784664-rp79z" Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.669803 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-86b8d48789-7rd2m" Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.681632 4930 scope.go:117] "RemoveContainer" containerID="9b4e5324382ab8226f2902432ecafac57ec1761f1c6be848d7be65b1937d48bb" Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.693546 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6df4dd4f95-cwhwt" Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.722833 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-84fd96956f-5sqc8" Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.803061 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9dd57cc3-6793-42f9-b938-620f968192c3-horizon-secret-key\") pod \"9dd57cc3-6793-42f9-b938-620f968192c3\" (UID: \"9dd57cc3-6793-42f9-b938-620f968192c3\") " Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.803089 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgghz\" (UniqueName: \"kubernetes.io/projected/86141d8f-1f9e-469a-9023-63d5a56ad33b-kube-api-access-rgghz\") pod \"86141d8f-1f9e-469a-9023-63d5a56ad33b\" (UID: \"86141d8f-1f9e-469a-9023-63d5a56ad33b\") " Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.803107 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/86141d8f-1f9e-469a-9023-63d5a56ad33b-config-data-custom\") pod \"86141d8f-1f9e-469a-9023-63d5a56ad33b\" (UID: \"86141d8f-1f9e-469a-9023-63d5a56ad33b\") " Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.803126 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86141d8f-1f9e-469a-9023-63d5a56ad33b-combined-ca-bundle\") pod \"86141d8f-1f9e-469a-9023-63d5a56ad33b\" (UID: \"86141d8f-1f9e-469a-9023-63d5a56ad33b\") " Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.803144 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9000baf-8bd7-4d76-a2c8-b98ad280728d-combined-ca-bundle\") pod \"c9000baf-8bd7-4d76-a2c8-b98ad280728d\" (UID: \"c9000baf-8bd7-4d76-a2c8-b98ad280728d\") " Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.803183 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9dd57cc3-6793-42f9-b938-620f968192c3-logs\") pod \"9dd57cc3-6793-42f9-b938-620f968192c3\" (UID: \"9dd57cc3-6793-42f9-b938-620f968192c3\") " Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.803201 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9dd57cc3-6793-42f9-b938-620f968192c3-scripts\") pod \"9dd57cc3-6793-42f9-b938-620f968192c3\" (UID: \"9dd57cc3-6793-42f9-b938-620f968192c3\") " Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.803245 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9000baf-8bd7-4d76-a2c8-b98ad280728d-ovndb-tls-certs\") pod \"c9000baf-8bd7-4d76-a2c8-b98ad280728d\" (UID: \"c9000baf-8bd7-4d76-a2c8-b98ad280728d\") " Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.803288 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86141d8f-1f9e-469a-9023-63d5a56ad33b-logs\") pod \"86141d8f-1f9e-469a-9023-63d5a56ad33b\" (UID: \"86141d8f-1f9e-469a-9023-63d5a56ad33b\") " Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.803345 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rx59v\" (UniqueName: \"kubernetes.io/projected/9dd57cc3-6793-42f9-b938-620f968192c3-kube-api-access-rx59v\") pod \"9dd57cc3-6793-42f9-b938-620f968192c3\" (UID: \"9dd57cc3-6793-42f9-b938-620f968192c3\") " Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.803366 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f549aa26-b902-4497-838b-6b80e635897c-logs\") pod \"f549aa26-b902-4497-838b-6b80e635897c\" (UID: \"f549aa26-b902-4497-838b-6b80e635897c\") " Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.803381 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f549aa26-b902-4497-838b-6b80e635897c-horizon-secret-key\") pod \"f549aa26-b902-4497-838b-6b80e635897c\" (UID: \"f549aa26-b902-4497-838b-6b80e635897c\") " Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.803401 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9dd57cc3-6793-42f9-b938-620f968192c3-config-data\") pod \"9dd57cc3-6793-42f9-b938-620f968192c3\" (UID: \"9dd57cc3-6793-42f9-b938-620f968192c3\") " Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.803438 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f549aa26-b902-4497-838b-6b80e635897c-config-data\") pod \"f549aa26-b902-4497-838b-6b80e635897c\" (UID: \"f549aa26-b902-4497-838b-6b80e635897c\") " Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.803502 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqp9m\" (UniqueName: \"kubernetes.io/projected/c9000baf-8bd7-4d76-a2c8-b98ad280728d-kube-api-access-lqp9m\") pod \"c9000baf-8bd7-4d76-a2c8-b98ad280728d\" (UID: \"c9000baf-8bd7-4d76-a2c8-b98ad280728d\") " Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.803531 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c9000baf-8bd7-4d76-a2c8-b98ad280728d-httpd-config\") pod \"c9000baf-8bd7-4d76-a2c8-b98ad280728d\" (UID: \"c9000baf-8bd7-4d76-a2c8-b98ad280728d\") " Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.803557 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6sgc\" (UniqueName: \"kubernetes.io/projected/f549aa26-b902-4497-838b-6b80e635897c-kube-api-access-z6sgc\") pod \"f549aa26-b902-4497-838b-6b80e635897c\" (UID: \"f549aa26-b902-4497-838b-6b80e635897c\") " Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.803579 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86141d8f-1f9e-469a-9023-63d5a56ad33b-config-data\") pod \"86141d8f-1f9e-469a-9023-63d5a56ad33b\" (UID: \"86141d8f-1f9e-469a-9023-63d5a56ad33b\") " Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.803611 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f549aa26-b902-4497-838b-6b80e635897c-scripts\") pod \"f549aa26-b902-4497-838b-6b80e635897c\" (UID: \"f549aa26-b902-4497-838b-6b80e635897c\") " Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.803642 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c9000baf-8bd7-4d76-a2c8-b98ad280728d-config\") pod \"c9000baf-8bd7-4d76-a2c8-b98ad280728d\" (UID: \"c9000baf-8bd7-4d76-a2c8-b98ad280728d\") " Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.804704 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86141d8f-1f9e-469a-9023-63d5a56ad33b-logs" (OuterVolumeSpecName: "logs") pod "86141d8f-1f9e-469a-9023-63d5a56ad33b" (UID: "86141d8f-1f9e-469a-9023-63d5a56ad33b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.806890 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9dd57cc3-6793-42f9-b938-620f968192c3-logs" (OuterVolumeSpecName: "logs") pod "9dd57cc3-6793-42f9-b938-620f968192c3" (UID: "9dd57cc3-6793-42f9-b938-620f968192c3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.834069 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f549aa26-b902-4497-838b-6b80e635897c-logs" (OuterVolumeSpecName: "logs") pod "f549aa26-b902-4497-838b-6b80e635897c" (UID: "f549aa26-b902-4497-838b-6b80e635897c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.846133 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f549aa26-b902-4497-838b-6b80e635897c-kube-api-access-z6sgc" (OuterVolumeSpecName: "kube-api-access-z6sgc") pod "f549aa26-b902-4497-838b-6b80e635897c" (UID: "f549aa26-b902-4497-838b-6b80e635897c"). InnerVolumeSpecName "kube-api-access-z6sgc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.846249 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dd57cc3-6793-42f9-b938-620f968192c3-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "9dd57cc3-6793-42f9-b938-620f968192c3" (UID: "9dd57cc3-6793-42f9-b938-620f968192c3"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.851046 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f549aa26-b902-4497-838b-6b80e635897c-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "f549aa26-b902-4497-838b-6b80e635897c" (UID: "f549aa26-b902-4497-838b-6b80e635897c"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.878321 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86141d8f-1f9e-469a-9023-63d5a56ad33b-kube-api-access-rgghz" (OuterVolumeSpecName: "kube-api-access-rgghz") pod "86141d8f-1f9e-469a-9023-63d5a56ad33b" (UID: "86141d8f-1f9e-469a-9023-63d5a56ad33b"). InnerVolumeSpecName "kube-api-access-rgghz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.888993 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86141d8f-1f9e-469a-9023-63d5a56ad33b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "86141d8f-1f9e-469a-9023-63d5a56ad33b" (UID: "86141d8f-1f9e-469a-9023-63d5a56ad33b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.889508 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9dd57cc3-6793-42f9-b938-620f968192c3-kube-api-access-rx59v" (OuterVolumeSpecName: "kube-api-access-rx59v") pod "9dd57cc3-6793-42f9-b938-620f968192c3" (UID: "9dd57cc3-6793-42f9-b938-620f968192c3"). InnerVolumeSpecName "kube-api-access-rx59v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.890894 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9000baf-8bd7-4d76-a2c8-b98ad280728d-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "c9000baf-8bd7-4d76-a2c8-b98ad280728d" (UID: "c9000baf-8bd7-4d76-a2c8-b98ad280728d"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.891341 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9000baf-8bd7-4d76-a2c8-b98ad280728d-kube-api-access-lqp9m" (OuterVolumeSpecName: "kube-api-access-lqp9m") pod "c9000baf-8bd7-4d76-a2c8-b98ad280728d" (UID: "c9000baf-8bd7-4d76-a2c8-b98ad280728d"). InnerVolumeSpecName "kube-api-access-lqp9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.905635 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b69d2329-3396-4e67-9880-940dacef7e56-horizon-secret-key\") pod \"b69d2329-3396-4e67-9880-940dacef7e56\" (UID: \"b69d2329-3396-4e67-9880-940dacef7e56\") " Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.905686 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b69d2329-3396-4e67-9880-940dacef7e56-config-data\") pod \"b69d2329-3396-4e67-9880-940dacef7e56\" (UID: \"b69d2329-3396-4e67-9880-940dacef7e56\") " Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.905846 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b69d2329-3396-4e67-9880-940dacef7e56-scripts\") pod \"b69d2329-3396-4e67-9880-940dacef7e56\" (UID: \"b69d2329-3396-4e67-9880-940dacef7e56\") " Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.905897 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b69d2329-3396-4e67-9880-940dacef7e56-logs\") pod \"b69d2329-3396-4e67-9880-940dacef7e56\" (UID: \"b69d2329-3396-4e67-9880-940dacef7e56\") " Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.905922 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk6hg\" (UniqueName: \"kubernetes.io/projected/b69d2329-3396-4e67-9880-940dacef7e56-kube-api-access-tk6hg\") pod \"b69d2329-3396-4e67-9880-940dacef7e56\" (UID: \"b69d2329-3396-4e67-9880-940dacef7e56\") " Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.906300 4930 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9dd57cc3-6793-42f9-b938-620f968192c3-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.906318 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgghz\" (UniqueName: \"kubernetes.io/projected/86141d8f-1f9e-469a-9023-63d5a56ad33b-kube-api-access-rgghz\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.906328 4930 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/86141d8f-1f9e-469a-9023-63d5a56ad33b-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.906336 4930 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9dd57cc3-6793-42f9-b938-620f968192c3-logs\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.906344 4930 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86141d8f-1f9e-469a-9023-63d5a56ad33b-logs\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.906353 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rx59v\" (UniqueName: \"kubernetes.io/projected/9dd57cc3-6793-42f9-b938-620f968192c3-kube-api-access-rx59v\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.906361 4930 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f549aa26-b902-4497-838b-6b80e635897c-logs\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.906368 4930 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f549aa26-b902-4497-838b-6b80e635897c-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.906376 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqp9m\" (UniqueName: \"kubernetes.io/projected/c9000baf-8bd7-4d76-a2c8-b98ad280728d-kube-api-access-lqp9m\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.906384 4930 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c9000baf-8bd7-4d76-a2c8-b98ad280728d-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.906391 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6sgc\" (UniqueName: \"kubernetes.io/projected/f549aa26-b902-4497-838b-6b80e635897c-kube-api-access-z6sgc\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.918143 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b69d2329-3396-4e67-9880-940dacef7e56-logs" (OuterVolumeSpecName: "logs") pod "b69d2329-3396-4e67-9880-940dacef7e56" (UID: "b69d2329-3396-4e67-9880-940dacef7e56"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.930304 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9dd57cc3-6793-42f9-b938-620f968192c3-scripts" (OuterVolumeSpecName: "scripts") pod "9dd57cc3-6793-42f9-b938-620f968192c3" (UID: "9dd57cc3-6793-42f9-b938-620f968192c3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.932994 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b69d2329-3396-4e67-9880-940dacef7e56-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "b69d2329-3396-4e67-9880-940dacef7e56" (UID: "b69d2329-3396-4e67-9880-940dacef7e56"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.957192 4930 scope.go:117] "RemoveContainer" containerID="2d144f341e028cb128d6fc363b8ab8768ff57f1a0e734f0e3a9ab7fead6f1aae" Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.957511 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f549aa26-b902-4497-838b-6b80e635897c-config-data" (OuterVolumeSpecName: "config-data") pod "f549aa26-b902-4497-838b-6b80e635897c" (UID: "f549aa26-b902-4497-838b-6b80e635897c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.981553 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b69d2329-3396-4e67-9880-940dacef7e56-kube-api-access-tk6hg" (OuterVolumeSpecName: "kube-api-access-tk6hg") pod "b69d2329-3396-4e67-9880-940dacef7e56" (UID: "b69d2329-3396-4e67-9880-940dacef7e56"). InnerVolumeSpecName "kube-api-access-tk6hg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:59:44 crc kubenswrapper[4930]: I1012 05:59:44.993423 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9dd57cc3-6793-42f9-b938-620f968192c3-config-data" (OuterVolumeSpecName: "config-data") pod "9dd57cc3-6793-42f9-b938-620f968192c3" (UID: "9dd57cc3-6793-42f9-b938-620f968192c3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:59:45 crc kubenswrapper[4930]: I1012 05:59:45.010331 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-84fd96956f-5sqc8" event={"ID":"b69d2329-3396-4e67-9880-940dacef7e56","Type":"ContainerDied","Data":"ef3edbe0788a2bbd3ac45ef204a0fa7190a8bcd0575a57e60101bc87b93f2ebc"} Oct 12 05:59:45 crc kubenswrapper[4930]: I1012 05:59:45.010432 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-84fd96956f-5sqc8" Oct 12 05:59:45 crc kubenswrapper[4930]: I1012 05:59:45.012022 4930 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9dd57cc3-6793-42f9-b938-620f968192c3-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:45 crc kubenswrapper[4930]: I1012 05:59:45.012034 4930 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f549aa26-b902-4497-838b-6b80e635897c-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:45 crc kubenswrapper[4930]: I1012 05:59:45.012043 4930 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b69d2329-3396-4e67-9880-940dacef7e56-logs\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:45 crc kubenswrapper[4930]: I1012 05:59:45.012051 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk6hg\" (UniqueName: \"kubernetes.io/projected/b69d2329-3396-4e67-9880-940dacef7e56-kube-api-access-tk6hg\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:45 crc kubenswrapper[4930]: I1012 05:59:45.012059 4930 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b69d2329-3396-4e67-9880-940dacef7e56-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:45 crc kubenswrapper[4930]: I1012 05:59:45.012068 4930 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9dd57cc3-6793-42f9-b938-620f968192c3-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:45 crc kubenswrapper[4930]: I1012 05:59:45.012631 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86141d8f-1f9e-469a-9023-63d5a56ad33b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "86141d8f-1f9e-469a-9023-63d5a56ad33b" (UID: "86141d8f-1f9e-469a-9023-63d5a56ad33b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:59:45 crc kubenswrapper[4930]: I1012 05:59:45.083814 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-86668db68d-p5ppf" event={"ID":"86141d8f-1f9e-469a-9023-63d5a56ad33b","Type":"ContainerDied","Data":"ad24ca744ce212ae5d81ee9db0a6b5c7c43462a6afca1f6430c54b5495669545"} Oct 12 05:59:45 crc kubenswrapper[4930]: I1012 05:59:45.083847 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-86668db68d-p5ppf" Oct 12 05:59:45 crc kubenswrapper[4930]: I1012 05:59:45.094676 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b69d2329-3396-4e67-9880-940dacef7e56-config-data" (OuterVolumeSpecName: "config-data") pod "b69d2329-3396-4e67-9880-940dacef7e56" (UID: "b69d2329-3396-4e67-9880-940dacef7e56"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:59:45 crc kubenswrapper[4930]: I1012 05:59:45.097458 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b69d2329-3396-4e67-9880-940dacef7e56-scripts" (OuterVolumeSpecName: "scripts") pod "b69d2329-3396-4e67-9880-940dacef7e56" (UID: "b69d2329-3396-4e67-9880-940dacef7e56"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:59:45 crc kubenswrapper[4930]: I1012 05:59:45.105478 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-578d784664-rp79z" Oct 12 05:59:45 crc kubenswrapper[4930]: I1012 05:59:45.105487 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-578d784664-rp79z" event={"ID":"c9000baf-8bd7-4d76-a2c8-b98ad280728d","Type":"ContainerDied","Data":"4cdb28b7d3b50e5ce551554a7ab24221224103e0f0e792b646717bca1024b489"} Oct 12 05:59:45 crc kubenswrapper[4930]: I1012 05:59:45.122884 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-86b8d48789-7rd2m" event={"ID":"9dd57cc3-6793-42f9-b938-620f968192c3","Type":"ContainerDied","Data":"0c62560e93c7a21b11e0e81a3eb1c9724b0fa387c2eb476a96c580480c31c257"} Oct 12 05:59:45 crc kubenswrapper[4930]: I1012 05:59:45.123007 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-86b8d48789-7rd2m" Oct 12 05:59:45 crc kubenswrapper[4930]: I1012 05:59:45.131107 4930 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b69d2329-3396-4e67-9880-940dacef7e56-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:45 crc kubenswrapper[4930]: I1012 05:59:45.131139 4930 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86141d8f-1f9e-469a-9023-63d5a56ad33b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:45 crc kubenswrapper[4930]: I1012 05:59:45.131148 4930 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b69d2329-3396-4e67-9880-940dacef7e56-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:45 crc kubenswrapper[4930]: I1012 05:59:45.138293 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6df4dd4f95-cwhwt" event={"ID":"f549aa26-b902-4497-838b-6b80e635897c","Type":"ContainerDied","Data":"f8af6f942697e7e8f483d761214ec737acd6905fd522d134004bde7c580d7459"} Oct 12 05:59:45 crc kubenswrapper[4930]: I1012 05:59:45.138362 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6df4dd4f95-cwhwt" Oct 12 05:59:45 crc kubenswrapper[4930]: I1012 05:59:45.150529 4930 scope.go:117] "RemoveContainer" containerID="d3b1d7f06ed0fc2481170337c30d12182ca4b0c017afe2dc33827d41b0c934d2" Oct 12 05:59:45 crc kubenswrapper[4930]: I1012 05:59:45.150897 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f549aa26-b902-4497-838b-6b80e635897c-scripts" (OuterVolumeSpecName: "scripts") pod "f549aa26-b902-4497-838b-6b80e635897c" (UID: "f549aa26-b902-4497-838b-6b80e635897c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:59:45 crc kubenswrapper[4930]: I1012 05:59:45.217512 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9000baf-8bd7-4d76-a2c8-b98ad280728d-config" (OuterVolumeSpecName: "config") pod "c9000baf-8bd7-4d76-a2c8-b98ad280728d" (UID: "c9000baf-8bd7-4d76-a2c8-b98ad280728d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:59:45 crc kubenswrapper[4930]: I1012 05:59:45.220024 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9000baf-8bd7-4d76-a2c8-b98ad280728d-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "c9000baf-8bd7-4d76-a2c8-b98ad280728d" (UID: "c9000baf-8bd7-4d76-a2c8-b98ad280728d"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:59:45 crc kubenswrapper[4930]: I1012 05:59:45.223070 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86141d8f-1f9e-469a-9023-63d5a56ad33b-config-data" (OuterVolumeSpecName: "config-data") pod "86141d8f-1f9e-469a-9023-63d5a56ad33b" (UID: "86141d8f-1f9e-469a-9023-63d5a56ad33b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:59:45 crc kubenswrapper[4930]: I1012 05:59:45.223131 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-86b8d48789-7rd2m"] Oct 12 05:59:45 crc kubenswrapper[4930]: I1012 05:59:45.225811 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9000baf-8bd7-4d76-a2c8-b98ad280728d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c9000baf-8bd7-4d76-a2c8-b98ad280728d" (UID: "c9000baf-8bd7-4d76-a2c8-b98ad280728d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:59:45 crc kubenswrapper[4930]: I1012 05:59:45.233296 4930 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86141d8f-1f9e-469a-9023-63d5a56ad33b-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:45 crc kubenswrapper[4930]: I1012 05:59:45.233325 4930 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f549aa26-b902-4497-838b-6b80e635897c-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:45 crc kubenswrapper[4930]: I1012 05:59:45.233334 4930 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/c9000baf-8bd7-4d76-a2c8-b98ad280728d-config\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:45 crc kubenswrapper[4930]: I1012 05:59:45.233343 4930 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9000baf-8bd7-4d76-a2c8-b98ad280728d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:45 crc kubenswrapper[4930]: I1012 05:59:45.233352 4930 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9000baf-8bd7-4d76-a2c8-b98ad280728d-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:45 crc kubenswrapper[4930]: I1012 05:59:45.237302 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-86b8d48789-7rd2m"] Oct 12 05:59:45 crc kubenswrapper[4930]: I1012 05:59:45.287543 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Oct 12 05:59:45 crc kubenswrapper[4930]: W1012 05:59:45.297868 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb68a1f2_d5d6_4fea_b29a_bc253bfc919d.slice/crio-0ed2ba16a79d0ab483202841694bb4dac51191b625184831cb376b7ffb52dac0 WatchSource:0}: Error finding container 0ed2ba16a79d0ab483202841694bb4dac51191b625184831cb376b7ffb52dac0: Status 404 returned error can't find the container with id 0ed2ba16a79d0ab483202841694bb4dac51191b625184831cb376b7ffb52dac0 Oct 12 05:59:45 crc kubenswrapper[4930]: I1012 05:59:45.322602 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-6d76466876-jf9t8" Oct 12 05:59:45 crc kubenswrapper[4930]: I1012 05:59:45.394875 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6778cd8bb8-9zhz5"] Oct 12 05:59:45 crc kubenswrapper[4930]: I1012 05:59:45.395079 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6778cd8bb8-9zhz5" podUID="30049dfb-04f2-455c-a949-08bd6ff892d0" containerName="horizon-log" containerID="cri-o://ac8297145e46be91604457a14764e05292e58540ab5b16ca150f8ac1bf5273b1" gracePeriod=30 Oct 12 05:59:45 crc kubenswrapper[4930]: I1012 05:59:45.395422 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6778cd8bb8-9zhz5" podUID="30049dfb-04f2-455c-a949-08bd6ff892d0" containerName="horizon" containerID="cri-o://61889e483f0c5377a242095541e1e60b08683d4d632d130c12343e4cefb25513" gracePeriod=30 Oct 12 05:59:45 crc kubenswrapper[4930]: I1012 05:59:45.402699 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-84fd96956f-5sqc8"] Oct 12 05:59:45 crc kubenswrapper[4930]: I1012 05:59:45.414218 4930 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6778cd8bb8-9zhz5" podUID="30049dfb-04f2-455c-a949-08bd6ff892d0" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.161:8443/dashboard/auth/login/?next=/dashboard/\": EOF" Oct 12 05:59:45 crc kubenswrapper[4930]: I1012 05:59:45.419249 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-84fd96956f-5sqc8"] Oct 12 05:59:45 crc kubenswrapper[4930]: I1012 05:59:45.464250 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-86668db68d-p5ppf"] Oct 12 05:59:45 crc kubenswrapper[4930]: I1012 05:59:45.478910 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-86668db68d-p5ppf"] Oct 12 05:59:45 crc kubenswrapper[4930]: I1012 05:59:45.525174 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-578d784664-rp79z"] Oct 12 05:59:45 crc kubenswrapper[4930]: I1012 05:59:45.537014 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-578d784664-rp79z"] Oct 12 05:59:45 crc kubenswrapper[4930]: I1012 05:59:45.548750 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6df4dd4f95-cwhwt"] Oct 12 05:59:45 crc kubenswrapper[4930]: I1012 05:59:45.561767 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6df4dd4f95-cwhwt"] Oct 12 05:59:45 crc kubenswrapper[4930]: E1012 05:59:45.622593 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="7b25d9dd-3bce-4f8d-9219-1d6ce75878ed" Oct 12 05:59:45 crc kubenswrapper[4930]: I1012 05:59:45.661291 4930 scope.go:117] "RemoveContainer" containerID="47c711d16000c5d14e73813bcff6b60f98928393a79b11066202e3ecdf688e9e" Oct 12 05:59:45 crc kubenswrapper[4930]: I1012 05:59:45.750459 4930 scope.go:117] "RemoveContainer" containerID="950e2f08ba01d47b3d518920328fbb0f364b05d802d234516294f27276958930" Oct 12 05:59:45 crc kubenswrapper[4930]: I1012 05:59:45.781307 4930 scope.go:117] "RemoveContainer" containerID="b62d22932abf6bc9933c0371e28e61dd3dfe0b8d8d8c7ccd6ff55d33d8b9755a" Oct 12 05:59:45 crc kubenswrapper[4930]: I1012 05:59:45.813275 4930 scope.go:117] "RemoveContainer" containerID="64524e145955403b7c027d759d117e90588d66231377c4ce4125c8a86458ba4b" Oct 12 05:59:45 crc kubenswrapper[4930]: I1012 05:59:45.837174 4930 scope.go:117] "RemoveContainer" containerID="ed5390836965969e0f55b579f3d611d101a852bbae842bf2a578de46b26a78e6" Oct 12 05:59:45 crc kubenswrapper[4930]: I1012 05:59:45.876680 4930 scope.go:117] "RemoveContainer" containerID="8392226d6c0e7f43f3b4908ab41afde31479dee244db52b006703321a58fafe6" Oct 12 05:59:46 crc kubenswrapper[4930]: I1012 05:59:46.150143 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86141d8f-1f9e-469a-9023-63d5a56ad33b" path="/var/lib/kubelet/pods/86141d8f-1f9e-469a-9023-63d5a56ad33b/volumes" Oct 12 05:59:46 crc kubenswrapper[4930]: I1012 05:59:46.153924 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9dd57cc3-6793-42f9-b938-620f968192c3" path="/var/lib/kubelet/pods/9dd57cc3-6793-42f9-b938-620f968192c3/volumes" Oct 12 05:59:46 crc kubenswrapper[4930]: I1012 05:59:46.156273 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b69d2329-3396-4e67-9880-940dacef7e56" path="/var/lib/kubelet/pods/b69d2329-3396-4e67-9880-940dacef7e56/volumes" Oct 12 05:59:46 crc kubenswrapper[4930]: I1012 05:59:46.157052 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9000baf-8bd7-4d76-a2c8-b98ad280728d" path="/var/lib/kubelet/pods/c9000baf-8bd7-4d76-a2c8-b98ad280728d/volumes" Oct 12 05:59:46 crc kubenswrapper[4930]: I1012 05:59:46.158250 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f549aa26-b902-4497-838b-6b80e635897c" path="/var/lib/kubelet/pods/f549aa26-b902-4497-838b-6b80e635897c/volumes" Oct 12 05:59:46 crc kubenswrapper[4930]: I1012 05:59:46.171705 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9354ea26-d24c-49ef-b564-b58a6f3e4f3f","Type":"ContainerStarted","Data":"28bbb5fa395f28f4e7ae80c42e89be51dda90d3198143558c12ff31e7215c216"} Oct 12 05:59:46 crc kubenswrapper[4930]: I1012 05:59:46.171907 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="9354ea26-d24c-49ef-b564-b58a6f3e4f3f" containerName="cinder-api-log" containerID="cri-o://ed8bf4918693de7f7434f05fe2be6d14f443570721d49aecde6d90044bbea9b7" gracePeriod=30 Oct 12 05:59:46 crc kubenswrapper[4930]: I1012 05:59:46.171994 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="9354ea26-d24c-49ef-b564-b58a6f3e4f3f" containerName="cinder-api" containerID="cri-o://28bbb5fa395f28f4e7ae80c42e89be51dda90d3198143558c12ff31e7215c216" gracePeriod=30 Oct 12 05:59:46 crc kubenswrapper[4930]: I1012 05:59:46.172016 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 12 05:59:46 crc kubenswrapper[4930]: I1012 05:59:46.184955 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6a78b1b6-f6b5-4357-ab16-db5f1de38a60","Type":"ContainerStarted","Data":"3b37c1600f39b9db1bf134fc8659ad96237e2fdaafceca30b1fa4108cc4656dc"} Oct 12 05:59:46 crc kubenswrapper[4930]: I1012 05:59:46.186513 4930 scope.go:117] "RemoveContainer" containerID="7383ee320f6575bed0c7fd6bd7510da07b91c2256f049f463e141453b7afb156" Oct 12 05:59:46 crc kubenswrapper[4930]: I1012 05:59:46.198919 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7b25d9dd-3bce-4f8d-9219-1d6ce75878ed","Type":"ContainerStarted","Data":"9611075a0346ebe159fbb360e141dcbb57042d642d566c6fd1855c99d9d2b8e7"} Oct 12 05:59:46 crc kubenswrapper[4930]: I1012 05:59:46.199526 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7b25d9dd-3bce-4f8d-9219-1d6ce75878ed" containerName="ceilometer-notification-agent" containerID="cri-o://42eb9ff39af53c03159df9698802c263efb99e17ab668023f32c7e55890df52f" gracePeriod=30 Oct 12 05:59:46 crc kubenswrapper[4930]: I1012 05:59:46.199861 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 12 05:59:46 crc kubenswrapper[4930]: I1012 05:59:46.200019 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=15.199988871 podStartE2EDuration="15.199988871s" podCreationTimestamp="2025-10-12 05:59:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:59:46.18950669 +0000 UTC m=+1118.731608455" watchObservedRunningTime="2025-10-12 05:59:46.199988871 +0000 UTC m=+1118.742090636" Oct 12 05:59:46 crc kubenswrapper[4930]: I1012 05:59:46.200293 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7b25d9dd-3bce-4f8d-9219-1d6ce75878ed" containerName="proxy-httpd" containerID="cri-o://9611075a0346ebe159fbb360e141dcbb57042d642d566c6fd1855c99d9d2b8e7" gracePeriod=30 Oct 12 05:59:46 crc kubenswrapper[4930]: I1012 05:59:46.200412 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7b25d9dd-3bce-4f8d-9219-1d6ce75878ed" containerName="sg-core" containerID="cri-o://3cb02172b934592377e139a2782717713272e8fb6014aad096c3e8ffee26c3a3" gracePeriod=30 Oct 12 05:59:46 crc kubenswrapper[4930]: I1012 05:59:46.217314 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75958fc765-rbf85" event={"ID":"0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3","Type":"ContainerStarted","Data":"2f1f3c78bf7472b42b43be9c8798a04a99ab6f1b52b82784f82597baf1acf3c0"} Oct 12 05:59:46 crc kubenswrapper[4930]: I1012 05:59:46.217779 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-75958fc765-rbf85" Oct 12 05:59:46 crc kubenswrapper[4930]: I1012 05:59:46.222052 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"df83ee2b-e484-4a2a-a619-4e9aadeb19a1","Type":"ContainerStarted","Data":"d0f8c31aa00150932c27bc723fef563ffb08cd71345032e256c141692453941f"} Oct 12 05:59:46 crc kubenswrapper[4930]: I1012 05:59:46.257067 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"603009e7-da31-49c2-bb29-baad64c52187","Type":"ContainerStarted","Data":"0e64ee6cdf6980869dee57a9ad832e0d620e799cfee5c893051f33b28d44b5a3"} Oct 12 05:59:46 crc kubenswrapper[4930]: I1012 05:59:46.260793 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-75958fc765-rbf85" podStartSLOduration=14.260774305 podStartE2EDuration="14.260774305s" podCreationTimestamp="2025-10-12 05:59:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:59:46.2513317 +0000 UTC m=+1118.793433465" watchObservedRunningTime="2025-10-12 05:59:46.260774305 +0000 UTC m=+1118.802876070" Oct 12 05:59:46 crc kubenswrapper[4930]: I1012 05:59:46.261892 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"eb68a1f2-d5d6-4fea-b29a-bc253bfc919d","Type":"ContainerStarted","Data":"d16e6d9a4254e1d28f9ed68010afaa2e42d65e73b4c849bc4d3ef53bdf8150a1"} Oct 12 05:59:46 crc kubenswrapper[4930]: I1012 05:59:46.261927 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"eb68a1f2-d5d6-4fea-b29a-bc253bfc919d","Type":"ContainerStarted","Data":"0ed2ba16a79d0ab483202841694bb4dac51191b625184831cb376b7ffb52dac0"} Oct 12 05:59:46 crc kubenswrapper[4930]: I1012 05:59:46.277880 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=14.327210361 podStartE2EDuration="15.27786373s" podCreationTimestamp="2025-10-12 05:59:31 +0000 UTC" firstStartedPulling="2025-10-12 05:59:33.072893628 +0000 UTC m=+1105.614995393" lastFinishedPulling="2025-10-12 05:59:34.023546997 +0000 UTC m=+1106.565648762" observedRunningTime="2025-10-12 05:59:46.270308482 +0000 UTC m=+1118.812410247" watchObservedRunningTime="2025-10-12 05:59:46.27786373 +0000 UTC m=+1118.819965495" Oct 12 05:59:46 crc kubenswrapper[4930]: I1012 05:59:46.539005 4930 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6778cd8bb8-9zhz5" podUID="30049dfb-04f2-455c-a949-08bd6ff892d0" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.161:8443/dashboard/auth/login/?next=/dashboard/\": read tcp 10.217.0.2:40106->10.217.0.161:8443: read: connection reset by peer" Oct 12 05:59:46 crc kubenswrapper[4930]: I1012 05:59:46.591265 4930 scope.go:117] "RemoveContainer" containerID="8e39d8ecb42b78b16591060ef87ea5c3018f7d42f7b006b9f645b2a67c37e569" Oct 12 05:59:46 crc kubenswrapper[4930]: I1012 05:59:46.890807 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 12 05:59:46 crc kubenswrapper[4930]: I1012 05:59:46.892612 4930 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/cinder-scheduler-0" podUID="df83ee2b-e484-4a2a-a619-4e9aadeb19a1" containerName="cinder-scheduler" probeResult="failure" output="Get \"http://10.217.0.177:8080/\": dial tcp 10.217.0.177:8080: connect: connection refused" Oct 12 05:59:47 crc kubenswrapper[4930]: I1012 05:59:47.274907 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6a78b1b6-f6b5-4357-ab16-db5f1de38a60","Type":"ContainerStarted","Data":"95a3d01d1e347fd8fea516b0f1338e51809127aff70212e4294fc2b4147f1202"} Oct 12 05:59:47 crc kubenswrapper[4930]: I1012 05:59:47.275048 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="6a78b1b6-f6b5-4357-ab16-db5f1de38a60" containerName="glance-log" containerID="cri-o://3b37c1600f39b9db1bf134fc8659ad96237e2fdaafceca30b1fa4108cc4656dc" gracePeriod=30 Oct 12 05:59:47 crc kubenswrapper[4930]: I1012 05:59:47.275074 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="6a78b1b6-f6b5-4357-ab16-db5f1de38a60" containerName="glance-httpd" containerID="cri-o://95a3d01d1e347fd8fea516b0f1338e51809127aff70212e4294fc2b4147f1202" gracePeriod=30 Oct 12 05:59:47 crc kubenswrapper[4930]: I1012 05:59:47.280267 4930 generic.go:334] "Generic (PLEG): container finished" podID="7b25d9dd-3bce-4f8d-9219-1d6ce75878ed" containerID="9611075a0346ebe159fbb360e141dcbb57042d642d566c6fd1855c99d9d2b8e7" exitCode=0 Oct 12 05:59:47 crc kubenswrapper[4930]: I1012 05:59:47.280359 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7b25d9dd-3bce-4f8d-9219-1d6ce75878ed","Type":"ContainerDied","Data":"9611075a0346ebe159fbb360e141dcbb57042d642d566c6fd1855c99d9d2b8e7"} Oct 12 05:59:47 crc kubenswrapper[4930]: I1012 05:59:47.280434 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7b25d9dd-3bce-4f8d-9219-1d6ce75878ed","Type":"ContainerDied","Data":"3cb02172b934592377e139a2782717713272e8fb6014aad096c3e8ffee26c3a3"} Oct 12 05:59:47 crc kubenswrapper[4930]: I1012 05:59:47.280370 4930 generic.go:334] "Generic (PLEG): container finished" podID="7b25d9dd-3bce-4f8d-9219-1d6ce75878ed" containerID="3cb02172b934592377e139a2782717713272e8fb6014aad096c3e8ffee26c3a3" exitCode=2 Oct 12 05:59:47 crc kubenswrapper[4930]: I1012 05:59:47.283553 4930 generic.go:334] "Generic (PLEG): container finished" podID="9354ea26-d24c-49ef-b564-b58a6f3e4f3f" containerID="ed8bf4918693de7f7434f05fe2be6d14f443570721d49aecde6d90044bbea9b7" exitCode=143 Oct 12 05:59:47 crc kubenswrapper[4930]: I1012 05:59:47.283696 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9354ea26-d24c-49ef-b564-b58a6f3e4f3f","Type":"ContainerDied","Data":"ed8bf4918693de7f7434f05fe2be6d14f443570721d49aecde6d90044bbea9b7"} Oct 12 05:59:47 crc kubenswrapper[4930]: I1012 05:59:47.285600 4930 generic.go:334] "Generic (PLEG): container finished" podID="30049dfb-04f2-455c-a949-08bd6ff892d0" containerID="61889e483f0c5377a242095541e1e60b08683d4d632d130c12343e4cefb25513" exitCode=0 Oct 12 05:59:47 crc kubenswrapper[4930]: I1012 05:59:47.285684 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6778cd8bb8-9zhz5" event={"ID":"30049dfb-04f2-455c-a949-08bd6ff892d0","Type":"ContainerDied","Data":"61889e483f0c5377a242095541e1e60b08683d4d632d130c12343e4cefb25513"} Oct 12 05:59:47 crc kubenswrapper[4930]: I1012 05:59:47.287360 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"603009e7-da31-49c2-bb29-baad64c52187","Type":"ContainerStarted","Data":"c5377275cddfafcac70ce2d221dafc693acea843259da8f5dbf4f00e77ef590b"} Oct 12 05:59:47 crc kubenswrapper[4930]: I1012 05:59:47.287531 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="603009e7-da31-49c2-bb29-baad64c52187" containerName="glance-log" containerID="cri-o://0e64ee6cdf6980869dee57a9ad832e0d620e799cfee5c893051f33b28d44b5a3" gracePeriod=30 Oct 12 05:59:47 crc kubenswrapper[4930]: I1012 05:59:47.287788 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="603009e7-da31-49c2-bb29-baad64c52187" containerName="glance-httpd" containerID="cri-o://c5377275cddfafcac70ce2d221dafc693acea843259da8f5dbf4f00e77ef590b" gracePeriod=30 Oct 12 05:59:47 crc kubenswrapper[4930]: I1012 05:59:47.293678 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"eb68a1f2-d5d6-4fea-b29a-bc253bfc919d","Type":"ContainerStarted","Data":"77e6c0bd5a779e8ecf63a3b526228f2391ecf48fc0446e188ac9706f0a5ef97b"} Oct 12 05:59:47 crc kubenswrapper[4930]: I1012 05:59:47.294942 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Oct 12 05:59:47 crc kubenswrapper[4930]: I1012 05:59:47.297214 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=15.297200038 podStartE2EDuration="15.297200038s" podCreationTimestamp="2025-10-12 05:59:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:59:47.295425924 +0000 UTC m=+1119.837527689" watchObservedRunningTime="2025-10-12 05:59:47.297200038 +0000 UTC m=+1119.839301813" Oct 12 05:59:47 crc kubenswrapper[4930]: I1012 05:59:47.331953 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=3.331923803 podStartE2EDuration="3.331923803s" podCreationTimestamp="2025-10-12 05:59:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:59:47.326418366 +0000 UTC m=+1119.868520151" watchObservedRunningTime="2025-10-12 05:59:47.331923803 +0000 UTC m=+1119.874025578" Oct 12 05:59:47 crc kubenswrapper[4930]: I1012 05:59:47.374887 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=14.374867322 podStartE2EDuration="14.374867322s" podCreationTimestamp="2025-10-12 05:59:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:59:47.368459913 +0000 UTC m=+1119.910561678" watchObservedRunningTime="2025-10-12 05:59:47.374867322 +0000 UTC m=+1119.916969087" Oct 12 05:59:47 crc kubenswrapper[4930]: E1012 05:59:47.545352 4930 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a78b1b6_f6b5_4357_ab16_db5f1de38a60.slice/crio-3b37c1600f39b9db1bf134fc8659ad96237e2fdaafceca30b1fa4108cc4656dc.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod603009e7_da31_49c2_bb29_baad64c52187.slice/crio-0e64ee6cdf6980869dee57a9ad832e0d620e799cfee5c893051f33b28d44b5a3.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a78b1b6_f6b5_4357_ab16_db5f1de38a60.slice/crio-conmon-3b37c1600f39b9db1bf134fc8659ad96237e2fdaafceca30b1fa4108cc4656dc.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod603009e7_da31_49c2_bb29_baad64c52187.slice/crio-c5377275cddfafcac70ce2d221dafc693acea843259da8f5dbf4f00e77ef590b.scope\": RecentStats: unable to find data in memory cache]" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.146037 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.229884 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.313382 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6a78b1b6-f6b5-4357-ab16-db5f1de38a60-httpd-run\") pod \"6a78b1b6-f6b5-4357-ab16-db5f1de38a60\" (UID: \"6a78b1b6-f6b5-4357-ab16-db5f1de38a60\") " Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.313724 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/603009e7-da31-49c2-bb29-baad64c52187-combined-ca-bundle\") pod \"603009e7-da31-49c2-bb29-baad64c52187\" (UID: \"603009e7-da31-49c2-bb29-baad64c52187\") " Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.313830 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-77trt\" (UniqueName: \"kubernetes.io/projected/6a78b1b6-f6b5-4357-ab16-db5f1de38a60-kube-api-access-77trt\") pod \"6a78b1b6-f6b5-4357-ab16-db5f1de38a60\" (UID: \"6a78b1b6-f6b5-4357-ab16-db5f1de38a60\") " Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.313864 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a78b1b6-f6b5-4357-ab16-db5f1de38a60-combined-ca-bundle\") pod \"6a78b1b6-f6b5-4357-ab16-db5f1de38a60\" (UID: \"6a78b1b6-f6b5-4357-ab16-db5f1de38a60\") " Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.313880 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a78b1b6-f6b5-4357-ab16-db5f1de38a60-config-data\") pod \"6a78b1b6-f6b5-4357-ab16-db5f1de38a60\" (UID: \"6a78b1b6-f6b5-4357-ab16-db5f1de38a60\") " Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.313940 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cptwf\" (UniqueName: \"kubernetes.io/projected/603009e7-da31-49c2-bb29-baad64c52187-kube-api-access-cptwf\") pod \"603009e7-da31-49c2-bb29-baad64c52187\" (UID: \"603009e7-da31-49c2-bb29-baad64c52187\") " Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.313970 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/603009e7-da31-49c2-bb29-baad64c52187-logs\") pod \"603009e7-da31-49c2-bb29-baad64c52187\" (UID: \"603009e7-da31-49c2-bb29-baad64c52187\") " Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.313992 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/603009e7-da31-49c2-bb29-baad64c52187-config-data\") pod \"603009e7-da31-49c2-bb29-baad64c52187\" (UID: \"603009e7-da31-49c2-bb29-baad64c52187\") " Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.314037 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"603009e7-da31-49c2-bb29-baad64c52187\" (UID: \"603009e7-da31-49c2-bb29-baad64c52187\") " Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.314078 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/603009e7-da31-49c2-bb29-baad64c52187-scripts\") pod \"603009e7-da31-49c2-bb29-baad64c52187\" (UID: \"603009e7-da31-49c2-bb29-baad64c52187\") " Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.314099 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a78b1b6-f6b5-4357-ab16-db5f1de38a60-scripts\") pod \"6a78b1b6-f6b5-4357-ab16-db5f1de38a60\" (UID: \"6a78b1b6-f6b5-4357-ab16-db5f1de38a60\") " Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.314125 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"6a78b1b6-f6b5-4357-ab16-db5f1de38a60\" (UID: \"6a78b1b6-f6b5-4357-ab16-db5f1de38a60\") " Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.314161 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a78b1b6-f6b5-4357-ab16-db5f1de38a60-logs\") pod \"6a78b1b6-f6b5-4357-ab16-db5f1de38a60\" (UID: \"6a78b1b6-f6b5-4357-ab16-db5f1de38a60\") " Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.314190 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/603009e7-da31-49c2-bb29-baad64c52187-httpd-run\") pod \"603009e7-da31-49c2-bb29-baad64c52187\" (UID: \"603009e7-da31-49c2-bb29-baad64c52187\") " Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.314763 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/603009e7-da31-49c2-bb29-baad64c52187-logs" (OuterVolumeSpecName: "logs") pod "603009e7-da31-49c2-bb29-baad64c52187" (UID: "603009e7-da31-49c2-bb29-baad64c52187"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.316003 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/603009e7-da31-49c2-bb29-baad64c52187-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "603009e7-da31-49c2-bb29-baad64c52187" (UID: "603009e7-da31-49c2-bb29-baad64c52187"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.316353 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a78b1b6-f6b5-4357-ab16-db5f1de38a60-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "6a78b1b6-f6b5-4357-ab16-db5f1de38a60" (UID: "6a78b1b6-f6b5-4357-ab16-db5f1de38a60"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.320820 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a78b1b6-f6b5-4357-ab16-db5f1de38a60-kube-api-access-77trt" (OuterVolumeSpecName: "kube-api-access-77trt") pod "6a78b1b6-f6b5-4357-ab16-db5f1de38a60" (UID: "6a78b1b6-f6b5-4357-ab16-db5f1de38a60"). InnerVolumeSpecName "kube-api-access-77trt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.322667 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a78b1b6-f6b5-4357-ab16-db5f1de38a60-logs" (OuterVolumeSpecName: "logs") pod "6a78b1b6-f6b5-4357-ab16-db5f1de38a60" (UID: "6a78b1b6-f6b5-4357-ab16-db5f1de38a60"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.326769 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/603009e7-da31-49c2-bb29-baad64c52187-kube-api-access-cptwf" (OuterVolumeSpecName: "kube-api-access-cptwf") pod "603009e7-da31-49c2-bb29-baad64c52187" (UID: "603009e7-da31-49c2-bb29-baad64c52187"). InnerVolumeSpecName "kube-api-access-cptwf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.328406 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "6a78b1b6-f6b5-4357-ab16-db5f1de38a60" (UID: "6a78b1b6-f6b5-4357-ab16-db5f1de38a60"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.328491 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/603009e7-da31-49c2-bb29-baad64c52187-scripts" (OuterVolumeSpecName: "scripts") pod "603009e7-da31-49c2-bb29-baad64c52187" (UID: "603009e7-da31-49c2-bb29-baad64c52187"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.332557 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a78b1b6-f6b5-4357-ab16-db5f1de38a60-scripts" (OuterVolumeSpecName: "scripts") pod "6a78b1b6-f6b5-4357-ab16-db5f1de38a60" (UID: "6a78b1b6-f6b5-4357-ab16-db5f1de38a60"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.335929 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "603009e7-da31-49c2-bb29-baad64c52187" (UID: "603009e7-da31-49c2-bb29-baad64c52187"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.340813 4930 generic.go:334] "Generic (PLEG): container finished" podID="6a78b1b6-f6b5-4357-ab16-db5f1de38a60" containerID="95a3d01d1e347fd8fea516b0f1338e51809127aff70212e4294fc2b4147f1202" exitCode=0 Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.340839 4930 generic.go:334] "Generic (PLEG): container finished" podID="6a78b1b6-f6b5-4357-ab16-db5f1de38a60" containerID="3b37c1600f39b9db1bf134fc8659ad96237e2fdaafceca30b1fa4108cc4656dc" exitCode=143 Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.340879 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6a78b1b6-f6b5-4357-ab16-db5f1de38a60","Type":"ContainerDied","Data":"95a3d01d1e347fd8fea516b0f1338e51809127aff70212e4294fc2b4147f1202"} Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.340905 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6a78b1b6-f6b5-4357-ab16-db5f1de38a60","Type":"ContainerDied","Data":"3b37c1600f39b9db1bf134fc8659ad96237e2fdaafceca30b1fa4108cc4656dc"} Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.340914 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6a78b1b6-f6b5-4357-ab16-db5f1de38a60","Type":"ContainerDied","Data":"13c6e1230dd18e3318d7800e2722860a0bfea15424eb72ab348cd3284a183ef2"} Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.340930 4930 scope.go:117] "RemoveContainer" containerID="95a3d01d1e347fd8fea516b0f1338e51809127aff70212e4294fc2b4147f1202" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.341041 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.345419 4930 generic.go:334] "Generic (PLEG): container finished" podID="603009e7-da31-49c2-bb29-baad64c52187" containerID="c5377275cddfafcac70ce2d221dafc693acea843259da8f5dbf4f00e77ef590b" exitCode=0 Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.345461 4930 generic.go:334] "Generic (PLEG): container finished" podID="603009e7-da31-49c2-bb29-baad64c52187" containerID="0e64ee6cdf6980869dee57a9ad832e0d620e799cfee5c893051f33b28d44b5a3" exitCode=143 Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.346590 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.347486 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"603009e7-da31-49c2-bb29-baad64c52187","Type":"ContainerDied","Data":"c5377275cddfafcac70ce2d221dafc693acea843259da8f5dbf4f00e77ef590b"} Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.347526 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"603009e7-da31-49c2-bb29-baad64c52187","Type":"ContainerDied","Data":"0e64ee6cdf6980869dee57a9ad832e0d620e799cfee5c893051f33b28d44b5a3"} Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.347570 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"603009e7-da31-49c2-bb29-baad64c52187","Type":"ContainerDied","Data":"92eec786802858f170a8dc65a07aba1caf94f77c4d67ebcff59af9ed230313da"} Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.366157 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a78b1b6-f6b5-4357-ab16-db5f1de38a60-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6a78b1b6-f6b5-4357-ab16-db5f1de38a60" (UID: "6a78b1b6-f6b5-4357-ab16-db5f1de38a60"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.366331 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/603009e7-da31-49c2-bb29-baad64c52187-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "603009e7-da31-49c2-bb29-baad64c52187" (UID: "603009e7-da31-49c2-bb29-baad64c52187"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.381235 4930 scope.go:117] "RemoveContainer" containerID="3b37c1600f39b9db1bf134fc8659ad96237e2fdaafceca30b1fa4108cc4656dc" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.396069 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a78b1b6-f6b5-4357-ab16-db5f1de38a60-config-data" (OuterVolumeSpecName: "config-data") pod "6a78b1b6-f6b5-4357-ab16-db5f1de38a60" (UID: "6a78b1b6-f6b5-4357-ab16-db5f1de38a60"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.417848 4930 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/603009e7-da31-49c2-bb29-baad64c52187-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.417975 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-77trt\" (UniqueName: \"kubernetes.io/projected/6a78b1b6-f6b5-4357-ab16-db5f1de38a60-kube-api-access-77trt\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.418013 4930 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a78b1b6-f6b5-4357-ab16-db5f1de38a60-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.418031 4930 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a78b1b6-f6b5-4357-ab16-db5f1de38a60-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.418048 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cptwf\" (UniqueName: \"kubernetes.io/projected/603009e7-da31-49c2-bb29-baad64c52187-kube-api-access-cptwf\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.418063 4930 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/603009e7-da31-49c2-bb29-baad64c52187-logs\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.418108 4930 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.418127 4930 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/603009e7-da31-49c2-bb29-baad64c52187-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.418144 4930 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a78b1b6-f6b5-4357-ab16-db5f1de38a60-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.418164 4930 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.418178 4930 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a78b1b6-f6b5-4357-ab16-db5f1de38a60-logs\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.418190 4930 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/603009e7-da31-49c2-bb29-baad64c52187-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.418202 4930 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6a78b1b6-f6b5-4357-ab16-db5f1de38a60-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.431576 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/603009e7-da31-49c2-bb29-baad64c52187-config-data" (OuterVolumeSpecName: "config-data") pod "603009e7-da31-49c2-bb29-baad64c52187" (UID: "603009e7-da31-49c2-bb29-baad64c52187"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.442856 4930 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.460521 4930 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.517190 4930 scope.go:117] "RemoveContainer" containerID="95a3d01d1e347fd8fea516b0f1338e51809127aff70212e4294fc2b4147f1202" Oct 12 05:59:48 crc kubenswrapper[4930]: E1012 05:59:48.517595 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95a3d01d1e347fd8fea516b0f1338e51809127aff70212e4294fc2b4147f1202\": container with ID starting with 95a3d01d1e347fd8fea516b0f1338e51809127aff70212e4294fc2b4147f1202 not found: ID does not exist" containerID="95a3d01d1e347fd8fea516b0f1338e51809127aff70212e4294fc2b4147f1202" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.517625 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95a3d01d1e347fd8fea516b0f1338e51809127aff70212e4294fc2b4147f1202"} err="failed to get container status \"95a3d01d1e347fd8fea516b0f1338e51809127aff70212e4294fc2b4147f1202\": rpc error: code = NotFound desc = could not find container \"95a3d01d1e347fd8fea516b0f1338e51809127aff70212e4294fc2b4147f1202\": container with ID starting with 95a3d01d1e347fd8fea516b0f1338e51809127aff70212e4294fc2b4147f1202 not found: ID does not exist" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.517647 4930 scope.go:117] "RemoveContainer" containerID="3b37c1600f39b9db1bf134fc8659ad96237e2fdaafceca30b1fa4108cc4656dc" Oct 12 05:59:48 crc kubenswrapper[4930]: E1012 05:59:48.518165 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b37c1600f39b9db1bf134fc8659ad96237e2fdaafceca30b1fa4108cc4656dc\": container with ID starting with 3b37c1600f39b9db1bf134fc8659ad96237e2fdaafceca30b1fa4108cc4656dc not found: ID does not exist" containerID="3b37c1600f39b9db1bf134fc8659ad96237e2fdaafceca30b1fa4108cc4656dc" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.518189 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b37c1600f39b9db1bf134fc8659ad96237e2fdaafceca30b1fa4108cc4656dc"} err="failed to get container status \"3b37c1600f39b9db1bf134fc8659ad96237e2fdaafceca30b1fa4108cc4656dc\": rpc error: code = NotFound desc = could not find container \"3b37c1600f39b9db1bf134fc8659ad96237e2fdaafceca30b1fa4108cc4656dc\": container with ID starting with 3b37c1600f39b9db1bf134fc8659ad96237e2fdaafceca30b1fa4108cc4656dc not found: ID does not exist" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.518207 4930 scope.go:117] "RemoveContainer" containerID="95a3d01d1e347fd8fea516b0f1338e51809127aff70212e4294fc2b4147f1202" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.518534 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95a3d01d1e347fd8fea516b0f1338e51809127aff70212e4294fc2b4147f1202"} err="failed to get container status \"95a3d01d1e347fd8fea516b0f1338e51809127aff70212e4294fc2b4147f1202\": rpc error: code = NotFound desc = could not find container \"95a3d01d1e347fd8fea516b0f1338e51809127aff70212e4294fc2b4147f1202\": container with ID starting with 95a3d01d1e347fd8fea516b0f1338e51809127aff70212e4294fc2b4147f1202 not found: ID does not exist" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.518553 4930 scope.go:117] "RemoveContainer" containerID="3b37c1600f39b9db1bf134fc8659ad96237e2fdaafceca30b1fa4108cc4656dc" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.518864 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b37c1600f39b9db1bf134fc8659ad96237e2fdaafceca30b1fa4108cc4656dc"} err="failed to get container status \"3b37c1600f39b9db1bf134fc8659ad96237e2fdaafceca30b1fa4108cc4656dc\": rpc error: code = NotFound desc = could not find container \"3b37c1600f39b9db1bf134fc8659ad96237e2fdaafceca30b1fa4108cc4656dc\": container with ID starting with 3b37c1600f39b9db1bf134fc8659ad96237e2fdaafceca30b1fa4108cc4656dc not found: ID does not exist" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.518886 4930 scope.go:117] "RemoveContainer" containerID="c5377275cddfafcac70ce2d221dafc693acea843259da8f5dbf4f00e77ef590b" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.519460 4930 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/603009e7-da31-49c2-bb29-baad64c52187-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.519480 4930 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.519490 4930 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.542400 4930 scope.go:117] "RemoveContainer" containerID="0e64ee6cdf6980869dee57a9ad832e0d620e799cfee5c893051f33b28d44b5a3" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.566442 4930 scope.go:117] "RemoveContainer" containerID="c5377275cddfafcac70ce2d221dafc693acea843259da8f5dbf4f00e77ef590b" Oct 12 05:59:48 crc kubenswrapper[4930]: E1012 05:59:48.567148 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5377275cddfafcac70ce2d221dafc693acea843259da8f5dbf4f00e77ef590b\": container with ID starting with c5377275cddfafcac70ce2d221dafc693acea843259da8f5dbf4f00e77ef590b not found: ID does not exist" containerID="c5377275cddfafcac70ce2d221dafc693acea843259da8f5dbf4f00e77ef590b" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.567193 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5377275cddfafcac70ce2d221dafc693acea843259da8f5dbf4f00e77ef590b"} err="failed to get container status \"c5377275cddfafcac70ce2d221dafc693acea843259da8f5dbf4f00e77ef590b\": rpc error: code = NotFound desc = could not find container \"c5377275cddfafcac70ce2d221dafc693acea843259da8f5dbf4f00e77ef590b\": container with ID starting with c5377275cddfafcac70ce2d221dafc693acea843259da8f5dbf4f00e77ef590b not found: ID does not exist" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.567219 4930 scope.go:117] "RemoveContainer" containerID="0e64ee6cdf6980869dee57a9ad832e0d620e799cfee5c893051f33b28d44b5a3" Oct 12 05:59:48 crc kubenswrapper[4930]: E1012 05:59:48.567601 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e64ee6cdf6980869dee57a9ad832e0d620e799cfee5c893051f33b28d44b5a3\": container with ID starting with 0e64ee6cdf6980869dee57a9ad832e0d620e799cfee5c893051f33b28d44b5a3 not found: ID does not exist" containerID="0e64ee6cdf6980869dee57a9ad832e0d620e799cfee5c893051f33b28d44b5a3" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.567716 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e64ee6cdf6980869dee57a9ad832e0d620e799cfee5c893051f33b28d44b5a3"} err="failed to get container status \"0e64ee6cdf6980869dee57a9ad832e0d620e799cfee5c893051f33b28d44b5a3\": rpc error: code = NotFound desc = could not find container \"0e64ee6cdf6980869dee57a9ad832e0d620e799cfee5c893051f33b28d44b5a3\": container with ID starting with 0e64ee6cdf6980869dee57a9ad832e0d620e799cfee5c893051f33b28d44b5a3 not found: ID does not exist" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.567781 4930 scope.go:117] "RemoveContainer" containerID="c5377275cddfafcac70ce2d221dafc693acea843259da8f5dbf4f00e77ef590b" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.568158 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5377275cddfafcac70ce2d221dafc693acea843259da8f5dbf4f00e77ef590b"} err="failed to get container status \"c5377275cddfafcac70ce2d221dafc693acea843259da8f5dbf4f00e77ef590b\": rpc error: code = NotFound desc = could not find container \"c5377275cddfafcac70ce2d221dafc693acea843259da8f5dbf4f00e77ef590b\": container with ID starting with c5377275cddfafcac70ce2d221dafc693acea843259da8f5dbf4f00e77ef590b not found: ID does not exist" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.568184 4930 scope.go:117] "RemoveContainer" containerID="0e64ee6cdf6980869dee57a9ad832e0d620e799cfee5c893051f33b28d44b5a3" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.568441 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e64ee6cdf6980869dee57a9ad832e0d620e799cfee5c893051f33b28d44b5a3"} err="failed to get container status \"0e64ee6cdf6980869dee57a9ad832e0d620e799cfee5c893051f33b28d44b5a3\": rpc error: code = NotFound desc = could not find container \"0e64ee6cdf6980869dee57a9ad832e0d620e799cfee5c893051f33b28d44b5a3\": container with ID starting with 0e64ee6cdf6980869dee57a9ad832e0d620e799cfee5c893051f33b28d44b5a3 not found: ID does not exist" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.691304 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.713756 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.739353 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 12 05:59:48 crc kubenswrapper[4930]: E1012 05:59:48.739765 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a78b1b6-f6b5-4357-ab16-db5f1de38a60" containerName="glance-httpd" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.739782 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a78b1b6-f6b5-4357-ab16-db5f1de38a60" containerName="glance-httpd" Oct 12 05:59:48 crc kubenswrapper[4930]: E1012 05:59:48.739796 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dd57cc3-6793-42f9-b938-620f968192c3" containerName="horizon" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.739803 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dd57cc3-6793-42f9-b938-620f968192c3" containerName="horizon" Oct 12 05:59:48 crc kubenswrapper[4930]: E1012 05:59:48.739815 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f549aa26-b902-4497-838b-6b80e635897c" containerName="horizon-log" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.739821 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="f549aa26-b902-4497-838b-6b80e635897c" containerName="horizon-log" Oct 12 05:59:48 crc kubenswrapper[4930]: E1012 05:59:48.739832 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b69d2329-3396-4e67-9880-940dacef7e56" containerName="horizon" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.739837 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="b69d2329-3396-4e67-9880-940dacef7e56" containerName="horizon" Oct 12 05:59:48 crc kubenswrapper[4930]: E1012 05:59:48.739848 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86141d8f-1f9e-469a-9023-63d5a56ad33b" containerName="barbican-api-log" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.739854 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="86141d8f-1f9e-469a-9023-63d5a56ad33b" containerName="barbican-api-log" Oct 12 05:59:48 crc kubenswrapper[4930]: E1012 05:59:48.739864 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a78b1b6-f6b5-4357-ab16-db5f1de38a60" containerName="glance-log" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.739869 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a78b1b6-f6b5-4357-ab16-db5f1de38a60" containerName="glance-log" Oct 12 05:59:48 crc kubenswrapper[4930]: E1012 05:59:48.739879 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9000baf-8bd7-4d76-a2c8-b98ad280728d" containerName="neutron-api" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.739886 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9000baf-8bd7-4d76-a2c8-b98ad280728d" containerName="neutron-api" Oct 12 05:59:48 crc kubenswrapper[4930]: E1012 05:59:48.739895 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b69d2329-3396-4e67-9880-940dacef7e56" containerName="horizon-log" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.739901 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="b69d2329-3396-4e67-9880-940dacef7e56" containerName="horizon-log" Oct 12 05:59:48 crc kubenswrapper[4930]: E1012 05:59:48.739916 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f549aa26-b902-4497-838b-6b80e635897c" containerName="horizon" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.739922 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="f549aa26-b902-4497-838b-6b80e635897c" containerName="horizon" Oct 12 05:59:48 crc kubenswrapper[4930]: E1012 05:59:48.739930 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86141d8f-1f9e-469a-9023-63d5a56ad33b" containerName="barbican-api" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.739937 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="86141d8f-1f9e-469a-9023-63d5a56ad33b" containerName="barbican-api" Oct 12 05:59:48 crc kubenswrapper[4930]: E1012 05:59:48.739946 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9000baf-8bd7-4d76-a2c8-b98ad280728d" containerName="neutron-httpd" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.739952 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9000baf-8bd7-4d76-a2c8-b98ad280728d" containerName="neutron-httpd" Oct 12 05:59:48 crc kubenswrapper[4930]: E1012 05:59:48.739967 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="603009e7-da31-49c2-bb29-baad64c52187" containerName="glance-httpd" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.739973 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="603009e7-da31-49c2-bb29-baad64c52187" containerName="glance-httpd" Oct 12 05:59:48 crc kubenswrapper[4930]: E1012 05:59:48.739982 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="603009e7-da31-49c2-bb29-baad64c52187" containerName="glance-log" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.739988 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="603009e7-da31-49c2-bb29-baad64c52187" containerName="glance-log" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.740151 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9000baf-8bd7-4d76-a2c8-b98ad280728d" containerName="neutron-api" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.740167 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a78b1b6-f6b5-4357-ab16-db5f1de38a60" containerName="glance-log" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.740177 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9000baf-8bd7-4d76-a2c8-b98ad280728d" containerName="neutron-httpd" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.740189 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="9dd57cc3-6793-42f9-b938-620f968192c3" containerName="horizon" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.740200 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="603009e7-da31-49c2-bb29-baad64c52187" containerName="glance-httpd" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.740209 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="86141d8f-1f9e-469a-9023-63d5a56ad33b" containerName="barbican-api" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.740219 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="86141d8f-1f9e-469a-9023-63d5a56ad33b" containerName="barbican-api-log" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.740231 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="b69d2329-3396-4e67-9880-940dacef7e56" containerName="horizon-log" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.740239 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="603009e7-da31-49c2-bb29-baad64c52187" containerName="glance-log" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.740247 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a78b1b6-f6b5-4357-ab16-db5f1de38a60" containerName="glance-httpd" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.740254 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="f549aa26-b902-4497-838b-6b80e635897c" containerName="horizon" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.740264 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="f549aa26-b902-4497-838b-6b80e635897c" containerName="horizon-log" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.740276 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="b69d2329-3396-4e67-9880-940dacef7e56" containerName="horizon" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.741219 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.747956 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.752933 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.753141 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.753447 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-6jg95" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.753709 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.760858 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.768160 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.776095 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.779164 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.782889 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.783126 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.785977 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.827912 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d297fbfc-df5c-4365-a668-d04f87df845d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d297fbfc-df5c-4365-a668-d04f87df845d\") " pod="openstack/glance-default-external-api-0" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.827949 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d297fbfc-df5c-4365-a668-d04f87df845d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d297fbfc-df5c-4365-a668-d04f87df845d\") " pod="openstack/glance-default-external-api-0" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.827971 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d297fbfc-df5c-4365-a668-d04f87df845d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d297fbfc-df5c-4365-a668-d04f87df845d\") " pod="openstack/glance-default-external-api-0" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.828009 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"d297fbfc-df5c-4365-a668-d04f87df845d\") " pod="openstack/glance-default-external-api-0" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.828337 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d297fbfc-df5c-4365-a668-d04f87df845d-logs\") pod \"glance-default-external-api-0\" (UID: \"d297fbfc-df5c-4365-a668-d04f87df845d\") " pod="openstack/glance-default-external-api-0" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.828455 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d297fbfc-df5c-4365-a668-d04f87df845d-scripts\") pod \"glance-default-external-api-0\" (UID: \"d297fbfc-df5c-4365-a668-d04f87df845d\") " pod="openstack/glance-default-external-api-0" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.828680 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6l2fw\" (UniqueName: \"kubernetes.io/projected/d297fbfc-df5c-4365-a668-d04f87df845d-kube-api-access-6l2fw\") pod \"glance-default-external-api-0\" (UID: \"d297fbfc-df5c-4365-a668-d04f87df845d\") " pod="openstack/glance-default-external-api-0" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.828708 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d297fbfc-df5c-4365-a668-d04f87df845d-config-data\") pod \"glance-default-external-api-0\" (UID: \"d297fbfc-df5c-4365-a668-d04f87df845d\") " pod="openstack/glance-default-external-api-0" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.930153 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b83c8650-6361-42bb-8112-0449e6722a3f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b83c8650-6361-42bb-8112-0449e6722a3f\") " pod="openstack/glance-default-internal-api-0" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.930446 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b83c8650-6361-42bb-8112-0449e6722a3f-logs\") pod \"glance-default-internal-api-0\" (UID: \"b83c8650-6361-42bb-8112-0449e6722a3f\") " pod="openstack/glance-default-internal-api-0" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.930511 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6l2fw\" (UniqueName: \"kubernetes.io/projected/d297fbfc-df5c-4365-a668-d04f87df845d-kube-api-access-6l2fw\") pod \"glance-default-external-api-0\" (UID: \"d297fbfc-df5c-4365-a668-d04f87df845d\") " pod="openstack/glance-default-external-api-0" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.930548 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d297fbfc-df5c-4365-a668-d04f87df845d-config-data\") pod \"glance-default-external-api-0\" (UID: \"d297fbfc-df5c-4365-a668-d04f87df845d\") " pod="openstack/glance-default-external-api-0" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.930577 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b83c8650-6361-42bb-8112-0449e6722a3f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b83c8650-6361-42bb-8112-0449e6722a3f\") " pod="openstack/glance-default-internal-api-0" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.930804 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b83c8650-6361-42bb-8112-0449e6722a3f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b83c8650-6361-42bb-8112-0449e6722a3f\") " pod="openstack/glance-default-internal-api-0" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.931068 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crxzw\" (UniqueName: \"kubernetes.io/projected/b83c8650-6361-42bb-8112-0449e6722a3f-kube-api-access-crxzw\") pod \"glance-default-internal-api-0\" (UID: \"b83c8650-6361-42bb-8112-0449e6722a3f\") " pod="openstack/glance-default-internal-api-0" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.931132 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d297fbfc-df5c-4365-a668-d04f87df845d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d297fbfc-df5c-4365-a668-d04f87df845d\") " pod="openstack/glance-default-external-api-0" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.931171 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d297fbfc-df5c-4365-a668-d04f87df845d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d297fbfc-df5c-4365-a668-d04f87df845d\") " pod="openstack/glance-default-external-api-0" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.931209 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b83c8650-6361-42bb-8112-0449e6722a3f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b83c8650-6361-42bb-8112-0449e6722a3f\") " pod="openstack/glance-default-internal-api-0" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.931231 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d297fbfc-df5c-4365-a668-d04f87df845d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d297fbfc-df5c-4365-a668-d04f87df845d\") " pod="openstack/glance-default-external-api-0" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.931263 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"b83c8650-6361-42bb-8112-0449e6722a3f\") " pod="openstack/glance-default-internal-api-0" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.931283 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"d297fbfc-df5c-4365-a668-d04f87df845d\") " pod="openstack/glance-default-external-api-0" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.931357 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d297fbfc-df5c-4365-a668-d04f87df845d-logs\") pod \"glance-default-external-api-0\" (UID: \"d297fbfc-df5c-4365-a668-d04f87df845d\") " pod="openstack/glance-default-external-api-0" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.931427 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b83c8650-6361-42bb-8112-0449e6722a3f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b83c8650-6361-42bb-8112-0449e6722a3f\") " pod="openstack/glance-default-internal-api-0" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.931502 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d297fbfc-df5c-4365-a668-d04f87df845d-scripts\") pod \"glance-default-external-api-0\" (UID: \"d297fbfc-df5c-4365-a668-d04f87df845d\") " pod="openstack/glance-default-external-api-0" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.931601 4930 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"d297fbfc-df5c-4365-a668-d04f87df845d\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-external-api-0" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.931839 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d297fbfc-df5c-4365-a668-d04f87df845d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d297fbfc-df5c-4365-a668-d04f87df845d\") " pod="openstack/glance-default-external-api-0" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.932588 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d297fbfc-df5c-4365-a668-d04f87df845d-logs\") pod \"glance-default-external-api-0\" (UID: \"d297fbfc-df5c-4365-a668-d04f87df845d\") " pod="openstack/glance-default-external-api-0" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.934998 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d297fbfc-df5c-4365-a668-d04f87df845d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d297fbfc-df5c-4365-a668-d04f87df845d\") " pod="openstack/glance-default-external-api-0" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.935269 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d297fbfc-df5c-4365-a668-d04f87df845d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d297fbfc-df5c-4365-a668-d04f87df845d\") " pod="openstack/glance-default-external-api-0" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.935868 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d297fbfc-df5c-4365-a668-d04f87df845d-config-data\") pod \"glance-default-external-api-0\" (UID: \"d297fbfc-df5c-4365-a668-d04f87df845d\") " pod="openstack/glance-default-external-api-0" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.944025 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d297fbfc-df5c-4365-a668-d04f87df845d-scripts\") pod \"glance-default-external-api-0\" (UID: \"d297fbfc-df5c-4365-a668-d04f87df845d\") " pod="openstack/glance-default-external-api-0" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.946243 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6l2fw\" (UniqueName: \"kubernetes.io/projected/d297fbfc-df5c-4365-a668-d04f87df845d-kube-api-access-6l2fw\") pod \"glance-default-external-api-0\" (UID: \"d297fbfc-df5c-4365-a668-d04f87df845d\") " pod="openstack/glance-default-external-api-0" Oct 12 05:59:48 crc kubenswrapper[4930]: I1012 05:59:48.972569 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"d297fbfc-df5c-4365-a668-d04f87df845d\") " pod="openstack/glance-default-external-api-0" Oct 12 05:59:49 crc kubenswrapper[4930]: I1012 05:59:49.033460 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crxzw\" (UniqueName: \"kubernetes.io/projected/b83c8650-6361-42bb-8112-0449e6722a3f-kube-api-access-crxzw\") pod \"glance-default-internal-api-0\" (UID: \"b83c8650-6361-42bb-8112-0449e6722a3f\") " pod="openstack/glance-default-internal-api-0" Oct 12 05:59:49 crc kubenswrapper[4930]: I1012 05:59:49.033513 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b83c8650-6361-42bb-8112-0449e6722a3f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b83c8650-6361-42bb-8112-0449e6722a3f\") " pod="openstack/glance-default-internal-api-0" Oct 12 05:59:49 crc kubenswrapper[4930]: I1012 05:59:49.033558 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"b83c8650-6361-42bb-8112-0449e6722a3f\") " pod="openstack/glance-default-internal-api-0" Oct 12 05:59:49 crc kubenswrapper[4930]: I1012 05:59:49.033607 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b83c8650-6361-42bb-8112-0449e6722a3f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b83c8650-6361-42bb-8112-0449e6722a3f\") " pod="openstack/glance-default-internal-api-0" Oct 12 05:59:49 crc kubenswrapper[4930]: I1012 05:59:49.033644 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b83c8650-6361-42bb-8112-0449e6722a3f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b83c8650-6361-42bb-8112-0449e6722a3f\") " pod="openstack/glance-default-internal-api-0" Oct 12 05:59:49 crc kubenswrapper[4930]: I1012 05:59:49.033691 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b83c8650-6361-42bb-8112-0449e6722a3f-logs\") pod \"glance-default-internal-api-0\" (UID: \"b83c8650-6361-42bb-8112-0449e6722a3f\") " pod="openstack/glance-default-internal-api-0" Oct 12 05:59:49 crc kubenswrapper[4930]: I1012 05:59:49.033760 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b83c8650-6361-42bb-8112-0449e6722a3f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b83c8650-6361-42bb-8112-0449e6722a3f\") " pod="openstack/glance-default-internal-api-0" Oct 12 05:59:49 crc kubenswrapper[4930]: I1012 05:59:49.033803 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b83c8650-6361-42bb-8112-0449e6722a3f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b83c8650-6361-42bb-8112-0449e6722a3f\") " pod="openstack/glance-default-internal-api-0" Oct 12 05:59:49 crc kubenswrapper[4930]: I1012 05:59:49.035187 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b83c8650-6361-42bb-8112-0449e6722a3f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b83c8650-6361-42bb-8112-0449e6722a3f\") " pod="openstack/glance-default-internal-api-0" Oct 12 05:59:49 crc kubenswrapper[4930]: I1012 05:59:49.035253 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b83c8650-6361-42bb-8112-0449e6722a3f-logs\") pod \"glance-default-internal-api-0\" (UID: \"b83c8650-6361-42bb-8112-0449e6722a3f\") " pod="openstack/glance-default-internal-api-0" Oct 12 05:59:49 crc kubenswrapper[4930]: I1012 05:59:49.036040 4930 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"b83c8650-6361-42bb-8112-0449e6722a3f\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-internal-api-0" Oct 12 05:59:49 crc kubenswrapper[4930]: I1012 05:59:49.038056 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b83c8650-6361-42bb-8112-0449e6722a3f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b83c8650-6361-42bb-8112-0449e6722a3f\") " pod="openstack/glance-default-internal-api-0" Oct 12 05:59:49 crc kubenswrapper[4930]: I1012 05:59:49.039151 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b83c8650-6361-42bb-8112-0449e6722a3f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b83c8650-6361-42bb-8112-0449e6722a3f\") " pod="openstack/glance-default-internal-api-0" Oct 12 05:59:49 crc kubenswrapper[4930]: I1012 05:59:49.047086 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b83c8650-6361-42bb-8112-0449e6722a3f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b83c8650-6361-42bb-8112-0449e6722a3f\") " pod="openstack/glance-default-internal-api-0" Oct 12 05:59:49 crc kubenswrapper[4930]: I1012 05:59:49.047473 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b83c8650-6361-42bb-8112-0449e6722a3f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b83c8650-6361-42bb-8112-0449e6722a3f\") " pod="openstack/glance-default-internal-api-0" Oct 12 05:59:49 crc kubenswrapper[4930]: I1012 05:59:49.055681 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crxzw\" (UniqueName: \"kubernetes.io/projected/b83c8650-6361-42bb-8112-0449e6722a3f-kube-api-access-crxzw\") pod \"glance-default-internal-api-0\" (UID: \"b83c8650-6361-42bb-8112-0449e6722a3f\") " pod="openstack/glance-default-internal-api-0" Oct 12 05:59:49 crc kubenswrapper[4930]: I1012 05:59:49.083945 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 12 05:59:49 crc kubenswrapper[4930]: I1012 05:59:49.098118 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"b83c8650-6361-42bb-8112-0449e6722a3f\") " pod="openstack/glance-default-internal-api-0" Oct 12 05:59:49 crc kubenswrapper[4930]: I1012 05:59:49.111921 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 12 05:59:49 crc kubenswrapper[4930]: I1012 05:59:49.358913 4930 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 12 05:59:49 crc kubenswrapper[4930]: I1012 05:59:49.523941 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Oct 12 05:59:49 crc kubenswrapper[4930]: I1012 05:59:49.699039 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 12 05:59:49 crc kubenswrapper[4930]: I1012 05:59:49.792927 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 12 05:59:49 crc kubenswrapper[4930]: I1012 05:59:49.878545 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Oct 12 05:59:50 crc kubenswrapper[4930]: I1012 05:59:50.152258 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="603009e7-da31-49c2-bb29-baad64c52187" path="/var/lib/kubelet/pods/603009e7-da31-49c2-bb29-baad64c52187/volumes" Oct 12 05:59:50 crc kubenswrapper[4930]: I1012 05:59:50.154205 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a78b1b6-f6b5-4357-ab16-db5f1de38a60" path="/var/lib/kubelet/pods/6a78b1b6-f6b5-4357-ab16-db5f1de38a60/volumes" Oct 12 05:59:50 crc kubenswrapper[4930]: I1012 05:59:50.399655 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d297fbfc-df5c-4365-a668-d04f87df845d","Type":"ContainerStarted","Data":"fe7ce099e3cf7c6ea4db68a65afee617377f91c2b135876a6ca6d32218d66f12"} Oct 12 05:59:50 crc kubenswrapper[4930]: I1012 05:59:50.401824 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b83c8650-6361-42bb-8112-0449e6722a3f","Type":"ContainerStarted","Data":"21e3dd3f6fc225d827017c9cf32eac22fac4ce103a374731010c0c257c796bb9"} Oct 12 05:59:50 crc kubenswrapper[4930]: I1012 05:59:50.714699 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-544d94f45b-79l8m" Oct 12 05:59:50 crc kubenswrapper[4930]: I1012 05:59:50.790225 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-544d94f45b-79l8m" Oct 12 05:59:51 crc kubenswrapper[4930]: I1012 05:59:51.189562 4930 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6778cd8bb8-9zhz5" podUID="30049dfb-04f2-455c-a949-08bd6ff892d0" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.161:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.161:8443: connect: connection refused" Oct 12 05:59:51 crc kubenswrapper[4930]: I1012 05:59:51.277518 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 05:59:51 crc kubenswrapper[4930]: I1012 05:59:51.392321 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b25d9dd-3bce-4f8d-9219-1d6ce75878ed-combined-ca-bundle\") pod \"7b25d9dd-3bce-4f8d-9219-1d6ce75878ed\" (UID: \"7b25d9dd-3bce-4f8d-9219-1d6ce75878ed\") " Oct 12 05:59:51 crc kubenswrapper[4930]: I1012 05:59:51.392392 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b25d9dd-3bce-4f8d-9219-1d6ce75878ed-config-data\") pod \"7b25d9dd-3bce-4f8d-9219-1d6ce75878ed\" (UID: \"7b25d9dd-3bce-4f8d-9219-1d6ce75878ed\") " Oct 12 05:59:51 crc kubenswrapper[4930]: I1012 05:59:51.392497 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b25d9dd-3bce-4f8d-9219-1d6ce75878ed-scripts\") pod \"7b25d9dd-3bce-4f8d-9219-1d6ce75878ed\" (UID: \"7b25d9dd-3bce-4f8d-9219-1d6ce75878ed\") " Oct 12 05:59:51 crc kubenswrapper[4930]: I1012 05:59:51.392523 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b25d9dd-3bce-4f8d-9219-1d6ce75878ed-run-httpd\") pod \"7b25d9dd-3bce-4f8d-9219-1d6ce75878ed\" (UID: \"7b25d9dd-3bce-4f8d-9219-1d6ce75878ed\") " Oct 12 05:59:51 crc kubenswrapper[4930]: I1012 05:59:51.392651 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fsmjv\" (UniqueName: \"kubernetes.io/projected/7b25d9dd-3bce-4f8d-9219-1d6ce75878ed-kube-api-access-fsmjv\") pod \"7b25d9dd-3bce-4f8d-9219-1d6ce75878ed\" (UID: \"7b25d9dd-3bce-4f8d-9219-1d6ce75878ed\") " Oct 12 05:59:51 crc kubenswrapper[4930]: I1012 05:59:51.392676 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7b25d9dd-3bce-4f8d-9219-1d6ce75878ed-sg-core-conf-yaml\") pod \"7b25d9dd-3bce-4f8d-9219-1d6ce75878ed\" (UID: \"7b25d9dd-3bce-4f8d-9219-1d6ce75878ed\") " Oct 12 05:59:51 crc kubenswrapper[4930]: I1012 05:59:51.392699 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b25d9dd-3bce-4f8d-9219-1d6ce75878ed-log-httpd\") pod \"7b25d9dd-3bce-4f8d-9219-1d6ce75878ed\" (UID: \"7b25d9dd-3bce-4f8d-9219-1d6ce75878ed\") " Oct 12 05:59:51 crc kubenswrapper[4930]: I1012 05:59:51.393517 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b25d9dd-3bce-4f8d-9219-1d6ce75878ed-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7b25d9dd-3bce-4f8d-9219-1d6ce75878ed" (UID: "7b25d9dd-3bce-4f8d-9219-1d6ce75878ed"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 05:59:51 crc kubenswrapper[4930]: I1012 05:59:51.394312 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b25d9dd-3bce-4f8d-9219-1d6ce75878ed-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7b25d9dd-3bce-4f8d-9219-1d6ce75878ed" (UID: "7b25d9dd-3bce-4f8d-9219-1d6ce75878ed"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 05:59:51 crc kubenswrapper[4930]: I1012 05:59:51.398948 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b25d9dd-3bce-4f8d-9219-1d6ce75878ed-scripts" (OuterVolumeSpecName: "scripts") pod "7b25d9dd-3bce-4f8d-9219-1d6ce75878ed" (UID: "7b25d9dd-3bce-4f8d-9219-1d6ce75878ed"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:59:51 crc kubenswrapper[4930]: I1012 05:59:51.399427 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b25d9dd-3bce-4f8d-9219-1d6ce75878ed-kube-api-access-fsmjv" (OuterVolumeSpecName: "kube-api-access-fsmjv") pod "7b25d9dd-3bce-4f8d-9219-1d6ce75878ed" (UID: "7b25d9dd-3bce-4f8d-9219-1d6ce75878ed"). InnerVolumeSpecName "kube-api-access-fsmjv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:59:51 crc kubenswrapper[4930]: I1012 05:59:51.413954 4930 generic.go:334] "Generic (PLEG): container finished" podID="7b25d9dd-3bce-4f8d-9219-1d6ce75878ed" containerID="42eb9ff39af53c03159df9698802c263efb99e17ab668023f32c7e55890df52f" exitCode=0 Oct 12 05:59:51 crc kubenswrapper[4930]: I1012 05:59:51.414010 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 05:59:51 crc kubenswrapper[4930]: I1012 05:59:51.414103 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7b25d9dd-3bce-4f8d-9219-1d6ce75878ed","Type":"ContainerDied","Data":"42eb9ff39af53c03159df9698802c263efb99e17ab668023f32c7e55890df52f"} Oct 12 05:59:51 crc kubenswrapper[4930]: I1012 05:59:51.414172 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7b25d9dd-3bce-4f8d-9219-1d6ce75878ed","Type":"ContainerDied","Data":"98f743e3fd40ba0726cdaec6f7dbaf40edbce7e61986a664ab469afdd890d255"} Oct 12 05:59:51 crc kubenswrapper[4930]: I1012 05:59:51.414194 4930 scope.go:117] "RemoveContainer" containerID="9611075a0346ebe159fbb360e141dcbb57042d642d566c6fd1855c99d9d2b8e7" Oct 12 05:59:51 crc kubenswrapper[4930]: I1012 05:59:51.422978 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d297fbfc-df5c-4365-a668-d04f87df845d","Type":"ContainerStarted","Data":"4597264a5a1f2045f2dfd2c65470cdc4a45efec1958c0f9c123bc342b19d8b98"} Oct 12 05:59:51 crc kubenswrapper[4930]: I1012 05:59:51.423010 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d297fbfc-df5c-4365-a668-d04f87df845d","Type":"ContainerStarted","Data":"a607f1cebf14a923275c8ceab7752588186b342d8877d40c3c9276d90f221c5e"} Oct 12 05:59:51 crc kubenswrapper[4930]: I1012 05:59:51.424947 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b25d9dd-3bce-4f8d-9219-1d6ce75878ed-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7b25d9dd-3bce-4f8d-9219-1d6ce75878ed" (UID: "7b25d9dd-3bce-4f8d-9219-1d6ce75878ed"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:59:51 crc kubenswrapper[4930]: I1012 05:59:51.428296 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b83c8650-6361-42bb-8112-0449e6722a3f","Type":"ContainerStarted","Data":"ae9ee27bdd88c79f2fa0d15b36520ba143be88fc092cd54c9e17f9cf8674b5e8"} Oct 12 05:59:51 crc kubenswrapper[4930]: I1012 05:59:51.446192 4930 scope.go:117] "RemoveContainer" containerID="3cb02172b934592377e139a2782717713272e8fb6014aad096c3e8ffee26c3a3" Oct 12 05:59:51 crc kubenswrapper[4930]: I1012 05:59:51.453541 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.453524571 podStartE2EDuration="3.453524571s" podCreationTimestamp="2025-10-12 05:59:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:59:51.451010928 +0000 UTC m=+1123.993112693" watchObservedRunningTime="2025-10-12 05:59:51.453524571 +0000 UTC m=+1123.995626336" Oct 12 05:59:51 crc kubenswrapper[4930]: I1012 05:59:51.475801 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b25d9dd-3bce-4f8d-9219-1d6ce75878ed-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7b25d9dd-3bce-4f8d-9219-1d6ce75878ed" (UID: "7b25d9dd-3bce-4f8d-9219-1d6ce75878ed"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:59:51 crc kubenswrapper[4930]: I1012 05:59:51.490891 4930 scope.go:117] "RemoveContainer" containerID="42eb9ff39af53c03159df9698802c263efb99e17ab668023f32c7e55890df52f" Oct 12 05:59:51 crc kubenswrapper[4930]: I1012 05:59:51.493631 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.493618719 podStartE2EDuration="3.493618719s" podCreationTimestamp="2025-10-12 05:59:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:59:51.478132263 +0000 UTC m=+1124.020234028" watchObservedRunningTime="2025-10-12 05:59:51.493618719 +0000 UTC m=+1124.035720474" Oct 12 05:59:51 crc kubenswrapper[4930]: I1012 05:59:51.495320 4930 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b25d9dd-3bce-4f8d-9219-1d6ce75878ed-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:51 crc kubenswrapper[4930]: I1012 05:59:51.495339 4930 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b25d9dd-3bce-4f8d-9219-1d6ce75878ed-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:51 crc kubenswrapper[4930]: I1012 05:59:51.495348 4930 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b25d9dd-3bce-4f8d-9219-1d6ce75878ed-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:51 crc kubenswrapper[4930]: I1012 05:59:51.495356 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fsmjv\" (UniqueName: \"kubernetes.io/projected/7b25d9dd-3bce-4f8d-9219-1d6ce75878ed-kube-api-access-fsmjv\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:51 crc kubenswrapper[4930]: I1012 05:59:51.495367 4930 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7b25d9dd-3bce-4f8d-9219-1d6ce75878ed-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:51 crc kubenswrapper[4930]: I1012 05:59:51.495375 4930 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b25d9dd-3bce-4f8d-9219-1d6ce75878ed-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:51 crc kubenswrapper[4930]: I1012 05:59:51.510238 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b25d9dd-3bce-4f8d-9219-1d6ce75878ed-config-data" (OuterVolumeSpecName: "config-data") pod "7b25d9dd-3bce-4f8d-9219-1d6ce75878ed" (UID: "7b25d9dd-3bce-4f8d-9219-1d6ce75878ed"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:59:51 crc kubenswrapper[4930]: I1012 05:59:51.516559 4930 scope.go:117] "RemoveContainer" containerID="9611075a0346ebe159fbb360e141dcbb57042d642d566c6fd1855c99d9d2b8e7" Oct 12 05:59:51 crc kubenswrapper[4930]: E1012 05:59:51.517131 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9611075a0346ebe159fbb360e141dcbb57042d642d566c6fd1855c99d9d2b8e7\": container with ID starting with 9611075a0346ebe159fbb360e141dcbb57042d642d566c6fd1855c99d9d2b8e7 not found: ID does not exist" containerID="9611075a0346ebe159fbb360e141dcbb57042d642d566c6fd1855c99d9d2b8e7" Oct 12 05:59:51 crc kubenswrapper[4930]: I1012 05:59:51.517158 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9611075a0346ebe159fbb360e141dcbb57042d642d566c6fd1855c99d9d2b8e7"} err="failed to get container status \"9611075a0346ebe159fbb360e141dcbb57042d642d566c6fd1855c99d9d2b8e7\": rpc error: code = NotFound desc = could not find container \"9611075a0346ebe159fbb360e141dcbb57042d642d566c6fd1855c99d9d2b8e7\": container with ID starting with 9611075a0346ebe159fbb360e141dcbb57042d642d566c6fd1855c99d9d2b8e7 not found: ID does not exist" Oct 12 05:59:51 crc kubenswrapper[4930]: I1012 05:59:51.517175 4930 scope.go:117] "RemoveContainer" containerID="3cb02172b934592377e139a2782717713272e8fb6014aad096c3e8ffee26c3a3" Oct 12 05:59:51 crc kubenswrapper[4930]: E1012 05:59:51.517382 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3cb02172b934592377e139a2782717713272e8fb6014aad096c3e8ffee26c3a3\": container with ID starting with 3cb02172b934592377e139a2782717713272e8fb6014aad096c3e8ffee26c3a3 not found: ID does not exist" containerID="3cb02172b934592377e139a2782717713272e8fb6014aad096c3e8ffee26c3a3" Oct 12 05:59:51 crc kubenswrapper[4930]: I1012 05:59:51.517401 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cb02172b934592377e139a2782717713272e8fb6014aad096c3e8ffee26c3a3"} err="failed to get container status \"3cb02172b934592377e139a2782717713272e8fb6014aad096c3e8ffee26c3a3\": rpc error: code = NotFound desc = could not find container \"3cb02172b934592377e139a2782717713272e8fb6014aad096c3e8ffee26c3a3\": container with ID starting with 3cb02172b934592377e139a2782717713272e8fb6014aad096c3e8ffee26c3a3 not found: ID does not exist" Oct 12 05:59:51 crc kubenswrapper[4930]: I1012 05:59:51.517413 4930 scope.go:117] "RemoveContainer" containerID="42eb9ff39af53c03159df9698802c263efb99e17ab668023f32c7e55890df52f" Oct 12 05:59:51 crc kubenswrapper[4930]: E1012 05:59:51.517567 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42eb9ff39af53c03159df9698802c263efb99e17ab668023f32c7e55890df52f\": container with ID starting with 42eb9ff39af53c03159df9698802c263efb99e17ab668023f32c7e55890df52f not found: ID does not exist" containerID="42eb9ff39af53c03159df9698802c263efb99e17ab668023f32c7e55890df52f" Oct 12 05:59:51 crc kubenswrapper[4930]: I1012 05:59:51.517586 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42eb9ff39af53c03159df9698802c263efb99e17ab668023f32c7e55890df52f"} err="failed to get container status \"42eb9ff39af53c03159df9698802c263efb99e17ab668023f32c7e55890df52f\": rpc error: code = NotFound desc = could not find container \"42eb9ff39af53c03159df9698802c263efb99e17ab668023f32c7e55890df52f\": container with ID starting with 42eb9ff39af53c03159df9698802c263efb99e17ab668023f32c7e55890df52f not found: ID does not exist" Oct 12 05:59:51 crc kubenswrapper[4930]: I1012 05:59:51.596788 4930 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b25d9dd-3bce-4f8d-9219-1d6ce75878ed-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:51 crc kubenswrapper[4930]: I1012 05:59:51.783656 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 12 05:59:51 crc kubenswrapper[4930]: I1012 05:59:51.795989 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 12 05:59:51 crc kubenswrapper[4930]: I1012 05:59:51.807072 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 12 05:59:51 crc kubenswrapper[4930]: E1012 05:59:51.807553 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b25d9dd-3bce-4f8d-9219-1d6ce75878ed" containerName="sg-core" Oct 12 05:59:51 crc kubenswrapper[4930]: I1012 05:59:51.807567 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b25d9dd-3bce-4f8d-9219-1d6ce75878ed" containerName="sg-core" Oct 12 05:59:51 crc kubenswrapper[4930]: E1012 05:59:51.807583 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b25d9dd-3bce-4f8d-9219-1d6ce75878ed" containerName="proxy-httpd" Oct 12 05:59:51 crc kubenswrapper[4930]: I1012 05:59:51.807591 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b25d9dd-3bce-4f8d-9219-1d6ce75878ed" containerName="proxy-httpd" Oct 12 05:59:51 crc kubenswrapper[4930]: E1012 05:59:51.807623 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b25d9dd-3bce-4f8d-9219-1d6ce75878ed" containerName="ceilometer-notification-agent" Oct 12 05:59:51 crc kubenswrapper[4930]: I1012 05:59:51.807633 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b25d9dd-3bce-4f8d-9219-1d6ce75878ed" containerName="ceilometer-notification-agent" Oct 12 05:59:51 crc kubenswrapper[4930]: I1012 05:59:51.807871 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b25d9dd-3bce-4f8d-9219-1d6ce75878ed" containerName="ceilometer-notification-agent" Oct 12 05:59:51 crc kubenswrapper[4930]: I1012 05:59:51.807895 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b25d9dd-3bce-4f8d-9219-1d6ce75878ed" containerName="proxy-httpd" Oct 12 05:59:51 crc kubenswrapper[4930]: I1012 05:59:51.807920 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b25d9dd-3bce-4f8d-9219-1d6ce75878ed" containerName="sg-core" Oct 12 05:59:51 crc kubenswrapper[4930]: I1012 05:59:51.818975 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 05:59:51 crc kubenswrapper[4930]: I1012 05:59:51.820962 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 12 05:59:51 crc kubenswrapper[4930]: I1012 05:59:51.821095 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 12 05:59:51 crc kubenswrapper[4930]: I1012 05:59:51.822176 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 12 05:59:51 crc kubenswrapper[4930]: I1012 05:59:51.903607 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d2b3aab6-b6ca-44a6-92e9-50e80109fb0c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d2b3aab6-b6ca-44a6-92e9-50e80109fb0c\") " pod="openstack/ceilometer-0" Oct 12 05:59:51 crc kubenswrapper[4930]: I1012 05:59:51.903661 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2b3aab6-b6ca-44a6-92e9-50e80109fb0c-log-httpd\") pod \"ceilometer-0\" (UID: \"d2b3aab6-b6ca-44a6-92e9-50e80109fb0c\") " pod="openstack/ceilometer-0" Oct 12 05:59:51 crc kubenswrapper[4930]: I1012 05:59:51.903704 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2b3aab6-b6ca-44a6-92e9-50e80109fb0c-config-data\") pod \"ceilometer-0\" (UID: \"d2b3aab6-b6ca-44a6-92e9-50e80109fb0c\") " pod="openstack/ceilometer-0" Oct 12 05:59:51 crc kubenswrapper[4930]: I1012 05:59:51.903752 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2b3aab6-b6ca-44a6-92e9-50e80109fb0c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d2b3aab6-b6ca-44a6-92e9-50e80109fb0c\") " pod="openstack/ceilometer-0" Oct 12 05:59:51 crc kubenswrapper[4930]: I1012 05:59:51.903934 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2b3aab6-b6ca-44a6-92e9-50e80109fb0c-run-httpd\") pod \"ceilometer-0\" (UID: \"d2b3aab6-b6ca-44a6-92e9-50e80109fb0c\") " pod="openstack/ceilometer-0" Oct 12 05:59:51 crc kubenswrapper[4930]: I1012 05:59:51.904031 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2b3aab6-b6ca-44a6-92e9-50e80109fb0c-scripts\") pod \"ceilometer-0\" (UID: \"d2b3aab6-b6ca-44a6-92e9-50e80109fb0c\") " pod="openstack/ceilometer-0" Oct 12 05:59:51 crc kubenswrapper[4930]: I1012 05:59:51.904114 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nrjc\" (UniqueName: \"kubernetes.io/projected/d2b3aab6-b6ca-44a6-92e9-50e80109fb0c-kube-api-access-9nrjc\") pod \"ceilometer-0\" (UID: \"d2b3aab6-b6ca-44a6-92e9-50e80109fb0c\") " pod="openstack/ceilometer-0" Oct 12 05:59:52 crc kubenswrapper[4930]: I1012 05:59:52.005625 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2b3aab6-b6ca-44a6-92e9-50e80109fb0c-scripts\") pod \"ceilometer-0\" (UID: \"d2b3aab6-b6ca-44a6-92e9-50e80109fb0c\") " pod="openstack/ceilometer-0" Oct 12 05:59:52 crc kubenswrapper[4930]: I1012 05:59:52.005685 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nrjc\" (UniqueName: \"kubernetes.io/projected/d2b3aab6-b6ca-44a6-92e9-50e80109fb0c-kube-api-access-9nrjc\") pod \"ceilometer-0\" (UID: \"d2b3aab6-b6ca-44a6-92e9-50e80109fb0c\") " pod="openstack/ceilometer-0" Oct 12 05:59:52 crc kubenswrapper[4930]: I1012 05:59:52.005772 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d2b3aab6-b6ca-44a6-92e9-50e80109fb0c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d2b3aab6-b6ca-44a6-92e9-50e80109fb0c\") " pod="openstack/ceilometer-0" Oct 12 05:59:52 crc kubenswrapper[4930]: I1012 05:59:52.005793 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2b3aab6-b6ca-44a6-92e9-50e80109fb0c-log-httpd\") pod \"ceilometer-0\" (UID: \"d2b3aab6-b6ca-44a6-92e9-50e80109fb0c\") " pod="openstack/ceilometer-0" Oct 12 05:59:52 crc kubenswrapper[4930]: I1012 05:59:52.005834 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2b3aab6-b6ca-44a6-92e9-50e80109fb0c-config-data\") pod \"ceilometer-0\" (UID: \"d2b3aab6-b6ca-44a6-92e9-50e80109fb0c\") " pod="openstack/ceilometer-0" Oct 12 05:59:52 crc kubenswrapper[4930]: I1012 05:59:52.005865 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2b3aab6-b6ca-44a6-92e9-50e80109fb0c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d2b3aab6-b6ca-44a6-92e9-50e80109fb0c\") " pod="openstack/ceilometer-0" Oct 12 05:59:52 crc kubenswrapper[4930]: I1012 05:59:52.006086 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2b3aab6-b6ca-44a6-92e9-50e80109fb0c-run-httpd\") pod \"ceilometer-0\" (UID: \"d2b3aab6-b6ca-44a6-92e9-50e80109fb0c\") " pod="openstack/ceilometer-0" Oct 12 05:59:52 crc kubenswrapper[4930]: I1012 05:59:52.006653 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2b3aab6-b6ca-44a6-92e9-50e80109fb0c-run-httpd\") pod \"ceilometer-0\" (UID: \"d2b3aab6-b6ca-44a6-92e9-50e80109fb0c\") " pod="openstack/ceilometer-0" Oct 12 05:59:52 crc kubenswrapper[4930]: I1012 05:59:52.007261 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2b3aab6-b6ca-44a6-92e9-50e80109fb0c-log-httpd\") pod \"ceilometer-0\" (UID: \"d2b3aab6-b6ca-44a6-92e9-50e80109fb0c\") " pod="openstack/ceilometer-0" Oct 12 05:59:52 crc kubenswrapper[4930]: I1012 05:59:52.010462 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d2b3aab6-b6ca-44a6-92e9-50e80109fb0c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d2b3aab6-b6ca-44a6-92e9-50e80109fb0c\") " pod="openstack/ceilometer-0" Oct 12 05:59:52 crc kubenswrapper[4930]: I1012 05:59:52.012293 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2b3aab6-b6ca-44a6-92e9-50e80109fb0c-config-data\") pod \"ceilometer-0\" (UID: \"d2b3aab6-b6ca-44a6-92e9-50e80109fb0c\") " pod="openstack/ceilometer-0" Oct 12 05:59:52 crc kubenswrapper[4930]: I1012 05:59:52.015176 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2b3aab6-b6ca-44a6-92e9-50e80109fb0c-scripts\") pod \"ceilometer-0\" (UID: \"d2b3aab6-b6ca-44a6-92e9-50e80109fb0c\") " pod="openstack/ceilometer-0" Oct 12 05:59:52 crc kubenswrapper[4930]: I1012 05:59:52.018306 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 12 05:59:52 crc kubenswrapper[4930]: I1012 05:59:52.022644 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2b3aab6-b6ca-44a6-92e9-50e80109fb0c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d2b3aab6-b6ca-44a6-92e9-50e80109fb0c\") " pod="openstack/ceilometer-0" Oct 12 05:59:52 crc kubenswrapper[4930]: I1012 05:59:52.029331 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nrjc\" (UniqueName: \"kubernetes.io/projected/d2b3aab6-b6ca-44a6-92e9-50e80109fb0c-kube-api-access-9nrjc\") pod \"ceilometer-0\" (UID: \"d2b3aab6-b6ca-44a6-92e9-50e80109fb0c\") " pod="openstack/ceilometer-0" Oct 12 05:59:52 crc kubenswrapper[4930]: I1012 05:59:52.077018 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 12 05:59:52 crc kubenswrapper[4930]: I1012 05:59:52.134650 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 05:59:52 crc kubenswrapper[4930]: I1012 05:59:52.154701 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b25d9dd-3bce-4f8d-9219-1d6ce75878ed" path="/var/lib/kubelet/pods/7b25d9dd-3bce-4f8d-9219-1d6ce75878ed/volumes" Oct 12 05:59:52 crc kubenswrapper[4930]: I1012 05:59:52.440810 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b83c8650-6361-42bb-8112-0449e6722a3f","Type":"ContainerStarted","Data":"15e775931c35b2e9b09753f036a874b4d26c758652a7511acfa8b52dcdbd6e9d"} Oct 12 05:59:52 crc kubenswrapper[4930]: I1012 05:59:52.442551 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="df83ee2b-e484-4a2a-a619-4e9aadeb19a1" containerName="cinder-scheduler" containerID="cri-o://b4c8b0882b0dbe2c9092bf447bef37dbcc054a544fa3c0d20011e40b2a894d55" gracePeriod=30 Oct 12 05:59:52 crc kubenswrapper[4930]: I1012 05:59:52.443438 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="df83ee2b-e484-4a2a-a619-4e9aadeb19a1" containerName="probe" containerID="cri-o://d0f8c31aa00150932c27bc723fef563ffb08cd71345032e256c141692453941f" gracePeriod=30 Oct 12 05:59:52 crc kubenswrapper[4930]: I1012 05:59:52.584024 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 12 05:59:52 crc kubenswrapper[4930]: W1012 05:59:52.585068 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd2b3aab6_b6ca_44a6_92e9_50e80109fb0c.slice/crio-a0320fd981abb5646b2520e45339c6fe9e7c6f2627aaa11ef7a8bdd43eb2534c WatchSource:0}: Error finding container a0320fd981abb5646b2520e45339c6fe9e7c6f2627aaa11ef7a8bdd43eb2534c: Status 404 returned error can't find the container with id a0320fd981abb5646b2520e45339c6fe9e7c6f2627aaa11ef7a8bdd43eb2534c Oct 12 05:59:53 crc kubenswrapper[4930]: I1012 05:59:53.421654 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-75958fc765-rbf85" Oct 12 05:59:53 crc kubenswrapper[4930]: I1012 05:59:53.461188 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2b3aab6-b6ca-44a6-92e9-50e80109fb0c","Type":"ContainerStarted","Data":"6460600dc2240bf70a42d8ce5d6ad8f89e6a182400f65215964547911db1e3c1"} Oct 12 05:59:53 crc kubenswrapper[4930]: I1012 05:59:53.461246 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2b3aab6-b6ca-44a6-92e9-50e80109fb0c","Type":"ContainerStarted","Data":"a0320fd981abb5646b2520e45339c6fe9e7c6f2627aaa11ef7a8bdd43eb2534c"} Oct 12 05:59:53 crc kubenswrapper[4930]: I1012 05:59:53.465072 4930 generic.go:334] "Generic (PLEG): container finished" podID="df83ee2b-e484-4a2a-a619-4e9aadeb19a1" containerID="d0f8c31aa00150932c27bc723fef563ffb08cd71345032e256c141692453941f" exitCode=0 Oct 12 05:59:53 crc kubenswrapper[4930]: I1012 05:59:53.465304 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"df83ee2b-e484-4a2a-a619-4e9aadeb19a1","Type":"ContainerDied","Data":"d0f8c31aa00150932c27bc723fef563ffb08cd71345032e256c141692453941f"} Oct 12 05:59:53 crc kubenswrapper[4930]: I1012 05:59:53.481287 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7648c6b969-m7fgp"] Oct 12 05:59:53 crc kubenswrapper[4930]: I1012 05:59:53.487617 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7648c6b969-m7fgp" podUID="39903c9a-2fa6-4fb3-8868-b2c055bdd11c" containerName="dnsmasq-dns" containerID="cri-o://108bb256a26485ad47d064e75e6eebb87735e2755bc7af4a1c379f1e6d10355b" gracePeriod=10 Oct 12 05:59:53 crc kubenswrapper[4930]: I1012 05:59:53.685842 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Oct 12 05:59:53 crc kubenswrapper[4930]: I1012 05:59:53.685889 4930 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 12 05:59:53 crc kubenswrapper[4930]: I1012 05:59:53.686589 4930 scope.go:117] "RemoveContainer" containerID="893b8e00eaeb9c339e5a9528038e3102dd99c50f64a2531fff869300585ed13a" Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.183326 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7648c6b969-m7fgp" Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.263416 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39903c9a-2fa6-4fb3-8868-b2c055bdd11c-config\") pod \"39903c9a-2fa6-4fb3-8868-b2c055bdd11c\" (UID: \"39903c9a-2fa6-4fb3-8868-b2c055bdd11c\") " Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.263517 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/39903c9a-2fa6-4fb3-8868-b2c055bdd11c-dns-svc\") pod \"39903c9a-2fa6-4fb3-8868-b2c055bdd11c\" (UID: \"39903c9a-2fa6-4fb3-8868-b2c055bdd11c\") " Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.263563 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/39903c9a-2fa6-4fb3-8868-b2c055bdd11c-dns-swift-storage-0\") pod \"39903c9a-2fa6-4fb3-8868-b2c055bdd11c\" (UID: \"39903c9a-2fa6-4fb3-8868-b2c055bdd11c\") " Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.263633 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/39903c9a-2fa6-4fb3-8868-b2c055bdd11c-ovsdbserver-nb\") pod \"39903c9a-2fa6-4fb3-8868-b2c055bdd11c\" (UID: \"39903c9a-2fa6-4fb3-8868-b2c055bdd11c\") " Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.263708 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/39903c9a-2fa6-4fb3-8868-b2c055bdd11c-ovsdbserver-sb\") pod \"39903c9a-2fa6-4fb3-8868-b2c055bdd11c\" (UID: \"39903c9a-2fa6-4fb3-8868-b2c055bdd11c\") " Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.263801 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6wkn\" (UniqueName: \"kubernetes.io/projected/39903c9a-2fa6-4fb3-8868-b2c055bdd11c-kube-api-access-s6wkn\") pod \"39903c9a-2fa6-4fb3-8868-b2c055bdd11c\" (UID: \"39903c9a-2fa6-4fb3-8868-b2c055bdd11c\") " Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.268394 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.282876 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39903c9a-2fa6-4fb3-8868-b2c055bdd11c-kube-api-access-s6wkn" (OuterVolumeSpecName: "kube-api-access-s6wkn") pod "39903c9a-2fa6-4fb3-8868-b2c055bdd11c" (UID: "39903c9a-2fa6-4fb3-8868-b2c055bdd11c"). InnerVolumeSpecName "kube-api-access-s6wkn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.330646 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39903c9a-2fa6-4fb3-8868-b2c055bdd11c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "39903c9a-2fa6-4fb3-8868-b2c055bdd11c" (UID: "39903c9a-2fa6-4fb3-8868-b2c055bdd11c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.332214 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39903c9a-2fa6-4fb3-8868-b2c055bdd11c-config" (OuterVolumeSpecName: "config") pod "39903c9a-2fa6-4fb3-8868-b2c055bdd11c" (UID: "39903c9a-2fa6-4fb3-8868-b2c055bdd11c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.365993 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df83ee2b-e484-4a2a-a619-4e9aadeb19a1-scripts\") pod \"df83ee2b-e484-4a2a-a619-4e9aadeb19a1\" (UID: \"df83ee2b-e484-4a2a-a619-4e9aadeb19a1\") " Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.366117 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/df83ee2b-e484-4a2a-a619-4e9aadeb19a1-etc-machine-id\") pod \"df83ee2b-e484-4a2a-a619-4e9aadeb19a1\" (UID: \"df83ee2b-e484-4a2a-a619-4e9aadeb19a1\") " Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.366216 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqs9b\" (UniqueName: \"kubernetes.io/projected/df83ee2b-e484-4a2a-a619-4e9aadeb19a1-kube-api-access-xqs9b\") pod \"df83ee2b-e484-4a2a-a619-4e9aadeb19a1\" (UID: \"df83ee2b-e484-4a2a-a619-4e9aadeb19a1\") " Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.366285 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/df83ee2b-e484-4a2a-a619-4e9aadeb19a1-config-data-custom\") pod \"df83ee2b-e484-4a2a-a619-4e9aadeb19a1\" (UID: \"df83ee2b-e484-4a2a-a619-4e9aadeb19a1\") " Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.366312 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df83ee2b-e484-4a2a-a619-4e9aadeb19a1-combined-ca-bundle\") pod \"df83ee2b-e484-4a2a-a619-4e9aadeb19a1\" (UID: \"df83ee2b-e484-4a2a-a619-4e9aadeb19a1\") " Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.366338 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df83ee2b-e484-4a2a-a619-4e9aadeb19a1-config-data\") pod \"df83ee2b-e484-4a2a-a619-4e9aadeb19a1\" (UID: \"df83ee2b-e484-4a2a-a619-4e9aadeb19a1\") " Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.366876 4930 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39903c9a-2fa6-4fb3-8868-b2c055bdd11c-config\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.366891 4930 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/39903c9a-2fa6-4fb3-8868-b2c055bdd11c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.366903 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s6wkn\" (UniqueName: \"kubernetes.io/projected/39903c9a-2fa6-4fb3-8868-b2c055bdd11c-kube-api-access-s6wkn\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.367455 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/df83ee2b-e484-4a2a-a619-4e9aadeb19a1-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "df83ee2b-e484-4a2a-a619-4e9aadeb19a1" (UID: "df83ee2b-e484-4a2a-a619-4e9aadeb19a1"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.373888 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df83ee2b-e484-4a2a-a619-4e9aadeb19a1-kube-api-access-xqs9b" (OuterVolumeSpecName: "kube-api-access-xqs9b") pod "df83ee2b-e484-4a2a-a619-4e9aadeb19a1" (UID: "df83ee2b-e484-4a2a-a619-4e9aadeb19a1"). InnerVolumeSpecName "kube-api-access-xqs9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.376468 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df83ee2b-e484-4a2a-a619-4e9aadeb19a1-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "df83ee2b-e484-4a2a-a619-4e9aadeb19a1" (UID: "df83ee2b-e484-4a2a-a619-4e9aadeb19a1"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.377420 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df83ee2b-e484-4a2a-a619-4e9aadeb19a1-scripts" (OuterVolumeSpecName: "scripts") pod "df83ee2b-e484-4a2a-a619-4e9aadeb19a1" (UID: "df83ee2b-e484-4a2a-a619-4e9aadeb19a1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.385704 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39903c9a-2fa6-4fb3-8868-b2c055bdd11c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "39903c9a-2fa6-4fb3-8868-b2c055bdd11c" (UID: "39903c9a-2fa6-4fb3-8868-b2c055bdd11c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.389306 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39903c9a-2fa6-4fb3-8868-b2c055bdd11c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "39903c9a-2fa6-4fb3-8868-b2c055bdd11c" (UID: "39903c9a-2fa6-4fb3-8868-b2c055bdd11c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.404747 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39903c9a-2fa6-4fb3-8868-b2c055bdd11c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "39903c9a-2fa6-4fb3-8868-b2c055bdd11c" (UID: "39903c9a-2fa6-4fb3-8868-b2c055bdd11c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.463479 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df83ee2b-e484-4a2a-a619-4e9aadeb19a1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "df83ee2b-e484-4a2a-a619-4e9aadeb19a1" (UID: "df83ee2b-e484-4a2a-a619-4e9aadeb19a1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.468571 4930 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/df83ee2b-e484-4a2a-a619-4e9aadeb19a1-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.468596 4930 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/39903c9a-2fa6-4fb3-8868-b2c055bdd11c-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.468605 4930 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/39903c9a-2fa6-4fb3-8868-b2c055bdd11c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.468615 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xqs9b\" (UniqueName: \"kubernetes.io/projected/df83ee2b-e484-4a2a-a619-4e9aadeb19a1-kube-api-access-xqs9b\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.468624 4930 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/39903c9a-2fa6-4fb3-8868-b2c055bdd11c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.468632 4930 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/df83ee2b-e484-4a2a-a619-4e9aadeb19a1-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.468639 4930 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df83ee2b-e484-4a2a-a619-4e9aadeb19a1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.468647 4930 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df83ee2b-e484-4a2a-a619-4e9aadeb19a1-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.480217 4930 generic.go:334] "Generic (PLEG): container finished" podID="39903c9a-2fa6-4fb3-8868-b2c055bdd11c" containerID="108bb256a26485ad47d064e75e6eebb87735e2755bc7af4a1c379f1e6d10355b" exitCode=0 Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.480283 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7648c6b969-m7fgp" Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.480304 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7648c6b969-m7fgp" event={"ID":"39903c9a-2fa6-4fb3-8868-b2c055bdd11c","Type":"ContainerDied","Data":"108bb256a26485ad47d064e75e6eebb87735e2755bc7af4a1c379f1e6d10355b"} Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.480333 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7648c6b969-m7fgp" event={"ID":"39903c9a-2fa6-4fb3-8868-b2c055bdd11c","Type":"ContainerDied","Data":"64db9d9334415a54f565981fde31abf2adda6d6feef6076aa66d8db067199175"} Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.480350 4930 scope.go:117] "RemoveContainer" containerID="108bb256a26485ad47d064e75e6eebb87735e2755bc7af4a1c379f1e6d10355b" Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.488717 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"59f4bae4-1a84-449a-be72-e735294116e6","Type":"ContainerStarted","Data":"08f49b3d51ac716588f5237987a07b5659675e827f1272074de03ba6918727cd"} Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.502222 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2b3aab6-b6ca-44a6-92e9-50e80109fb0c","Type":"ContainerStarted","Data":"2330a8f81f829cfc92c56cdbb3d5729dd0bf48d8f9f7220b0f3440e7ce7eee40"} Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.502392 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2b3aab6-b6ca-44a6-92e9-50e80109fb0c","Type":"ContainerStarted","Data":"6077c7bd89312c12d12cf690bc136c749a0bc0dba67844fa5056e36f36f0da44"} Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.515368 4930 scope.go:117] "RemoveContainer" containerID="2e2dc4254cc94c380ea3d9e5b59ed4c5af7e5369610b130b2ff2049d1bf5c49e" Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.517337 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.517392 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"df83ee2b-e484-4a2a-a619-4e9aadeb19a1","Type":"ContainerDied","Data":"b4c8b0882b0dbe2c9092bf447bef37dbcc054a544fa3c0d20011e40b2a894d55"} Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.517936 4930 generic.go:334] "Generic (PLEG): container finished" podID="df83ee2b-e484-4a2a-a619-4e9aadeb19a1" containerID="b4c8b0882b0dbe2c9092bf447bef37dbcc054a544fa3c0d20011e40b2a894d55" exitCode=0 Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.517989 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"df83ee2b-e484-4a2a-a619-4e9aadeb19a1","Type":"ContainerDied","Data":"266ddf63427777ee248c4be7892a17fc87f86426387f54bfd5a0a14a00153de4"} Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.526921 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df83ee2b-e484-4a2a-a619-4e9aadeb19a1-config-data" (OuterVolumeSpecName: "config-data") pod "df83ee2b-e484-4a2a-a619-4e9aadeb19a1" (UID: "df83ee2b-e484-4a2a-a619-4e9aadeb19a1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.527702 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0" Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.538323 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7648c6b969-m7fgp"] Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.538423 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0" Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.546927 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7648c6b969-m7fgp"] Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.548146 4930 scope.go:117] "RemoveContainer" containerID="108bb256a26485ad47d064e75e6eebb87735e2755bc7af4a1c379f1e6d10355b" Oct 12 05:59:54 crc kubenswrapper[4930]: E1012 05:59:54.553013 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"108bb256a26485ad47d064e75e6eebb87735e2755bc7af4a1c379f1e6d10355b\": container with ID starting with 108bb256a26485ad47d064e75e6eebb87735e2755bc7af4a1c379f1e6d10355b not found: ID does not exist" containerID="108bb256a26485ad47d064e75e6eebb87735e2755bc7af4a1c379f1e6d10355b" Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.553069 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"108bb256a26485ad47d064e75e6eebb87735e2755bc7af4a1c379f1e6d10355b"} err="failed to get container status \"108bb256a26485ad47d064e75e6eebb87735e2755bc7af4a1c379f1e6d10355b\": rpc error: code = NotFound desc = could not find container \"108bb256a26485ad47d064e75e6eebb87735e2755bc7af4a1c379f1e6d10355b\": container with ID starting with 108bb256a26485ad47d064e75e6eebb87735e2755bc7af4a1c379f1e6d10355b not found: ID does not exist" Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.553120 4930 scope.go:117] "RemoveContainer" containerID="2e2dc4254cc94c380ea3d9e5b59ed4c5af7e5369610b130b2ff2049d1bf5c49e" Oct 12 05:59:54 crc kubenswrapper[4930]: E1012 05:59:54.553958 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e2dc4254cc94c380ea3d9e5b59ed4c5af7e5369610b130b2ff2049d1bf5c49e\": container with ID starting with 2e2dc4254cc94c380ea3d9e5b59ed4c5af7e5369610b130b2ff2049d1bf5c49e not found: ID does not exist" containerID="2e2dc4254cc94c380ea3d9e5b59ed4c5af7e5369610b130b2ff2049d1bf5c49e" Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.553989 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e2dc4254cc94c380ea3d9e5b59ed4c5af7e5369610b130b2ff2049d1bf5c49e"} err="failed to get container status \"2e2dc4254cc94c380ea3d9e5b59ed4c5af7e5369610b130b2ff2049d1bf5c49e\": rpc error: code = NotFound desc = could not find container \"2e2dc4254cc94c380ea3d9e5b59ed4c5af7e5369610b130b2ff2049d1bf5c49e\": container with ID starting with 2e2dc4254cc94c380ea3d9e5b59ed4c5af7e5369610b130b2ff2049d1bf5c49e not found: ID does not exist" Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.554006 4930 scope.go:117] "RemoveContainer" containerID="d0f8c31aa00150932c27bc723fef563ffb08cd71345032e256c141692453941f" Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.573791 4930 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df83ee2b-e484-4a2a-a619-4e9aadeb19a1-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.590282 4930 scope.go:117] "RemoveContainer" containerID="b4c8b0882b0dbe2c9092bf447bef37dbcc054a544fa3c0d20011e40b2a894d55" Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.606857 4930 scope.go:117] "RemoveContainer" containerID="d0f8c31aa00150932c27bc723fef563ffb08cd71345032e256c141692453941f" Oct 12 05:59:54 crc kubenswrapper[4930]: E1012 05:59:54.607242 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0f8c31aa00150932c27bc723fef563ffb08cd71345032e256c141692453941f\": container with ID starting with d0f8c31aa00150932c27bc723fef563ffb08cd71345032e256c141692453941f not found: ID does not exist" containerID="d0f8c31aa00150932c27bc723fef563ffb08cd71345032e256c141692453941f" Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.607274 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0f8c31aa00150932c27bc723fef563ffb08cd71345032e256c141692453941f"} err="failed to get container status \"d0f8c31aa00150932c27bc723fef563ffb08cd71345032e256c141692453941f\": rpc error: code = NotFound desc = could not find container \"d0f8c31aa00150932c27bc723fef563ffb08cd71345032e256c141692453941f\": container with ID starting with d0f8c31aa00150932c27bc723fef563ffb08cd71345032e256c141692453941f not found: ID does not exist" Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.607294 4930 scope.go:117] "RemoveContainer" containerID="b4c8b0882b0dbe2c9092bf447bef37dbcc054a544fa3c0d20011e40b2a894d55" Oct 12 05:59:54 crc kubenswrapper[4930]: E1012 05:59:54.607462 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4c8b0882b0dbe2c9092bf447bef37dbcc054a544fa3c0d20011e40b2a894d55\": container with ID starting with b4c8b0882b0dbe2c9092bf447bef37dbcc054a544fa3c0d20011e40b2a894d55 not found: ID does not exist" containerID="b4c8b0882b0dbe2c9092bf447bef37dbcc054a544fa3c0d20011e40b2a894d55" Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.607480 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4c8b0882b0dbe2c9092bf447bef37dbcc054a544fa3c0d20011e40b2a894d55"} err="failed to get container status \"b4c8b0882b0dbe2c9092bf447bef37dbcc054a544fa3c0d20011e40b2a894d55\": rpc error: code = NotFound desc = could not find container \"b4c8b0882b0dbe2c9092bf447bef37dbcc054a544fa3c0d20011e40b2a894d55\": container with ID starting with b4c8b0882b0dbe2c9092bf447bef37dbcc054a544fa3c0d20011e40b2a894d55 not found: ID does not exist" Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.850827 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.857319 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.880386 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 12 05:59:54 crc kubenswrapper[4930]: E1012 05:59:54.880770 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39903c9a-2fa6-4fb3-8868-b2c055bdd11c" containerName="init" Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.880789 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="39903c9a-2fa6-4fb3-8868-b2c055bdd11c" containerName="init" Oct 12 05:59:54 crc kubenswrapper[4930]: E1012 05:59:54.880809 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df83ee2b-e484-4a2a-a619-4e9aadeb19a1" containerName="cinder-scheduler" Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.880815 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="df83ee2b-e484-4a2a-a619-4e9aadeb19a1" containerName="cinder-scheduler" Oct 12 05:59:54 crc kubenswrapper[4930]: E1012 05:59:54.880824 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39903c9a-2fa6-4fb3-8868-b2c055bdd11c" containerName="dnsmasq-dns" Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.880830 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="39903c9a-2fa6-4fb3-8868-b2c055bdd11c" containerName="dnsmasq-dns" Oct 12 05:59:54 crc kubenswrapper[4930]: E1012 05:59:54.880840 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df83ee2b-e484-4a2a-a619-4e9aadeb19a1" containerName="probe" Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.880846 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="df83ee2b-e484-4a2a-a619-4e9aadeb19a1" containerName="probe" Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.881036 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="df83ee2b-e484-4a2a-a619-4e9aadeb19a1" containerName="cinder-scheduler" Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.881048 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="39903c9a-2fa6-4fb3-8868-b2c055bdd11c" containerName="dnsmasq-dns" Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.881060 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="df83ee2b-e484-4a2a-a619-4e9aadeb19a1" containerName="probe" Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.881985 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.883971 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.940013 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.982872 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6c911528-2136-4abe-a716-c75437784628-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"6c911528-2136-4abe-a716-c75437784628\") " pod="openstack/cinder-scheduler-0" Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.982947 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c911528-2136-4abe-a716-c75437784628-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"6c911528-2136-4abe-a716-c75437784628\") " pod="openstack/cinder-scheduler-0" Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.983070 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6c911528-2136-4abe-a716-c75437784628-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"6c911528-2136-4abe-a716-c75437784628\") " pod="openstack/cinder-scheduler-0" Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.983105 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c911528-2136-4abe-a716-c75437784628-scripts\") pod \"cinder-scheduler-0\" (UID: \"6c911528-2136-4abe-a716-c75437784628\") " pod="openstack/cinder-scheduler-0" Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.983266 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c911528-2136-4abe-a716-c75437784628-config-data\") pod \"cinder-scheduler-0\" (UID: \"6c911528-2136-4abe-a716-c75437784628\") " pod="openstack/cinder-scheduler-0" Oct 12 05:59:54 crc kubenswrapper[4930]: I1012 05:59:54.983346 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nwt9\" (UniqueName: \"kubernetes.io/projected/6c911528-2136-4abe-a716-c75437784628-kube-api-access-4nwt9\") pod \"cinder-scheduler-0\" (UID: \"6c911528-2136-4abe-a716-c75437784628\") " pod="openstack/cinder-scheduler-0" Oct 12 05:59:55 crc kubenswrapper[4930]: I1012 05:59:55.085649 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6c911528-2136-4abe-a716-c75437784628-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"6c911528-2136-4abe-a716-c75437784628\") " pod="openstack/cinder-scheduler-0" Oct 12 05:59:55 crc kubenswrapper[4930]: I1012 05:59:55.085706 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c911528-2136-4abe-a716-c75437784628-scripts\") pod \"cinder-scheduler-0\" (UID: \"6c911528-2136-4abe-a716-c75437784628\") " pod="openstack/cinder-scheduler-0" Oct 12 05:59:55 crc kubenswrapper[4930]: I1012 05:59:55.085753 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c911528-2136-4abe-a716-c75437784628-config-data\") pod \"cinder-scheduler-0\" (UID: \"6c911528-2136-4abe-a716-c75437784628\") " pod="openstack/cinder-scheduler-0" Oct 12 05:59:55 crc kubenswrapper[4930]: I1012 05:59:55.085779 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nwt9\" (UniqueName: \"kubernetes.io/projected/6c911528-2136-4abe-a716-c75437784628-kube-api-access-4nwt9\") pod \"cinder-scheduler-0\" (UID: \"6c911528-2136-4abe-a716-c75437784628\") " pod="openstack/cinder-scheduler-0" Oct 12 05:59:55 crc kubenswrapper[4930]: I1012 05:59:55.085847 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6c911528-2136-4abe-a716-c75437784628-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"6c911528-2136-4abe-a716-c75437784628\") " pod="openstack/cinder-scheduler-0" Oct 12 05:59:55 crc kubenswrapper[4930]: I1012 05:59:55.085878 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c911528-2136-4abe-a716-c75437784628-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"6c911528-2136-4abe-a716-c75437784628\") " pod="openstack/cinder-scheduler-0" Oct 12 05:59:55 crc kubenswrapper[4930]: I1012 05:59:55.085869 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6c911528-2136-4abe-a716-c75437784628-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"6c911528-2136-4abe-a716-c75437784628\") " pod="openstack/cinder-scheduler-0" Oct 12 05:59:55 crc kubenswrapper[4930]: I1012 05:59:55.090257 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c911528-2136-4abe-a716-c75437784628-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"6c911528-2136-4abe-a716-c75437784628\") " pod="openstack/cinder-scheduler-0" Oct 12 05:59:55 crc kubenswrapper[4930]: I1012 05:59:55.091443 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c911528-2136-4abe-a716-c75437784628-config-data\") pod \"cinder-scheduler-0\" (UID: \"6c911528-2136-4abe-a716-c75437784628\") " pod="openstack/cinder-scheduler-0" Oct 12 05:59:55 crc kubenswrapper[4930]: I1012 05:59:55.101697 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c911528-2136-4abe-a716-c75437784628-scripts\") pod \"cinder-scheduler-0\" (UID: \"6c911528-2136-4abe-a716-c75437784628\") " pod="openstack/cinder-scheduler-0" Oct 12 05:59:55 crc kubenswrapper[4930]: I1012 05:59:55.102012 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6c911528-2136-4abe-a716-c75437784628-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"6c911528-2136-4abe-a716-c75437784628\") " pod="openstack/cinder-scheduler-0" Oct 12 05:59:55 crc kubenswrapper[4930]: I1012 05:59:55.108240 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nwt9\" (UniqueName: \"kubernetes.io/projected/6c911528-2136-4abe-a716-c75437784628-kube-api-access-4nwt9\") pod \"cinder-scheduler-0\" (UID: \"6c911528-2136-4abe-a716-c75437784628\") " pod="openstack/cinder-scheduler-0" Oct 12 05:59:55 crc kubenswrapper[4930]: I1012 05:59:55.218302 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 12 05:59:55 crc kubenswrapper[4930]: I1012 05:59:55.227364 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 12 05:59:55 crc kubenswrapper[4930]: I1012 05:59:55.532761 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2b3aab6-b6ca-44a6-92e9-50e80109fb0c","Type":"ContainerStarted","Data":"d63fc718d21b66de4049176761e68d69ef1fbc8b48a25789af68d6eb85a65f4d"} Oct 12 05:59:55 crc kubenswrapper[4930]: I1012 05:59:55.543694 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Oct 12 05:59:55 crc kubenswrapper[4930]: I1012 05:59:55.638346 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-546b85cb56-ln9lt" Oct 12 05:59:55 crc kubenswrapper[4930]: I1012 05:59:55.788084 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 12 05:59:56 crc kubenswrapper[4930]: I1012 05:59:56.153352 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39903c9a-2fa6-4fb3-8868-b2c055bdd11c" path="/var/lib/kubelet/pods/39903c9a-2fa6-4fb3-8868-b2c055bdd11c/volumes" Oct 12 05:59:56 crc kubenswrapper[4930]: I1012 05:59:56.161680 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df83ee2b-e484-4a2a-a619-4e9aadeb19a1" path="/var/lib/kubelet/pods/df83ee2b-e484-4a2a-a619-4e9aadeb19a1/volumes" Oct 12 05:59:56 crc kubenswrapper[4930]: I1012 05:59:56.565976 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6c911528-2136-4abe-a716-c75437784628","Type":"ContainerStarted","Data":"18dc184b1e5060892474e9008e7aeb45cef51df5199664c0fd9cece9257a16fe"} Oct 12 05:59:56 crc kubenswrapper[4930]: I1012 05:59:56.566022 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6c911528-2136-4abe-a716-c75437784628","Type":"ContainerStarted","Data":"29a317ea87c15658ed4d293fb632aaa497d80aeb895dcf2972313e71fab320cf"} Oct 12 05:59:56 crc kubenswrapper[4930]: I1012 05:59:56.606189 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.918268297 podStartE2EDuration="5.606166239s" podCreationTimestamp="2025-10-12 05:59:51 +0000 UTC" firstStartedPulling="2025-10-12 05:59:52.58854574 +0000 UTC m=+1125.130647505" lastFinishedPulling="2025-10-12 05:59:55.276443692 +0000 UTC m=+1127.818545447" observedRunningTime="2025-10-12 05:59:56.592903679 +0000 UTC m=+1129.135005444" watchObservedRunningTime="2025-10-12 05:59:56.606166239 +0000 UTC m=+1129.148268004" Oct 12 05:59:57 crc kubenswrapper[4930]: I1012 05:59:57.576065 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6c911528-2136-4abe-a716-c75437784628","Type":"ContainerStarted","Data":"e19b892b9e7cfa85e9752a410e1c85a20d04a51c2f919b650d33e30737ec29bb"} Oct 12 05:59:58 crc kubenswrapper[4930]: I1012 05:59:58.588987 4930 generic.go:334] "Generic (PLEG): container finished" podID="59f4bae4-1a84-449a-be72-e735294116e6" containerID="08f49b3d51ac716588f5237987a07b5659675e827f1272074de03ba6918727cd" exitCode=1 Oct 12 05:59:58 crc kubenswrapper[4930]: I1012 05:59:58.589024 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"59f4bae4-1a84-449a-be72-e735294116e6","Type":"ContainerDied","Data":"08f49b3d51ac716588f5237987a07b5659675e827f1272074de03ba6918727cd"} Oct 12 05:59:58 crc kubenswrapper[4930]: I1012 05:59:58.589303 4930 scope.go:117] "RemoveContainer" containerID="893b8e00eaeb9c339e5a9528038e3102dd99c50f64a2531fff869300585ed13a" Oct 12 05:59:58 crc kubenswrapper[4930]: I1012 05:59:58.590563 4930 scope.go:117] "RemoveContainer" containerID="08f49b3d51ac716588f5237987a07b5659675e827f1272074de03ba6918727cd" Oct 12 05:59:58 crc kubenswrapper[4930]: E1012 05:59:58.590993 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(59f4bae4-1a84-449a-be72-e735294116e6)\"" pod="openstack/watcher-decision-engine-0" podUID="59f4bae4-1a84-449a-be72-e735294116e6" Oct 12 05:59:58 crc kubenswrapper[4930]: I1012 05:59:58.614635 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.614613784 podStartE2EDuration="4.614613784s" podCreationTimestamp="2025-10-12 05:59:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 05:59:57.598558277 +0000 UTC m=+1130.140660052" watchObservedRunningTime="2025-10-12 05:59:58.614613784 +0000 UTC m=+1131.156715549" Oct 12 05:59:59 crc kubenswrapper[4930]: I1012 05:59:59.085354 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 12 05:59:59 crc kubenswrapper[4930]: I1012 05:59:59.086682 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 12 05:59:59 crc kubenswrapper[4930]: I1012 05:59:59.113008 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 12 05:59:59 crc kubenswrapper[4930]: I1012 05:59:59.113055 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 12 05:59:59 crc kubenswrapper[4930]: I1012 05:59:59.125808 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 12 05:59:59 crc kubenswrapper[4930]: I1012 05:59:59.139833 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 12 05:59:59 crc kubenswrapper[4930]: I1012 05:59:59.153275 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 12 05:59:59 crc kubenswrapper[4930]: I1012 05:59:59.166402 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 12 05:59:59 crc kubenswrapper[4930]: I1012 05:59:59.531223 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 12 05:59:59 crc kubenswrapper[4930]: I1012 05:59:59.532325 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 12 05:59:59 crc kubenswrapper[4930]: I1012 05:59:59.534304 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-446n8" Oct 12 05:59:59 crc kubenswrapper[4930]: I1012 05:59:59.534487 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Oct 12 05:59:59 crc kubenswrapper[4930]: I1012 05:59:59.534898 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Oct 12 05:59:59 crc kubenswrapper[4930]: I1012 05:59:59.556108 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 12 05:59:59 crc kubenswrapper[4930]: I1012 05:59:59.602887 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 12 05:59:59 crc kubenswrapper[4930]: I1012 05:59:59.602920 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 12 05:59:59 crc kubenswrapper[4930]: I1012 05:59:59.602933 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 12 05:59:59 crc kubenswrapper[4930]: I1012 05:59:59.602941 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 12 05:59:59 crc kubenswrapper[4930]: I1012 05:59:59.683405 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1975f875-9e09-4d30-b5d4-2e883f13781b-combined-ca-bundle\") pod \"openstackclient\" (UID: \"1975f875-9e09-4d30-b5d4-2e883f13781b\") " pod="openstack/openstackclient" Oct 12 05:59:59 crc kubenswrapper[4930]: I1012 05:59:59.684254 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xn56\" (UniqueName: \"kubernetes.io/projected/1975f875-9e09-4d30-b5d4-2e883f13781b-kube-api-access-2xn56\") pod \"openstackclient\" (UID: \"1975f875-9e09-4d30-b5d4-2e883f13781b\") " pod="openstack/openstackclient" Oct 12 05:59:59 crc kubenswrapper[4930]: I1012 05:59:59.684320 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1975f875-9e09-4d30-b5d4-2e883f13781b-openstack-config-secret\") pod \"openstackclient\" (UID: \"1975f875-9e09-4d30-b5d4-2e883f13781b\") " pod="openstack/openstackclient" Oct 12 05:59:59 crc kubenswrapper[4930]: I1012 05:59:59.684388 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1975f875-9e09-4d30-b5d4-2e883f13781b-openstack-config\") pod \"openstackclient\" (UID: \"1975f875-9e09-4d30-b5d4-2e883f13781b\") " pod="openstack/openstackclient" Oct 12 05:59:59 crc kubenswrapper[4930]: I1012 05:59:59.755245 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-7d8c9db847-bqfrb"] Oct 12 05:59:59 crc kubenswrapper[4930]: I1012 05:59:59.756677 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7d8c9db847-bqfrb" Oct 12 05:59:59 crc kubenswrapper[4930]: I1012 05:59:59.760534 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Oct 12 05:59:59 crc kubenswrapper[4930]: I1012 05:59:59.760853 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 12 05:59:59 crc kubenswrapper[4930]: I1012 05:59:59.760951 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Oct 12 05:59:59 crc kubenswrapper[4930]: I1012 05:59:59.777012 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7d8c9db847-bqfrb"] Oct 12 05:59:59 crc kubenswrapper[4930]: I1012 05:59:59.786118 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xn56\" (UniqueName: \"kubernetes.io/projected/1975f875-9e09-4d30-b5d4-2e883f13781b-kube-api-access-2xn56\") pod \"openstackclient\" (UID: \"1975f875-9e09-4d30-b5d4-2e883f13781b\") " pod="openstack/openstackclient" Oct 12 05:59:59 crc kubenswrapper[4930]: I1012 05:59:59.786153 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1975f875-9e09-4d30-b5d4-2e883f13781b-openstack-config-secret\") pod \"openstackclient\" (UID: \"1975f875-9e09-4d30-b5d4-2e883f13781b\") " pod="openstack/openstackclient" Oct 12 05:59:59 crc kubenswrapper[4930]: I1012 05:59:59.786197 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1975f875-9e09-4d30-b5d4-2e883f13781b-openstack-config\") pod \"openstackclient\" (UID: \"1975f875-9e09-4d30-b5d4-2e883f13781b\") " pod="openstack/openstackclient" Oct 12 05:59:59 crc kubenswrapper[4930]: I1012 05:59:59.786267 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1975f875-9e09-4d30-b5d4-2e883f13781b-combined-ca-bundle\") pod \"openstackclient\" (UID: \"1975f875-9e09-4d30-b5d4-2e883f13781b\") " pod="openstack/openstackclient" Oct 12 05:59:59 crc kubenswrapper[4930]: I1012 05:59:59.788051 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1975f875-9e09-4d30-b5d4-2e883f13781b-openstack-config\") pod \"openstackclient\" (UID: \"1975f875-9e09-4d30-b5d4-2e883f13781b\") " pod="openstack/openstackclient" Oct 12 05:59:59 crc kubenswrapper[4930]: I1012 05:59:59.791705 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1975f875-9e09-4d30-b5d4-2e883f13781b-combined-ca-bundle\") pod \"openstackclient\" (UID: \"1975f875-9e09-4d30-b5d4-2e883f13781b\") " pod="openstack/openstackclient" Oct 12 05:59:59 crc kubenswrapper[4930]: I1012 05:59:59.801296 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1975f875-9e09-4d30-b5d4-2e883f13781b-openstack-config-secret\") pod \"openstackclient\" (UID: \"1975f875-9e09-4d30-b5d4-2e883f13781b\") " pod="openstack/openstackclient" Oct 12 05:59:59 crc kubenswrapper[4930]: I1012 05:59:59.817251 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xn56\" (UniqueName: \"kubernetes.io/projected/1975f875-9e09-4d30-b5d4-2e883f13781b-kube-api-access-2xn56\") pod \"openstackclient\" (UID: \"1975f875-9e09-4d30-b5d4-2e883f13781b\") " pod="openstack/openstackclient" Oct 12 05:59:59 crc kubenswrapper[4930]: I1012 05:59:59.853503 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 12 05:59:59 crc kubenswrapper[4930]: I1012 05:59:59.887895 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qvt2\" (UniqueName: \"kubernetes.io/projected/4ed14594-beb5-4ce3-bf04-4a9299a932be-kube-api-access-8qvt2\") pod \"swift-proxy-7d8c9db847-bqfrb\" (UID: \"4ed14594-beb5-4ce3-bf04-4a9299a932be\") " pod="openstack/swift-proxy-7d8c9db847-bqfrb" Oct 12 05:59:59 crc kubenswrapper[4930]: I1012 05:59:59.888009 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ed14594-beb5-4ce3-bf04-4a9299a932be-log-httpd\") pod \"swift-proxy-7d8c9db847-bqfrb\" (UID: \"4ed14594-beb5-4ce3-bf04-4a9299a932be\") " pod="openstack/swift-proxy-7d8c9db847-bqfrb" Oct 12 05:59:59 crc kubenswrapper[4930]: I1012 05:59:59.888029 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4ed14594-beb5-4ce3-bf04-4a9299a932be-etc-swift\") pod \"swift-proxy-7d8c9db847-bqfrb\" (UID: \"4ed14594-beb5-4ce3-bf04-4a9299a932be\") " pod="openstack/swift-proxy-7d8c9db847-bqfrb" Oct 12 05:59:59 crc kubenswrapper[4930]: I1012 05:59:59.888053 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ed14594-beb5-4ce3-bf04-4a9299a932be-config-data\") pod \"swift-proxy-7d8c9db847-bqfrb\" (UID: \"4ed14594-beb5-4ce3-bf04-4a9299a932be\") " pod="openstack/swift-proxy-7d8c9db847-bqfrb" Oct 12 05:59:59 crc kubenswrapper[4930]: I1012 05:59:59.888074 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ed14594-beb5-4ce3-bf04-4a9299a932be-combined-ca-bundle\") pod \"swift-proxy-7d8c9db847-bqfrb\" (UID: \"4ed14594-beb5-4ce3-bf04-4a9299a932be\") " pod="openstack/swift-proxy-7d8c9db847-bqfrb" Oct 12 05:59:59 crc kubenswrapper[4930]: I1012 05:59:59.888229 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ed14594-beb5-4ce3-bf04-4a9299a932be-run-httpd\") pod \"swift-proxy-7d8c9db847-bqfrb\" (UID: \"4ed14594-beb5-4ce3-bf04-4a9299a932be\") " pod="openstack/swift-proxy-7d8c9db847-bqfrb" Oct 12 05:59:59 crc kubenswrapper[4930]: I1012 05:59:59.888281 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ed14594-beb5-4ce3-bf04-4a9299a932be-public-tls-certs\") pod \"swift-proxy-7d8c9db847-bqfrb\" (UID: \"4ed14594-beb5-4ce3-bf04-4a9299a932be\") " pod="openstack/swift-proxy-7d8c9db847-bqfrb" Oct 12 05:59:59 crc kubenswrapper[4930]: I1012 05:59:59.888406 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ed14594-beb5-4ce3-bf04-4a9299a932be-internal-tls-certs\") pod \"swift-proxy-7d8c9db847-bqfrb\" (UID: \"4ed14594-beb5-4ce3-bf04-4a9299a932be\") " pod="openstack/swift-proxy-7d8c9db847-bqfrb" Oct 12 05:59:59 crc kubenswrapper[4930]: I1012 05:59:59.990479 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qvt2\" (UniqueName: \"kubernetes.io/projected/4ed14594-beb5-4ce3-bf04-4a9299a932be-kube-api-access-8qvt2\") pod \"swift-proxy-7d8c9db847-bqfrb\" (UID: \"4ed14594-beb5-4ce3-bf04-4a9299a932be\") " pod="openstack/swift-proxy-7d8c9db847-bqfrb" Oct 12 05:59:59 crc kubenswrapper[4930]: I1012 05:59:59.990609 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ed14594-beb5-4ce3-bf04-4a9299a932be-log-httpd\") pod \"swift-proxy-7d8c9db847-bqfrb\" (UID: \"4ed14594-beb5-4ce3-bf04-4a9299a932be\") " pod="openstack/swift-proxy-7d8c9db847-bqfrb" Oct 12 05:59:59 crc kubenswrapper[4930]: I1012 05:59:59.990629 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4ed14594-beb5-4ce3-bf04-4a9299a932be-etc-swift\") pod \"swift-proxy-7d8c9db847-bqfrb\" (UID: \"4ed14594-beb5-4ce3-bf04-4a9299a932be\") " pod="openstack/swift-proxy-7d8c9db847-bqfrb" Oct 12 05:59:59 crc kubenswrapper[4930]: I1012 05:59:59.990657 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ed14594-beb5-4ce3-bf04-4a9299a932be-config-data\") pod \"swift-proxy-7d8c9db847-bqfrb\" (UID: \"4ed14594-beb5-4ce3-bf04-4a9299a932be\") " pod="openstack/swift-proxy-7d8c9db847-bqfrb" Oct 12 05:59:59 crc kubenswrapper[4930]: I1012 05:59:59.990678 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ed14594-beb5-4ce3-bf04-4a9299a932be-combined-ca-bundle\") pod \"swift-proxy-7d8c9db847-bqfrb\" (UID: \"4ed14594-beb5-4ce3-bf04-4a9299a932be\") " pod="openstack/swift-proxy-7d8c9db847-bqfrb" Oct 12 05:59:59 crc kubenswrapper[4930]: I1012 05:59:59.990709 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ed14594-beb5-4ce3-bf04-4a9299a932be-run-httpd\") pod \"swift-proxy-7d8c9db847-bqfrb\" (UID: \"4ed14594-beb5-4ce3-bf04-4a9299a932be\") " pod="openstack/swift-proxy-7d8c9db847-bqfrb" Oct 12 05:59:59 crc kubenswrapper[4930]: I1012 05:59:59.990729 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ed14594-beb5-4ce3-bf04-4a9299a932be-public-tls-certs\") pod \"swift-proxy-7d8c9db847-bqfrb\" (UID: \"4ed14594-beb5-4ce3-bf04-4a9299a932be\") " pod="openstack/swift-proxy-7d8c9db847-bqfrb" Oct 12 05:59:59 crc kubenswrapper[4930]: I1012 05:59:59.990773 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ed14594-beb5-4ce3-bf04-4a9299a932be-internal-tls-certs\") pod \"swift-proxy-7d8c9db847-bqfrb\" (UID: \"4ed14594-beb5-4ce3-bf04-4a9299a932be\") " pod="openstack/swift-proxy-7d8c9db847-bqfrb" Oct 12 05:59:59 crc kubenswrapper[4930]: I1012 05:59:59.992380 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ed14594-beb5-4ce3-bf04-4a9299a932be-run-httpd\") pod \"swift-proxy-7d8c9db847-bqfrb\" (UID: \"4ed14594-beb5-4ce3-bf04-4a9299a932be\") " pod="openstack/swift-proxy-7d8c9db847-bqfrb" Oct 12 05:59:59 crc kubenswrapper[4930]: I1012 05:59:59.992458 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ed14594-beb5-4ce3-bf04-4a9299a932be-log-httpd\") pod \"swift-proxy-7d8c9db847-bqfrb\" (UID: \"4ed14594-beb5-4ce3-bf04-4a9299a932be\") " pod="openstack/swift-proxy-7d8c9db847-bqfrb" Oct 12 06:00:00 crc kubenswrapper[4930]: I1012 06:00:00.005081 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4ed14594-beb5-4ce3-bf04-4a9299a932be-etc-swift\") pod \"swift-proxy-7d8c9db847-bqfrb\" (UID: \"4ed14594-beb5-4ce3-bf04-4a9299a932be\") " pod="openstack/swift-proxy-7d8c9db847-bqfrb" Oct 12 06:00:00 crc kubenswrapper[4930]: I1012 06:00:00.007378 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ed14594-beb5-4ce3-bf04-4a9299a932be-public-tls-certs\") pod \"swift-proxy-7d8c9db847-bqfrb\" (UID: \"4ed14594-beb5-4ce3-bf04-4a9299a932be\") " pod="openstack/swift-proxy-7d8c9db847-bqfrb" Oct 12 06:00:00 crc kubenswrapper[4930]: I1012 06:00:00.008031 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ed14594-beb5-4ce3-bf04-4a9299a932be-internal-tls-certs\") pod \"swift-proxy-7d8c9db847-bqfrb\" (UID: \"4ed14594-beb5-4ce3-bf04-4a9299a932be\") " pod="openstack/swift-proxy-7d8c9db847-bqfrb" Oct 12 06:00:00 crc kubenswrapper[4930]: I1012 06:00:00.013190 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ed14594-beb5-4ce3-bf04-4a9299a932be-combined-ca-bundle\") pod \"swift-proxy-7d8c9db847-bqfrb\" (UID: \"4ed14594-beb5-4ce3-bf04-4a9299a932be\") " pod="openstack/swift-proxy-7d8c9db847-bqfrb" Oct 12 06:00:00 crc kubenswrapper[4930]: I1012 06:00:00.032017 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ed14594-beb5-4ce3-bf04-4a9299a932be-config-data\") pod \"swift-proxy-7d8c9db847-bqfrb\" (UID: \"4ed14594-beb5-4ce3-bf04-4a9299a932be\") " pod="openstack/swift-proxy-7d8c9db847-bqfrb" Oct 12 06:00:00 crc kubenswrapper[4930]: I1012 06:00:00.043321 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qvt2\" (UniqueName: \"kubernetes.io/projected/4ed14594-beb5-4ce3-bf04-4a9299a932be-kube-api-access-8qvt2\") pod \"swift-proxy-7d8c9db847-bqfrb\" (UID: \"4ed14594-beb5-4ce3-bf04-4a9299a932be\") " pod="openstack/swift-proxy-7d8c9db847-bqfrb" Oct 12 06:00:00 crc kubenswrapper[4930]: I1012 06:00:00.074564 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7d8c9db847-bqfrb" Oct 12 06:00:00 crc kubenswrapper[4930]: I1012 06:00:00.193565 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29337480-zt7xb"] Oct 12 06:00:00 crc kubenswrapper[4930]: I1012 06:00:00.194972 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29337480-zt7xb" Oct 12 06:00:00 crc kubenswrapper[4930]: I1012 06:00:00.198821 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 12 06:00:00 crc kubenswrapper[4930]: I1012 06:00:00.198896 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 12 06:00:00 crc kubenswrapper[4930]: I1012 06:00:00.205467 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29337480-zt7xb"] Oct 12 06:00:00 crc kubenswrapper[4930]: I1012 06:00:00.229088 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 12 06:00:00 crc kubenswrapper[4930]: I1012 06:00:00.295034 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5518b25e-e018-492d-926b-37943813b054-secret-volume\") pod \"collect-profiles-29337480-zt7xb\" (UID: \"5518b25e-e018-492d-926b-37943813b054\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337480-zt7xb" Oct 12 06:00:00 crc kubenswrapper[4930]: I1012 06:00:00.295177 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5518b25e-e018-492d-926b-37943813b054-config-volume\") pod \"collect-profiles-29337480-zt7xb\" (UID: \"5518b25e-e018-492d-926b-37943813b054\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337480-zt7xb" Oct 12 06:00:00 crc kubenswrapper[4930]: I1012 06:00:00.295225 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgjr4\" (UniqueName: \"kubernetes.io/projected/5518b25e-e018-492d-926b-37943813b054-kube-api-access-lgjr4\") pod \"collect-profiles-29337480-zt7xb\" (UID: \"5518b25e-e018-492d-926b-37943813b054\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337480-zt7xb" Oct 12 06:00:00 crc kubenswrapper[4930]: I1012 06:00:00.368112 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 12 06:00:00 crc kubenswrapper[4930]: I1012 06:00:00.397016 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5518b25e-e018-492d-926b-37943813b054-secret-volume\") pod \"collect-profiles-29337480-zt7xb\" (UID: \"5518b25e-e018-492d-926b-37943813b054\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337480-zt7xb" Oct 12 06:00:00 crc kubenswrapper[4930]: I1012 06:00:00.397139 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5518b25e-e018-492d-926b-37943813b054-config-volume\") pod \"collect-profiles-29337480-zt7xb\" (UID: \"5518b25e-e018-492d-926b-37943813b054\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337480-zt7xb" Oct 12 06:00:00 crc kubenswrapper[4930]: I1012 06:00:00.397177 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgjr4\" (UniqueName: \"kubernetes.io/projected/5518b25e-e018-492d-926b-37943813b054-kube-api-access-lgjr4\") pod \"collect-profiles-29337480-zt7xb\" (UID: \"5518b25e-e018-492d-926b-37943813b054\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337480-zt7xb" Oct 12 06:00:00 crc kubenswrapper[4930]: I1012 06:00:00.398699 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5518b25e-e018-492d-926b-37943813b054-config-volume\") pod \"collect-profiles-29337480-zt7xb\" (UID: \"5518b25e-e018-492d-926b-37943813b054\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337480-zt7xb" Oct 12 06:00:00 crc kubenswrapper[4930]: I1012 06:00:00.409503 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5518b25e-e018-492d-926b-37943813b054-secret-volume\") pod \"collect-profiles-29337480-zt7xb\" (UID: \"5518b25e-e018-492d-926b-37943813b054\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337480-zt7xb" Oct 12 06:00:00 crc kubenswrapper[4930]: I1012 06:00:00.415801 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgjr4\" (UniqueName: \"kubernetes.io/projected/5518b25e-e018-492d-926b-37943813b054-kube-api-access-lgjr4\") pod \"collect-profiles-29337480-zt7xb\" (UID: \"5518b25e-e018-492d-926b-37943813b054\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337480-zt7xb" Oct 12 06:00:00 crc kubenswrapper[4930]: I1012 06:00:00.514695 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29337480-zt7xb" Oct 12 06:00:00 crc kubenswrapper[4930]: I1012 06:00:00.624516 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"1975f875-9e09-4d30-b5d4-2e883f13781b","Type":"ContainerStarted","Data":"7f82e3acd62c3858b1dbe1e1d4763125fd2ec7d1621c1584b17233e278e23a25"} Oct 12 06:00:00 crc kubenswrapper[4930]: I1012 06:00:00.704192 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7d8c9db847-bqfrb"] Oct 12 06:00:00 crc kubenswrapper[4930]: W1012 06:00:00.709215 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4ed14594_beb5_4ce3_bf04_4a9299a932be.slice/crio-0796a1762e90fb1e8be8280a92bbf86e9ea98dac4f59eb864bbb6e3127a53085 WatchSource:0}: Error finding container 0796a1762e90fb1e8be8280a92bbf86e9ea98dac4f59eb864bbb6e3127a53085: Status 404 returned error can't find the container with id 0796a1762e90fb1e8be8280a92bbf86e9ea98dac4f59eb864bbb6e3127a53085 Oct 12 06:00:01 crc kubenswrapper[4930]: I1012 06:00:01.039041 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29337480-zt7xb"] Oct 12 06:00:01 crc kubenswrapper[4930]: I1012 06:00:01.189293 4930 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6778cd8bb8-9zhz5" podUID="30049dfb-04f2-455c-a949-08bd6ff892d0" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.161:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.161:8443: connect: connection refused" Oct 12 06:00:01 crc kubenswrapper[4930]: I1012 06:00:01.659445 4930 generic.go:334] "Generic (PLEG): container finished" podID="5518b25e-e018-492d-926b-37943813b054" containerID="2ff1d3b5d7d499248e998ca9cbe4ffc2ee3adc45747c49d8875a4ddf19d940bc" exitCode=0 Oct 12 06:00:01 crc kubenswrapper[4930]: I1012 06:00:01.659775 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29337480-zt7xb" event={"ID":"5518b25e-e018-492d-926b-37943813b054","Type":"ContainerDied","Data":"2ff1d3b5d7d499248e998ca9cbe4ffc2ee3adc45747c49d8875a4ddf19d940bc"} Oct 12 06:00:01 crc kubenswrapper[4930]: I1012 06:00:01.659818 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29337480-zt7xb" event={"ID":"5518b25e-e018-492d-926b-37943813b054","Type":"ContainerStarted","Data":"c12a7944a0286d8dd56f96a2f00f3376fb079658bfed210fe6160dbdd3b55269"} Oct 12 06:00:01 crc kubenswrapper[4930]: I1012 06:00:01.710362 4930 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 12 06:00:01 crc kubenswrapper[4930]: I1012 06:00:01.710389 4930 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 12 06:00:01 crc kubenswrapper[4930]: I1012 06:00:01.710828 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7d8c9db847-bqfrb" event={"ID":"4ed14594-beb5-4ce3-bf04-4a9299a932be","Type":"ContainerStarted","Data":"34dff7b871d7c75cbde302a5ee28e52856959e3ed9da99b3af1304fa63257822"} Oct 12 06:00:01 crc kubenswrapper[4930]: I1012 06:00:01.710908 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7d8c9db847-bqfrb" event={"ID":"4ed14594-beb5-4ce3-bf04-4a9299a932be","Type":"ContainerStarted","Data":"af258b0b86d429b2cee23de97fb0cffd7cf006b3970356122e9e2b3022cceca8"} Oct 12 06:00:01 crc kubenswrapper[4930]: I1012 06:00:01.710918 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7d8c9db847-bqfrb" event={"ID":"4ed14594-beb5-4ce3-bf04-4a9299a932be","Type":"ContainerStarted","Data":"0796a1762e90fb1e8be8280a92bbf86e9ea98dac4f59eb864bbb6e3127a53085"} Oct 12 06:00:01 crc kubenswrapper[4930]: I1012 06:00:01.710954 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7d8c9db847-bqfrb" Oct 12 06:00:01 crc kubenswrapper[4930]: I1012 06:00:01.710974 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7d8c9db847-bqfrb" Oct 12 06:00:01 crc kubenswrapper[4930]: I1012 06:00:01.740745 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-7d8c9db847-bqfrb" podStartSLOduration=2.740710686 podStartE2EDuration="2.740710686s" podCreationTimestamp="2025-10-12 05:59:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 06:00:01.73325161 +0000 UTC m=+1134.275353375" watchObservedRunningTime="2025-10-12 06:00:01.740710686 +0000 UTC m=+1134.282812451" Oct 12 06:00:02 crc kubenswrapper[4930]: I1012 06:00:02.224360 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 12 06:00:02 crc kubenswrapper[4930]: I1012 06:00:02.225039 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d2b3aab6-b6ca-44a6-92e9-50e80109fb0c" containerName="ceilometer-central-agent" containerID="cri-o://6460600dc2240bf70a42d8ce5d6ad8f89e6a182400f65215964547911db1e3c1" gracePeriod=30 Oct 12 06:00:02 crc kubenswrapper[4930]: I1012 06:00:02.225223 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d2b3aab6-b6ca-44a6-92e9-50e80109fb0c" containerName="proxy-httpd" containerID="cri-o://d63fc718d21b66de4049176761e68d69ef1fbc8b48a25789af68d6eb85a65f4d" gracePeriod=30 Oct 12 06:00:02 crc kubenswrapper[4930]: I1012 06:00:02.225325 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d2b3aab6-b6ca-44a6-92e9-50e80109fb0c" containerName="sg-core" containerID="cri-o://2330a8f81f829cfc92c56cdbb3d5729dd0bf48d8f9f7220b0f3440e7ce7eee40" gracePeriod=30 Oct 12 06:00:02 crc kubenswrapper[4930]: I1012 06:00:02.225046 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 12 06:00:02 crc kubenswrapper[4930]: I1012 06:00:02.225514 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d2b3aab6-b6ca-44a6-92e9-50e80109fb0c" containerName="ceilometer-notification-agent" containerID="cri-o://6077c7bd89312c12d12cf690bc136c749a0bc0dba67844fa5056e36f36f0da44" gracePeriod=30 Oct 12 06:00:02 crc kubenswrapper[4930]: I1012 06:00:02.644650 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 12 06:00:02 crc kubenswrapper[4930]: I1012 06:00:02.644965 4930 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 12 06:00:02 crc kubenswrapper[4930]: I1012 06:00:02.658164 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 12 06:00:02 crc kubenswrapper[4930]: I1012 06:00:02.664134 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 12 06:00:02 crc kubenswrapper[4930]: I1012 06:00:02.673568 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 12 06:00:02 crc kubenswrapper[4930]: I1012 06:00:02.750117 4930 generic.go:334] "Generic (PLEG): container finished" podID="d2b3aab6-b6ca-44a6-92e9-50e80109fb0c" containerID="d63fc718d21b66de4049176761e68d69ef1fbc8b48a25789af68d6eb85a65f4d" exitCode=0 Oct 12 06:00:02 crc kubenswrapper[4930]: I1012 06:00:02.750153 4930 generic.go:334] "Generic (PLEG): container finished" podID="d2b3aab6-b6ca-44a6-92e9-50e80109fb0c" containerID="2330a8f81f829cfc92c56cdbb3d5729dd0bf48d8f9f7220b0f3440e7ce7eee40" exitCode=2 Oct 12 06:00:02 crc kubenswrapper[4930]: I1012 06:00:02.750162 4930 generic.go:334] "Generic (PLEG): container finished" podID="d2b3aab6-b6ca-44a6-92e9-50e80109fb0c" containerID="6460600dc2240bf70a42d8ce5d6ad8f89e6a182400f65215964547911db1e3c1" exitCode=0 Oct 12 06:00:02 crc kubenswrapper[4930]: I1012 06:00:02.751965 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2b3aab6-b6ca-44a6-92e9-50e80109fb0c","Type":"ContainerDied","Data":"d63fc718d21b66de4049176761e68d69ef1fbc8b48a25789af68d6eb85a65f4d"} Oct 12 06:00:02 crc kubenswrapper[4930]: I1012 06:00:02.751995 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2b3aab6-b6ca-44a6-92e9-50e80109fb0c","Type":"ContainerDied","Data":"2330a8f81f829cfc92c56cdbb3d5729dd0bf48d8f9f7220b0f3440e7ce7eee40"} Oct 12 06:00:02 crc kubenswrapper[4930]: I1012 06:00:02.752004 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2b3aab6-b6ca-44a6-92e9-50e80109fb0c","Type":"ContainerDied","Data":"6460600dc2240bf70a42d8ce5d6ad8f89e6a182400f65215964547911db1e3c1"} Oct 12 06:00:03 crc kubenswrapper[4930]: I1012 06:00:03.260565 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29337480-zt7xb" Oct 12 06:00:03 crc kubenswrapper[4930]: I1012 06:00:03.365981 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lgjr4\" (UniqueName: \"kubernetes.io/projected/5518b25e-e018-492d-926b-37943813b054-kube-api-access-lgjr4\") pod \"5518b25e-e018-492d-926b-37943813b054\" (UID: \"5518b25e-e018-492d-926b-37943813b054\") " Oct 12 06:00:03 crc kubenswrapper[4930]: I1012 06:00:03.366113 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5518b25e-e018-492d-926b-37943813b054-config-volume\") pod \"5518b25e-e018-492d-926b-37943813b054\" (UID: \"5518b25e-e018-492d-926b-37943813b054\") " Oct 12 06:00:03 crc kubenswrapper[4930]: I1012 06:00:03.366159 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5518b25e-e018-492d-926b-37943813b054-secret-volume\") pod \"5518b25e-e018-492d-926b-37943813b054\" (UID: \"5518b25e-e018-492d-926b-37943813b054\") " Oct 12 06:00:03 crc kubenswrapper[4930]: I1012 06:00:03.367485 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5518b25e-e018-492d-926b-37943813b054-config-volume" (OuterVolumeSpecName: "config-volume") pod "5518b25e-e018-492d-926b-37943813b054" (UID: "5518b25e-e018-492d-926b-37943813b054"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 06:00:03 crc kubenswrapper[4930]: I1012 06:00:03.372926 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5518b25e-e018-492d-926b-37943813b054-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5518b25e-e018-492d-926b-37943813b054" (UID: "5518b25e-e018-492d-926b-37943813b054"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:00:03 crc kubenswrapper[4930]: I1012 06:00:03.372958 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5518b25e-e018-492d-926b-37943813b054-kube-api-access-lgjr4" (OuterVolumeSpecName: "kube-api-access-lgjr4") pod "5518b25e-e018-492d-926b-37943813b054" (UID: "5518b25e-e018-492d-926b-37943813b054"). InnerVolumeSpecName "kube-api-access-lgjr4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:00:03 crc kubenswrapper[4930]: I1012 06:00:03.469017 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lgjr4\" (UniqueName: \"kubernetes.io/projected/5518b25e-e018-492d-926b-37943813b054-kube-api-access-lgjr4\") on node \"crc\" DevicePath \"\"" Oct 12 06:00:03 crc kubenswrapper[4930]: I1012 06:00:03.469312 4930 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5518b25e-e018-492d-926b-37943813b054-config-volume\") on node \"crc\" DevicePath \"\"" Oct 12 06:00:03 crc kubenswrapper[4930]: I1012 06:00:03.469327 4930 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5518b25e-e018-492d-926b-37943813b054-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 12 06:00:03 crc kubenswrapper[4930]: I1012 06:00:03.668962 4930 patch_prober.go:28] interesting pod/machine-config-daemon-mk4tf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 06:00:03 crc kubenswrapper[4930]: I1012 06:00:03.669018 4930 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 06:00:03 crc kubenswrapper[4930]: I1012 06:00:03.686215 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 12 06:00:03 crc kubenswrapper[4930]: I1012 06:00:03.686249 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 12 06:00:03 crc kubenswrapper[4930]: I1012 06:00:03.686680 4930 scope.go:117] "RemoveContainer" containerID="08f49b3d51ac716588f5237987a07b5659675e827f1272074de03ba6918727cd" Oct 12 06:00:03 crc kubenswrapper[4930]: E1012 06:00:03.687020 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(59f4bae4-1a84-449a-be72-e735294116e6)\"" pod="openstack/watcher-decision-engine-0" podUID="59f4bae4-1a84-449a-be72-e735294116e6" Oct 12 06:00:03 crc kubenswrapper[4930]: I1012 06:00:03.772221 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29337480-zt7xb" event={"ID":"5518b25e-e018-492d-926b-37943813b054","Type":"ContainerDied","Data":"c12a7944a0286d8dd56f96a2f00f3376fb079658bfed210fe6160dbdd3b55269"} Oct 12 06:00:03 crc kubenswrapper[4930]: I1012 06:00:03.772269 4930 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c12a7944a0286d8dd56f96a2f00f3376fb079658bfed210fe6160dbdd3b55269" Oct 12 06:00:03 crc kubenswrapper[4930]: I1012 06:00:03.772345 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29337480-zt7xb" Oct 12 06:00:03 crc kubenswrapper[4930]: I1012 06:00:03.783616 4930 generic.go:334] "Generic (PLEG): container finished" podID="d2b3aab6-b6ca-44a6-92e9-50e80109fb0c" containerID="6077c7bd89312c12d12cf690bc136c749a0bc0dba67844fa5056e36f36f0da44" exitCode=0 Oct 12 06:00:03 crc kubenswrapper[4930]: I1012 06:00:03.783875 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2b3aab6-b6ca-44a6-92e9-50e80109fb0c","Type":"ContainerDied","Data":"6077c7bd89312c12d12cf690bc136c749a0bc0dba67844fa5056e36f36f0da44"} Oct 12 06:00:03 crc kubenswrapper[4930]: I1012 06:00:03.784021 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2b3aab6-b6ca-44a6-92e9-50e80109fb0c","Type":"ContainerDied","Data":"a0320fd981abb5646b2520e45339c6fe9e7c6f2627aaa11ef7a8bdd43eb2534c"} Oct 12 06:00:03 crc kubenswrapper[4930]: I1012 06:00:03.784116 4930 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0320fd981abb5646b2520e45339c6fe9e7c6f2627aaa11ef7a8bdd43eb2534c" Oct 12 06:00:03 crc kubenswrapper[4930]: I1012 06:00:03.788708 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 06:00:03 crc kubenswrapper[4930]: I1012 06:00:03.879320 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9nrjc\" (UniqueName: \"kubernetes.io/projected/d2b3aab6-b6ca-44a6-92e9-50e80109fb0c-kube-api-access-9nrjc\") pod \"d2b3aab6-b6ca-44a6-92e9-50e80109fb0c\" (UID: \"d2b3aab6-b6ca-44a6-92e9-50e80109fb0c\") " Oct 12 06:00:03 crc kubenswrapper[4930]: I1012 06:00:03.879363 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2b3aab6-b6ca-44a6-92e9-50e80109fb0c-config-data\") pod \"d2b3aab6-b6ca-44a6-92e9-50e80109fb0c\" (UID: \"d2b3aab6-b6ca-44a6-92e9-50e80109fb0c\") " Oct 12 06:00:03 crc kubenswrapper[4930]: I1012 06:00:03.879432 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2b3aab6-b6ca-44a6-92e9-50e80109fb0c-log-httpd\") pod \"d2b3aab6-b6ca-44a6-92e9-50e80109fb0c\" (UID: \"d2b3aab6-b6ca-44a6-92e9-50e80109fb0c\") " Oct 12 06:00:03 crc kubenswrapper[4930]: I1012 06:00:03.879548 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2b3aab6-b6ca-44a6-92e9-50e80109fb0c-combined-ca-bundle\") pod \"d2b3aab6-b6ca-44a6-92e9-50e80109fb0c\" (UID: \"d2b3aab6-b6ca-44a6-92e9-50e80109fb0c\") " Oct 12 06:00:03 crc kubenswrapper[4930]: I1012 06:00:03.879587 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2b3aab6-b6ca-44a6-92e9-50e80109fb0c-run-httpd\") pod \"d2b3aab6-b6ca-44a6-92e9-50e80109fb0c\" (UID: \"d2b3aab6-b6ca-44a6-92e9-50e80109fb0c\") " Oct 12 06:00:03 crc kubenswrapper[4930]: I1012 06:00:03.879622 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2b3aab6-b6ca-44a6-92e9-50e80109fb0c-scripts\") pod \"d2b3aab6-b6ca-44a6-92e9-50e80109fb0c\" (UID: \"d2b3aab6-b6ca-44a6-92e9-50e80109fb0c\") " Oct 12 06:00:03 crc kubenswrapper[4930]: I1012 06:00:03.879690 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d2b3aab6-b6ca-44a6-92e9-50e80109fb0c-sg-core-conf-yaml\") pod \"d2b3aab6-b6ca-44a6-92e9-50e80109fb0c\" (UID: \"d2b3aab6-b6ca-44a6-92e9-50e80109fb0c\") " Oct 12 06:00:03 crc kubenswrapper[4930]: I1012 06:00:03.882856 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2b3aab6-b6ca-44a6-92e9-50e80109fb0c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d2b3aab6-b6ca-44a6-92e9-50e80109fb0c" (UID: "d2b3aab6-b6ca-44a6-92e9-50e80109fb0c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 06:00:03 crc kubenswrapper[4930]: I1012 06:00:03.883006 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2b3aab6-b6ca-44a6-92e9-50e80109fb0c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d2b3aab6-b6ca-44a6-92e9-50e80109fb0c" (UID: "d2b3aab6-b6ca-44a6-92e9-50e80109fb0c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 06:00:03 crc kubenswrapper[4930]: I1012 06:00:03.884900 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2b3aab6-b6ca-44a6-92e9-50e80109fb0c-scripts" (OuterVolumeSpecName: "scripts") pod "d2b3aab6-b6ca-44a6-92e9-50e80109fb0c" (UID: "d2b3aab6-b6ca-44a6-92e9-50e80109fb0c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:00:03 crc kubenswrapper[4930]: I1012 06:00:03.887302 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2b3aab6-b6ca-44a6-92e9-50e80109fb0c-kube-api-access-9nrjc" (OuterVolumeSpecName: "kube-api-access-9nrjc") pod "d2b3aab6-b6ca-44a6-92e9-50e80109fb0c" (UID: "d2b3aab6-b6ca-44a6-92e9-50e80109fb0c"). InnerVolumeSpecName "kube-api-access-9nrjc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:00:03 crc kubenswrapper[4930]: I1012 06:00:03.913389 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2b3aab6-b6ca-44a6-92e9-50e80109fb0c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d2b3aab6-b6ca-44a6-92e9-50e80109fb0c" (UID: "d2b3aab6-b6ca-44a6-92e9-50e80109fb0c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:00:03 crc kubenswrapper[4930]: I1012 06:00:03.969537 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2b3aab6-b6ca-44a6-92e9-50e80109fb0c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d2b3aab6-b6ca-44a6-92e9-50e80109fb0c" (UID: "d2b3aab6-b6ca-44a6-92e9-50e80109fb0c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:00:03 crc kubenswrapper[4930]: I1012 06:00:03.986270 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9nrjc\" (UniqueName: \"kubernetes.io/projected/d2b3aab6-b6ca-44a6-92e9-50e80109fb0c-kube-api-access-9nrjc\") on node \"crc\" DevicePath \"\"" Oct 12 06:00:03 crc kubenswrapper[4930]: I1012 06:00:03.986296 4930 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2b3aab6-b6ca-44a6-92e9-50e80109fb0c-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 12 06:00:03 crc kubenswrapper[4930]: I1012 06:00:03.986306 4930 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2b3aab6-b6ca-44a6-92e9-50e80109fb0c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 06:00:03 crc kubenswrapper[4930]: I1012 06:00:03.986314 4930 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2b3aab6-b6ca-44a6-92e9-50e80109fb0c-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 12 06:00:03 crc kubenswrapper[4930]: I1012 06:00:03.986322 4930 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2b3aab6-b6ca-44a6-92e9-50e80109fb0c-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 06:00:03 crc kubenswrapper[4930]: I1012 06:00:03.986330 4930 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d2b3aab6-b6ca-44a6-92e9-50e80109fb0c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 12 06:00:03 crc kubenswrapper[4930]: I1012 06:00:03.992773 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2b3aab6-b6ca-44a6-92e9-50e80109fb0c-config-data" (OuterVolumeSpecName: "config-data") pod "d2b3aab6-b6ca-44a6-92e9-50e80109fb0c" (UID: "d2b3aab6-b6ca-44a6-92e9-50e80109fb0c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:00:04 crc kubenswrapper[4930]: I1012 06:00:04.088343 4930 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2b3aab6-b6ca-44a6-92e9-50e80109fb0c-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 06:00:04 crc kubenswrapper[4930]: I1012 06:00:04.791606 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 06:00:04 crc kubenswrapper[4930]: I1012 06:00:04.819413 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 12 06:00:04 crc kubenswrapper[4930]: I1012 06:00:04.826800 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 12 06:00:04 crc kubenswrapper[4930]: I1012 06:00:04.849613 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 12 06:00:04 crc kubenswrapper[4930]: E1012 06:00:04.850178 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2b3aab6-b6ca-44a6-92e9-50e80109fb0c" containerName="ceilometer-central-agent" Oct 12 06:00:04 crc kubenswrapper[4930]: I1012 06:00:04.850247 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2b3aab6-b6ca-44a6-92e9-50e80109fb0c" containerName="ceilometer-central-agent" Oct 12 06:00:04 crc kubenswrapper[4930]: E1012 06:00:04.850311 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2b3aab6-b6ca-44a6-92e9-50e80109fb0c" containerName="proxy-httpd" Oct 12 06:00:04 crc kubenswrapper[4930]: I1012 06:00:04.850366 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2b3aab6-b6ca-44a6-92e9-50e80109fb0c" containerName="proxy-httpd" Oct 12 06:00:04 crc kubenswrapper[4930]: E1012 06:00:04.850441 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2b3aab6-b6ca-44a6-92e9-50e80109fb0c" containerName="ceilometer-notification-agent" Oct 12 06:00:04 crc kubenswrapper[4930]: I1012 06:00:04.850498 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2b3aab6-b6ca-44a6-92e9-50e80109fb0c" containerName="ceilometer-notification-agent" Oct 12 06:00:04 crc kubenswrapper[4930]: E1012 06:00:04.850663 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5518b25e-e018-492d-926b-37943813b054" containerName="collect-profiles" Oct 12 06:00:04 crc kubenswrapper[4930]: I1012 06:00:04.850722 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="5518b25e-e018-492d-926b-37943813b054" containerName="collect-profiles" Oct 12 06:00:04 crc kubenswrapper[4930]: E1012 06:00:04.850811 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2b3aab6-b6ca-44a6-92e9-50e80109fb0c" containerName="sg-core" Oct 12 06:00:04 crc kubenswrapper[4930]: I1012 06:00:04.850883 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2b3aab6-b6ca-44a6-92e9-50e80109fb0c" containerName="sg-core" Oct 12 06:00:04 crc kubenswrapper[4930]: I1012 06:00:04.851107 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2b3aab6-b6ca-44a6-92e9-50e80109fb0c" containerName="ceilometer-central-agent" Oct 12 06:00:04 crc kubenswrapper[4930]: I1012 06:00:04.851167 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="5518b25e-e018-492d-926b-37943813b054" containerName="collect-profiles" Oct 12 06:00:04 crc kubenswrapper[4930]: I1012 06:00:04.851238 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2b3aab6-b6ca-44a6-92e9-50e80109fb0c" containerName="sg-core" Oct 12 06:00:04 crc kubenswrapper[4930]: I1012 06:00:04.851293 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2b3aab6-b6ca-44a6-92e9-50e80109fb0c" containerName="proxy-httpd" Oct 12 06:00:04 crc kubenswrapper[4930]: I1012 06:00:04.851352 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2b3aab6-b6ca-44a6-92e9-50e80109fb0c" containerName="ceilometer-notification-agent" Oct 12 06:00:04 crc kubenswrapper[4930]: I1012 06:00:04.853048 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 06:00:04 crc kubenswrapper[4930]: I1012 06:00:04.861123 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 12 06:00:04 crc kubenswrapper[4930]: I1012 06:00:04.861361 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 12 06:00:04 crc kubenswrapper[4930]: I1012 06:00:04.865919 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 12 06:00:04 crc kubenswrapper[4930]: I1012 06:00:04.904431 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8d94394-2655-48d0-a53d-7ac1a17f3f71-log-httpd\") pod \"ceilometer-0\" (UID: \"c8d94394-2655-48d0-a53d-7ac1a17f3f71\") " pod="openstack/ceilometer-0" Oct 12 06:00:04 crc kubenswrapper[4930]: I1012 06:00:04.904701 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8d94394-2655-48d0-a53d-7ac1a17f3f71-run-httpd\") pod \"ceilometer-0\" (UID: \"c8d94394-2655-48d0-a53d-7ac1a17f3f71\") " pod="openstack/ceilometer-0" Oct 12 06:00:04 crc kubenswrapper[4930]: I1012 06:00:04.904824 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8d94394-2655-48d0-a53d-7ac1a17f3f71-config-data\") pod \"ceilometer-0\" (UID: \"c8d94394-2655-48d0-a53d-7ac1a17f3f71\") " pod="openstack/ceilometer-0" Oct 12 06:00:04 crc kubenswrapper[4930]: I1012 06:00:04.904967 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8d94394-2655-48d0-a53d-7ac1a17f3f71-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c8d94394-2655-48d0-a53d-7ac1a17f3f71\") " pod="openstack/ceilometer-0" Oct 12 06:00:04 crc kubenswrapper[4930]: I1012 06:00:04.905066 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c8d94394-2655-48d0-a53d-7ac1a17f3f71-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c8d94394-2655-48d0-a53d-7ac1a17f3f71\") " pod="openstack/ceilometer-0" Oct 12 06:00:04 crc kubenswrapper[4930]: I1012 06:00:04.905146 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbd9q\" (UniqueName: \"kubernetes.io/projected/c8d94394-2655-48d0-a53d-7ac1a17f3f71-kube-api-access-xbd9q\") pod \"ceilometer-0\" (UID: \"c8d94394-2655-48d0-a53d-7ac1a17f3f71\") " pod="openstack/ceilometer-0" Oct 12 06:00:04 crc kubenswrapper[4930]: I1012 06:00:04.905214 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8d94394-2655-48d0-a53d-7ac1a17f3f71-scripts\") pod \"ceilometer-0\" (UID: \"c8d94394-2655-48d0-a53d-7ac1a17f3f71\") " pod="openstack/ceilometer-0" Oct 12 06:00:05 crc kubenswrapper[4930]: I1012 06:00:05.006913 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8d94394-2655-48d0-a53d-7ac1a17f3f71-config-data\") pod \"ceilometer-0\" (UID: \"c8d94394-2655-48d0-a53d-7ac1a17f3f71\") " pod="openstack/ceilometer-0" Oct 12 06:00:05 crc kubenswrapper[4930]: I1012 06:00:05.007215 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8d94394-2655-48d0-a53d-7ac1a17f3f71-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c8d94394-2655-48d0-a53d-7ac1a17f3f71\") " pod="openstack/ceilometer-0" Oct 12 06:00:05 crc kubenswrapper[4930]: I1012 06:00:05.007321 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c8d94394-2655-48d0-a53d-7ac1a17f3f71-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c8d94394-2655-48d0-a53d-7ac1a17f3f71\") " pod="openstack/ceilometer-0" Oct 12 06:00:05 crc kubenswrapper[4930]: I1012 06:00:05.007401 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbd9q\" (UniqueName: \"kubernetes.io/projected/c8d94394-2655-48d0-a53d-7ac1a17f3f71-kube-api-access-xbd9q\") pod \"ceilometer-0\" (UID: \"c8d94394-2655-48d0-a53d-7ac1a17f3f71\") " pod="openstack/ceilometer-0" Oct 12 06:00:05 crc kubenswrapper[4930]: I1012 06:00:05.007469 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8d94394-2655-48d0-a53d-7ac1a17f3f71-scripts\") pod \"ceilometer-0\" (UID: \"c8d94394-2655-48d0-a53d-7ac1a17f3f71\") " pod="openstack/ceilometer-0" Oct 12 06:00:05 crc kubenswrapper[4930]: I1012 06:00:05.007557 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8d94394-2655-48d0-a53d-7ac1a17f3f71-log-httpd\") pod \"ceilometer-0\" (UID: \"c8d94394-2655-48d0-a53d-7ac1a17f3f71\") " pod="openstack/ceilometer-0" Oct 12 06:00:05 crc kubenswrapper[4930]: I1012 06:00:05.007644 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8d94394-2655-48d0-a53d-7ac1a17f3f71-run-httpd\") pod \"ceilometer-0\" (UID: \"c8d94394-2655-48d0-a53d-7ac1a17f3f71\") " pod="openstack/ceilometer-0" Oct 12 06:00:05 crc kubenswrapper[4930]: I1012 06:00:05.008072 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8d94394-2655-48d0-a53d-7ac1a17f3f71-run-httpd\") pod \"ceilometer-0\" (UID: \"c8d94394-2655-48d0-a53d-7ac1a17f3f71\") " pod="openstack/ceilometer-0" Oct 12 06:00:05 crc kubenswrapper[4930]: I1012 06:00:05.008343 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8d94394-2655-48d0-a53d-7ac1a17f3f71-log-httpd\") pod \"ceilometer-0\" (UID: \"c8d94394-2655-48d0-a53d-7ac1a17f3f71\") " pod="openstack/ceilometer-0" Oct 12 06:00:05 crc kubenswrapper[4930]: I1012 06:00:05.011920 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c8d94394-2655-48d0-a53d-7ac1a17f3f71-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c8d94394-2655-48d0-a53d-7ac1a17f3f71\") " pod="openstack/ceilometer-0" Oct 12 06:00:05 crc kubenswrapper[4930]: I1012 06:00:05.012224 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8d94394-2655-48d0-a53d-7ac1a17f3f71-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c8d94394-2655-48d0-a53d-7ac1a17f3f71\") " pod="openstack/ceilometer-0" Oct 12 06:00:05 crc kubenswrapper[4930]: I1012 06:00:05.012530 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8d94394-2655-48d0-a53d-7ac1a17f3f71-scripts\") pod \"ceilometer-0\" (UID: \"c8d94394-2655-48d0-a53d-7ac1a17f3f71\") " pod="openstack/ceilometer-0" Oct 12 06:00:05 crc kubenswrapper[4930]: I1012 06:00:05.012710 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8d94394-2655-48d0-a53d-7ac1a17f3f71-config-data\") pod \"ceilometer-0\" (UID: \"c8d94394-2655-48d0-a53d-7ac1a17f3f71\") " pod="openstack/ceilometer-0" Oct 12 06:00:05 crc kubenswrapper[4930]: I1012 06:00:05.034162 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbd9q\" (UniqueName: \"kubernetes.io/projected/c8d94394-2655-48d0-a53d-7ac1a17f3f71-kube-api-access-xbd9q\") pod \"ceilometer-0\" (UID: \"c8d94394-2655-48d0-a53d-7ac1a17f3f71\") " pod="openstack/ceilometer-0" Oct 12 06:00:05 crc kubenswrapper[4930]: I1012 06:00:05.175501 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 06:00:05 crc kubenswrapper[4930]: I1012 06:00:05.433828 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 12 06:00:05 crc kubenswrapper[4930]: I1012 06:00:05.632673 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 12 06:00:05 crc kubenswrapper[4930]: I1012 06:00:05.805889 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8d94394-2655-48d0-a53d-7ac1a17f3f71","Type":"ContainerStarted","Data":"dd6a1c9058dbdbf663a9416bda59b6d43b32fe0f4c5a914b93e8eec9b51be358"} Oct 12 06:00:06 crc kubenswrapper[4930]: I1012 06:00:06.146388 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2b3aab6-b6ca-44a6-92e9-50e80109fb0c" path="/var/lib/kubelet/pods/d2b3aab6-b6ca-44a6-92e9-50e80109fb0c/volumes" Oct 12 06:00:06 crc kubenswrapper[4930]: I1012 06:00:06.830402 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8d94394-2655-48d0-a53d-7ac1a17f3f71","Type":"ContainerStarted","Data":"e6e2e0a8b4334a83a934369feb105035aaace1cde6e03086b7d3d5f9cba3218f"} Oct 12 06:00:06 crc kubenswrapper[4930]: I1012 06:00:06.830469 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8d94394-2655-48d0-a53d-7ac1a17f3f71","Type":"ContainerStarted","Data":"056be5c75ad37ad8721e618a57e64ced13e4a1753bea43097fbe1396a55e79af"} Oct 12 06:00:07 crc kubenswrapper[4930]: I1012 06:00:07.849778 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8d94394-2655-48d0-a53d-7ac1a17f3f71","Type":"ContainerStarted","Data":"2ae1a90b7dda3dde9f5d2aa343d68b941c1348c68f49952d974eef4bcb547c74"} Oct 12 06:00:09 crc kubenswrapper[4930]: I1012 06:00:09.460038 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 12 06:00:10 crc kubenswrapper[4930]: I1012 06:00:10.080493 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7d8c9db847-bqfrb" Oct 12 06:00:10 crc kubenswrapper[4930]: I1012 06:00:10.093945 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7d8c9db847-bqfrb" Oct 12 06:00:11 crc kubenswrapper[4930]: I1012 06:00:11.189558 4930 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6778cd8bb8-9zhz5" podUID="30049dfb-04f2-455c-a949-08bd6ff892d0" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.161:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.161:8443: connect: connection refused" Oct 12 06:00:12 crc kubenswrapper[4930]: I1012 06:00:12.970869 4930 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 12 06:00:13 crc kubenswrapper[4930]: I1012 06:00:13.921834 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"1975f875-9e09-4d30-b5d4-2e883f13781b","Type":"ContainerStarted","Data":"f08119808f45115d2595a93221d2b9e9caf0762aab760349209eb159461f996b"} Oct 12 06:00:13 crc kubenswrapper[4930]: I1012 06:00:13.941897 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.2461755549999998 podStartE2EDuration="14.941880886s" podCreationTimestamp="2025-10-12 05:59:59 +0000 UTC" firstStartedPulling="2025-10-12 06:00:00.356793801 +0000 UTC m=+1132.898895566" lastFinishedPulling="2025-10-12 06:00:13.052499132 +0000 UTC m=+1145.594600897" observedRunningTime="2025-10-12 06:00:13.938336747 +0000 UTC m=+1146.480438512" watchObservedRunningTime="2025-10-12 06:00:13.941880886 +0000 UTC m=+1146.483982651" Oct 12 06:00:14 crc kubenswrapper[4930]: I1012 06:00:14.934469 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8d94394-2655-48d0-a53d-7ac1a17f3f71","Type":"ContainerStarted","Data":"f83603ab540d1e650f253e24b79062a86b5de5745aba40f5e015d72a8c3f893c"} Oct 12 06:00:14 crc kubenswrapper[4930]: I1012 06:00:14.934654 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c8d94394-2655-48d0-a53d-7ac1a17f3f71" containerName="ceilometer-central-agent" containerID="cri-o://056be5c75ad37ad8721e618a57e64ced13e4a1753bea43097fbe1396a55e79af" gracePeriod=30 Oct 12 06:00:14 crc kubenswrapper[4930]: I1012 06:00:14.934694 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c8d94394-2655-48d0-a53d-7ac1a17f3f71" containerName="sg-core" containerID="cri-o://2ae1a90b7dda3dde9f5d2aa343d68b941c1348c68f49952d974eef4bcb547c74" gracePeriod=30 Oct 12 06:00:14 crc kubenswrapper[4930]: I1012 06:00:14.934709 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c8d94394-2655-48d0-a53d-7ac1a17f3f71" containerName="ceilometer-notification-agent" containerID="cri-o://e6e2e0a8b4334a83a934369feb105035aaace1cde6e03086b7d3d5f9cba3218f" gracePeriod=30 Oct 12 06:00:14 crc kubenswrapper[4930]: I1012 06:00:14.934763 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c8d94394-2655-48d0-a53d-7ac1a17f3f71" containerName="proxy-httpd" containerID="cri-o://f83603ab540d1e650f253e24b79062a86b5de5745aba40f5e015d72a8c3f893c" gracePeriod=30 Oct 12 06:00:14 crc kubenswrapper[4930]: I1012 06:00:14.935049 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 12 06:00:14 crc kubenswrapper[4930]: I1012 06:00:14.975927 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.673749626 podStartE2EDuration="10.97590474s" podCreationTimestamp="2025-10-12 06:00:04 +0000 UTC" firstStartedPulling="2025-10-12 06:00:05.635875026 +0000 UTC m=+1138.177976791" lastFinishedPulling="2025-10-12 06:00:13.93803012 +0000 UTC m=+1146.480131905" observedRunningTime="2025-10-12 06:00:14.961499651 +0000 UTC m=+1147.503601456" watchObservedRunningTime="2025-10-12 06:00:14.97590474 +0000 UTC m=+1147.518006525" Oct 12 06:00:15 crc kubenswrapper[4930]: I1012 06:00:15.135837 4930 scope.go:117] "RemoveContainer" containerID="08f49b3d51ac716588f5237987a07b5659675e827f1272074de03ba6918727cd" Oct 12 06:00:15 crc kubenswrapper[4930]: E1012 06:00:15.136300 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(59f4bae4-1a84-449a-be72-e735294116e6)\"" pod="openstack/watcher-decision-engine-0" podUID="59f4bae4-1a84-449a-be72-e735294116e6" Oct 12 06:00:15 crc kubenswrapper[4930]: I1012 06:00:15.868106 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6778cd8bb8-9zhz5" Oct 12 06:00:15 crc kubenswrapper[4930]: I1012 06:00:15.930682 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30049dfb-04f2-455c-a949-08bd6ff892d0-logs\") pod \"30049dfb-04f2-455c-a949-08bd6ff892d0\" (UID: \"30049dfb-04f2-455c-a949-08bd6ff892d0\") " Oct 12 06:00:15 crc kubenswrapper[4930]: I1012 06:00:15.930829 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30049dfb-04f2-455c-a949-08bd6ff892d0-combined-ca-bundle\") pod \"30049dfb-04f2-455c-a949-08bd6ff892d0\" (UID: \"30049dfb-04f2-455c-a949-08bd6ff892d0\") " Oct 12 06:00:15 crc kubenswrapper[4930]: I1012 06:00:15.930862 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/30049dfb-04f2-455c-a949-08bd6ff892d0-horizon-secret-key\") pod \"30049dfb-04f2-455c-a949-08bd6ff892d0\" (UID: \"30049dfb-04f2-455c-a949-08bd6ff892d0\") " Oct 12 06:00:15 crc kubenswrapper[4930]: I1012 06:00:15.930940 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/30049dfb-04f2-455c-a949-08bd6ff892d0-scripts\") pod \"30049dfb-04f2-455c-a949-08bd6ff892d0\" (UID: \"30049dfb-04f2-455c-a949-08bd6ff892d0\") " Oct 12 06:00:15 crc kubenswrapper[4930]: I1012 06:00:15.931123 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30049dfb-04f2-455c-a949-08bd6ff892d0-logs" (OuterVolumeSpecName: "logs") pod "30049dfb-04f2-455c-a949-08bd6ff892d0" (UID: "30049dfb-04f2-455c-a949-08bd6ff892d0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 06:00:15 crc kubenswrapper[4930]: I1012 06:00:15.931632 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/30049dfb-04f2-455c-a949-08bd6ff892d0-horizon-tls-certs\") pod \"30049dfb-04f2-455c-a949-08bd6ff892d0\" (UID: \"30049dfb-04f2-455c-a949-08bd6ff892d0\") " Oct 12 06:00:15 crc kubenswrapper[4930]: I1012 06:00:15.931690 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2fqrn\" (UniqueName: \"kubernetes.io/projected/30049dfb-04f2-455c-a949-08bd6ff892d0-kube-api-access-2fqrn\") pod \"30049dfb-04f2-455c-a949-08bd6ff892d0\" (UID: \"30049dfb-04f2-455c-a949-08bd6ff892d0\") " Oct 12 06:00:15 crc kubenswrapper[4930]: I1012 06:00:15.931718 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/30049dfb-04f2-455c-a949-08bd6ff892d0-config-data\") pod \"30049dfb-04f2-455c-a949-08bd6ff892d0\" (UID: \"30049dfb-04f2-455c-a949-08bd6ff892d0\") " Oct 12 06:00:15 crc kubenswrapper[4930]: I1012 06:00:15.932219 4930 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30049dfb-04f2-455c-a949-08bd6ff892d0-logs\") on node \"crc\" DevicePath \"\"" Oct 12 06:00:15 crc kubenswrapper[4930]: I1012 06:00:15.936873 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30049dfb-04f2-455c-a949-08bd6ff892d0-kube-api-access-2fqrn" (OuterVolumeSpecName: "kube-api-access-2fqrn") pod "30049dfb-04f2-455c-a949-08bd6ff892d0" (UID: "30049dfb-04f2-455c-a949-08bd6ff892d0"). InnerVolumeSpecName "kube-api-access-2fqrn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:00:15 crc kubenswrapper[4930]: I1012 06:00:15.943020 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30049dfb-04f2-455c-a949-08bd6ff892d0-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "30049dfb-04f2-455c-a949-08bd6ff892d0" (UID: "30049dfb-04f2-455c-a949-08bd6ff892d0"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:00:15 crc kubenswrapper[4930]: I1012 06:00:15.956228 4930 generic.go:334] "Generic (PLEG): container finished" podID="c8d94394-2655-48d0-a53d-7ac1a17f3f71" containerID="f83603ab540d1e650f253e24b79062a86b5de5745aba40f5e015d72a8c3f893c" exitCode=0 Oct 12 06:00:15 crc kubenswrapper[4930]: I1012 06:00:15.956261 4930 generic.go:334] "Generic (PLEG): container finished" podID="c8d94394-2655-48d0-a53d-7ac1a17f3f71" containerID="2ae1a90b7dda3dde9f5d2aa343d68b941c1348c68f49952d974eef4bcb547c74" exitCode=2 Oct 12 06:00:15 crc kubenswrapper[4930]: I1012 06:00:15.956270 4930 generic.go:334] "Generic (PLEG): container finished" podID="c8d94394-2655-48d0-a53d-7ac1a17f3f71" containerID="056be5c75ad37ad8721e618a57e64ced13e4a1753bea43097fbe1396a55e79af" exitCode=0 Oct 12 06:00:15 crc kubenswrapper[4930]: I1012 06:00:15.956278 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8d94394-2655-48d0-a53d-7ac1a17f3f71","Type":"ContainerDied","Data":"f83603ab540d1e650f253e24b79062a86b5de5745aba40f5e015d72a8c3f893c"} Oct 12 06:00:15 crc kubenswrapper[4930]: I1012 06:00:15.956328 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8d94394-2655-48d0-a53d-7ac1a17f3f71","Type":"ContainerDied","Data":"2ae1a90b7dda3dde9f5d2aa343d68b941c1348c68f49952d974eef4bcb547c74"} Oct 12 06:00:15 crc kubenswrapper[4930]: I1012 06:00:15.956341 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8d94394-2655-48d0-a53d-7ac1a17f3f71","Type":"ContainerDied","Data":"056be5c75ad37ad8721e618a57e64ced13e4a1753bea43097fbe1396a55e79af"} Oct 12 06:00:15 crc kubenswrapper[4930]: I1012 06:00:15.960592 4930 generic.go:334] "Generic (PLEG): container finished" podID="30049dfb-04f2-455c-a949-08bd6ff892d0" containerID="ac8297145e46be91604457a14764e05292e58540ab5b16ca150f8ac1bf5273b1" exitCode=137 Oct 12 06:00:15 crc kubenswrapper[4930]: I1012 06:00:15.960623 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6778cd8bb8-9zhz5" event={"ID":"30049dfb-04f2-455c-a949-08bd6ff892d0","Type":"ContainerDied","Data":"ac8297145e46be91604457a14764e05292e58540ab5b16ca150f8ac1bf5273b1"} Oct 12 06:00:15 crc kubenswrapper[4930]: I1012 06:00:15.960652 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6778cd8bb8-9zhz5" Oct 12 06:00:15 crc kubenswrapper[4930]: I1012 06:00:15.960664 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6778cd8bb8-9zhz5" event={"ID":"30049dfb-04f2-455c-a949-08bd6ff892d0","Type":"ContainerDied","Data":"aee8af9e74229c0d66d27ff2d19312dda66138fb638443ac3e6fee244267604a"} Oct 12 06:00:15 crc kubenswrapper[4930]: I1012 06:00:15.960689 4930 scope.go:117] "RemoveContainer" containerID="61889e483f0c5377a242095541e1e60b08683d4d632d130c12343e4cefb25513" Oct 12 06:00:15 crc kubenswrapper[4930]: I1012 06:00:15.962444 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30049dfb-04f2-455c-a949-08bd6ff892d0-config-data" (OuterVolumeSpecName: "config-data") pod "30049dfb-04f2-455c-a949-08bd6ff892d0" (UID: "30049dfb-04f2-455c-a949-08bd6ff892d0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 06:00:15 crc kubenswrapper[4930]: I1012 06:00:15.970969 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30049dfb-04f2-455c-a949-08bd6ff892d0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "30049dfb-04f2-455c-a949-08bd6ff892d0" (UID: "30049dfb-04f2-455c-a949-08bd6ff892d0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:00:15 crc kubenswrapper[4930]: I1012 06:00:15.972543 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30049dfb-04f2-455c-a949-08bd6ff892d0-scripts" (OuterVolumeSpecName: "scripts") pod "30049dfb-04f2-455c-a949-08bd6ff892d0" (UID: "30049dfb-04f2-455c-a949-08bd6ff892d0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 06:00:16 crc kubenswrapper[4930]: I1012 06:00:16.016420 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30049dfb-04f2-455c-a949-08bd6ff892d0-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "30049dfb-04f2-455c-a949-08bd6ff892d0" (UID: "30049dfb-04f2-455c-a949-08bd6ff892d0"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:00:16 crc kubenswrapper[4930]: I1012 06:00:16.033598 4930 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30049dfb-04f2-455c-a949-08bd6ff892d0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 06:00:16 crc kubenswrapper[4930]: I1012 06:00:16.033631 4930 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/30049dfb-04f2-455c-a949-08bd6ff892d0-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 12 06:00:16 crc kubenswrapper[4930]: I1012 06:00:16.033645 4930 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/30049dfb-04f2-455c-a949-08bd6ff892d0-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 06:00:16 crc kubenswrapper[4930]: I1012 06:00:16.033657 4930 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/30049dfb-04f2-455c-a949-08bd6ff892d0-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 12 06:00:16 crc kubenswrapper[4930]: I1012 06:00:16.033668 4930 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/30049dfb-04f2-455c-a949-08bd6ff892d0-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 06:00:16 crc kubenswrapper[4930]: I1012 06:00:16.033681 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2fqrn\" (UniqueName: \"kubernetes.io/projected/30049dfb-04f2-455c-a949-08bd6ff892d0-kube-api-access-2fqrn\") on node \"crc\" DevicePath \"\"" Oct 12 06:00:16 crc kubenswrapper[4930]: I1012 06:00:16.166060 4930 scope.go:117] "RemoveContainer" containerID="ac8297145e46be91604457a14764e05292e58540ab5b16ca150f8ac1bf5273b1" Oct 12 06:00:16 crc kubenswrapper[4930]: I1012 06:00:16.214692 4930 scope.go:117] "RemoveContainer" containerID="61889e483f0c5377a242095541e1e60b08683d4d632d130c12343e4cefb25513" Oct 12 06:00:16 crc kubenswrapper[4930]: E1012 06:00:16.215748 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61889e483f0c5377a242095541e1e60b08683d4d632d130c12343e4cefb25513\": container with ID starting with 61889e483f0c5377a242095541e1e60b08683d4d632d130c12343e4cefb25513 not found: ID does not exist" containerID="61889e483f0c5377a242095541e1e60b08683d4d632d130c12343e4cefb25513" Oct 12 06:00:16 crc kubenswrapper[4930]: I1012 06:00:16.215804 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61889e483f0c5377a242095541e1e60b08683d4d632d130c12343e4cefb25513"} err="failed to get container status \"61889e483f0c5377a242095541e1e60b08683d4d632d130c12343e4cefb25513\": rpc error: code = NotFound desc = could not find container \"61889e483f0c5377a242095541e1e60b08683d4d632d130c12343e4cefb25513\": container with ID starting with 61889e483f0c5377a242095541e1e60b08683d4d632d130c12343e4cefb25513 not found: ID does not exist" Oct 12 06:00:16 crc kubenswrapper[4930]: I1012 06:00:16.215825 4930 scope.go:117] "RemoveContainer" containerID="ac8297145e46be91604457a14764e05292e58540ab5b16ca150f8ac1bf5273b1" Oct 12 06:00:16 crc kubenswrapper[4930]: E1012 06:00:16.216249 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac8297145e46be91604457a14764e05292e58540ab5b16ca150f8ac1bf5273b1\": container with ID starting with ac8297145e46be91604457a14764e05292e58540ab5b16ca150f8ac1bf5273b1 not found: ID does not exist" containerID="ac8297145e46be91604457a14764e05292e58540ab5b16ca150f8ac1bf5273b1" Oct 12 06:00:16 crc kubenswrapper[4930]: I1012 06:00:16.216273 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac8297145e46be91604457a14764e05292e58540ab5b16ca150f8ac1bf5273b1"} err="failed to get container status \"ac8297145e46be91604457a14764e05292e58540ab5b16ca150f8ac1bf5273b1\": rpc error: code = NotFound desc = could not find container \"ac8297145e46be91604457a14764e05292e58540ab5b16ca150f8ac1bf5273b1\": container with ID starting with ac8297145e46be91604457a14764e05292e58540ab5b16ca150f8ac1bf5273b1 not found: ID does not exist" Oct 12 06:00:16 crc kubenswrapper[4930]: I1012 06:00:16.283294 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6778cd8bb8-9zhz5"] Oct 12 06:00:16 crc kubenswrapper[4930]: I1012 06:00:16.288805 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6778cd8bb8-9zhz5"] Oct 12 06:00:16 crc kubenswrapper[4930]: I1012 06:00:16.643840 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 12 06:00:16 crc kubenswrapper[4930]: I1012 06:00:16.749064 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9354ea26-d24c-49ef-b564-b58a6f3e4f3f-config-data\") pod \"9354ea26-d24c-49ef-b564-b58a6f3e4f3f\" (UID: \"9354ea26-d24c-49ef-b564-b58a6f3e4f3f\") " Oct 12 06:00:16 crc kubenswrapper[4930]: I1012 06:00:16.749246 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6gdv\" (UniqueName: \"kubernetes.io/projected/9354ea26-d24c-49ef-b564-b58a6f3e4f3f-kube-api-access-t6gdv\") pod \"9354ea26-d24c-49ef-b564-b58a6f3e4f3f\" (UID: \"9354ea26-d24c-49ef-b564-b58a6f3e4f3f\") " Oct 12 06:00:16 crc kubenswrapper[4930]: I1012 06:00:16.749313 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9354ea26-d24c-49ef-b564-b58a6f3e4f3f-scripts\") pod \"9354ea26-d24c-49ef-b564-b58a6f3e4f3f\" (UID: \"9354ea26-d24c-49ef-b564-b58a6f3e4f3f\") " Oct 12 06:00:16 crc kubenswrapper[4930]: I1012 06:00:16.749335 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9354ea26-d24c-49ef-b564-b58a6f3e4f3f-config-data-custom\") pod \"9354ea26-d24c-49ef-b564-b58a6f3e4f3f\" (UID: \"9354ea26-d24c-49ef-b564-b58a6f3e4f3f\") " Oct 12 06:00:16 crc kubenswrapper[4930]: I1012 06:00:16.749391 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9354ea26-d24c-49ef-b564-b58a6f3e4f3f-logs\") pod \"9354ea26-d24c-49ef-b564-b58a6f3e4f3f\" (UID: \"9354ea26-d24c-49ef-b564-b58a6f3e4f3f\") " Oct 12 06:00:16 crc kubenswrapper[4930]: I1012 06:00:16.749411 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9354ea26-d24c-49ef-b564-b58a6f3e4f3f-etc-machine-id\") pod \"9354ea26-d24c-49ef-b564-b58a6f3e4f3f\" (UID: \"9354ea26-d24c-49ef-b564-b58a6f3e4f3f\") " Oct 12 06:00:16 crc kubenswrapper[4930]: I1012 06:00:16.749447 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9354ea26-d24c-49ef-b564-b58a6f3e4f3f-combined-ca-bundle\") pod \"9354ea26-d24c-49ef-b564-b58a6f3e4f3f\" (UID: \"9354ea26-d24c-49ef-b564-b58a6f3e4f3f\") " Oct 12 06:00:16 crc kubenswrapper[4930]: I1012 06:00:16.749552 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9354ea26-d24c-49ef-b564-b58a6f3e4f3f-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "9354ea26-d24c-49ef-b564-b58a6f3e4f3f" (UID: "9354ea26-d24c-49ef-b564-b58a6f3e4f3f"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 06:00:16 crc kubenswrapper[4930]: I1012 06:00:16.749967 4930 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9354ea26-d24c-49ef-b564-b58a6f3e4f3f-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 12 06:00:16 crc kubenswrapper[4930]: I1012 06:00:16.750086 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9354ea26-d24c-49ef-b564-b58a6f3e4f3f-logs" (OuterVolumeSpecName: "logs") pod "9354ea26-d24c-49ef-b564-b58a6f3e4f3f" (UID: "9354ea26-d24c-49ef-b564-b58a6f3e4f3f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 06:00:16 crc kubenswrapper[4930]: I1012 06:00:16.754051 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9354ea26-d24c-49ef-b564-b58a6f3e4f3f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "9354ea26-d24c-49ef-b564-b58a6f3e4f3f" (UID: "9354ea26-d24c-49ef-b564-b58a6f3e4f3f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:00:16 crc kubenswrapper[4930]: I1012 06:00:16.757839 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9354ea26-d24c-49ef-b564-b58a6f3e4f3f-scripts" (OuterVolumeSpecName: "scripts") pod "9354ea26-d24c-49ef-b564-b58a6f3e4f3f" (UID: "9354ea26-d24c-49ef-b564-b58a6f3e4f3f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:00:16 crc kubenswrapper[4930]: I1012 06:00:16.761922 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9354ea26-d24c-49ef-b564-b58a6f3e4f3f-kube-api-access-t6gdv" (OuterVolumeSpecName: "kube-api-access-t6gdv") pod "9354ea26-d24c-49ef-b564-b58a6f3e4f3f" (UID: "9354ea26-d24c-49ef-b564-b58a6f3e4f3f"). InnerVolumeSpecName "kube-api-access-t6gdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:00:16 crc kubenswrapper[4930]: I1012 06:00:16.786080 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9354ea26-d24c-49ef-b564-b58a6f3e4f3f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9354ea26-d24c-49ef-b564-b58a6f3e4f3f" (UID: "9354ea26-d24c-49ef-b564-b58a6f3e4f3f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:00:16 crc kubenswrapper[4930]: I1012 06:00:16.821251 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9354ea26-d24c-49ef-b564-b58a6f3e4f3f-config-data" (OuterVolumeSpecName: "config-data") pod "9354ea26-d24c-49ef-b564-b58a6f3e4f3f" (UID: "9354ea26-d24c-49ef-b564-b58a6f3e4f3f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:00:16 crc kubenswrapper[4930]: I1012 06:00:16.851849 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6gdv\" (UniqueName: \"kubernetes.io/projected/9354ea26-d24c-49ef-b564-b58a6f3e4f3f-kube-api-access-t6gdv\") on node \"crc\" DevicePath \"\"" Oct 12 06:00:16 crc kubenswrapper[4930]: I1012 06:00:16.851884 4930 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9354ea26-d24c-49ef-b564-b58a6f3e4f3f-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 06:00:16 crc kubenswrapper[4930]: I1012 06:00:16.851895 4930 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9354ea26-d24c-49ef-b564-b58a6f3e4f3f-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 12 06:00:16 crc kubenswrapper[4930]: I1012 06:00:16.851906 4930 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9354ea26-d24c-49ef-b564-b58a6f3e4f3f-logs\") on node \"crc\" DevicePath \"\"" Oct 12 06:00:16 crc kubenswrapper[4930]: I1012 06:00:16.851916 4930 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9354ea26-d24c-49ef-b564-b58a6f3e4f3f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 06:00:16 crc kubenswrapper[4930]: I1012 06:00:16.851925 4930 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9354ea26-d24c-49ef-b564-b58a6f3e4f3f-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 06:00:16 crc kubenswrapper[4930]: I1012 06:00:16.971155 4930 generic.go:334] "Generic (PLEG): container finished" podID="9354ea26-d24c-49ef-b564-b58a6f3e4f3f" containerID="28bbb5fa395f28f4e7ae80c42e89be51dda90d3198143558c12ff31e7215c216" exitCode=137 Oct 12 06:00:16 crc kubenswrapper[4930]: I1012 06:00:16.971213 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 12 06:00:16 crc kubenswrapper[4930]: I1012 06:00:16.971257 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9354ea26-d24c-49ef-b564-b58a6f3e4f3f","Type":"ContainerDied","Data":"28bbb5fa395f28f4e7ae80c42e89be51dda90d3198143558c12ff31e7215c216"} Oct 12 06:00:16 crc kubenswrapper[4930]: I1012 06:00:16.971315 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9354ea26-d24c-49ef-b564-b58a6f3e4f3f","Type":"ContainerDied","Data":"83268b87516f300737ac7ca3b2e22a11166c24f847642b838202ca242ebb05b1"} Oct 12 06:00:16 crc kubenswrapper[4930]: I1012 06:00:16.971342 4930 scope.go:117] "RemoveContainer" containerID="28bbb5fa395f28f4e7ae80c42e89be51dda90d3198143558c12ff31e7215c216" Oct 12 06:00:16 crc kubenswrapper[4930]: I1012 06:00:16.992054 4930 scope.go:117] "RemoveContainer" containerID="ed8bf4918693de7f7434f05fe2be6d14f443570721d49aecde6d90044bbea9b7" Oct 12 06:00:17 crc kubenswrapper[4930]: I1012 06:00:17.011505 4930 scope.go:117] "RemoveContainer" containerID="28bbb5fa395f28f4e7ae80c42e89be51dda90d3198143558c12ff31e7215c216" Oct 12 06:00:17 crc kubenswrapper[4930]: E1012 06:00:17.011981 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28bbb5fa395f28f4e7ae80c42e89be51dda90d3198143558c12ff31e7215c216\": container with ID starting with 28bbb5fa395f28f4e7ae80c42e89be51dda90d3198143558c12ff31e7215c216 not found: ID does not exist" containerID="28bbb5fa395f28f4e7ae80c42e89be51dda90d3198143558c12ff31e7215c216" Oct 12 06:00:17 crc kubenswrapper[4930]: I1012 06:00:17.012028 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28bbb5fa395f28f4e7ae80c42e89be51dda90d3198143558c12ff31e7215c216"} err="failed to get container status \"28bbb5fa395f28f4e7ae80c42e89be51dda90d3198143558c12ff31e7215c216\": rpc error: code = NotFound desc = could not find container \"28bbb5fa395f28f4e7ae80c42e89be51dda90d3198143558c12ff31e7215c216\": container with ID starting with 28bbb5fa395f28f4e7ae80c42e89be51dda90d3198143558c12ff31e7215c216 not found: ID does not exist" Oct 12 06:00:17 crc kubenswrapper[4930]: I1012 06:00:17.012060 4930 scope.go:117] "RemoveContainer" containerID="ed8bf4918693de7f7434f05fe2be6d14f443570721d49aecde6d90044bbea9b7" Oct 12 06:00:17 crc kubenswrapper[4930]: E1012 06:00:17.012338 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed8bf4918693de7f7434f05fe2be6d14f443570721d49aecde6d90044bbea9b7\": container with ID starting with ed8bf4918693de7f7434f05fe2be6d14f443570721d49aecde6d90044bbea9b7 not found: ID does not exist" containerID="ed8bf4918693de7f7434f05fe2be6d14f443570721d49aecde6d90044bbea9b7" Oct 12 06:00:17 crc kubenswrapper[4930]: I1012 06:00:17.012378 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed8bf4918693de7f7434f05fe2be6d14f443570721d49aecde6d90044bbea9b7"} err="failed to get container status \"ed8bf4918693de7f7434f05fe2be6d14f443570721d49aecde6d90044bbea9b7\": rpc error: code = NotFound desc = could not find container \"ed8bf4918693de7f7434f05fe2be6d14f443570721d49aecde6d90044bbea9b7\": container with ID starting with ed8bf4918693de7f7434f05fe2be6d14f443570721d49aecde6d90044bbea9b7 not found: ID does not exist" Oct 12 06:00:17 crc kubenswrapper[4930]: I1012 06:00:17.038092 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 12 06:00:17 crc kubenswrapper[4930]: I1012 06:00:17.048519 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 12 06:00:17 crc kubenswrapper[4930]: I1012 06:00:17.060789 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 12 06:00:17 crc kubenswrapper[4930]: E1012 06:00:17.061284 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30049dfb-04f2-455c-a949-08bd6ff892d0" containerName="horizon" Oct 12 06:00:17 crc kubenswrapper[4930]: I1012 06:00:17.061304 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="30049dfb-04f2-455c-a949-08bd6ff892d0" containerName="horizon" Oct 12 06:00:17 crc kubenswrapper[4930]: E1012 06:00:17.061355 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9354ea26-d24c-49ef-b564-b58a6f3e4f3f" containerName="cinder-api" Oct 12 06:00:17 crc kubenswrapper[4930]: I1012 06:00:17.061365 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="9354ea26-d24c-49ef-b564-b58a6f3e4f3f" containerName="cinder-api" Oct 12 06:00:17 crc kubenswrapper[4930]: E1012 06:00:17.061380 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30049dfb-04f2-455c-a949-08bd6ff892d0" containerName="horizon-log" Oct 12 06:00:17 crc kubenswrapper[4930]: I1012 06:00:17.061388 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="30049dfb-04f2-455c-a949-08bd6ff892d0" containerName="horizon-log" Oct 12 06:00:17 crc kubenswrapper[4930]: E1012 06:00:17.061406 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9354ea26-d24c-49ef-b564-b58a6f3e4f3f" containerName="cinder-api-log" Oct 12 06:00:17 crc kubenswrapper[4930]: I1012 06:00:17.061415 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="9354ea26-d24c-49ef-b564-b58a6f3e4f3f" containerName="cinder-api-log" Oct 12 06:00:17 crc kubenswrapper[4930]: I1012 06:00:17.061642 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="9354ea26-d24c-49ef-b564-b58a6f3e4f3f" containerName="cinder-api-log" Oct 12 06:00:17 crc kubenswrapper[4930]: I1012 06:00:17.061670 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="30049dfb-04f2-455c-a949-08bd6ff892d0" containerName="horizon" Oct 12 06:00:17 crc kubenswrapper[4930]: I1012 06:00:17.061689 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="30049dfb-04f2-455c-a949-08bd6ff892d0" containerName="horizon-log" Oct 12 06:00:17 crc kubenswrapper[4930]: I1012 06:00:17.061702 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="9354ea26-d24c-49ef-b564-b58a6f3e4f3f" containerName="cinder-api" Oct 12 06:00:17 crc kubenswrapper[4930]: I1012 06:00:17.063058 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 12 06:00:17 crc kubenswrapper[4930]: I1012 06:00:17.065956 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Oct 12 06:00:17 crc kubenswrapper[4930]: I1012 06:00:17.066010 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Oct 12 06:00:17 crc kubenswrapper[4930]: I1012 06:00:17.065975 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 12 06:00:17 crc kubenswrapper[4930]: I1012 06:00:17.072800 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 12 06:00:17 crc kubenswrapper[4930]: I1012 06:00:17.157179 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/eceafd59-b491-4468-b6b6-78fe1c689e6b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"eceafd59-b491-4468-b6b6-78fe1c689e6b\") " pod="openstack/cinder-api-0" Oct 12 06:00:17 crc kubenswrapper[4930]: I1012 06:00:17.157237 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eceafd59-b491-4468-b6b6-78fe1c689e6b-scripts\") pod \"cinder-api-0\" (UID: \"eceafd59-b491-4468-b6b6-78fe1c689e6b\") " pod="openstack/cinder-api-0" Oct 12 06:00:17 crc kubenswrapper[4930]: I1012 06:00:17.157322 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eceafd59-b491-4468-b6b6-78fe1c689e6b-logs\") pod \"cinder-api-0\" (UID: \"eceafd59-b491-4468-b6b6-78fe1c689e6b\") " pod="openstack/cinder-api-0" Oct 12 06:00:17 crc kubenswrapper[4930]: I1012 06:00:17.157343 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eceafd59-b491-4468-b6b6-78fe1c689e6b-public-tls-certs\") pod \"cinder-api-0\" (UID: \"eceafd59-b491-4468-b6b6-78fe1c689e6b\") " pod="openstack/cinder-api-0" Oct 12 06:00:17 crc kubenswrapper[4930]: I1012 06:00:17.157358 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eceafd59-b491-4468-b6b6-78fe1c689e6b-config-data\") pod \"cinder-api-0\" (UID: \"eceafd59-b491-4468-b6b6-78fe1c689e6b\") " pod="openstack/cinder-api-0" Oct 12 06:00:17 crc kubenswrapper[4930]: I1012 06:00:17.157378 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eceafd59-b491-4468-b6b6-78fe1c689e6b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"eceafd59-b491-4468-b6b6-78fe1c689e6b\") " pod="openstack/cinder-api-0" Oct 12 06:00:17 crc kubenswrapper[4930]: I1012 06:00:17.157395 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eceafd59-b491-4468-b6b6-78fe1c689e6b-config-data-custom\") pod \"cinder-api-0\" (UID: \"eceafd59-b491-4468-b6b6-78fe1c689e6b\") " pod="openstack/cinder-api-0" Oct 12 06:00:17 crc kubenswrapper[4930]: I1012 06:00:17.157470 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45v6w\" (UniqueName: \"kubernetes.io/projected/eceafd59-b491-4468-b6b6-78fe1c689e6b-kube-api-access-45v6w\") pod \"cinder-api-0\" (UID: \"eceafd59-b491-4468-b6b6-78fe1c689e6b\") " pod="openstack/cinder-api-0" Oct 12 06:00:17 crc kubenswrapper[4930]: I1012 06:00:17.157507 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eceafd59-b491-4468-b6b6-78fe1c689e6b-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"eceafd59-b491-4468-b6b6-78fe1c689e6b\") " pod="openstack/cinder-api-0" Oct 12 06:00:17 crc kubenswrapper[4930]: I1012 06:00:17.259121 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45v6w\" (UniqueName: \"kubernetes.io/projected/eceafd59-b491-4468-b6b6-78fe1c689e6b-kube-api-access-45v6w\") pod \"cinder-api-0\" (UID: \"eceafd59-b491-4468-b6b6-78fe1c689e6b\") " pod="openstack/cinder-api-0" Oct 12 06:00:17 crc kubenswrapper[4930]: I1012 06:00:17.259203 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eceafd59-b491-4468-b6b6-78fe1c689e6b-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"eceafd59-b491-4468-b6b6-78fe1c689e6b\") " pod="openstack/cinder-api-0" Oct 12 06:00:17 crc kubenswrapper[4930]: I1012 06:00:17.259364 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/eceafd59-b491-4468-b6b6-78fe1c689e6b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"eceafd59-b491-4468-b6b6-78fe1c689e6b\") " pod="openstack/cinder-api-0" Oct 12 06:00:17 crc kubenswrapper[4930]: I1012 06:00:17.259518 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/eceafd59-b491-4468-b6b6-78fe1c689e6b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"eceafd59-b491-4468-b6b6-78fe1c689e6b\") " pod="openstack/cinder-api-0" Oct 12 06:00:17 crc kubenswrapper[4930]: I1012 06:00:17.260195 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eceafd59-b491-4468-b6b6-78fe1c689e6b-scripts\") pod \"cinder-api-0\" (UID: \"eceafd59-b491-4468-b6b6-78fe1c689e6b\") " pod="openstack/cinder-api-0" Oct 12 06:00:17 crc kubenswrapper[4930]: I1012 06:00:17.261017 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eceafd59-b491-4468-b6b6-78fe1c689e6b-logs\") pod \"cinder-api-0\" (UID: \"eceafd59-b491-4468-b6b6-78fe1c689e6b\") " pod="openstack/cinder-api-0" Oct 12 06:00:17 crc kubenswrapper[4930]: I1012 06:00:17.261056 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eceafd59-b491-4468-b6b6-78fe1c689e6b-public-tls-certs\") pod \"cinder-api-0\" (UID: \"eceafd59-b491-4468-b6b6-78fe1c689e6b\") " pod="openstack/cinder-api-0" Oct 12 06:00:17 crc kubenswrapper[4930]: I1012 06:00:17.261083 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eceafd59-b491-4468-b6b6-78fe1c689e6b-config-data\") pod \"cinder-api-0\" (UID: \"eceafd59-b491-4468-b6b6-78fe1c689e6b\") " pod="openstack/cinder-api-0" Oct 12 06:00:17 crc kubenswrapper[4930]: I1012 06:00:17.261170 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eceafd59-b491-4468-b6b6-78fe1c689e6b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"eceafd59-b491-4468-b6b6-78fe1c689e6b\") " pod="openstack/cinder-api-0" Oct 12 06:00:17 crc kubenswrapper[4930]: I1012 06:00:17.261219 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eceafd59-b491-4468-b6b6-78fe1c689e6b-config-data-custom\") pod \"cinder-api-0\" (UID: \"eceafd59-b491-4468-b6b6-78fe1c689e6b\") " pod="openstack/cinder-api-0" Oct 12 06:00:17 crc kubenswrapper[4930]: I1012 06:00:17.261653 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eceafd59-b491-4468-b6b6-78fe1c689e6b-logs\") pod \"cinder-api-0\" (UID: \"eceafd59-b491-4468-b6b6-78fe1c689e6b\") " pod="openstack/cinder-api-0" Oct 12 06:00:17 crc kubenswrapper[4930]: I1012 06:00:17.264356 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eceafd59-b491-4468-b6b6-78fe1c689e6b-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"eceafd59-b491-4468-b6b6-78fe1c689e6b\") " pod="openstack/cinder-api-0" Oct 12 06:00:17 crc kubenswrapper[4930]: I1012 06:00:17.265351 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eceafd59-b491-4468-b6b6-78fe1c689e6b-public-tls-certs\") pod \"cinder-api-0\" (UID: \"eceafd59-b491-4468-b6b6-78fe1c689e6b\") " pod="openstack/cinder-api-0" Oct 12 06:00:17 crc kubenswrapper[4930]: I1012 06:00:17.266679 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eceafd59-b491-4468-b6b6-78fe1c689e6b-config-data\") pod \"cinder-api-0\" (UID: \"eceafd59-b491-4468-b6b6-78fe1c689e6b\") " pod="openstack/cinder-api-0" Oct 12 06:00:17 crc kubenswrapper[4930]: I1012 06:00:17.268539 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eceafd59-b491-4468-b6b6-78fe1c689e6b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"eceafd59-b491-4468-b6b6-78fe1c689e6b\") " pod="openstack/cinder-api-0" Oct 12 06:00:17 crc kubenswrapper[4930]: I1012 06:00:17.269137 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eceafd59-b491-4468-b6b6-78fe1c689e6b-scripts\") pod \"cinder-api-0\" (UID: \"eceafd59-b491-4468-b6b6-78fe1c689e6b\") " pod="openstack/cinder-api-0" Oct 12 06:00:17 crc kubenswrapper[4930]: I1012 06:00:17.278408 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eceafd59-b491-4468-b6b6-78fe1c689e6b-config-data-custom\") pod \"cinder-api-0\" (UID: \"eceafd59-b491-4468-b6b6-78fe1c689e6b\") " pod="openstack/cinder-api-0" Oct 12 06:00:17 crc kubenswrapper[4930]: I1012 06:00:17.287544 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45v6w\" (UniqueName: \"kubernetes.io/projected/eceafd59-b491-4468-b6b6-78fe1c689e6b-kube-api-access-45v6w\") pod \"cinder-api-0\" (UID: \"eceafd59-b491-4468-b6b6-78fe1c689e6b\") " pod="openstack/cinder-api-0" Oct 12 06:00:17 crc kubenswrapper[4930]: I1012 06:00:17.392751 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 12 06:00:17 crc kubenswrapper[4930]: I1012 06:00:17.811407 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 06:00:17 crc kubenswrapper[4930]: I1012 06:00:17.884926 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8d94394-2655-48d0-a53d-7ac1a17f3f71-run-httpd\") pod \"c8d94394-2655-48d0-a53d-7ac1a17f3f71\" (UID: \"c8d94394-2655-48d0-a53d-7ac1a17f3f71\") " Oct 12 06:00:17 crc kubenswrapper[4930]: I1012 06:00:17.885097 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8d94394-2655-48d0-a53d-7ac1a17f3f71-config-data\") pod \"c8d94394-2655-48d0-a53d-7ac1a17f3f71\" (UID: \"c8d94394-2655-48d0-a53d-7ac1a17f3f71\") " Oct 12 06:00:17 crc kubenswrapper[4930]: I1012 06:00:17.885180 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8d94394-2655-48d0-a53d-7ac1a17f3f71-combined-ca-bundle\") pod \"c8d94394-2655-48d0-a53d-7ac1a17f3f71\" (UID: \"c8d94394-2655-48d0-a53d-7ac1a17f3f71\") " Oct 12 06:00:17 crc kubenswrapper[4930]: I1012 06:00:17.885210 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbd9q\" (UniqueName: \"kubernetes.io/projected/c8d94394-2655-48d0-a53d-7ac1a17f3f71-kube-api-access-xbd9q\") pod \"c8d94394-2655-48d0-a53d-7ac1a17f3f71\" (UID: \"c8d94394-2655-48d0-a53d-7ac1a17f3f71\") " Oct 12 06:00:17 crc kubenswrapper[4930]: I1012 06:00:17.885491 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c8d94394-2655-48d0-a53d-7ac1a17f3f71-sg-core-conf-yaml\") pod \"c8d94394-2655-48d0-a53d-7ac1a17f3f71\" (UID: \"c8d94394-2655-48d0-a53d-7ac1a17f3f71\") " Oct 12 06:00:17 crc kubenswrapper[4930]: I1012 06:00:17.885517 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8d94394-2655-48d0-a53d-7ac1a17f3f71-scripts\") pod \"c8d94394-2655-48d0-a53d-7ac1a17f3f71\" (UID: \"c8d94394-2655-48d0-a53d-7ac1a17f3f71\") " Oct 12 06:00:17 crc kubenswrapper[4930]: I1012 06:00:17.885555 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8d94394-2655-48d0-a53d-7ac1a17f3f71-log-httpd\") pod \"c8d94394-2655-48d0-a53d-7ac1a17f3f71\" (UID: \"c8d94394-2655-48d0-a53d-7ac1a17f3f71\") " Oct 12 06:00:17 crc kubenswrapper[4930]: I1012 06:00:17.885715 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8d94394-2655-48d0-a53d-7ac1a17f3f71-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c8d94394-2655-48d0-a53d-7ac1a17f3f71" (UID: "c8d94394-2655-48d0-a53d-7ac1a17f3f71"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 06:00:17 crc kubenswrapper[4930]: I1012 06:00:17.886293 4930 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8d94394-2655-48d0-a53d-7ac1a17f3f71-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 12 06:00:17 crc kubenswrapper[4930]: I1012 06:00:17.887269 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8d94394-2655-48d0-a53d-7ac1a17f3f71-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c8d94394-2655-48d0-a53d-7ac1a17f3f71" (UID: "c8d94394-2655-48d0-a53d-7ac1a17f3f71"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 06:00:17 crc kubenswrapper[4930]: I1012 06:00:17.891718 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8d94394-2655-48d0-a53d-7ac1a17f3f71-kube-api-access-xbd9q" (OuterVolumeSpecName: "kube-api-access-xbd9q") pod "c8d94394-2655-48d0-a53d-7ac1a17f3f71" (UID: "c8d94394-2655-48d0-a53d-7ac1a17f3f71"). InnerVolumeSpecName "kube-api-access-xbd9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:00:17 crc kubenswrapper[4930]: I1012 06:00:17.896830 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8d94394-2655-48d0-a53d-7ac1a17f3f71-scripts" (OuterVolumeSpecName: "scripts") pod "c8d94394-2655-48d0-a53d-7ac1a17f3f71" (UID: "c8d94394-2655-48d0-a53d-7ac1a17f3f71"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:00:17 crc kubenswrapper[4930]: I1012 06:00:17.925770 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8d94394-2655-48d0-a53d-7ac1a17f3f71-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c8d94394-2655-48d0-a53d-7ac1a17f3f71" (UID: "c8d94394-2655-48d0-a53d-7ac1a17f3f71"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:00:17 crc kubenswrapper[4930]: I1012 06:00:17.984010 4930 generic.go:334] "Generic (PLEG): container finished" podID="c8d94394-2655-48d0-a53d-7ac1a17f3f71" containerID="e6e2e0a8b4334a83a934369feb105035aaace1cde6e03086b7d3d5f9cba3218f" exitCode=0 Oct 12 06:00:17 crc kubenswrapper[4930]: I1012 06:00:17.984082 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8d94394-2655-48d0-a53d-7ac1a17f3f71","Type":"ContainerDied","Data":"e6e2e0a8b4334a83a934369feb105035aaace1cde6e03086b7d3d5f9cba3218f"} Oct 12 06:00:17 crc kubenswrapper[4930]: I1012 06:00:17.984108 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8d94394-2655-48d0-a53d-7ac1a17f3f71","Type":"ContainerDied","Data":"dd6a1c9058dbdbf663a9416bda59b6d43b32fe0f4c5a914b93e8eec9b51be358"} Oct 12 06:00:17 crc kubenswrapper[4930]: I1012 06:00:17.984124 4930 scope.go:117] "RemoveContainer" containerID="f83603ab540d1e650f253e24b79062a86b5de5745aba40f5e015d72a8c3f893c" Oct 12 06:00:17 crc kubenswrapper[4930]: I1012 06:00:17.984267 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 06:00:17 crc kubenswrapper[4930]: I1012 06:00:17.987254 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbd9q\" (UniqueName: \"kubernetes.io/projected/c8d94394-2655-48d0-a53d-7ac1a17f3f71-kube-api-access-xbd9q\") on node \"crc\" DevicePath \"\"" Oct 12 06:00:17 crc kubenswrapper[4930]: I1012 06:00:17.987276 4930 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c8d94394-2655-48d0-a53d-7ac1a17f3f71-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 12 06:00:17 crc kubenswrapper[4930]: I1012 06:00:17.987285 4930 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8d94394-2655-48d0-a53d-7ac1a17f3f71-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 06:00:17 crc kubenswrapper[4930]: I1012 06:00:17.987295 4930 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8d94394-2655-48d0-a53d-7ac1a17f3f71-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 12 06:00:17 crc kubenswrapper[4930]: I1012 06:00:17.987852 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8d94394-2655-48d0-a53d-7ac1a17f3f71-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c8d94394-2655-48d0-a53d-7ac1a17f3f71" (UID: "c8d94394-2655-48d0-a53d-7ac1a17f3f71"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:00:18 crc kubenswrapper[4930]: I1012 06:00:18.002011 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8d94394-2655-48d0-a53d-7ac1a17f3f71-config-data" (OuterVolumeSpecName: "config-data") pod "c8d94394-2655-48d0-a53d-7ac1a17f3f71" (UID: "c8d94394-2655-48d0-a53d-7ac1a17f3f71"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:00:18 crc kubenswrapper[4930]: I1012 06:00:18.007873 4930 scope.go:117] "RemoveContainer" containerID="2ae1a90b7dda3dde9f5d2aa343d68b941c1348c68f49952d974eef4bcb547c74" Oct 12 06:00:18 crc kubenswrapper[4930]: I1012 06:00:18.028080 4930 scope.go:117] "RemoveContainer" containerID="e6e2e0a8b4334a83a934369feb105035aaace1cde6e03086b7d3d5f9cba3218f" Oct 12 06:00:18 crc kubenswrapper[4930]: I1012 06:00:18.028270 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 12 06:00:18 crc kubenswrapper[4930]: W1012 06:00:18.033759 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeceafd59_b491_4468_b6b6_78fe1c689e6b.slice/crio-9b67820f6a3e34704f41838418749aab11c43dc9f1f2c4472ac2ee934f173b72 WatchSource:0}: Error finding container 9b67820f6a3e34704f41838418749aab11c43dc9f1f2c4472ac2ee934f173b72: Status 404 returned error can't find the container with id 9b67820f6a3e34704f41838418749aab11c43dc9f1f2c4472ac2ee934f173b72 Oct 12 06:00:18 crc kubenswrapper[4930]: I1012 06:00:18.055992 4930 scope.go:117] "RemoveContainer" containerID="056be5c75ad37ad8721e618a57e64ced13e4a1753bea43097fbe1396a55e79af" Oct 12 06:00:18 crc kubenswrapper[4930]: I1012 06:00:18.087319 4930 scope.go:117] "RemoveContainer" containerID="f83603ab540d1e650f253e24b79062a86b5de5745aba40f5e015d72a8c3f893c" Oct 12 06:00:18 crc kubenswrapper[4930]: E1012 06:00:18.087802 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f83603ab540d1e650f253e24b79062a86b5de5745aba40f5e015d72a8c3f893c\": container with ID starting with f83603ab540d1e650f253e24b79062a86b5de5745aba40f5e015d72a8c3f893c not found: ID does not exist" containerID="f83603ab540d1e650f253e24b79062a86b5de5745aba40f5e015d72a8c3f893c" Oct 12 06:00:18 crc kubenswrapper[4930]: I1012 06:00:18.087839 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f83603ab540d1e650f253e24b79062a86b5de5745aba40f5e015d72a8c3f893c"} err="failed to get container status \"f83603ab540d1e650f253e24b79062a86b5de5745aba40f5e015d72a8c3f893c\": rpc error: code = NotFound desc = could not find container \"f83603ab540d1e650f253e24b79062a86b5de5745aba40f5e015d72a8c3f893c\": container with ID starting with f83603ab540d1e650f253e24b79062a86b5de5745aba40f5e015d72a8c3f893c not found: ID does not exist" Oct 12 06:00:18 crc kubenswrapper[4930]: I1012 06:00:18.087863 4930 scope.go:117] "RemoveContainer" containerID="2ae1a90b7dda3dde9f5d2aa343d68b941c1348c68f49952d974eef4bcb547c74" Oct 12 06:00:18 crc kubenswrapper[4930]: E1012 06:00:18.088282 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ae1a90b7dda3dde9f5d2aa343d68b941c1348c68f49952d974eef4bcb547c74\": container with ID starting with 2ae1a90b7dda3dde9f5d2aa343d68b941c1348c68f49952d974eef4bcb547c74 not found: ID does not exist" containerID="2ae1a90b7dda3dde9f5d2aa343d68b941c1348c68f49952d974eef4bcb547c74" Oct 12 06:00:18 crc kubenswrapper[4930]: I1012 06:00:18.088313 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ae1a90b7dda3dde9f5d2aa343d68b941c1348c68f49952d974eef4bcb547c74"} err="failed to get container status \"2ae1a90b7dda3dde9f5d2aa343d68b941c1348c68f49952d974eef4bcb547c74\": rpc error: code = NotFound desc = could not find container \"2ae1a90b7dda3dde9f5d2aa343d68b941c1348c68f49952d974eef4bcb547c74\": container with ID starting with 2ae1a90b7dda3dde9f5d2aa343d68b941c1348c68f49952d974eef4bcb547c74 not found: ID does not exist" Oct 12 06:00:18 crc kubenswrapper[4930]: I1012 06:00:18.088342 4930 scope.go:117] "RemoveContainer" containerID="e6e2e0a8b4334a83a934369feb105035aaace1cde6e03086b7d3d5f9cba3218f" Oct 12 06:00:18 crc kubenswrapper[4930]: E1012 06:00:18.088649 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6e2e0a8b4334a83a934369feb105035aaace1cde6e03086b7d3d5f9cba3218f\": container with ID starting with e6e2e0a8b4334a83a934369feb105035aaace1cde6e03086b7d3d5f9cba3218f not found: ID does not exist" containerID="e6e2e0a8b4334a83a934369feb105035aaace1cde6e03086b7d3d5f9cba3218f" Oct 12 06:00:18 crc kubenswrapper[4930]: I1012 06:00:18.088688 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6e2e0a8b4334a83a934369feb105035aaace1cde6e03086b7d3d5f9cba3218f"} err="failed to get container status \"e6e2e0a8b4334a83a934369feb105035aaace1cde6e03086b7d3d5f9cba3218f\": rpc error: code = NotFound desc = could not find container \"e6e2e0a8b4334a83a934369feb105035aaace1cde6e03086b7d3d5f9cba3218f\": container with ID starting with e6e2e0a8b4334a83a934369feb105035aaace1cde6e03086b7d3d5f9cba3218f not found: ID does not exist" Oct 12 06:00:18 crc kubenswrapper[4930]: I1012 06:00:18.088717 4930 scope.go:117] "RemoveContainer" containerID="056be5c75ad37ad8721e618a57e64ced13e4a1753bea43097fbe1396a55e79af" Oct 12 06:00:18 crc kubenswrapper[4930]: E1012 06:00:18.089076 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"056be5c75ad37ad8721e618a57e64ced13e4a1753bea43097fbe1396a55e79af\": container with ID starting with 056be5c75ad37ad8721e618a57e64ced13e4a1753bea43097fbe1396a55e79af not found: ID does not exist" containerID="056be5c75ad37ad8721e618a57e64ced13e4a1753bea43097fbe1396a55e79af" Oct 12 06:00:18 crc kubenswrapper[4930]: I1012 06:00:18.089105 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"056be5c75ad37ad8721e618a57e64ced13e4a1753bea43097fbe1396a55e79af"} err="failed to get container status \"056be5c75ad37ad8721e618a57e64ced13e4a1753bea43097fbe1396a55e79af\": rpc error: code = NotFound desc = could not find container \"056be5c75ad37ad8721e618a57e64ced13e4a1753bea43097fbe1396a55e79af\": container with ID starting with 056be5c75ad37ad8721e618a57e64ced13e4a1753bea43097fbe1396a55e79af not found: ID does not exist" Oct 12 06:00:18 crc kubenswrapper[4930]: I1012 06:00:18.089223 4930 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8d94394-2655-48d0-a53d-7ac1a17f3f71-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 06:00:18 crc kubenswrapper[4930]: I1012 06:00:18.089245 4930 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8d94394-2655-48d0-a53d-7ac1a17f3f71-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 06:00:18 crc kubenswrapper[4930]: I1012 06:00:18.181541 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30049dfb-04f2-455c-a949-08bd6ff892d0" path="/var/lib/kubelet/pods/30049dfb-04f2-455c-a949-08bd6ff892d0/volumes" Oct 12 06:00:18 crc kubenswrapper[4930]: I1012 06:00:18.187709 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9354ea26-d24c-49ef-b564-b58a6f3e4f3f" path="/var/lib/kubelet/pods/9354ea26-d24c-49ef-b564-b58a6f3e4f3f/volumes" Oct 12 06:00:18 crc kubenswrapper[4930]: I1012 06:00:18.309550 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 12 06:00:18 crc kubenswrapper[4930]: E1012 06:00:18.319372 4930 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8d94394_2655_48d0_a53d_7ac1a17f3f71.slice/crio-dd6a1c9058dbdbf663a9416bda59b6d43b32fe0f4c5a914b93e8eec9b51be358\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8d94394_2655_48d0_a53d_7ac1a17f3f71.slice\": RecentStats: unable to find data in memory cache]" Oct 12 06:00:18 crc kubenswrapper[4930]: I1012 06:00:18.334504 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 12 06:00:18 crc kubenswrapper[4930]: I1012 06:00:18.343307 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 12 06:00:18 crc kubenswrapper[4930]: E1012 06:00:18.343681 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8d94394-2655-48d0-a53d-7ac1a17f3f71" containerName="proxy-httpd" Oct 12 06:00:18 crc kubenswrapper[4930]: I1012 06:00:18.343696 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8d94394-2655-48d0-a53d-7ac1a17f3f71" containerName="proxy-httpd" Oct 12 06:00:18 crc kubenswrapper[4930]: E1012 06:00:18.343712 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8d94394-2655-48d0-a53d-7ac1a17f3f71" containerName="sg-core" Oct 12 06:00:18 crc kubenswrapper[4930]: I1012 06:00:18.343718 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8d94394-2655-48d0-a53d-7ac1a17f3f71" containerName="sg-core" Oct 12 06:00:18 crc kubenswrapper[4930]: E1012 06:00:18.343747 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8d94394-2655-48d0-a53d-7ac1a17f3f71" containerName="ceilometer-notification-agent" Oct 12 06:00:18 crc kubenswrapper[4930]: I1012 06:00:18.343755 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8d94394-2655-48d0-a53d-7ac1a17f3f71" containerName="ceilometer-notification-agent" Oct 12 06:00:18 crc kubenswrapper[4930]: E1012 06:00:18.343779 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8d94394-2655-48d0-a53d-7ac1a17f3f71" containerName="ceilometer-central-agent" Oct 12 06:00:18 crc kubenswrapper[4930]: I1012 06:00:18.343784 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8d94394-2655-48d0-a53d-7ac1a17f3f71" containerName="ceilometer-central-agent" Oct 12 06:00:18 crc kubenswrapper[4930]: I1012 06:00:18.343966 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8d94394-2655-48d0-a53d-7ac1a17f3f71" containerName="sg-core" Oct 12 06:00:18 crc kubenswrapper[4930]: I1012 06:00:18.343981 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8d94394-2655-48d0-a53d-7ac1a17f3f71" containerName="ceilometer-notification-agent" Oct 12 06:00:18 crc kubenswrapper[4930]: I1012 06:00:18.343993 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8d94394-2655-48d0-a53d-7ac1a17f3f71" containerName="proxy-httpd" Oct 12 06:00:18 crc kubenswrapper[4930]: I1012 06:00:18.344011 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8d94394-2655-48d0-a53d-7ac1a17f3f71" containerName="ceilometer-central-agent" Oct 12 06:00:18 crc kubenswrapper[4930]: I1012 06:00:18.346009 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 06:00:18 crc kubenswrapper[4930]: I1012 06:00:18.349832 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 12 06:00:18 crc kubenswrapper[4930]: I1012 06:00:18.350539 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 12 06:00:18 crc kubenswrapper[4930]: I1012 06:00:18.362959 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 12 06:00:18 crc kubenswrapper[4930]: I1012 06:00:18.396686 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a56af5dd-28e1-41e8-b164-0c05399223ff-config-data\") pod \"ceilometer-0\" (UID: \"a56af5dd-28e1-41e8-b164-0c05399223ff\") " pod="openstack/ceilometer-0" Oct 12 06:00:18 crc kubenswrapper[4930]: I1012 06:00:18.396857 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvhn6\" (UniqueName: \"kubernetes.io/projected/a56af5dd-28e1-41e8-b164-0c05399223ff-kube-api-access-cvhn6\") pod \"ceilometer-0\" (UID: \"a56af5dd-28e1-41e8-b164-0c05399223ff\") " pod="openstack/ceilometer-0" Oct 12 06:00:18 crc kubenswrapper[4930]: I1012 06:00:18.397004 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a56af5dd-28e1-41e8-b164-0c05399223ff-run-httpd\") pod \"ceilometer-0\" (UID: \"a56af5dd-28e1-41e8-b164-0c05399223ff\") " pod="openstack/ceilometer-0" Oct 12 06:00:18 crc kubenswrapper[4930]: I1012 06:00:18.397159 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a56af5dd-28e1-41e8-b164-0c05399223ff-log-httpd\") pod \"ceilometer-0\" (UID: \"a56af5dd-28e1-41e8-b164-0c05399223ff\") " pod="openstack/ceilometer-0" Oct 12 06:00:18 crc kubenswrapper[4930]: I1012 06:00:18.397226 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a56af5dd-28e1-41e8-b164-0c05399223ff-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a56af5dd-28e1-41e8-b164-0c05399223ff\") " pod="openstack/ceilometer-0" Oct 12 06:00:18 crc kubenswrapper[4930]: I1012 06:00:18.397282 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a56af5dd-28e1-41e8-b164-0c05399223ff-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a56af5dd-28e1-41e8-b164-0c05399223ff\") " pod="openstack/ceilometer-0" Oct 12 06:00:18 crc kubenswrapper[4930]: I1012 06:00:18.397453 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a56af5dd-28e1-41e8-b164-0c05399223ff-scripts\") pod \"ceilometer-0\" (UID: \"a56af5dd-28e1-41e8-b164-0c05399223ff\") " pod="openstack/ceilometer-0" Oct 12 06:00:18 crc kubenswrapper[4930]: I1012 06:00:18.498558 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a56af5dd-28e1-41e8-b164-0c05399223ff-config-data\") pod \"ceilometer-0\" (UID: \"a56af5dd-28e1-41e8-b164-0c05399223ff\") " pod="openstack/ceilometer-0" Oct 12 06:00:18 crc kubenswrapper[4930]: I1012 06:00:18.498617 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvhn6\" (UniqueName: \"kubernetes.io/projected/a56af5dd-28e1-41e8-b164-0c05399223ff-kube-api-access-cvhn6\") pod \"ceilometer-0\" (UID: \"a56af5dd-28e1-41e8-b164-0c05399223ff\") " pod="openstack/ceilometer-0" Oct 12 06:00:18 crc kubenswrapper[4930]: I1012 06:00:18.498649 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a56af5dd-28e1-41e8-b164-0c05399223ff-run-httpd\") pod \"ceilometer-0\" (UID: \"a56af5dd-28e1-41e8-b164-0c05399223ff\") " pod="openstack/ceilometer-0" Oct 12 06:00:18 crc kubenswrapper[4930]: I1012 06:00:18.498684 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a56af5dd-28e1-41e8-b164-0c05399223ff-log-httpd\") pod \"ceilometer-0\" (UID: \"a56af5dd-28e1-41e8-b164-0c05399223ff\") " pod="openstack/ceilometer-0" Oct 12 06:00:18 crc kubenswrapper[4930]: I1012 06:00:18.498710 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a56af5dd-28e1-41e8-b164-0c05399223ff-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a56af5dd-28e1-41e8-b164-0c05399223ff\") " pod="openstack/ceilometer-0" Oct 12 06:00:18 crc kubenswrapper[4930]: I1012 06:00:18.498727 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a56af5dd-28e1-41e8-b164-0c05399223ff-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a56af5dd-28e1-41e8-b164-0c05399223ff\") " pod="openstack/ceilometer-0" Oct 12 06:00:18 crc kubenswrapper[4930]: I1012 06:00:18.498758 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a56af5dd-28e1-41e8-b164-0c05399223ff-scripts\") pod \"ceilometer-0\" (UID: \"a56af5dd-28e1-41e8-b164-0c05399223ff\") " pod="openstack/ceilometer-0" Oct 12 06:00:18 crc kubenswrapper[4930]: I1012 06:00:18.499365 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a56af5dd-28e1-41e8-b164-0c05399223ff-run-httpd\") pod \"ceilometer-0\" (UID: \"a56af5dd-28e1-41e8-b164-0c05399223ff\") " pod="openstack/ceilometer-0" Oct 12 06:00:18 crc kubenswrapper[4930]: I1012 06:00:18.499594 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a56af5dd-28e1-41e8-b164-0c05399223ff-log-httpd\") pod \"ceilometer-0\" (UID: \"a56af5dd-28e1-41e8-b164-0c05399223ff\") " pod="openstack/ceilometer-0" Oct 12 06:00:18 crc kubenswrapper[4930]: I1012 06:00:18.503451 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a56af5dd-28e1-41e8-b164-0c05399223ff-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a56af5dd-28e1-41e8-b164-0c05399223ff\") " pod="openstack/ceilometer-0" Oct 12 06:00:18 crc kubenswrapper[4930]: I1012 06:00:18.503520 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a56af5dd-28e1-41e8-b164-0c05399223ff-scripts\") pod \"ceilometer-0\" (UID: \"a56af5dd-28e1-41e8-b164-0c05399223ff\") " pod="openstack/ceilometer-0" Oct 12 06:00:18 crc kubenswrapper[4930]: I1012 06:00:18.508081 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a56af5dd-28e1-41e8-b164-0c05399223ff-config-data\") pod \"ceilometer-0\" (UID: \"a56af5dd-28e1-41e8-b164-0c05399223ff\") " pod="openstack/ceilometer-0" Oct 12 06:00:18 crc kubenswrapper[4930]: I1012 06:00:18.510611 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a56af5dd-28e1-41e8-b164-0c05399223ff-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a56af5dd-28e1-41e8-b164-0c05399223ff\") " pod="openstack/ceilometer-0" Oct 12 06:00:18 crc kubenswrapper[4930]: I1012 06:00:18.517107 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvhn6\" (UniqueName: \"kubernetes.io/projected/a56af5dd-28e1-41e8-b164-0c05399223ff-kube-api-access-cvhn6\") pod \"ceilometer-0\" (UID: \"a56af5dd-28e1-41e8-b164-0c05399223ff\") " pod="openstack/ceilometer-0" Oct 12 06:00:18 crc kubenswrapper[4930]: I1012 06:00:18.666587 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 06:00:19 crc kubenswrapper[4930]: I1012 06:00:19.002900 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"eceafd59-b491-4468-b6b6-78fe1c689e6b","Type":"ContainerStarted","Data":"5a27b1038fba46f13d160d5b0874102336a78422702f9b5783a692626e59ab25"} Oct 12 06:00:19 crc kubenswrapper[4930]: I1012 06:00:19.003218 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"eceafd59-b491-4468-b6b6-78fe1c689e6b","Type":"ContainerStarted","Data":"9b67820f6a3e34704f41838418749aab11c43dc9f1f2c4472ac2ee934f173b72"} Oct 12 06:00:19 crc kubenswrapper[4930]: I1012 06:00:19.130810 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 12 06:00:20 crc kubenswrapper[4930]: I1012 06:00:20.015562 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"eceafd59-b491-4468-b6b6-78fe1c689e6b","Type":"ContainerStarted","Data":"d85d832710fb1677e88f0ac3452caf6511c92eee238abf47aca53098af70e035"} Oct 12 06:00:20 crc kubenswrapper[4930]: I1012 06:00:20.016183 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 12 06:00:20 crc kubenswrapper[4930]: I1012 06:00:20.017484 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a56af5dd-28e1-41e8-b164-0c05399223ff","Type":"ContainerStarted","Data":"551b07d00e8c837ed33df82334548831010e926a8af34e84c52eff358862e926"} Oct 12 06:00:20 crc kubenswrapper[4930]: I1012 06:00:20.017531 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a56af5dd-28e1-41e8-b164-0c05399223ff","Type":"ContainerStarted","Data":"55a808bcc1a2d095282567ad8384771805baea3031a333453e36d3ed5727103b"} Oct 12 06:00:20 crc kubenswrapper[4930]: I1012 06:00:20.017542 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a56af5dd-28e1-41e8-b164-0c05399223ff","Type":"ContainerStarted","Data":"f0b78e275e199affbc146d353f66a6296e65131474ef140b4ebfed5f8c76da68"} Oct 12 06:00:20 crc kubenswrapper[4930]: I1012 06:00:20.149243 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8d94394-2655-48d0-a53d-7ac1a17f3f71" path="/var/lib/kubelet/pods/c8d94394-2655-48d0-a53d-7ac1a17f3f71/volumes" Oct 12 06:00:21 crc kubenswrapper[4930]: I1012 06:00:21.032019 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a56af5dd-28e1-41e8-b164-0c05399223ff","Type":"ContainerStarted","Data":"3366607c6c00bd4d1fc49527cb6393c742199ac5e24e167b5f85a6585ab06cd4"} Oct 12 06:00:22 crc kubenswrapper[4930]: I1012 06:00:22.054650 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a56af5dd-28e1-41e8-b164-0c05399223ff","Type":"ContainerStarted","Data":"9ca093f41e7b2d1d4cc5341ae9489273ced70ddc36ab7dbcd03ee96ae1f4e590"} Oct 12 06:00:22 crc kubenswrapper[4930]: I1012 06:00:22.055037 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 12 06:00:22 crc kubenswrapper[4930]: I1012 06:00:22.077555 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.514094839 podStartE2EDuration="4.077538534s" podCreationTimestamp="2025-10-12 06:00:18 +0000 UTC" firstStartedPulling="2025-10-12 06:00:19.154330342 +0000 UTC m=+1151.696432107" lastFinishedPulling="2025-10-12 06:00:21.717774037 +0000 UTC m=+1154.259875802" observedRunningTime="2025-10-12 06:00:22.07415141 +0000 UTC m=+1154.616253175" watchObservedRunningTime="2025-10-12 06:00:22.077538534 +0000 UTC m=+1154.619640299" Oct 12 06:00:22 crc kubenswrapper[4930]: I1012 06:00:22.077909 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.077903533 podStartE2EDuration="5.077903533s" podCreationTimestamp="2025-10-12 06:00:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 06:00:20.043105621 +0000 UTC m=+1152.585207386" watchObservedRunningTime="2025-10-12 06:00:22.077903533 +0000 UTC m=+1154.620005298" Oct 12 06:00:22 crc kubenswrapper[4930]: I1012 06:00:22.624990 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 12 06:00:22 crc kubenswrapper[4930]: I1012 06:00:22.625256 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="d297fbfc-df5c-4365-a668-d04f87df845d" containerName="glance-log" containerID="cri-o://a607f1cebf14a923275c8ceab7752588186b342d8877d40c3c9276d90f221c5e" gracePeriod=30 Oct 12 06:00:22 crc kubenswrapper[4930]: I1012 06:00:22.625864 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="d297fbfc-df5c-4365-a668-d04f87df845d" containerName="glance-httpd" containerID="cri-o://4597264a5a1f2045f2dfd2c65470cdc4a45efec1958c0f9c123bc342b19d8b98" gracePeriod=30 Oct 12 06:00:23 crc kubenswrapper[4930]: I1012 06:00:23.068894 4930 generic.go:334] "Generic (PLEG): container finished" podID="d297fbfc-df5c-4365-a668-d04f87df845d" containerID="a607f1cebf14a923275c8ceab7752588186b342d8877d40c3c9276d90f221c5e" exitCode=143 Oct 12 06:00:23 crc kubenswrapper[4930]: I1012 06:00:23.070313 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d297fbfc-df5c-4365-a668-d04f87df845d","Type":"ContainerDied","Data":"a607f1cebf14a923275c8ceab7752588186b342d8877d40c3c9276d90f221c5e"} Oct 12 06:00:23 crc kubenswrapper[4930]: I1012 06:00:23.443909 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 12 06:00:23 crc kubenswrapper[4930]: I1012 06:00:23.444167 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="b83c8650-6361-42bb-8112-0449e6722a3f" containerName="glance-log" containerID="cri-o://ae9ee27bdd88c79f2fa0d15b36520ba143be88fc092cd54c9e17f9cf8674b5e8" gracePeriod=30 Oct 12 06:00:23 crc kubenswrapper[4930]: I1012 06:00:23.444282 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="b83c8650-6361-42bb-8112-0449e6722a3f" containerName="glance-httpd" containerID="cri-o://15e775931c35b2e9b09753f036a874b4d26c758652a7511acfa8b52dcdbd6e9d" gracePeriod=30 Oct 12 06:00:23 crc kubenswrapper[4930]: I1012 06:00:23.678970 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 12 06:00:23 crc kubenswrapper[4930]: I1012 06:00:23.685485 4930 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 12 06:00:23 crc kubenswrapper[4930]: I1012 06:00:23.685839 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Oct 12 06:00:23 crc kubenswrapper[4930]: I1012 06:00:23.686271 4930 scope.go:117] "RemoveContainer" containerID="08f49b3d51ac716588f5237987a07b5659675e827f1272074de03ba6918727cd" Oct 12 06:00:24 crc kubenswrapper[4930]: I1012 06:00:24.081634 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"59f4bae4-1a84-449a-be72-e735294116e6","Type":"ContainerStarted","Data":"d3276e810538259cb3294c616a6c0101e64806aa3e935bc18563a7a3f2b3591c"} Oct 12 06:00:24 crc kubenswrapper[4930]: I1012 06:00:24.083506 4930 generic.go:334] "Generic (PLEG): container finished" podID="d297fbfc-df5c-4365-a668-d04f87df845d" containerID="4597264a5a1f2045f2dfd2c65470cdc4a45efec1958c0f9c123bc342b19d8b98" exitCode=0 Oct 12 06:00:24 crc kubenswrapper[4930]: I1012 06:00:24.083546 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d297fbfc-df5c-4365-a668-d04f87df845d","Type":"ContainerDied","Data":"4597264a5a1f2045f2dfd2c65470cdc4a45efec1958c0f9c123bc342b19d8b98"} Oct 12 06:00:24 crc kubenswrapper[4930]: I1012 06:00:24.083562 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d297fbfc-df5c-4365-a668-d04f87df845d","Type":"ContainerDied","Data":"fe7ce099e3cf7c6ea4db68a65afee617377f91c2b135876a6ca6d32218d66f12"} Oct 12 06:00:24 crc kubenswrapper[4930]: I1012 06:00:24.083574 4930 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe7ce099e3cf7c6ea4db68a65afee617377f91c2b135876a6ca6d32218d66f12" Oct 12 06:00:24 crc kubenswrapper[4930]: I1012 06:00:24.088626 4930 generic.go:334] "Generic (PLEG): container finished" podID="b83c8650-6361-42bb-8112-0449e6722a3f" containerID="ae9ee27bdd88c79f2fa0d15b36520ba143be88fc092cd54c9e17f9cf8674b5e8" exitCode=143 Oct 12 06:00:24 crc kubenswrapper[4930]: I1012 06:00:24.088698 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b83c8650-6361-42bb-8112-0449e6722a3f","Type":"ContainerDied","Data":"ae9ee27bdd88c79f2fa0d15b36520ba143be88fc092cd54c9e17f9cf8674b5e8"} Oct 12 06:00:24 crc kubenswrapper[4930]: I1012 06:00:24.088885 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a56af5dd-28e1-41e8-b164-0c05399223ff" containerName="ceilometer-central-agent" containerID="cri-o://55a808bcc1a2d095282567ad8384771805baea3031a333453e36d3ed5727103b" gracePeriod=30 Oct 12 06:00:24 crc kubenswrapper[4930]: I1012 06:00:24.088916 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a56af5dd-28e1-41e8-b164-0c05399223ff" containerName="sg-core" containerID="cri-o://3366607c6c00bd4d1fc49527cb6393c742199ac5e24e167b5f85a6585ab06cd4" gracePeriod=30 Oct 12 06:00:24 crc kubenswrapper[4930]: I1012 06:00:24.088933 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a56af5dd-28e1-41e8-b164-0c05399223ff" containerName="proxy-httpd" containerID="cri-o://9ca093f41e7b2d1d4cc5341ae9489273ced70ddc36ab7dbcd03ee96ae1f4e590" gracePeriod=30 Oct 12 06:00:24 crc kubenswrapper[4930]: I1012 06:00:24.088961 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a56af5dd-28e1-41e8-b164-0c05399223ff" containerName="ceilometer-notification-agent" containerID="cri-o://551b07d00e8c837ed33df82334548831010e926a8af34e84c52eff358862e926" gracePeriod=30 Oct 12 06:00:24 crc kubenswrapper[4930]: I1012 06:00:24.121310 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 12 06:00:24 crc kubenswrapper[4930]: I1012 06:00:24.216134 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d297fbfc-df5c-4365-a668-d04f87df845d-httpd-run\") pod \"d297fbfc-df5c-4365-a668-d04f87df845d\" (UID: \"d297fbfc-df5c-4365-a668-d04f87df845d\") " Oct 12 06:00:24 crc kubenswrapper[4930]: I1012 06:00:24.216205 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d297fbfc-df5c-4365-a668-d04f87df845d-public-tls-certs\") pod \"d297fbfc-df5c-4365-a668-d04f87df845d\" (UID: \"d297fbfc-df5c-4365-a668-d04f87df845d\") " Oct 12 06:00:24 crc kubenswrapper[4930]: I1012 06:00:24.216243 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d297fbfc-df5c-4365-a668-d04f87df845d-logs\") pod \"d297fbfc-df5c-4365-a668-d04f87df845d\" (UID: \"d297fbfc-df5c-4365-a668-d04f87df845d\") " Oct 12 06:00:24 crc kubenswrapper[4930]: I1012 06:00:24.216267 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d297fbfc-df5c-4365-a668-d04f87df845d-config-data\") pod \"d297fbfc-df5c-4365-a668-d04f87df845d\" (UID: \"d297fbfc-df5c-4365-a668-d04f87df845d\") " Oct 12 06:00:24 crc kubenswrapper[4930]: I1012 06:00:24.216288 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"d297fbfc-df5c-4365-a668-d04f87df845d\" (UID: \"d297fbfc-df5c-4365-a668-d04f87df845d\") " Oct 12 06:00:24 crc kubenswrapper[4930]: I1012 06:00:24.216321 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d297fbfc-df5c-4365-a668-d04f87df845d-scripts\") pod \"d297fbfc-df5c-4365-a668-d04f87df845d\" (UID: \"d297fbfc-df5c-4365-a668-d04f87df845d\") " Oct 12 06:00:24 crc kubenswrapper[4930]: I1012 06:00:24.216358 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d297fbfc-df5c-4365-a668-d04f87df845d-combined-ca-bundle\") pod \"d297fbfc-df5c-4365-a668-d04f87df845d\" (UID: \"d297fbfc-df5c-4365-a668-d04f87df845d\") " Oct 12 06:00:24 crc kubenswrapper[4930]: I1012 06:00:24.216449 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6l2fw\" (UniqueName: \"kubernetes.io/projected/d297fbfc-df5c-4365-a668-d04f87df845d-kube-api-access-6l2fw\") pod \"d297fbfc-df5c-4365-a668-d04f87df845d\" (UID: \"d297fbfc-df5c-4365-a668-d04f87df845d\") " Oct 12 06:00:24 crc kubenswrapper[4930]: I1012 06:00:24.217242 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d297fbfc-df5c-4365-a668-d04f87df845d-logs" (OuterVolumeSpecName: "logs") pod "d297fbfc-df5c-4365-a668-d04f87df845d" (UID: "d297fbfc-df5c-4365-a668-d04f87df845d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 06:00:24 crc kubenswrapper[4930]: I1012 06:00:24.217538 4930 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d297fbfc-df5c-4365-a668-d04f87df845d-logs\") on node \"crc\" DevicePath \"\"" Oct 12 06:00:24 crc kubenswrapper[4930]: I1012 06:00:24.218459 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d297fbfc-df5c-4365-a668-d04f87df845d-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "d297fbfc-df5c-4365-a668-d04f87df845d" (UID: "d297fbfc-df5c-4365-a668-d04f87df845d"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 06:00:24 crc kubenswrapper[4930]: I1012 06:00:24.230190 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d297fbfc-df5c-4365-a668-d04f87df845d-scripts" (OuterVolumeSpecName: "scripts") pod "d297fbfc-df5c-4365-a668-d04f87df845d" (UID: "d297fbfc-df5c-4365-a668-d04f87df845d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:00:24 crc kubenswrapper[4930]: I1012 06:00:24.231144 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d297fbfc-df5c-4365-a668-d04f87df845d-kube-api-access-6l2fw" (OuterVolumeSpecName: "kube-api-access-6l2fw") pod "d297fbfc-df5c-4365-a668-d04f87df845d" (UID: "d297fbfc-df5c-4365-a668-d04f87df845d"). InnerVolumeSpecName "kube-api-access-6l2fw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:00:24 crc kubenswrapper[4930]: I1012 06:00:24.245708 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "d297fbfc-df5c-4365-a668-d04f87df845d" (UID: "d297fbfc-df5c-4365-a668-d04f87df845d"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 12 06:00:24 crc kubenswrapper[4930]: I1012 06:00:24.265453 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d297fbfc-df5c-4365-a668-d04f87df845d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d297fbfc-df5c-4365-a668-d04f87df845d" (UID: "d297fbfc-df5c-4365-a668-d04f87df845d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:00:24 crc kubenswrapper[4930]: I1012 06:00:24.299825 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d297fbfc-df5c-4365-a668-d04f87df845d-config-data" (OuterVolumeSpecName: "config-data") pod "d297fbfc-df5c-4365-a668-d04f87df845d" (UID: "d297fbfc-df5c-4365-a668-d04f87df845d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:00:24 crc kubenswrapper[4930]: I1012 06:00:24.319859 4930 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d297fbfc-df5c-4365-a668-d04f87df845d-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 12 06:00:24 crc kubenswrapper[4930]: I1012 06:00:24.319891 4930 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d297fbfc-df5c-4365-a668-d04f87df845d-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 06:00:24 crc kubenswrapper[4930]: I1012 06:00:24.319916 4930 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Oct 12 06:00:24 crc kubenswrapper[4930]: I1012 06:00:24.319929 4930 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d297fbfc-df5c-4365-a668-d04f87df845d-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 06:00:24 crc kubenswrapper[4930]: I1012 06:00:24.319937 4930 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d297fbfc-df5c-4365-a668-d04f87df845d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 06:00:24 crc kubenswrapper[4930]: I1012 06:00:24.319949 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6l2fw\" (UniqueName: \"kubernetes.io/projected/d297fbfc-df5c-4365-a668-d04f87df845d-kube-api-access-6l2fw\") on node \"crc\" DevicePath \"\"" Oct 12 06:00:24 crc kubenswrapper[4930]: I1012 06:00:24.328890 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d297fbfc-df5c-4365-a668-d04f87df845d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "d297fbfc-df5c-4365-a668-d04f87df845d" (UID: "d297fbfc-df5c-4365-a668-d04f87df845d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:00:24 crc kubenswrapper[4930]: I1012 06:00:24.341459 4930 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Oct 12 06:00:24 crc kubenswrapper[4930]: I1012 06:00:24.422206 4930 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Oct 12 06:00:24 crc kubenswrapper[4930]: I1012 06:00:24.422432 4930 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d297fbfc-df5c-4365-a668-d04f87df845d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 12 06:00:24 crc kubenswrapper[4930]: I1012 06:00:24.799055 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 06:00:24 crc kubenswrapper[4930]: I1012 06:00:24.830011 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a56af5dd-28e1-41e8-b164-0c05399223ff-combined-ca-bundle\") pod \"a56af5dd-28e1-41e8-b164-0c05399223ff\" (UID: \"a56af5dd-28e1-41e8-b164-0c05399223ff\") " Oct 12 06:00:24 crc kubenswrapper[4930]: I1012 06:00:24.830063 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvhn6\" (UniqueName: \"kubernetes.io/projected/a56af5dd-28e1-41e8-b164-0c05399223ff-kube-api-access-cvhn6\") pod \"a56af5dd-28e1-41e8-b164-0c05399223ff\" (UID: \"a56af5dd-28e1-41e8-b164-0c05399223ff\") " Oct 12 06:00:24 crc kubenswrapper[4930]: I1012 06:00:24.830096 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a56af5dd-28e1-41e8-b164-0c05399223ff-config-data\") pod \"a56af5dd-28e1-41e8-b164-0c05399223ff\" (UID: \"a56af5dd-28e1-41e8-b164-0c05399223ff\") " Oct 12 06:00:24 crc kubenswrapper[4930]: I1012 06:00:24.830141 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a56af5dd-28e1-41e8-b164-0c05399223ff-scripts\") pod \"a56af5dd-28e1-41e8-b164-0c05399223ff\" (UID: \"a56af5dd-28e1-41e8-b164-0c05399223ff\") " Oct 12 06:00:24 crc kubenswrapper[4930]: I1012 06:00:24.830241 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a56af5dd-28e1-41e8-b164-0c05399223ff-sg-core-conf-yaml\") pod \"a56af5dd-28e1-41e8-b164-0c05399223ff\" (UID: \"a56af5dd-28e1-41e8-b164-0c05399223ff\") " Oct 12 06:00:24 crc kubenswrapper[4930]: I1012 06:00:24.830299 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a56af5dd-28e1-41e8-b164-0c05399223ff-log-httpd\") pod \"a56af5dd-28e1-41e8-b164-0c05399223ff\" (UID: \"a56af5dd-28e1-41e8-b164-0c05399223ff\") " Oct 12 06:00:24 crc kubenswrapper[4930]: I1012 06:00:24.830330 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a56af5dd-28e1-41e8-b164-0c05399223ff-run-httpd\") pod \"a56af5dd-28e1-41e8-b164-0c05399223ff\" (UID: \"a56af5dd-28e1-41e8-b164-0c05399223ff\") " Oct 12 06:00:24 crc kubenswrapper[4930]: I1012 06:00:24.831107 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a56af5dd-28e1-41e8-b164-0c05399223ff-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a56af5dd-28e1-41e8-b164-0c05399223ff" (UID: "a56af5dd-28e1-41e8-b164-0c05399223ff"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 06:00:24 crc kubenswrapper[4930]: I1012 06:00:24.833649 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a56af5dd-28e1-41e8-b164-0c05399223ff-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a56af5dd-28e1-41e8-b164-0c05399223ff" (UID: "a56af5dd-28e1-41e8-b164-0c05399223ff"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 06:00:24 crc kubenswrapper[4930]: I1012 06:00:24.833843 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a56af5dd-28e1-41e8-b164-0c05399223ff-scripts" (OuterVolumeSpecName: "scripts") pod "a56af5dd-28e1-41e8-b164-0c05399223ff" (UID: "a56af5dd-28e1-41e8-b164-0c05399223ff"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:00:24 crc kubenswrapper[4930]: I1012 06:00:24.838279 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a56af5dd-28e1-41e8-b164-0c05399223ff-kube-api-access-cvhn6" (OuterVolumeSpecName: "kube-api-access-cvhn6") pod "a56af5dd-28e1-41e8-b164-0c05399223ff" (UID: "a56af5dd-28e1-41e8-b164-0c05399223ff"). InnerVolumeSpecName "kube-api-access-cvhn6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:00:24 crc kubenswrapper[4930]: I1012 06:00:24.864953 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a56af5dd-28e1-41e8-b164-0c05399223ff-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a56af5dd-28e1-41e8-b164-0c05399223ff" (UID: "a56af5dd-28e1-41e8-b164-0c05399223ff"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:00:24 crc kubenswrapper[4930]: I1012 06:00:24.927879 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a56af5dd-28e1-41e8-b164-0c05399223ff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a56af5dd-28e1-41e8-b164-0c05399223ff" (UID: "a56af5dd-28e1-41e8-b164-0c05399223ff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:00:24 crc kubenswrapper[4930]: I1012 06:00:24.933026 4930 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a56af5dd-28e1-41e8-b164-0c05399223ff-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 12 06:00:24 crc kubenswrapper[4930]: I1012 06:00:24.933057 4930 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a56af5dd-28e1-41e8-b164-0c05399223ff-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 12 06:00:24 crc kubenswrapper[4930]: I1012 06:00:24.933065 4930 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a56af5dd-28e1-41e8-b164-0c05399223ff-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 12 06:00:24 crc kubenswrapper[4930]: I1012 06:00:24.933075 4930 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a56af5dd-28e1-41e8-b164-0c05399223ff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 06:00:24 crc kubenswrapper[4930]: I1012 06:00:24.933084 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvhn6\" (UniqueName: \"kubernetes.io/projected/a56af5dd-28e1-41e8-b164-0c05399223ff-kube-api-access-cvhn6\") on node \"crc\" DevicePath \"\"" Oct 12 06:00:24 crc kubenswrapper[4930]: I1012 06:00:24.933095 4930 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a56af5dd-28e1-41e8-b164-0c05399223ff-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 06:00:24 crc kubenswrapper[4930]: I1012 06:00:24.935852 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a56af5dd-28e1-41e8-b164-0c05399223ff-config-data" (OuterVolumeSpecName: "config-data") pod "a56af5dd-28e1-41e8-b164-0c05399223ff" (UID: "a56af5dd-28e1-41e8-b164-0c05399223ff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.034421 4930 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a56af5dd-28e1-41e8-b164-0c05399223ff-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.098703 4930 generic.go:334] "Generic (PLEG): container finished" podID="a56af5dd-28e1-41e8-b164-0c05399223ff" containerID="9ca093f41e7b2d1d4cc5341ae9489273ced70ddc36ab7dbcd03ee96ae1f4e590" exitCode=0 Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.098783 4930 generic.go:334] "Generic (PLEG): container finished" podID="a56af5dd-28e1-41e8-b164-0c05399223ff" containerID="3366607c6c00bd4d1fc49527cb6393c742199ac5e24e167b5f85a6585ab06cd4" exitCode=2 Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.098794 4930 generic.go:334] "Generic (PLEG): container finished" podID="a56af5dd-28e1-41e8-b164-0c05399223ff" containerID="551b07d00e8c837ed33df82334548831010e926a8af34e84c52eff358862e926" exitCode=0 Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.098801 4930 generic.go:334] "Generic (PLEG): container finished" podID="a56af5dd-28e1-41e8-b164-0c05399223ff" containerID="55a808bcc1a2d095282567ad8384771805baea3031a333453e36d3ed5727103b" exitCode=0 Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.098861 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.106098 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a56af5dd-28e1-41e8-b164-0c05399223ff","Type":"ContainerDied","Data":"9ca093f41e7b2d1d4cc5341ae9489273ced70ddc36ab7dbcd03ee96ae1f4e590"} Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.106168 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a56af5dd-28e1-41e8-b164-0c05399223ff","Type":"ContainerDied","Data":"3366607c6c00bd4d1fc49527cb6393c742199ac5e24e167b5f85a6585ab06cd4"} Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.106184 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a56af5dd-28e1-41e8-b164-0c05399223ff","Type":"ContainerDied","Data":"551b07d00e8c837ed33df82334548831010e926a8af34e84c52eff358862e926"} Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.106196 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a56af5dd-28e1-41e8-b164-0c05399223ff","Type":"ContainerDied","Data":"55a808bcc1a2d095282567ad8384771805baea3031a333453e36d3ed5727103b"} Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.106205 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a56af5dd-28e1-41e8-b164-0c05399223ff","Type":"ContainerDied","Data":"f0b78e275e199affbc146d353f66a6296e65131474ef140b4ebfed5f8c76da68"} Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.106224 4930 scope.go:117] "RemoveContainer" containerID="9ca093f41e7b2d1d4cc5341ae9489273ced70ddc36ab7dbcd03ee96ae1f4e590" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.106479 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.193749 4930 scope.go:117] "RemoveContainer" containerID="3366607c6c00bd4d1fc49527cb6393c742199ac5e24e167b5f85a6585ab06cd4" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.230895 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.236126 4930 scope.go:117] "RemoveContainer" containerID="551b07d00e8c837ed33df82334548831010e926a8af34e84c52eff358862e926" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.245784 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.251434 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.265476 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.285690 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 12 06:00:25 crc kubenswrapper[4930]: E1012 06:00:25.286162 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a56af5dd-28e1-41e8-b164-0c05399223ff" containerName="ceilometer-central-agent" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.286180 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="a56af5dd-28e1-41e8-b164-0c05399223ff" containerName="ceilometer-central-agent" Oct 12 06:00:25 crc kubenswrapper[4930]: E1012 06:00:25.286204 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a56af5dd-28e1-41e8-b164-0c05399223ff" containerName="ceilometer-notification-agent" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.286210 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="a56af5dd-28e1-41e8-b164-0c05399223ff" containerName="ceilometer-notification-agent" Oct 12 06:00:25 crc kubenswrapper[4930]: E1012 06:00:25.286229 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a56af5dd-28e1-41e8-b164-0c05399223ff" containerName="proxy-httpd" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.286236 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="a56af5dd-28e1-41e8-b164-0c05399223ff" containerName="proxy-httpd" Oct 12 06:00:25 crc kubenswrapper[4930]: E1012 06:00:25.286249 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a56af5dd-28e1-41e8-b164-0c05399223ff" containerName="sg-core" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.286256 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="a56af5dd-28e1-41e8-b164-0c05399223ff" containerName="sg-core" Oct 12 06:00:25 crc kubenswrapper[4930]: E1012 06:00:25.286268 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d297fbfc-df5c-4365-a668-d04f87df845d" containerName="glance-httpd" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.286274 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="d297fbfc-df5c-4365-a668-d04f87df845d" containerName="glance-httpd" Oct 12 06:00:25 crc kubenswrapper[4930]: E1012 06:00:25.286282 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d297fbfc-df5c-4365-a668-d04f87df845d" containerName="glance-log" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.286287 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="d297fbfc-df5c-4365-a668-d04f87df845d" containerName="glance-log" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.286455 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="d297fbfc-df5c-4365-a668-d04f87df845d" containerName="glance-httpd" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.286466 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="a56af5dd-28e1-41e8-b164-0c05399223ff" containerName="sg-core" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.286475 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="a56af5dd-28e1-41e8-b164-0c05399223ff" containerName="ceilometer-central-agent" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.286489 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="d297fbfc-df5c-4365-a668-d04f87df845d" containerName="glance-log" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.286503 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="a56af5dd-28e1-41e8-b164-0c05399223ff" containerName="ceilometer-notification-agent" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.286518 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="a56af5dd-28e1-41e8-b164-0c05399223ff" containerName="proxy-httpd" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.288197 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.297329 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.308018 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.309592 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.313163 4930 scope.go:117] "RemoveContainer" containerID="55a808bcc1a2d095282567ad8384771805baea3031a333453e36d3ed5727103b" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.313531 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.313913 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.314179 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.314190 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.322005 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.388236 4930 scope.go:117] "RemoveContainer" containerID="9ca093f41e7b2d1d4cc5341ae9489273ced70ddc36ab7dbcd03ee96ae1f4e590" Oct 12 06:00:25 crc kubenswrapper[4930]: E1012 06:00:25.394267 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ca093f41e7b2d1d4cc5341ae9489273ced70ddc36ab7dbcd03ee96ae1f4e590\": container with ID starting with 9ca093f41e7b2d1d4cc5341ae9489273ced70ddc36ab7dbcd03ee96ae1f4e590 not found: ID does not exist" containerID="9ca093f41e7b2d1d4cc5341ae9489273ced70ddc36ab7dbcd03ee96ae1f4e590" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.394316 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ca093f41e7b2d1d4cc5341ae9489273ced70ddc36ab7dbcd03ee96ae1f4e590"} err="failed to get container status \"9ca093f41e7b2d1d4cc5341ae9489273ced70ddc36ab7dbcd03ee96ae1f4e590\": rpc error: code = NotFound desc = could not find container \"9ca093f41e7b2d1d4cc5341ae9489273ced70ddc36ab7dbcd03ee96ae1f4e590\": container with ID starting with 9ca093f41e7b2d1d4cc5341ae9489273ced70ddc36ab7dbcd03ee96ae1f4e590 not found: ID does not exist" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.394350 4930 scope.go:117] "RemoveContainer" containerID="3366607c6c00bd4d1fc49527cb6393c742199ac5e24e167b5f85a6585ab06cd4" Oct 12 06:00:25 crc kubenswrapper[4930]: E1012 06:00:25.394680 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3366607c6c00bd4d1fc49527cb6393c742199ac5e24e167b5f85a6585ab06cd4\": container with ID starting with 3366607c6c00bd4d1fc49527cb6393c742199ac5e24e167b5f85a6585ab06cd4 not found: ID does not exist" containerID="3366607c6c00bd4d1fc49527cb6393c742199ac5e24e167b5f85a6585ab06cd4" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.394898 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3366607c6c00bd4d1fc49527cb6393c742199ac5e24e167b5f85a6585ab06cd4"} err="failed to get container status \"3366607c6c00bd4d1fc49527cb6393c742199ac5e24e167b5f85a6585ab06cd4\": rpc error: code = NotFound desc = could not find container \"3366607c6c00bd4d1fc49527cb6393c742199ac5e24e167b5f85a6585ab06cd4\": container with ID starting with 3366607c6c00bd4d1fc49527cb6393c742199ac5e24e167b5f85a6585ab06cd4 not found: ID does not exist" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.394935 4930 scope.go:117] "RemoveContainer" containerID="551b07d00e8c837ed33df82334548831010e926a8af34e84c52eff358862e926" Oct 12 06:00:25 crc kubenswrapper[4930]: E1012 06:00:25.395297 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"551b07d00e8c837ed33df82334548831010e926a8af34e84c52eff358862e926\": container with ID starting with 551b07d00e8c837ed33df82334548831010e926a8af34e84c52eff358862e926 not found: ID does not exist" containerID="551b07d00e8c837ed33df82334548831010e926a8af34e84c52eff358862e926" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.395342 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"551b07d00e8c837ed33df82334548831010e926a8af34e84c52eff358862e926"} err="failed to get container status \"551b07d00e8c837ed33df82334548831010e926a8af34e84c52eff358862e926\": rpc error: code = NotFound desc = could not find container \"551b07d00e8c837ed33df82334548831010e926a8af34e84c52eff358862e926\": container with ID starting with 551b07d00e8c837ed33df82334548831010e926a8af34e84c52eff358862e926 not found: ID does not exist" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.395372 4930 scope.go:117] "RemoveContainer" containerID="55a808bcc1a2d095282567ad8384771805baea3031a333453e36d3ed5727103b" Oct 12 06:00:25 crc kubenswrapper[4930]: E1012 06:00:25.395672 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55a808bcc1a2d095282567ad8384771805baea3031a333453e36d3ed5727103b\": container with ID starting with 55a808bcc1a2d095282567ad8384771805baea3031a333453e36d3ed5727103b not found: ID does not exist" containerID="55a808bcc1a2d095282567ad8384771805baea3031a333453e36d3ed5727103b" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.395699 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55a808bcc1a2d095282567ad8384771805baea3031a333453e36d3ed5727103b"} err="failed to get container status \"55a808bcc1a2d095282567ad8384771805baea3031a333453e36d3ed5727103b\": rpc error: code = NotFound desc = could not find container \"55a808bcc1a2d095282567ad8384771805baea3031a333453e36d3ed5727103b\": container with ID starting with 55a808bcc1a2d095282567ad8384771805baea3031a333453e36d3ed5727103b not found: ID does not exist" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.395719 4930 scope.go:117] "RemoveContainer" containerID="9ca093f41e7b2d1d4cc5341ae9489273ced70ddc36ab7dbcd03ee96ae1f4e590" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.396001 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ca093f41e7b2d1d4cc5341ae9489273ced70ddc36ab7dbcd03ee96ae1f4e590"} err="failed to get container status \"9ca093f41e7b2d1d4cc5341ae9489273ced70ddc36ab7dbcd03ee96ae1f4e590\": rpc error: code = NotFound desc = could not find container \"9ca093f41e7b2d1d4cc5341ae9489273ced70ddc36ab7dbcd03ee96ae1f4e590\": container with ID starting with 9ca093f41e7b2d1d4cc5341ae9489273ced70ddc36ab7dbcd03ee96ae1f4e590 not found: ID does not exist" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.396021 4930 scope.go:117] "RemoveContainer" containerID="3366607c6c00bd4d1fc49527cb6393c742199ac5e24e167b5f85a6585ab06cd4" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.396218 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3366607c6c00bd4d1fc49527cb6393c742199ac5e24e167b5f85a6585ab06cd4"} err="failed to get container status \"3366607c6c00bd4d1fc49527cb6393c742199ac5e24e167b5f85a6585ab06cd4\": rpc error: code = NotFound desc = could not find container \"3366607c6c00bd4d1fc49527cb6393c742199ac5e24e167b5f85a6585ab06cd4\": container with ID starting with 3366607c6c00bd4d1fc49527cb6393c742199ac5e24e167b5f85a6585ab06cd4 not found: ID does not exist" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.396237 4930 scope.go:117] "RemoveContainer" containerID="551b07d00e8c837ed33df82334548831010e926a8af34e84c52eff358862e926" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.396483 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"551b07d00e8c837ed33df82334548831010e926a8af34e84c52eff358862e926"} err="failed to get container status \"551b07d00e8c837ed33df82334548831010e926a8af34e84c52eff358862e926\": rpc error: code = NotFound desc = could not find container \"551b07d00e8c837ed33df82334548831010e926a8af34e84c52eff358862e926\": container with ID starting with 551b07d00e8c837ed33df82334548831010e926a8af34e84c52eff358862e926 not found: ID does not exist" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.396502 4930 scope.go:117] "RemoveContainer" containerID="55a808bcc1a2d095282567ad8384771805baea3031a333453e36d3ed5727103b" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.396896 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55a808bcc1a2d095282567ad8384771805baea3031a333453e36d3ed5727103b"} err="failed to get container status \"55a808bcc1a2d095282567ad8384771805baea3031a333453e36d3ed5727103b\": rpc error: code = NotFound desc = could not find container \"55a808bcc1a2d095282567ad8384771805baea3031a333453e36d3ed5727103b\": container with ID starting with 55a808bcc1a2d095282567ad8384771805baea3031a333453e36d3ed5727103b not found: ID does not exist" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.396915 4930 scope.go:117] "RemoveContainer" containerID="9ca093f41e7b2d1d4cc5341ae9489273ced70ddc36ab7dbcd03ee96ae1f4e590" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.397200 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ca093f41e7b2d1d4cc5341ae9489273ced70ddc36ab7dbcd03ee96ae1f4e590"} err="failed to get container status \"9ca093f41e7b2d1d4cc5341ae9489273ced70ddc36ab7dbcd03ee96ae1f4e590\": rpc error: code = NotFound desc = could not find container \"9ca093f41e7b2d1d4cc5341ae9489273ced70ddc36ab7dbcd03ee96ae1f4e590\": container with ID starting with 9ca093f41e7b2d1d4cc5341ae9489273ced70ddc36ab7dbcd03ee96ae1f4e590 not found: ID does not exist" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.397230 4930 scope.go:117] "RemoveContainer" containerID="3366607c6c00bd4d1fc49527cb6393c742199ac5e24e167b5f85a6585ab06cd4" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.397479 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3366607c6c00bd4d1fc49527cb6393c742199ac5e24e167b5f85a6585ab06cd4"} err="failed to get container status \"3366607c6c00bd4d1fc49527cb6393c742199ac5e24e167b5f85a6585ab06cd4\": rpc error: code = NotFound desc = could not find container \"3366607c6c00bd4d1fc49527cb6393c742199ac5e24e167b5f85a6585ab06cd4\": container with ID starting with 3366607c6c00bd4d1fc49527cb6393c742199ac5e24e167b5f85a6585ab06cd4 not found: ID does not exist" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.397498 4930 scope.go:117] "RemoveContainer" containerID="551b07d00e8c837ed33df82334548831010e926a8af34e84c52eff358862e926" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.397756 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"551b07d00e8c837ed33df82334548831010e926a8af34e84c52eff358862e926"} err="failed to get container status \"551b07d00e8c837ed33df82334548831010e926a8af34e84c52eff358862e926\": rpc error: code = NotFound desc = could not find container \"551b07d00e8c837ed33df82334548831010e926a8af34e84c52eff358862e926\": container with ID starting with 551b07d00e8c837ed33df82334548831010e926a8af34e84c52eff358862e926 not found: ID does not exist" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.397798 4930 scope.go:117] "RemoveContainer" containerID="55a808bcc1a2d095282567ad8384771805baea3031a333453e36d3ed5727103b" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.398063 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55a808bcc1a2d095282567ad8384771805baea3031a333453e36d3ed5727103b"} err="failed to get container status \"55a808bcc1a2d095282567ad8384771805baea3031a333453e36d3ed5727103b\": rpc error: code = NotFound desc = could not find container \"55a808bcc1a2d095282567ad8384771805baea3031a333453e36d3ed5727103b\": container with ID starting with 55a808bcc1a2d095282567ad8384771805baea3031a333453e36d3ed5727103b not found: ID does not exist" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.398088 4930 scope.go:117] "RemoveContainer" containerID="9ca093f41e7b2d1d4cc5341ae9489273ced70ddc36ab7dbcd03ee96ae1f4e590" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.398306 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ca093f41e7b2d1d4cc5341ae9489273ced70ddc36ab7dbcd03ee96ae1f4e590"} err="failed to get container status \"9ca093f41e7b2d1d4cc5341ae9489273ced70ddc36ab7dbcd03ee96ae1f4e590\": rpc error: code = NotFound desc = could not find container \"9ca093f41e7b2d1d4cc5341ae9489273ced70ddc36ab7dbcd03ee96ae1f4e590\": container with ID starting with 9ca093f41e7b2d1d4cc5341ae9489273ced70ddc36ab7dbcd03ee96ae1f4e590 not found: ID does not exist" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.398338 4930 scope.go:117] "RemoveContainer" containerID="3366607c6c00bd4d1fc49527cb6393c742199ac5e24e167b5f85a6585ab06cd4" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.398633 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3366607c6c00bd4d1fc49527cb6393c742199ac5e24e167b5f85a6585ab06cd4"} err="failed to get container status \"3366607c6c00bd4d1fc49527cb6393c742199ac5e24e167b5f85a6585ab06cd4\": rpc error: code = NotFound desc = could not find container \"3366607c6c00bd4d1fc49527cb6393c742199ac5e24e167b5f85a6585ab06cd4\": container with ID starting with 3366607c6c00bd4d1fc49527cb6393c742199ac5e24e167b5f85a6585ab06cd4 not found: ID does not exist" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.398687 4930 scope.go:117] "RemoveContainer" containerID="551b07d00e8c837ed33df82334548831010e926a8af34e84c52eff358862e926" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.399310 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"551b07d00e8c837ed33df82334548831010e926a8af34e84c52eff358862e926"} err="failed to get container status \"551b07d00e8c837ed33df82334548831010e926a8af34e84c52eff358862e926\": rpc error: code = NotFound desc = could not find container \"551b07d00e8c837ed33df82334548831010e926a8af34e84c52eff358862e926\": container with ID starting with 551b07d00e8c837ed33df82334548831010e926a8af34e84c52eff358862e926 not found: ID does not exist" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.399337 4930 scope.go:117] "RemoveContainer" containerID="55a808bcc1a2d095282567ad8384771805baea3031a333453e36d3ed5727103b" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.399626 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55a808bcc1a2d095282567ad8384771805baea3031a333453e36d3ed5727103b"} err="failed to get container status \"55a808bcc1a2d095282567ad8384771805baea3031a333453e36d3ed5727103b\": rpc error: code = NotFound desc = could not find container \"55a808bcc1a2d095282567ad8384771805baea3031a333453e36d3ed5727103b\": container with ID starting with 55a808bcc1a2d095282567ad8384771805baea3031a333453e36d3ed5727103b not found: ID does not exist" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.440267 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d-scripts\") pod \"ceilometer-0\" (UID: \"b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d\") " pod="openstack/ceilometer-0" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.440547 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtd5l\" (UniqueName: \"kubernetes.io/projected/00546299-d7c8-4536-9059-85a75dc5824e-kube-api-access-rtd5l\") pod \"glance-default-external-api-0\" (UID: \"00546299-d7c8-4536-9059-85a75dc5824e\") " pod="openstack/glance-default-external-api-0" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.440653 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"00546299-d7c8-4536-9059-85a75dc5824e\") " pod="openstack/glance-default-external-api-0" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.440764 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/00546299-d7c8-4536-9059-85a75dc5824e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"00546299-d7c8-4536-9059-85a75dc5824e\") " pod="openstack/glance-default-external-api-0" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.440849 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00546299-d7c8-4536-9059-85a75dc5824e-logs\") pod \"glance-default-external-api-0\" (UID: \"00546299-d7c8-4536-9059-85a75dc5824e\") " pod="openstack/glance-default-external-api-0" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.441108 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d-run-httpd\") pod \"ceilometer-0\" (UID: \"b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d\") " pod="openstack/ceilometer-0" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.441204 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/00546299-d7c8-4536-9059-85a75dc5824e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"00546299-d7c8-4536-9059-85a75dc5824e\") " pod="openstack/glance-default-external-api-0" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.441280 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbwnh\" (UniqueName: \"kubernetes.io/projected/b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d-kube-api-access-xbwnh\") pod \"ceilometer-0\" (UID: \"b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d\") " pod="openstack/ceilometer-0" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.441370 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00546299-d7c8-4536-9059-85a75dc5824e-scripts\") pod \"glance-default-external-api-0\" (UID: \"00546299-d7c8-4536-9059-85a75dc5824e\") " pod="openstack/glance-default-external-api-0" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.441522 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d-log-httpd\") pod \"ceilometer-0\" (UID: \"b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d\") " pod="openstack/ceilometer-0" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.441600 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d\") " pod="openstack/ceilometer-0" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.441932 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00546299-d7c8-4536-9059-85a75dc5824e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"00546299-d7c8-4536-9059-85a75dc5824e\") " pod="openstack/glance-default-external-api-0" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.441989 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d\") " pod="openstack/ceilometer-0" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.442041 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d-config-data\") pod \"ceilometer-0\" (UID: \"b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d\") " pod="openstack/ceilometer-0" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.442103 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00546299-d7c8-4536-9059-85a75dc5824e-config-data\") pod \"glance-default-external-api-0\" (UID: \"00546299-d7c8-4536-9059-85a75dc5824e\") " pod="openstack/glance-default-external-api-0" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.543598 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00546299-d7c8-4536-9059-85a75dc5824e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"00546299-d7c8-4536-9059-85a75dc5824e\") " pod="openstack/glance-default-external-api-0" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.543643 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d\") " pod="openstack/ceilometer-0" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.543671 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d-config-data\") pod \"ceilometer-0\" (UID: \"b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d\") " pod="openstack/ceilometer-0" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.543694 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00546299-d7c8-4536-9059-85a75dc5824e-config-data\") pod \"glance-default-external-api-0\" (UID: \"00546299-d7c8-4536-9059-85a75dc5824e\") " pod="openstack/glance-default-external-api-0" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.543720 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d-scripts\") pod \"ceilometer-0\" (UID: \"b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d\") " pod="openstack/ceilometer-0" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.543753 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtd5l\" (UniqueName: \"kubernetes.io/projected/00546299-d7c8-4536-9059-85a75dc5824e-kube-api-access-rtd5l\") pod \"glance-default-external-api-0\" (UID: \"00546299-d7c8-4536-9059-85a75dc5824e\") " pod="openstack/glance-default-external-api-0" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.543777 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"00546299-d7c8-4536-9059-85a75dc5824e\") " pod="openstack/glance-default-external-api-0" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.543792 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/00546299-d7c8-4536-9059-85a75dc5824e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"00546299-d7c8-4536-9059-85a75dc5824e\") " pod="openstack/glance-default-external-api-0" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.543824 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00546299-d7c8-4536-9059-85a75dc5824e-logs\") pod \"glance-default-external-api-0\" (UID: \"00546299-d7c8-4536-9059-85a75dc5824e\") " pod="openstack/glance-default-external-api-0" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.543845 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d-run-httpd\") pod \"ceilometer-0\" (UID: \"b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d\") " pod="openstack/ceilometer-0" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.543872 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/00546299-d7c8-4536-9059-85a75dc5824e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"00546299-d7c8-4536-9059-85a75dc5824e\") " pod="openstack/glance-default-external-api-0" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.543894 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbwnh\" (UniqueName: \"kubernetes.io/projected/b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d-kube-api-access-xbwnh\") pod \"ceilometer-0\" (UID: \"b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d\") " pod="openstack/ceilometer-0" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.543922 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00546299-d7c8-4536-9059-85a75dc5824e-scripts\") pod \"glance-default-external-api-0\" (UID: \"00546299-d7c8-4536-9059-85a75dc5824e\") " pod="openstack/glance-default-external-api-0" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.543937 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d-log-httpd\") pod \"ceilometer-0\" (UID: \"b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d\") " pod="openstack/ceilometer-0" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.543954 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d\") " pod="openstack/ceilometer-0" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.545045 4930 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"00546299-d7c8-4536-9059-85a75dc5824e\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-external-api-0" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.547706 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/00546299-d7c8-4536-9059-85a75dc5824e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"00546299-d7c8-4536-9059-85a75dc5824e\") " pod="openstack/glance-default-external-api-0" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.547889 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d-log-httpd\") pod \"ceilometer-0\" (UID: \"b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d\") " pod="openstack/ceilometer-0" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.548111 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d-run-httpd\") pod \"ceilometer-0\" (UID: \"b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d\") " pod="openstack/ceilometer-0" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.548333 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00546299-d7c8-4536-9059-85a75dc5824e-logs\") pod \"glance-default-external-api-0\" (UID: \"00546299-d7c8-4536-9059-85a75dc5824e\") " pod="openstack/glance-default-external-api-0" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.554476 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d\") " pod="openstack/ceilometer-0" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.577698 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00546299-d7c8-4536-9059-85a75dc5824e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"00546299-d7c8-4536-9059-85a75dc5824e\") " pod="openstack/glance-default-external-api-0" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.577715 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d\") " pod="openstack/ceilometer-0" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.581252 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d-scripts\") pod \"ceilometer-0\" (UID: \"b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d\") " pod="openstack/ceilometer-0" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.585837 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d-config-data\") pod \"ceilometer-0\" (UID: \"b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d\") " pod="openstack/ceilometer-0" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.593323 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00546299-d7c8-4536-9059-85a75dc5824e-config-data\") pod \"glance-default-external-api-0\" (UID: \"00546299-d7c8-4536-9059-85a75dc5824e\") " pod="openstack/glance-default-external-api-0" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.594005 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/00546299-d7c8-4536-9059-85a75dc5824e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"00546299-d7c8-4536-9059-85a75dc5824e\") " pod="openstack/glance-default-external-api-0" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.594540 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtd5l\" (UniqueName: \"kubernetes.io/projected/00546299-d7c8-4536-9059-85a75dc5824e-kube-api-access-rtd5l\") pod \"glance-default-external-api-0\" (UID: \"00546299-d7c8-4536-9059-85a75dc5824e\") " pod="openstack/glance-default-external-api-0" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.594850 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbwnh\" (UniqueName: \"kubernetes.io/projected/b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d-kube-api-access-xbwnh\") pod \"ceilometer-0\" (UID: \"b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d\") " pod="openstack/ceilometer-0" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.596507 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00546299-d7c8-4536-9059-85a75dc5824e-scripts\") pod \"glance-default-external-api-0\" (UID: \"00546299-d7c8-4536-9059-85a75dc5824e\") " pod="openstack/glance-default-external-api-0" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.622424 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"00546299-d7c8-4536-9059-85a75dc5824e\") " pod="openstack/glance-default-external-api-0" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.638814 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.663277 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.747510 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.849600 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crxzw\" (UniqueName: \"kubernetes.io/projected/b83c8650-6361-42bb-8112-0449e6722a3f-kube-api-access-crxzw\") pod \"b83c8650-6361-42bb-8112-0449e6722a3f\" (UID: \"b83c8650-6361-42bb-8112-0449e6722a3f\") " Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.849923 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b83c8650-6361-42bb-8112-0449e6722a3f-config-data\") pod \"b83c8650-6361-42bb-8112-0449e6722a3f\" (UID: \"b83c8650-6361-42bb-8112-0449e6722a3f\") " Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.850393 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b83c8650-6361-42bb-8112-0449e6722a3f-logs\") pod \"b83c8650-6361-42bb-8112-0449e6722a3f\" (UID: \"b83c8650-6361-42bb-8112-0449e6722a3f\") " Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.850460 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b83c8650-6361-42bb-8112-0449e6722a3f-scripts\") pod \"b83c8650-6361-42bb-8112-0449e6722a3f\" (UID: \"b83c8650-6361-42bb-8112-0449e6722a3f\") " Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.850498 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b83c8650-6361-42bb-8112-0449e6722a3f-httpd-run\") pod \"b83c8650-6361-42bb-8112-0449e6722a3f\" (UID: \"b83c8650-6361-42bb-8112-0449e6722a3f\") " Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.850559 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b83c8650-6361-42bb-8112-0449e6722a3f-internal-tls-certs\") pod \"b83c8650-6361-42bb-8112-0449e6722a3f\" (UID: \"b83c8650-6361-42bb-8112-0449e6722a3f\") " Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.850588 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"b83c8650-6361-42bb-8112-0449e6722a3f\" (UID: \"b83c8650-6361-42bb-8112-0449e6722a3f\") " Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.850623 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b83c8650-6361-42bb-8112-0449e6722a3f-combined-ca-bundle\") pod \"b83c8650-6361-42bb-8112-0449e6722a3f\" (UID: \"b83c8650-6361-42bb-8112-0449e6722a3f\") " Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.851387 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b83c8650-6361-42bb-8112-0449e6722a3f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "b83c8650-6361-42bb-8112-0449e6722a3f" (UID: "b83c8650-6361-42bb-8112-0449e6722a3f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.851648 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b83c8650-6361-42bb-8112-0449e6722a3f-logs" (OuterVolumeSpecName: "logs") pod "b83c8650-6361-42bb-8112-0449e6722a3f" (UID: "b83c8650-6361-42bb-8112-0449e6722a3f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.874375 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b83c8650-6361-42bb-8112-0449e6722a3f-scripts" (OuterVolumeSpecName: "scripts") pod "b83c8650-6361-42bb-8112-0449e6722a3f" (UID: "b83c8650-6361-42bb-8112-0449e6722a3f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.875134 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "b83c8650-6361-42bb-8112-0449e6722a3f" (UID: "b83c8650-6361-42bb-8112-0449e6722a3f"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.875265 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b83c8650-6361-42bb-8112-0449e6722a3f-kube-api-access-crxzw" (OuterVolumeSpecName: "kube-api-access-crxzw") pod "b83c8650-6361-42bb-8112-0449e6722a3f" (UID: "b83c8650-6361-42bb-8112-0449e6722a3f"). InnerVolumeSpecName "kube-api-access-crxzw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.919305 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b83c8650-6361-42bb-8112-0449e6722a3f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b83c8650-6361-42bb-8112-0449e6722a3f" (UID: "b83c8650-6361-42bb-8112-0449e6722a3f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.953140 4930 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b83c8650-6361-42bb-8112-0449e6722a3f-logs\") on node \"crc\" DevicePath \"\"" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.953483 4930 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b83c8650-6361-42bb-8112-0449e6722a3f-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.953550 4930 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b83c8650-6361-42bb-8112-0449e6722a3f-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.953647 4930 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.953704 4930 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b83c8650-6361-42bb-8112-0449e6722a3f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.953774 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crxzw\" (UniqueName: \"kubernetes.io/projected/b83c8650-6361-42bb-8112-0449e6722a3f-kube-api-access-crxzw\") on node \"crc\" DevicePath \"\"" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.955160 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b83c8650-6361-42bb-8112-0449e6722a3f-config-data" (OuterVolumeSpecName: "config-data") pod "b83c8650-6361-42bb-8112-0449e6722a3f" (UID: "b83c8650-6361-42bb-8112-0449e6722a3f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.972688 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b83c8650-6361-42bb-8112-0449e6722a3f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b83c8650-6361-42bb-8112-0449e6722a3f" (UID: "b83c8650-6361-42bb-8112-0449e6722a3f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:00:25 crc kubenswrapper[4930]: I1012 06:00:25.976443 4930 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Oct 12 06:00:26 crc kubenswrapper[4930]: I1012 06:00:26.056200 4930 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b83c8650-6361-42bb-8112-0449e6722a3f-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 12 06:00:26 crc kubenswrapper[4930]: I1012 06:00:26.056231 4930 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Oct 12 06:00:26 crc kubenswrapper[4930]: I1012 06:00:26.056240 4930 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b83c8650-6361-42bb-8112-0449e6722a3f-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 06:00:26 crc kubenswrapper[4930]: I1012 06:00:26.111305 4930 generic.go:334] "Generic (PLEG): container finished" podID="b83c8650-6361-42bb-8112-0449e6722a3f" containerID="15e775931c35b2e9b09753f036a874b4d26c758652a7511acfa8b52dcdbd6e9d" exitCode=0 Oct 12 06:00:26 crc kubenswrapper[4930]: I1012 06:00:26.111349 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b83c8650-6361-42bb-8112-0449e6722a3f","Type":"ContainerDied","Data":"15e775931c35b2e9b09753f036a874b4d26c758652a7511acfa8b52dcdbd6e9d"} Oct 12 06:00:26 crc kubenswrapper[4930]: I1012 06:00:26.111368 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 12 06:00:26 crc kubenswrapper[4930]: I1012 06:00:26.111382 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b83c8650-6361-42bb-8112-0449e6722a3f","Type":"ContainerDied","Data":"21e3dd3f6fc225d827017c9cf32eac22fac4ce103a374731010c0c257c796bb9"} Oct 12 06:00:26 crc kubenswrapper[4930]: I1012 06:00:26.111402 4930 scope.go:117] "RemoveContainer" containerID="15e775931c35b2e9b09753f036a874b4d26c758652a7511acfa8b52dcdbd6e9d" Oct 12 06:00:26 crc kubenswrapper[4930]: I1012 06:00:26.147990 4930 scope.go:117] "RemoveContainer" containerID="ae9ee27bdd88c79f2fa0d15b36520ba143be88fc092cd54c9e17f9cf8674b5e8" Oct 12 06:00:26 crc kubenswrapper[4930]: I1012 06:00:26.152226 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a56af5dd-28e1-41e8-b164-0c05399223ff" path="/var/lib/kubelet/pods/a56af5dd-28e1-41e8-b164-0c05399223ff/volumes" Oct 12 06:00:26 crc kubenswrapper[4930]: I1012 06:00:26.154464 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d297fbfc-df5c-4365-a668-d04f87df845d" path="/var/lib/kubelet/pods/d297fbfc-df5c-4365-a668-d04f87df845d/volumes" Oct 12 06:00:26 crc kubenswrapper[4930]: I1012 06:00:26.155083 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 12 06:00:26 crc kubenswrapper[4930]: I1012 06:00:26.171594 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 12 06:00:26 crc kubenswrapper[4930]: I1012 06:00:26.191282 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 12 06:00:26 crc kubenswrapper[4930]: E1012 06:00:26.191709 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b83c8650-6361-42bb-8112-0449e6722a3f" containerName="glance-httpd" Oct 12 06:00:26 crc kubenswrapper[4930]: I1012 06:00:26.191721 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="b83c8650-6361-42bb-8112-0449e6722a3f" containerName="glance-httpd" Oct 12 06:00:26 crc kubenswrapper[4930]: E1012 06:00:26.191782 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b83c8650-6361-42bb-8112-0449e6722a3f" containerName="glance-log" Oct 12 06:00:26 crc kubenswrapper[4930]: I1012 06:00:26.191789 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="b83c8650-6361-42bb-8112-0449e6722a3f" containerName="glance-log" Oct 12 06:00:26 crc kubenswrapper[4930]: I1012 06:00:26.191886 4930 scope.go:117] "RemoveContainer" containerID="15e775931c35b2e9b09753f036a874b4d26c758652a7511acfa8b52dcdbd6e9d" Oct 12 06:00:26 crc kubenswrapper[4930]: I1012 06:00:26.192283 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="b83c8650-6361-42bb-8112-0449e6722a3f" containerName="glance-log" Oct 12 06:00:26 crc kubenswrapper[4930]: I1012 06:00:26.192315 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="b83c8650-6361-42bb-8112-0449e6722a3f" containerName="glance-httpd" Oct 12 06:00:26 crc kubenswrapper[4930]: E1012 06:00:26.193145 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15e775931c35b2e9b09753f036a874b4d26c758652a7511acfa8b52dcdbd6e9d\": container with ID starting with 15e775931c35b2e9b09753f036a874b4d26c758652a7511acfa8b52dcdbd6e9d not found: ID does not exist" containerID="15e775931c35b2e9b09753f036a874b4d26c758652a7511acfa8b52dcdbd6e9d" Oct 12 06:00:26 crc kubenswrapper[4930]: I1012 06:00:26.193188 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15e775931c35b2e9b09753f036a874b4d26c758652a7511acfa8b52dcdbd6e9d"} err="failed to get container status \"15e775931c35b2e9b09753f036a874b4d26c758652a7511acfa8b52dcdbd6e9d\": rpc error: code = NotFound desc = could not find container \"15e775931c35b2e9b09753f036a874b4d26c758652a7511acfa8b52dcdbd6e9d\": container with ID starting with 15e775931c35b2e9b09753f036a874b4d26c758652a7511acfa8b52dcdbd6e9d not found: ID does not exist" Oct 12 06:00:26 crc kubenswrapper[4930]: I1012 06:00:26.193232 4930 scope.go:117] "RemoveContainer" containerID="ae9ee27bdd88c79f2fa0d15b36520ba143be88fc092cd54c9e17f9cf8674b5e8" Oct 12 06:00:26 crc kubenswrapper[4930]: I1012 06:00:26.193501 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 12 06:00:26 crc kubenswrapper[4930]: E1012 06:00:26.193596 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae9ee27bdd88c79f2fa0d15b36520ba143be88fc092cd54c9e17f9cf8674b5e8\": container with ID starting with ae9ee27bdd88c79f2fa0d15b36520ba143be88fc092cd54c9e17f9cf8674b5e8 not found: ID does not exist" containerID="ae9ee27bdd88c79f2fa0d15b36520ba143be88fc092cd54c9e17f9cf8674b5e8" Oct 12 06:00:26 crc kubenswrapper[4930]: I1012 06:00:26.193628 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae9ee27bdd88c79f2fa0d15b36520ba143be88fc092cd54c9e17f9cf8674b5e8"} err="failed to get container status \"ae9ee27bdd88c79f2fa0d15b36520ba143be88fc092cd54c9e17f9cf8674b5e8\": rpc error: code = NotFound desc = could not find container \"ae9ee27bdd88c79f2fa0d15b36520ba143be88fc092cd54c9e17f9cf8674b5e8\": container with ID starting with ae9ee27bdd88c79f2fa0d15b36520ba143be88fc092cd54c9e17f9cf8674b5e8 not found: ID does not exist" Oct 12 06:00:26 crc kubenswrapper[4930]: I1012 06:00:26.195420 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 12 06:00:26 crc kubenswrapper[4930]: I1012 06:00:26.195576 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 12 06:00:26 crc kubenswrapper[4930]: I1012 06:00:26.218059 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 12 06:00:26 crc kubenswrapper[4930]: I1012 06:00:26.236478 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 12 06:00:26 crc kubenswrapper[4930]: I1012 06:00:26.364898 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqfz9\" (UniqueName: \"kubernetes.io/projected/6ceabefb-4c59-49ab-9ec7-cc011d6aa659-kube-api-access-hqfz9\") pod \"glance-default-internal-api-0\" (UID: \"6ceabefb-4c59-49ab-9ec7-cc011d6aa659\") " pod="openstack/glance-default-internal-api-0" Oct 12 06:00:26 crc kubenswrapper[4930]: I1012 06:00:26.365001 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ceabefb-4c59-49ab-9ec7-cc011d6aa659-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6ceabefb-4c59-49ab-9ec7-cc011d6aa659\") " pod="openstack/glance-default-internal-api-0" Oct 12 06:00:26 crc kubenswrapper[4930]: I1012 06:00:26.365059 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ceabefb-4c59-49ab-9ec7-cc011d6aa659-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6ceabefb-4c59-49ab-9ec7-cc011d6aa659\") " pod="openstack/glance-default-internal-api-0" Oct 12 06:00:26 crc kubenswrapper[4930]: I1012 06:00:26.365134 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6ceabefb-4c59-49ab-9ec7-cc011d6aa659-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6ceabefb-4c59-49ab-9ec7-cc011d6aa659\") " pod="openstack/glance-default-internal-api-0" Oct 12 06:00:26 crc kubenswrapper[4930]: I1012 06:00:26.365155 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ceabefb-4c59-49ab-9ec7-cc011d6aa659-logs\") pod \"glance-default-internal-api-0\" (UID: \"6ceabefb-4c59-49ab-9ec7-cc011d6aa659\") " pod="openstack/glance-default-internal-api-0" Oct 12 06:00:26 crc kubenswrapper[4930]: I1012 06:00:26.365192 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"6ceabefb-4c59-49ab-9ec7-cc011d6aa659\") " pod="openstack/glance-default-internal-api-0" Oct 12 06:00:26 crc kubenswrapper[4930]: I1012 06:00:26.365228 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ceabefb-4c59-49ab-9ec7-cc011d6aa659-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6ceabefb-4c59-49ab-9ec7-cc011d6aa659\") " pod="openstack/glance-default-internal-api-0" Oct 12 06:00:26 crc kubenswrapper[4930]: I1012 06:00:26.365296 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ceabefb-4c59-49ab-9ec7-cc011d6aa659-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6ceabefb-4c59-49ab-9ec7-cc011d6aa659\") " pod="openstack/glance-default-internal-api-0" Oct 12 06:00:26 crc kubenswrapper[4930]: I1012 06:00:26.369295 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 12 06:00:26 crc kubenswrapper[4930]: I1012 06:00:26.467053 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6ceabefb-4c59-49ab-9ec7-cc011d6aa659-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6ceabefb-4c59-49ab-9ec7-cc011d6aa659\") " pod="openstack/glance-default-internal-api-0" Oct 12 06:00:26 crc kubenswrapper[4930]: I1012 06:00:26.467109 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ceabefb-4c59-49ab-9ec7-cc011d6aa659-logs\") pod \"glance-default-internal-api-0\" (UID: \"6ceabefb-4c59-49ab-9ec7-cc011d6aa659\") " pod="openstack/glance-default-internal-api-0" Oct 12 06:00:26 crc kubenswrapper[4930]: I1012 06:00:26.467176 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"6ceabefb-4c59-49ab-9ec7-cc011d6aa659\") " pod="openstack/glance-default-internal-api-0" Oct 12 06:00:26 crc kubenswrapper[4930]: I1012 06:00:26.467229 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ceabefb-4c59-49ab-9ec7-cc011d6aa659-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6ceabefb-4c59-49ab-9ec7-cc011d6aa659\") " pod="openstack/glance-default-internal-api-0" Oct 12 06:00:26 crc kubenswrapper[4930]: I1012 06:00:26.467322 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ceabefb-4c59-49ab-9ec7-cc011d6aa659-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6ceabefb-4c59-49ab-9ec7-cc011d6aa659\") " pod="openstack/glance-default-internal-api-0" Oct 12 06:00:26 crc kubenswrapper[4930]: I1012 06:00:26.467386 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqfz9\" (UniqueName: \"kubernetes.io/projected/6ceabefb-4c59-49ab-9ec7-cc011d6aa659-kube-api-access-hqfz9\") pod \"glance-default-internal-api-0\" (UID: \"6ceabefb-4c59-49ab-9ec7-cc011d6aa659\") " pod="openstack/glance-default-internal-api-0" Oct 12 06:00:26 crc kubenswrapper[4930]: I1012 06:00:26.467429 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ceabefb-4c59-49ab-9ec7-cc011d6aa659-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6ceabefb-4c59-49ab-9ec7-cc011d6aa659\") " pod="openstack/glance-default-internal-api-0" Oct 12 06:00:26 crc kubenswrapper[4930]: I1012 06:00:26.467458 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ceabefb-4c59-49ab-9ec7-cc011d6aa659-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6ceabefb-4c59-49ab-9ec7-cc011d6aa659\") " pod="openstack/glance-default-internal-api-0" Oct 12 06:00:26 crc kubenswrapper[4930]: I1012 06:00:26.467525 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6ceabefb-4c59-49ab-9ec7-cc011d6aa659-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6ceabefb-4c59-49ab-9ec7-cc011d6aa659\") " pod="openstack/glance-default-internal-api-0" Oct 12 06:00:26 crc kubenswrapper[4930]: I1012 06:00:26.467950 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ceabefb-4c59-49ab-9ec7-cc011d6aa659-logs\") pod \"glance-default-internal-api-0\" (UID: \"6ceabefb-4c59-49ab-9ec7-cc011d6aa659\") " pod="openstack/glance-default-internal-api-0" Oct 12 06:00:26 crc kubenswrapper[4930]: I1012 06:00:26.468178 4930 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"6ceabefb-4c59-49ab-9ec7-cc011d6aa659\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-internal-api-0" Oct 12 06:00:26 crc kubenswrapper[4930]: I1012 06:00:26.481216 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ceabefb-4c59-49ab-9ec7-cc011d6aa659-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6ceabefb-4c59-49ab-9ec7-cc011d6aa659\") " pod="openstack/glance-default-internal-api-0" Oct 12 06:00:26 crc kubenswrapper[4930]: I1012 06:00:26.482179 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ceabefb-4c59-49ab-9ec7-cc011d6aa659-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6ceabefb-4c59-49ab-9ec7-cc011d6aa659\") " pod="openstack/glance-default-internal-api-0" Oct 12 06:00:26 crc kubenswrapper[4930]: I1012 06:00:26.483784 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ceabefb-4c59-49ab-9ec7-cc011d6aa659-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6ceabefb-4c59-49ab-9ec7-cc011d6aa659\") " pod="openstack/glance-default-internal-api-0" Oct 12 06:00:26 crc kubenswrapper[4930]: I1012 06:00:26.484344 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ceabefb-4c59-49ab-9ec7-cc011d6aa659-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6ceabefb-4c59-49ab-9ec7-cc011d6aa659\") " pod="openstack/glance-default-internal-api-0" Oct 12 06:00:26 crc kubenswrapper[4930]: I1012 06:00:26.484918 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqfz9\" (UniqueName: \"kubernetes.io/projected/6ceabefb-4c59-49ab-9ec7-cc011d6aa659-kube-api-access-hqfz9\") pod \"glance-default-internal-api-0\" (UID: \"6ceabefb-4c59-49ab-9ec7-cc011d6aa659\") " pod="openstack/glance-default-internal-api-0" Oct 12 06:00:26 crc kubenswrapper[4930]: I1012 06:00:26.499692 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"6ceabefb-4c59-49ab-9ec7-cc011d6aa659\") " pod="openstack/glance-default-internal-api-0" Oct 12 06:00:26 crc kubenswrapper[4930]: I1012 06:00:26.571454 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 12 06:00:27 crc kubenswrapper[4930]: I1012 06:00:27.149970 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"00546299-d7c8-4536-9059-85a75dc5824e","Type":"ContainerStarted","Data":"2ad10f3859df41bd6cbc2ad367dd48d7e43496a2a81d2d03939aef767edecc5b"} Oct 12 06:00:27 crc kubenswrapper[4930]: I1012 06:00:27.150358 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"00546299-d7c8-4536-9059-85a75dc5824e","Type":"ContainerStarted","Data":"c2836f08947c8dda8dc76c326272ccc200a40aaf333645bebdcab1976127e0a3"} Oct 12 06:00:27 crc kubenswrapper[4930]: I1012 06:00:27.206251 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d","Type":"ContainerStarted","Data":"3d82fd5a554ad135ca998d61001cce4621f35a4a1c48700786346e527a09e4e5"} Oct 12 06:00:27 crc kubenswrapper[4930]: I1012 06:00:27.206288 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d","Type":"ContainerStarted","Data":"80b8ed0425767142610b82842673ef1be948c1170575f68183cf8f5a3095d0b6"} Oct 12 06:00:27 crc kubenswrapper[4930]: I1012 06:00:27.206296 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d","Type":"ContainerStarted","Data":"7e377429aae2ab465dbde1db528e0e8cb38661bbe205f79690e4f115165b9459"} Oct 12 06:00:27 crc kubenswrapper[4930]: I1012 06:00:27.328092 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 12 06:00:28 crc kubenswrapper[4930]: I1012 06:00:28.157418 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b83c8650-6361-42bb-8112-0449e6722a3f" path="/var/lib/kubelet/pods/b83c8650-6361-42bb-8112-0449e6722a3f/volumes" Oct 12 06:00:28 crc kubenswrapper[4930]: I1012 06:00:28.222954 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d","Type":"ContainerStarted","Data":"cbb29a1e46d4bb476e24d4e7c217fa7a3b2e238b4211bb46f9a3cff524eea8fa"} Oct 12 06:00:28 crc kubenswrapper[4930]: I1012 06:00:28.225231 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"00546299-d7c8-4536-9059-85a75dc5824e","Type":"ContainerStarted","Data":"acdc36de771746107bbc59ac707ae22094963391d16657ec1cacddc49cb8995e"} Oct 12 06:00:28 crc kubenswrapper[4930]: I1012 06:00:28.237034 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6ceabefb-4c59-49ab-9ec7-cc011d6aa659","Type":"ContainerStarted","Data":"619bcf89a72dd41f4602d2793bf77720b4f76dd53cf0157d90e5f8bb11bf2f3d"} Oct 12 06:00:28 crc kubenswrapper[4930]: I1012 06:00:28.237123 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6ceabefb-4c59-49ab-9ec7-cc011d6aa659","Type":"ContainerStarted","Data":"d76f7655e370a9886292e345ae2bcf2a9df27b2b04c37dacb611c2a148306228"} Oct 12 06:00:28 crc kubenswrapper[4930]: I1012 06:00:28.293967 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.293948997 podStartE2EDuration="3.293948997s" podCreationTimestamp="2025-10-12 06:00:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 06:00:28.272164075 +0000 UTC m=+1160.814265850" watchObservedRunningTime="2025-10-12 06:00:28.293948997 +0000 UTC m=+1160.836050762" Oct 12 06:00:29 crc kubenswrapper[4930]: I1012 06:00:29.249959 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d","Type":"ContainerStarted","Data":"9cb158000845c265b2a30e14aecf239dd49f610ffb1471c21bceed7353e6d33a"} Oct 12 06:00:29 crc kubenswrapper[4930]: I1012 06:00:29.251866 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 12 06:00:29 crc kubenswrapper[4930]: I1012 06:00:29.253927 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6ceabefb-4c59-49ab-9ec7-cc011d6aa659","Type":"ContainerStarted","Data":"ca97c7f46a515e92a95353a8c329930c2cf63692c0c2cbabb8efea1e9dea571c"} Oct 12 06:00:29 crc kubenswrapper[4930]: I1012 06:00:29.278932 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.5059356780000002 podStartE2EDuration="4.278912069s" podCreationTimestamp="2025-10-12 06:00:25 +0000 UTC" firstStartedPulling="2025-10-12 06:00:26.211357195 +0000 UTC m=+1158.753458960" lastFinishedPulling="2025-10-12 06:00:28.984333586 +0000 UTC m=+1161.526435351" observedRunningTime="2025-10-12 06:00:29.273429543 +0000 UTC m=+1161.815531308" watchObservedRunningTime="2025-10-12 06:00:29.278912069 +0000 UTC m=+1161.821013834" Oct 12 06:00:29 crc kubenswrapper[4930]: I1012 06:00:29.296995 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.296978019 podStartE2EDuration="3.296978019s" podCreationTimestamp="2025-10-12 06:00:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 06:00:29.289903453 +0000 UTC m=+1161.832005218" watchObservedRunningTime="2025-10-12 06:00:29.296978019 +0000 UTC m=+1161.839079784" Oct 12 06:00:29 crc kubenswrapper[4930]: I1012 06:00:29.315936 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 12 06:00:29 crc kubenswrapper[4930]: I1012 06:00:29.458094 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 12 06:00:30 crc kubenswrapper[4930]: I1012 06:00:30.485155 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-9k65c"] Oct 12 06:00:30 crc kubenswrapper[4930]: I1012 06:00:30.486494 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-9k65c" Oct 12 06:00:30 crc kubenswrapper[4930]: I1012 06:00:30.511791 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-9k65c"] Oct 12 06:00:30 crc kubenswrapper[4930]: I1012 06:00:30.549682 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56wtm\" (UniqueName: \"kubernetes.io/projected/234f0277-c632-443a-a332-07699c63b28a-kube-api-access-56wtm\") pod \"nova-api-db-create-9k65c\" (UID: \"234f0277-c632-443a-a332-07699c63b28a\") " pod="openstack/nova-api-db-create-9k65c" Oct 12 06:00:30 crc kubenswrapper[4930]: I1012 06:00:30.586467 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-fg84h"] Oct 12 06:00:30 crc kubenswrapper[4930]: I1012 06:00:30.588214 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-fg84h" Oct 12 06:00:30 crc kubenswrapper[4930]: I1012 06:00:30.601087 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-fg84h"] Oct 12 06:00:30 crc kubenswrapper[4930]: I1012 06:00:30.662019 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56wtm\" (UniqueName: \"kubernetes.io/projected/234f0277-c632-443a-a332-07699c63b28a-kube-api-access-56wtm\") pod \"nova-api-db-create-9k65c\" (UID: \"234f0277-c632-443a-a332-07699c63b28a\") " pod="openstack/nova-api-db-create-9k65c" Oct 12 06:00:30 crc kubenswrapper[4930]: I1012 06:00:30.662886 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgdvs\" (UniqueName: \"kubernetes.io/projected/f73d1a72-3045-4bf4-b851-b52907101d74-kube-api-access-fgdvs\") pod \"nova-cell0-db-create-fg84h\" (UID: \"f73d1a72-3045-4bf4-b851-b52907101d74\") " pod="openstack/nova-cell0-db-create-fg84h" Oct 12 06:00:30 crc kubenswrapper[4930]: I1012 06:00:30.687759 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-94srd"] Oct 12 06:00:30 crc kubenswrapper[4930]: I1012 06:00:30.688992 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-94srd" Oct 12 06:00:30 crc kubenswrapper[4930]: I1012 06:00:30.694797 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-94srd"] Oct 12 06:00:30 crc kubenswrapper[4930]: I1012 06:00:30.696487 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56wtm\" (UniqueName: \"kubernetes.io/projected/234f0277-c632-443a-a332-07699c63b28a-kube-api-access-56wtm\") pod \"nova-api-db-create-9k65c\" (UID: \"234f0277-c632-443a-a332-07699c63b28a\") " pod="openstack/nova-api-db-create-9k65c" Oct 12 06:00:30 crc kubenswrapper[4930]: I1012 06:00:30.765205 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgdvs\" (UniqueName: \"kubernetes.io/projected/f73d1a72-3045-4bf4-b851-b52907101d74-kube-api-access-fgdvs\") pod \"nova-cell0-db-create-fg84h\" (UID: \"f73d1a72-3045-4bf4-b851-b52907101d74\") " pod="openstack/nova-cell0-db-create-fg84h" Oct 12 06:00:30 crc kubenswrapper[4930]: I1012 06:00:30.765249 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkdx2\" (UniqueName: \"kubernetes.io/projected/6af5c93e-5d4d-4176-9cf0-fcb5197c7774-kube-api-access-dkdx2\") pod \"nova-cell1-db-create-94srd\" (UID: \"6af5c93e-5d4d-4176-9cf0-fcb5197c7774\") " pod="openstack/nova-cell1-db-create-94srd" Oct 12 06:00:30 crc kubenswrapper[4930]: I1012 06:00:30.781593 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgdvs\" (UniqueName: \"kubernetes.io/projected/f73d1a72-3045-4bf4-b851-b52907101d74-kube-api-access-fgdvs\") pod \"nova-cell0-db-create-fg84h\" (UID: \"f73d1a72-3045-4bf4-b851-b52907101d74\") " pod="openstack/nova-cell0-db-create-fg84h" Oct 12 06:00:30 crc kubenswrapper[4930]: I1012 06:00:30.805620 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-9k65c" Oct 12 06:00:30 crc kubenswrapper[4930]: I1012 06:00:30.866765 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkdx2\" (UniqueName: \"kubernetes.io/projected/6af5c93e-5d4d-4176-9cf0-fcb5197c7774-kube-api-access-dkdx2\") pod \"nova-cell1-db-create-94srd\" (UID: \"6af5c93e-5d4d-4176-9cf0-fcb5197c7774\") " pod="openstack/nova-cell1-db-create-94srd" Oct 12 06:00:30 crc kubenswrapper[4930]: I1012 06:00:30.882467 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkdx2\" (UniqueName: \"kubernetes.io/projected/6af5c93e-5d4d-4176-9cf0-fcb5197c7774-kube-api-access-dkdx2\") pod \"nova-cell1-db-create-94srd\" (UID: \"6af5c93e-5d4d-4176-9cf0-fcb5197c7774\") " pod="openstack/nova-cell1-db-create-94srd" Oct 12 06:00:30 crc kubenswrapper[4930]: I1012 06:00:30.918062 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-fg84h" Oct 12 06:00:31 crc kubenswrapper[4930]: I1012 06:00:31.057219 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-94srd" Oct 12 06:00:31 crc kubenswrapper[4930]: I1012 06:00:31.269024 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d" containerName="ceilometer-central-agent" containerID="cri-o://80b8ed0425767142610b82842673ef1be948c1170575f68183cf8f5a3095d0b6" gracePeriod=30 Oct 12 06:00:31 crc kubenswrapper[4930]: I1012 06:00:31.269708 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d" containerName="proxy-httpd" containerID="cri-o://9cb158000845c265b2a30e14aecf239dd49f610ffb1471c21bceed7353e6d33a" gracePeriod=30 Oct 12 06:00:31 crc kubenswrapper[4930]: I1012 06:00:31.269767 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d" containerName="sg-core" containerID="cri-o://cbb29a1e46d4bb476e24d4e7c217fa7a3b2e238b4211bb46f9a3cff524eea8fa" gracePeriod=30 Oct 12 06:00:31 crc kubenswrapper[4930]: I1012 06:00:31.269799 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d" containerName="ceilometer-notification-agent" containerID="cri-o://3d82fd5a554ad135ca998d61001cce4621f35a4a1c48700786346e527a09e4e5" gracePeriod=30 Oct 12 06:00:31 crc kubenswrapper[4930]: I1012 06:00:31.335331 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-9k65c"] Oct 12 06:00:31 crc kubenswrapper[4930]: W1012 06:00:31.336618 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod234f0277_c632_443a_a332_07699c63b28a.slice/crio-7a78fb0f78d9ee4850936a04f035ef9798799bf61c227d5035c5d6a0dda49c4c WatchSource:0}: Error finding container 7a78fb0f78d9ee4850936a04f035ef9798799bf61c227d5035c5d6a0dda49c4c: Status 404 returned error can't find the container with id 7a78fb0f78d9ee4850936a04f035ef9798799bf61c227d5035c5d6a0dda49c4c Oct 12 06:00:31 crc kubenswrapper[4930]: I1012 06:00:31.463012 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-fg84h"] Oct 12 06:00:31 crc kubenswrapper[4930]: W1012 06:00:31.466885 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf73d1a72_3045_4bf4_b851_b52907101d74.slice/crio-d38c1995c1c011041d953a1e6ccc46d6b1c1bf375169d73eea29a75e5b0b0bb2 WatchSource:0}: Error finding container d38c1995c1c011041d953a1e6ccc46d6b1c1bf375169d73eea29a75e5b0b0bb2: Status 404 returned error can't find the container with id d38c1995c1c011041d953a1e6ccc46d6b1c1bf375169d73eea29a75e5b0b0bb2 Oct 12 06:00:31 crc kubenswrapper[4930]: I1012 06:00:31.628063 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-94srd"] Oct 12 06:00:31 crc kubenswrapper[4930]: W1012 06:00:31.662631 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6af5c93e_5d4d_4176_9cf0_fcb5197c7774.slice/crio-c034090bd316bb7e5b95bdd73c1d3c8f29f6bda3c725b3469455210e29027967 WatchSource:0}: Error finding container c034090bd316bb7e5b95bdd73c1d3c8f29f6bda3c725b3469455210e29027967: Status 404 returned error can't find the container with id c034090bd316bb7e5b95bdd73c1d3c8f29f6bda3c725b3469455210e29027967 Oct 12 06:00:32 crc kubenswrapper[4930]: I1012 06:00:32.289503 4930 generic.go:334] "Generic (PLEG): container finished" podID="6af5c93e-5d4d-4176-9cf0-fcb5197c7774" containerID="d267b4b820256d8a34e75b0920fde5a6f3d90bded0c05c4e00ffd803bfd2294b" exitCode=0 Oct 12 06:00:32 crc kubenswrapper[4930]: I1012 06:00:32.289606 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-94srd" event={"ID":"6af5c93e-5d4d-4176-9cf0-fcb5197c7774","Type":"ContainerDied","Data":"d267b4b820256d8a34e75b0920fde5a6f3d90bded0c05c4e00ffd803bfd2294b"} Oct 12 06:00:32 crc kubenswrapper[4930]: I1012 06:00:32.289649 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-94srd" event={"ID":"6af5c93e-5d4d-4176-9cf0-fcb5197c7774","Type":"ContainerStarted","Data":"c034090bd316bb7e5b95bdd73c1d3c8f29f6bda3c725b3469455210e29027967"} Oct 12 06:00:32 crc kubenswrapper[4930]: I1012 06:00:32.291907 4930 generic.go:334] "Generic (PLEG): container finished" podID="f73d1a72-3045-4bf4-b851-b52907101d74" containerID="395c709050f36df78f829eedf3e2abbc3b33739a508462fafa1f64340258c9f0" exitCode=0 Oct 12 06:00:32 crc kubenswrapper[4930]: I1012 06:00:32.292121 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-fg84h" event={"ID":"f73d1a72-3045-4bf4-b851-b52907101d74","Type":"ContainerDied","Data":"395c709050f36df78f829eedf3e2abbc3b33739a508462fafa1f64340258c9f0"} Oct 12 06:00:32 crc kubenswrapper[4930]: I1012 06:00:32.292182 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-fg84h" event={"ID":"f73d1a72-3045-4bf4-b851-b52907101d74","Type":"ContainerStarted","Data":"d38c1995c1c011041d953a1e6ccc46d6b1c1bf375169d73eea29a75e5b0b0bb2"} Oct 12 06:00:32 crc kubenswrapper[4930]: I1012 06:00:32.300577 4930 generic.go:334] "Generic (PLEG): container finished" podID="b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d" containerID="9cb158000845c265b2a30e14aecf239dd49f610ffb1471c21bceed7353e6d33a" exitCode=0 Oct 12 06:00:32 crc kubenswrapper[4930]: I1012 06:00:32.300609 4930 generic.go:334] "Generic (PLEG): container finished" podID="b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d" containerID="cbb29a1e46d4bb476e24d4e7c217fa7a3b2e238b4211bb46f9a3cff524eea8fa" exitCode=2 Oct 12 06:00:32 crc kubenswrapper[4930]: I1012 06:00:32.300617 4930 generic.go:334] "Generic (PLEG): container finished" podID="b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d" containerID="3d82fd5a554ad135ca998d61001cce4621f35a4a1c48700786346e527a09e4e5" exitCode=0 Oct 12 06:00:32 crc kubenswrapper[4930]: I1012 06:00:32.300656 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d","Type":"ContainerDied","Data":"9cb158000845c265b2a30e14aecf239dd49f610ffb1471c21bceed7353e6d33a"} Oct 12 06:00:32 crc kubenswrapper[4930]: I1012 06:00:32.300692 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d","Type":"ContainerDied","Data":"cbb29a1e46d4bb476e24d4e7c217fa7a3b2e238b4211bb46f9a3cff524eea8fa"} Oct 12 06:00:32 crc kubenswrapper[4930]: I1012 06:00:32.300702 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d","Type":"ContainerDied","Data":"3d82fd5a554ad135ca998d61001cce4621f35a4a1c48700786346e527a09e4e5"} Oct 12 06:00:32 crc kubenswrapper[4930]: I1012 06:00:32.303587 4930 generic.go:334] "Generic (PLEG): container finished" podID="234f0277-c632-443a-a332-07699c63b28a" containerID="0242ef56b3fb8e9f247c830cac2b07fb839a01174f1f165da28a219d25ed243d" exitCode=0 Oct 12 06:00:32 crc kubenswrapper[4930]: I1012 06:00:32.303644 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-9k65c" event={"ID":"234f0277-c632-443a-a332-07699c63b28a","Type":"ContainerDied","Data":"0242ef56b3fb8e9f247c830cac2b07fb839a01174f1f165da28a219d25ed243d"} Oct 12 06:00:32 crc kubenswrapper[4930]: I1012 06:00:32.303671 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-9k65c" event={"ID":"234f0277-c632-443a-a332-07699c63b28a","Type":"ContainerStarted","Data":"7a78fb0f78d9ee4850936a04f035ef9798799bf61c227d5035c5d6a0dda49c4c"} Oct 12 06:00:33 crc kubenswrapper[4930]: I1012 06:00:33.669396 4930 patch_prober.go:28] interesting pod/machine-config-daemon-mk4tf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 06:00:33 crc kubenswrapper[4930]: I1012 06:00:33.669871 4930 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 06:00:33 crc kubenswrapper[4930]: I1012 06:00:33.686224 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 12 06:00:33 crc kubenswrapper[4930]: I1012 06:00:33.716289 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Oct 12 06:00:33 crc kubenswrapper[4930]: I1012 06:00:33.899562 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-94srd" Oct 12 06:00:33 crc kubenswrapper[4930]: I1012 06:00:33.904949 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-fg84h" Oct 12 06:00:33 crc kubenswrapper[4930]: I1012 06:00:33.912954 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-9k65c" Oct 12 06:00:34 crc kubenswrapper[4930]: I1012 06:00:34.029262 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dkdx2\" (UniqueName: \"kubernetes.io/projected/6af5c93e-5d4d-4176-9cf0-fcb5197c7774-kube-api-access-dkdx2\") pod \"6af5c93e-5d4d-4176-9cf0-fcb5197c7774\" (UID: \"6af5c93e-5d4d-4176-9cf0-fcb5197c7774\") " Oct 12 06:00:34 crc kubenswrapper[4930]: I1012 06:00:34.029958 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56wtm\" (UniqueName: \"kubernetes.io/projected/234f0277-c632-443a-a332-07699c63b28a-kube-api-access-56wtm\") pod \"234f0277-c632-443a-a332-07699c63b28a\" (UID: \"234f0277-c632-443a-a332-07699c63b28a\") " Oct 12 06:00:34 crc kubenswrapper[4930]: I1012 06:00:34.030148 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgdvs\" (UniqueName: \"kubernetes.io/projected/f73d1a72-3045-4bf4-b851-b52907101d74-kube-api-access-fgdvs\") pod \"f73d1a72-3045-4bf4-b851-b52907101d74\" (UID: \"f73d1a72-3045-4bf4-b851-b52907101d74\") " Oct 12 06:00:34 crc kubenswrapper[4930]: I1012 06:00:34.035481 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/234f0277-c632-443a-a332-07699c63b28a-kube-api-access-56wtm" (OuterVolumeSpecName: "kube-api-access-56wtm") pod "234f0277-c632-443a-a332-07699c63b28a" (UID: "234f0277-c632-443a-a332-07699c63b28a"). InnerVolumeSpecName "kube-api-access-56wtm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:00:34 crc kubenswrapper[4930]: I1012 06:00:34.041465 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f73d1a72-3045-4bf4-b851-b52907101d74-kube-api-access-fgdvs" (OuterVolumeSpecName: "kube-api-access-fgdvs") pod "f73d1a72-3045-4bf4-b851-b52907101d74" (UID: "f73d1a72-3045-4bf4-b851-b52907101d74"). InnerVolumeSpecName "kube-api-access-fgdvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:00:34 crc kubenswrapper[4930]: I1012 06:00:34.041518 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6af5c93e-5d4d-4176-9cf0-fcb5197c7774-kube-api-access-dkdx2" (OuterVolumeSpecName: "kube-api-access-dkdx2") pod "6af5c93e-5d4d-4176-9cf0-fcb5197c7774" (UID: "6af5c93e-5d4d-4176-9cf0-fcb5197c7774"). InnerVolumeSpecName "kube-api-access-dkdx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:00:34 crc kubenswrapper[4930]: I1012 06:00:34.132084 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56wtm\" (UniqueName: \"kubernetes.io/projected/234f0277-c632-443a-a332-07699c63b28a-kube-api-access-56wtm\") on node \"crc\" DevicePath \"\"" Oct 12 06:00:34 crc kubenswrapper[4930]: I1012 06:00:34.132114 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgdvs\" (UniqueName: \"kubernetes.io/projected/f73d1a72-3045-4bf4-b851-b52907101d74-kube-api-access-fgdvs\") on node \"crc\" DevicePath \"\"" Oct 12 06:00:34 crc kubenswrapper[4930]: I1012 06:00:34.132124 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dkdx2\" (UniqueName: \"kubernetes.io/projected/6af5c93e-5d4d-4176-9cf0-fcb5197c7774-kube-api-access-dkdx2\") on node \"crc\" DevicePath \"\"" Oct 12 06:00:34 crc kubenswrapper[4930]: I1012 06:00:34.322934 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-9k65c" event={"ID":"234f0277-c632-443a-a332-07699c63b28a","Type":"ContainerDied","Data":"7a78fb0f78d9ee4850936a04f035ef9798799bf61c227d5035c5d6a0dda49c4c"} Oct 12 06:00:34 crc kubenswrapper[4930]: I1012 06:00:34.322976 4930 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a78fb0f78d9ee4850936a04f035ef9798799bf61c227d5035c5d6a0dda49c4c" Oct 12 06:00:34 crc kubenswrapper[4930]: I1012 06:00:34.322985 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-9k65c" Oct 12 06:00:34 crc kubenswrapper[4930]: I1012 06:00:34.324901 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-94srd" event={"ID":"6af5c93e-5d4d-4176-9cf0-fcb5197c7774","Type":"ContainerDied","Data":"c034090bd316bb7e5b95bdd73c1d3c8f29f6bda3c725b3469455210e29027967"} Oct 12 06:00:34 crc kubenswrapper[4930]: I1012 06:00:34.324947 4930 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c034090bd316bb7e5b95bdd73c1d3c8f29f6bda3c725b3469455210e29027967" Oct 12 06:00:34 crc kubenswrapper[4930]: I1012 06:00:34.324948 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-94srd" Oct 12 06:00:34 crc kubenswrapper[4930]: I1012 06:00:34.326913 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-fg84h" event={"ID":"f73d1a72-3045-4bf4-b851-b52907101d74","Type":"ContainerDied","Data":"d38c1995c1c011041d953a1e6ccc46d6b1c1bf375169d73eea29a75e5b0b0bb2"} Oct 12 06:00:34 crc kubenswrapper[4930]: I1012 06:00:34.326987 4930 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d38c1995c1c011041d953a1e6ccc46d6b1c1bf375169d73eea29a75e5b0b0bb2" Oct 12 06:00:34 crc kubenswrapper[4930]: I1012 06:00:34.327000 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-fg84h" Oct 12 06:00:34 crc kubenswrapper[4930]: I1012 06:00:34.327090 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Oct 12 06:00:34 crc kubenswrapper[4930]: I1012 06:00:34.363686 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Oct 12 06:00:34 crc kubenswrapper[4930]: I1012 06:00:34.408557 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 12 06:00:35 crc kubenswrapper[4930]: I1012 06:00:35.665080 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 12 06:00:35 crc kubenswrapper[4930]: I1012 06:00:35.665477 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 12 06:00:35 crc kubenswrapper[4930]: I1012 06:00:35.719955 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 12 06:00:35 crc kubenswrapper[4930]: I1012 06:00:35.724998 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 12 06:00:36 crc kubenswrapper[4930]: I1012 06:00:36.351877 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-decision-engine-0" podUID="59f4bae4-1a84-449a-be72-e735294116e6" containerName="watcher-decision-engine" containerID="cri-o://d3276e810538259cb3294c616a6c0101e64806aa3e935bc18563a7a3f2b3591c" gracePeriod=30 Oct 12 06:00:36 crc kubenswrapper[4930]: I1012 06:00:36.352455 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 12 06:00:36 crc kubenswrapper[4930]: I1012 06:00:36.352499 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 12 06:00:36 crc kubenswrapper[4930]: I1012 06:00:36.573022 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 12 06:00:36 crc kubenswrapper[4930]: I1012 06:00:36.573088 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 12 06:00:36 crc kubenswrapper[4930]: I1012 06:00:36.641447 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 12 06:00:36 crc kubenswrapper[4930]: I1012 06:00:36.654263 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 12 06:00:37 crc kubenswrapper[4930]: I1012 06:00:37.365937 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 12 06:00:37 crc kubenswrapper[4930]: I1012 06:00:37.366005 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 12 06:00:38 crc kubenswrapper[4930]: I1012 06:00:38.096241 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 12 06:00:38 crc kubenswrapper[4930]: I1012 06:00:38.101592 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 12 06:00:38 crc kubenswrapper[4930]: I1012 06:00:38.378827 4930 generic.go:334] "Generic (PLEG): container finished" podID="b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d" containerID="80b8ed0425767142610b82842673ef1be948c1170575f68183cf8f5a3095d0b6" exitCode=0 Oct 12 06:00:38 crc kubenswrapper[4930]: I1012 06:00:38.379827 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d","Type":"ContainerDied","Data":"80b8ed0425767142610b82842673ef1be948c1170575f68183cf8f5a3095d0b6"} Oct 12 06:00:38 crc kubenswrapper[4930]: I1012 06:00:38.631048 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 06:00:38 crc kubenswrapper[4930]: I1012 06:00:38.646419 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbwnh\" (UniqueName: \"kubernetes.io/projected/b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d-kube-api-access-xbwnh\") pod \"b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d\" (UID: \"b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d\") " Oct 12 06:00:38 crc kubenswrapper[4930]: I1012 06:00:38.646514 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d-run-httpd\") pod \"b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d\" (UID: \"b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d\") " Oct 12 06:00:38 crc kubenswrapper[4930]: I1012 06:00:38.646547 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d-scripts\") pod \"b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d\" (UID: \"b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d\") " Oct 12 06:00:38 crc kubenswrapper[4930]: I1012 06:00:38.646648 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d-sg-core-conf-yaml\") pod \"b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d\" (UID: \"b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d\") " Oct 12 06:00:38 crc kubenswrapper[4930]: I1012 06:00:38.646675 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d-combined-ca-bundle\") pod \"b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d\" (UID: \"b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d\") " Oct 12 06:00:38 crc kubenswrapper[4930]: I1012 06:00:38.646767 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d-log-httpd\") pod \"b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d\" (UID: \"b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d\") " Oct 12 06:00:38 crc kubenswrapper[4930]: I1012 06:00:38.646826 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d-config-data\") pod \"b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d\" (UID: \"b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d\") " Oct 12 06:00:38 crc kubenswrapper[4930]: I1012 06:00:38.647162 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d" (UID: "b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 06:00:38 crc kubenswrapper[4930]: I1012 06:00:38.647257 4930 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 12 06:00:38 crc kubenswrapper[4930]: I1012 06:00:38.647434 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d" (UID: "b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 06:00:38 crc kubenswrapper[4930]: I1012 06:00:38.652729 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d-scripts" (OuterVolumeSpecName: "scripts") pod "b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d" (UID: "b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:00:38 crc kubenswrapper[4930]: I1012 06:00:38.657891 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d-kube-api-access-xbwnh" (OuterVolumeSpecName: "kube-api-access-xbwnh") pod "b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d" (UID: "b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d"). InnerVolumeSpecName "kube-api-access-xbwnh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:00:38 crc kubenswrapper[4930]: I1012 06:00:38.753884 4930 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 12 06:00:38 crc kubenswrapper[4930]: I1012 06:00:38.753915 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbwnh\" (UniqueName: \"kubernetes.io/projected/b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d-kube-api-access-xbwnh\") on node \"crc\" DevicePath \"\"" Oct 12 06:00:38 crc kubenswrapper[4930]: I1012 06:00:38.753926 4930 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 06:00:38 crc kubenswrapper[4930]: I1012 06:00:38.760124 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d" (UID: "b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:00:38 crc kubenswrapper[4930]: I1012 06:00:38.871277 4930 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 12 06:00:38 crc kubenswrapper[4930]: I1012 06:00:38.940868 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d" (UID: "b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:00:38 crc kubenswrapper[4930]: I1012 06:00:38.973474 4930 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 06:00:38 crc kubenswrapper[4930]: I1012 06:00:38.976348 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d-config-data" (OuterVolumeSpecName: "config-data") pod "b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d" (UID: "b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:00:39 crc kubenswrapper[4930]: I1012 06:00:39.075313 4930 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 06:00:39 crc kubenswrapper[4930]: I1012 06:00:39.398617 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 06:00:39 crc kubenswrapper[4930]: I1012 06:00:39.398638 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d","Type":"ContainerDied","Data":"7e377429aae2ab465dbde1db528e0e8cb38661bbe205f79690e4f115165b9459"} Oct 12 06:00:39 crc kubenswrapper[4930]: I1012 06:00:39.398721 4930 scope.go:117] "RemoveContainer" containerID="9cb158000845c265b2a30e14aecf239dd49f610ffb1471c21bceed7353e6d33a" Oct 12 06:00:39 crc kubenswrapper[4930]: I1012 06:00:39.413369 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 12 06:00:39 crc kubenswrapper[4930]: I1012 06:00:39.413489 4930 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 12 06:00:39 crc kubenswrapper[4930]: I1012 06:00:39.425166 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 12 06:00:39 crc kubenswrapper[4930]: I1012 06:00:39.459049 4930 scope.go:117] "RemoveContainer" containerID="cbb29a1e46d4bb476e24d4e7c217fa7a3b2e238b4211bb46f9a3cff524eea8fa" Oct 12 06:00:39 crc kubenswrapper[4930]: I1012 06:00:39.499138 4930 scope.go:117] "RemoveContainer" containerID="3d82fd5a554ad135ca998d61001cce4621f35a4a1c48700786346e527a09e4e5" Oct 12 06:00:39 crc kubenswrapper[4930]: I1012 06:00:39.522191 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 12 06:00:39 crc kubenswrapper[4930]: I1012 06:00:39.535970 4930 scope.go:117] "RemoveContainer" containerID="80b8ed0425767142610b82842673ef1be948c1170575f68183cf8f5a3095d0b6" Oct 12 06:00:39 crc kubenswrapper[4930]: I1012 06:00:39.551454 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 12 06:00:39 crc kubenswrapper[4930]: I1012 06:00:39.560795 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 12 06:00:39 crc kubenswrapper[4930]: E1012 06:00:39.561304 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d" containerName="sg-core" Oct 12 06:00:39 crc kubenswrapper[4930]: I1012 06:00:39.561325 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d" containerName="sg-core" Oct 12 06:00:39 crc kubenswrapper[4930]: E1012 06:00:39.561347 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d" containerName="proxy-httpd" Oct 12 06:00:39 crc kubenswrapper[4930]: I1012 06:00:39.561356 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d" containerName="proxy-httpd" Oct 12 06:00:39 crc kubenswrapper[4930]: E1012 06:00:39.561382 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d" containerName="ceilometer-notification-agent" Oct 12 06:00:39 crc kubenswrapper[4930]: I1012 06:00:39.561391 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d" containerName="ceilometer-notification-agent" Oct 12 06:00:39 crc kubenswrapper[4930]: E1012 06:00:39.561412 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6af5c93e-5d4d-4176-9cf0-fcb5197c7774" containerName="mariadb-database-create" Oct 12 06:00:39 crc kubenswrapper[4930]: I1012 06:00:39.561421 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="6af5c93e-5d4d-4176-9cf0-fcb5197c7774" containerName="mariadb-database-create" Oct 12 06:00:39 crc kubenswrapper[4930]: E1012 06:00:39.561440 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="234f0277-c632-443a-a332-07699c63b28a" containerName="mariadb-database-create" Oct 12 06:00:39 crc kubenswrapper[4930]: I1012 06:00:39.561448 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="234f0277-c632-443a-a332-07699c63b28a" containerName="mariadb-database-create" Oct 12 06:00:39 crc kubenswrapper[4930]: E1012 06:00:39.561465 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f73d1a72-3045-4bf4-b851-b52907101d74" containerName="mariadb-database-create" Oct 12 06:00:39 crc kubenswrapper[4930]: I1012 06:00:39.561473 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="f73d1a72-3045-4bf4-b851-b52907101d74" containerName="mariadb-database-create" Oct 12 06:00:39 crc kubenswrapper[4930]: E1012 06:00:39.561491 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d" containerName="ceilometer-central-agent" Oct 12 06:00:39 crc kubenswrapper[4930]: I1012 06:00:39.561499 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d" containerName="ceilometer-central-agent" Oct 12 06:00:39 crc kubenswrapper[4930]: I1012 06:00:39.561762 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="6af5c93e-5d4d-4176-9cf0-fcb5197c7774" containerName="mariadb-database-create" Oct 12 06:00:39 crc kubenswrapper[4930]: I1012 06:00:39.561789 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="f73d1a72-3045-4bf4-b851-b52907101d74" containerName="mariadb-database-create" Oct 12 06:00:39 crc kubenswrapper[4930]: I1012 06:00:39.561802 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="234f0277-c632-443a-a332-07699c63b28a" containerName="mariadb-database-create" Oct 12 06:00:39 crc kubenswrapper[4930]: I1012 06:00:39.561820 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d" containerName="sg-core" Oct 12 06:00:39 crc kubenswrapper[4930]: I1012 06:00:39.561836 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d" containerName="ceilometer-central-agent" Oct 12 06:00:39 crc kubenswrapper[4930]: I1012 06:00:39.561857 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d" containerName="ceilometer-notification-agent" Oct 12 06:00:39 crc kubenswrapper[4930]: I1012 06:00:39.561872 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d" containerName="proxy-httpd" Oct 12 06:00:39 crc kubenswrapper[4930]: I1012 06:00:39.568321 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 06:00:39 crc kubenswrapper[4930]: I1012 06:00:39.574792 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 12 06:00:39 crc kubenswrapper[4930]: I1012 06:00:39.575161 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 12 06:00:39 crc kubenswrapper[4930]: I1012 06:00:39.603356 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 12 06:00:39 crc kubenswrapper[4930]: I1012 06:00:39.694438 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a3697520-ef2b-4b11-80a2-f47ca1de765b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a3697520-ef2b-4b11-80a2-f47ca1de765b\") " pod="openstack/ceilometer-0" Oct 12 06:00:39 crc kubenswrapper[4930]: I1012 06:00:39.694480 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a3697520-ef2b-4b11-80a2-f47ca1de765b-log-httpd\") pod \"ceilometer-0\" (UID: \"a3697520-ef2b-4b11-80a2-f47ca1de765b\") " pod="openstack/ceilometer-0" Oct 12 06:00:39 crc kubenswrapper[4930]: I1012 06:00:39.694551 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thmdt\" (UniqueName: \"kubernetes.io/projected/a3697520-ef2b-4b11-80a2-f47ca1de765b-kube-api-access-thmdt\") pod \"ceilometer-0\" (UID: \"a3697520-ef2b-4b11-80a2-f47ca1de765b\") " pod="openstack/ceilometer-0" Oct 12 06:00:39 crc kubenswrapper[4930]: I1012 06:00:39.694600 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3697520-ef2b-4b11-80a2-f47ca1de765b-config-data\") pod \"ceilometer-0\" (UID: \"a3697520-ef2b-4b11-80a2-f47ca1de765b\") " pod="openstack/ceilometer-0" Oct 12 06:00:39 crc kubenswrapper[4930]: I1012 06:00:39.694618 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3697520-ef2b-4b11-80a2-f47ca1de765b-scripts\") pod \"ceilometer-0\" (UID: \"a3697520-ef2b-4b11-80a2-f47ca1de765b\") " pod="openstack/ceilometer-0" Oct 12 06:00:39 crc kubenswrapper[4930]: I1012 06:00:39.694635 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a3697520-ef2b-4b11-80a2-f47ca1de765b-run-httpd\") pod \"ceilometer-0\" (UID: \"a3697520-ef2b-4b11-80a2-f47ca1de765b\") " pod="openstack/ceilometer-0" Oct 12 06:00:39 crc kubenswrapper[4930]: I1012 06:00:39.694700 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3697520-ef2b-4b11-80a2-f47ca1de765b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a3697520-ef2b-4b11-80a2-f47ca1de765b\") " pod="openstack/ceilometer-0" Oct 12 06:00:39 crc kubenswrapper[4930]: I1012 06:00:39.796942 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thmdt\" (UniqueName: \"kubernetes.io/projected/a3697520-ef2b-4b11-80a2-f47ca1de765b-kube-api-access-thmdt\") pod \"ceilometer-0\" (UID: \"a3697520-ef2b-4b11-80a2-f47ca1de765b\") " pod="openstack/ceilometer-0" Oct 12 06:00:39 crc kubenswrapper[4930]: I1012 06:00:39.797050 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3697520-ef2b-4b11-80a2-f47ca1de765b-config-data\") pod \"ceilometer-0\" (UID: \"a3697520-ef2b-4b11-80a2-f47ca1de765b\") " pod="openstack/ceilometer-0" Oct 12 06:00:39 crc kubenswrapper[4930]: I1012 06:00:39.797110 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3697520-ef2b-4b11-80a2-f47ca1de765b-scripts\") pod \"ceilometer-0\" (UID: \"a3697520-ef2b-4b11-80a2-f47ca1de765b\") " pod="openstack/ceilometer-0" Oct 12 06:00:39 crc kubenswrapper[4930]: I1012 06:00:39.797172 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a3697520-ef2b-4b11-80a2-f47ca1de765b-run-httpd\") pod \"ceilometer-0\" (UID: \"a3697520-ef2b-4b11-80a2-f47ca1de765b\") " pod="openstack/ceilometer-0" Oct 12 06:00:39 crc kubenswrapper[4930]: I1012 06:00:39.797996 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a3697520-ef2b-4b11-80a2-f47ca1de765b-run-httpd\") pod \"ceilometer-0\" (UID: \"a3697520-ef2b-4b11-80a2-f47ca1de765b\") " pod="openstack/ceilometer-0" Oct 12 06:00:39 crc kubenswrapper[4930]: I1012 06:00:39.797728 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3697520-ef2b-4b11-80a2-f47ca1de765b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a3697520-ef2b-4b11-80a2-f47ca1de765b\") " pod="openstack/ceilometer-0" Oct 12 06:00:39 crc kubenswrapper[4930]: I1012 06:00:39.799530 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a3697520-ef2b-4b11-80a2-f47ca1de765b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a3697520-ef2b-4b11-80a2-f47ca1de765b\") " pod="openstack/ceilometer-0" Oct 12 06:00:39 crc kubenswrapper[4930]: I1012 06:00:39.799573 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a3697520-ef2b-4b11-80a2-f47ca1de765b-log-httpd\") pod \"ceilometer-0\" (UID: \"a3697520-ef2b-4b11-80a2-f47ca1de765b\") " pod="openstack/ceilometer-0" Oct 12 06:00:39 crc kubenswrapper[4930]: I1012 06:00:39.800901 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a3697520-ef2b-4b11-80a2-f47ca1de765b-log-httpd\") pod \"ceilometer-0\" (UID: \"a3697520-ef2b-4b11-80a2-f47ca1de765b\") " pod="openstack/ceilometer-0" Oct 12 06:00:39 crc kubenswrapper[4930]: I1012 06:00:39.803822 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3697520-ef2b-4b11-80a2-f47ca1de765b-scripts\") pod \"ceilometer-0\" (UID: \"a3697520-ef2b-4b11-80a2-f47ca1de765b\") " pod="openstack/ceilometer-0" Oct 12 06:00:39 crc kubenswrapper[4930]: I1012 06:00:39.805165 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3697520-ef2b-4b11-80a2-f47ca1de765b-config-data\") pod \"ceilometer-0\" (UID: \"a3697520-ef2b-4b11-80a2-f47ca1de765b\") " pod="openstack/ceilometer-0" Oct 12 06:00:39 crc kubenswrapper[4930]: I1012 06:00:39.814129 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a3697520-ef2b-4b11-80a2-f47ca1de765b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a3697520-ef2b-4b11-80a2-f47ca1de765b\") " pod="openstack/ceilometer-0" Oct 12 06:00:39 crc kubenswrapper[4930]: I1012 06:00:39.814935 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3697520-ef2b-4b11-80a2-f47ca1de765b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a3697520-ef2b-4b11-80a2-f47ca1de765b\") " pod="openstack/ceilometer-0" Oct 12 06:00:39 crc kubenswrapper[4930]: I1012 06:00:39.817786 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thmdt\" (UniqueName: \"kubernetes.io/projected/a3697520-ef2b-4b11-80a2-f47ca1de765b-kube-api-access-thmdt\") pod \"ceilometer-0\" (UID: \"a3697520-ef2b-4b11-80a2-f47ca1de765b\") " pod="openstack/ceilometer-0" Oct 12 06:00:39 crc kubenswrapper[4930]: I1012 06:00:39.924495 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 06:00:40 crc kubenswrapper[4930]: I1012 06:00:40.158301 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d" path="/var/lib/kubelet/pods/b125f8ab-9de8-4dae-bc0e-f0c23ae9d07d/volumes" Oct 12 06:00:40 crc kubenswrapper[4930]: I1012 06:00:40.454951 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 12 06:00:40 crc kubenswrapper[4930]: I1012 06:00:40.578650 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-9054-account-create-rr9h4"] Oct 12 06:00:40 crc kubenswrapper[4930]: I1012 06:00:40.579898 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-9054-account-create-rr9h4" Oct 12 06:00:40 crc kubenswrapper[4930]: I1012 06:00:40.584865 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Oct 12 06:00:40 crc kubenswrapper[4930]: I1012 06:00:40.590856 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-9054-account-create-rr9h4"] Oct 12 06:00:40 crc kubenswrapper[4930]: I1012 06:00:40.679211 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-57c6-account-create-26nks"] Oct 12 06:00:40 crc kubenswrapper[4930]: I1012 06:00:40.680721 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-57c6-account-create-26nks" Oct 12 06:00:40 crc kubenswrapper[4930]: I1012 06:00:40.684244 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Oct 12 06:00:40 crc kubenswrapper[4930]: I1012 06:00:40.714768 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-57c6-account-create-26nks"] Oct 12 06:00:40 crc kubenswrapper[4930]: I1012 06:00:40.719456 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2f87h\" (UniqueName: \"kubernetes.io/projected/74e2cfe9-e552-4e89-b0dc-7af425310799-kube-api-access-2f87h\") pod \"nova-api-9054-account-create-rr9h4\" (UID: \"74e2cfe9-e552-4e89-b0dc-7af425310799\") " pod="openstack/nova-api-9054-account-create-rr9h4" Oct 12 06:00:40 crc kubenswrapper[4930]: I1012 06:00:40.821810 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmlvh\" (UniqueName: \"kubernetes.io/projected/656956b7-bd5b-4cab-9db0-53f9776ee51e-kube-api-access-hmlvh\") pod \"nova-cell0-57c6-account-create-26nks\" (UID: \"656956b7-bd5b-4cab-9db0-53f9776ee51e\") " pod="openstack/nova-cell0-57c6-account-create-26nks" Oct 12 06:00:40 crc kubenswrapper[4930]: I1012 06:00:40.821877 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2f87h\" (UniqueName: \"kubernetes.io/projected/74e2cfe9-e552-4e89-b0dc-7af425310799-kube-api-access-2f87h\") pod \"nova-api-9054-account-create-rr9h4\" (UID: \"74e2cfe9-e552-4e89-b0dc-7af425310799\") " pod="openstack/nova-api-9054-account-create-rr9h4" Oct 12 06:00:40 crc kubenswrapper[4930]: I1012 06:00:40.838988 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2f87h\" (UniqueName: \"kubernetes.io/projected/74e2cfe9-e552-4e89-b0dc-7af425310799-kube-api-access-2f87h\") pod \"nova-api-9054-account-create-rr9h4\" (UID: \"74e2cfe9-e552-4e89-b0dc-7af425310799\") " pod="openstack/nova-api-9054-account-create-rr9h4" Oct 12 06:00:40 crc kubenswrapper[4930]: I1012 06:00:40.902474 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-9054-account-create-rr9h4" Oct 12 06:00:40 crc kubenswrapper[4930]: I1012 06:00:40.925118 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmlvh\" (UniqueName: \"kubernetes.io/projected/656956b7-bd5b-4cab-9db0-53f9776ee51e-kube-api-access-hmlvh\") pod \"nova-cell0-57c6-account-create-26nks\" (UID: \"656956b7-bd5b-4cab-9db0-53f9776ee51e\") " pod="openstack/nova-cell0-57c6-account-create-26nks" Oct 12 06:00:40 crc kubenswrapper[4930]: I1012 06:00:40.946426 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmlvh\" (UniqueName: \"kubernetes.io/projected/656956b7-bd5b-4cab-9db0-53f9776ee51e-kube-api-access-hmlvh\") pod \"nova-cell0-57c6-account-create-26nks\" (UID: \"656956b7-bd5b-4cab-9db0-53f9776ee51e\") " pod="openstack/nova-cell0-57c6-account-create-26nks" Oct 12 06:00:40 crc kubenswrapper[4930]: I1012 06:00:40.982553 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-0a55-account-create-jjl4h"] Oct 12 06:00:40 crc kubenswrapper[4930]: I1012 06:00:40.984091 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-0a55-account-create-jjl4h" Oct 12 06:00:40 crc kubenswrapper[4930]: I1012 06:00:40.986798 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Oct 12 06:00:41 crc kubenswrapper[4930]: I1012 06:00:41.013538 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-0a55-account-create-jjl4h"] Oct 12 06:00:41 crc kubenswrapper[4930]: I1012 06:00:41.131286 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86tcw\" (UniqueName: \"kubernetes.io/projected/e74a93bc-5b73-4e2b-815a-14398881caf5-kube-api-access-86tcw\") pod \"nova-cell1-0a55-account-create-jjl4h\" (UID: \"e74a93bc-5b73-4e2b-815a-14398881caf5\") " pod="openstack/nova-cell1-0a55-account-create-jjl4h" Oct 12 06:00:41 crc kubenswrapper[4930]: I1012 06:00:41.213690 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-57c6-account-create-26nks" Oct 12 06:00:41 crc kubenswrapper[4930]: I1012 06:00:41.233921 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86tcw\" (UniqueName: \"kubernetes.io/projected/e74a93bc-5b73-4e2b-815a-14398881caf5-kube-api-access-86tcw\") pod \"nova-cell1-0a55-account-create-jjl4h\" (UID: \"e74a93bc-5b73-4e2b-815a-14398881caf5\") " pod="openstack/nova-cell1-0a55-account-create-jjl4h" Oct 12 06:00:41 crc kubenswrapper[4930]: I1012 06:00:41.254119 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86tcw\" (UniqueName: \"kubernetes.io/projected/e74a93bc-5b73-4e2b-815a-14398881caf5-kube-api-access-86tcw\") pod \"nova-cell1-0a55-account-create-jjl4h\" (UID: \"e74a93bc-5b73-4e2b-815a-14398881caf5\") " pod="openstack/nova-cell1-0a55-account-create-jjl4h" Oct 12 06:00:41 crc kubenswrapper[4930]: I1012 06:00:41.345069 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-0a55-account-create-jjl4h" Oct 12 06:00:41 crc kubenswrapper[4930]: I1012 06:00:41.382974 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-9054-account-create-rr9h4"] Oct 12 06:00:41 crc kubenswrapper[4930]: W1012 06:00:41.397946 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod74e2cfe9_e552_4e89_b0dc_7af425310799.slice/crio-cf3c2812b35653aa6aec3f9de074532a26dcf86a5091921c551bfdf9b3302678 WatchSource:0}: Error finding container cf3c2812b35653aa6aec3f9de074532a26dcf86a5091921c551bfdf9b3302678: Status 404 returned error can't find the container with id cf3c2812b35653aa6aec3f9de074532a26dcf86a5091921c551bfdf9b3302678 Oct 12 06:00:41 crc kubenswrapper[4930]: I1012 06:00:41.429840 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a3697520-ef2b-4b11-80a2-f47ca1de765b","Type":"ContainerStarted","Data":"3bedaa06bc81039767a716adef320cae4b890d4bd9b853e55e53feb8c7641bd5"} Oct 12 06:00:41 crc kubenswrapper[4930]: I1012 06:00:41.430196 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a3697520-ef2b-4b11-80a2-f47ca1de765b","Type":"ContainerStarted","Data":"eb1d987cda95b9e4fe9da735477a54fdfaa201a263d695bcdb741473889ca596"} Oct 12 06:00:41 crc kubenswrapper[4930]: I1012 06:00:41.431476 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-9054-account-create-rr9h4" event={"ID":"74e2cfe9-e552-4e89-b0dc-7af425310799","Type":"ContainerStarted","Data":"cf3c2812b35653aa6aec3f9de074532a26dcf86a5091921c551bfdf9b3302678"} Oct 12 06:00:41 crc kubenswrapper[4930]: I1012 06:00:41.678344 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-57c6-account-create-26nks"] Oct 12 06:00:41 crc kubenswrapper[4930]: I1012 06:00:41.840564 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-0a55-account-create-jjl4h"] Oct 12 06:00:42 crc kubenswrapper[4930]: I1012 06:00:42.300148 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Oct 12 06:00:42 crc kubenswrapper[4930]: I1012 06:00:42.453446 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a3697520-ef2b-4b11-80a2-f47ca1de765b","Type":"ContainerStarted","Data":"71a020f0d422ccae493d5d7258303e1c02d534e2b30ff6a2d4495a7ee6827f60"} Oct 12 06:00:42 crc kubenswrapper[4930]: I1012 06:00:42.453498 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a3697520-ef2b-4b11-80a2-f47ca1de765b","Type":"ContainerStarted","Data":"e7e1352593f0bfadce55c4d20a57e8bfb3263418a6531a8536a4a1bd984af0bf"} Oct 12 06:00:42 crc kubenswrapper[4930]: I1012 06:00:42.456525 4930 generic.go:334] "Generic (PLEG): container finished" podID="59f4bae4-1a84-449a-be72-e735294116e6" containerID="d3276e810538259cb3294c616a6c0101e64806aa3e935bc18563a7a3f2b3591c" exitCode=0 Oct 12 06:00:42 crc kubenswrapper[4930]: I1012 06:00:42.456580 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"59f4bae4-1a84-449a-be72-e735294116e6","Type":"ContainerDied","Data":"d3276e810538259cb3294c616a6c0101e64806aa3e935bc18563a7a3f2b3591c"} Oct 12 06:00:42 crc kubenswrapper[4930]: I1012 06:00:42.456592 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Oct 12 06:00:42 crc kubenswrapper[4930]: I1012 06:00:42.456612 4930 scope.go:117] "RemoveContainer" containerID="d3276e810538259cb3294c616a6c0101e64806aa3e935bc18563a7a3f2b3591c" Oct 12 06:00:42 crc kubenswrapper[4930]: I1012 06:00:42.456597 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"59f4bae4-1a84-449a-be72-e735294116e6","Type":"ContainerDied","Data":"11fe9aabaaefc071d24b067674968d325a6fcf6e69509ca6f1ddb3473f542555"} Oct 12 06:00:42 crc kubenswrapper[4930]: I1012 06:00:42.459186 4930 generic.go:334] "Generic (PLEG): container finished" podID="e74a93bc-5b73-4e2b-815a-14398881caf5" containerID="50d375193956d516aab058383a6effec3937384885e7a547a37888276b68e52f" exitCode=0 Oct 12 06:00:42 crc kubenswrapper[4930]: I1012 06:00:42.459240 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-0a55-account-create-jjl4h" event={"ID":"e74a93bc-5b73-4e2b-815a-14398881caf5","Type":"ContainerDied","Data":"50d375193956d516aab058383a6effec3937384885e7a547a37888276b68e52f"} Oct 12 06:00:42 crc kubenswrapper[4930]: I1012 06:00:42.459256 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-0a55-account-create-jjl4h" event={"ID":"e74a93bc-5b73-4e2b-815a-14398881caf5","Type":"ContainerStarted","Data":"bffac75ae8ad83c37828fa7273c472a756e12cd86c9228d9316e80f8523dfc9a"} Oct 12 06:00:42 crc kubenswrapper[4930]: I1012 06:00:42.460901 4930 generic.go:334] "Generic (PLEG): container finished" podID="656956b7-bd5b-4cab-9db0-53f9776ee51e" containerID="ecf0ed29a10a370d3b75b2d69566a505fc269b4320c45de8f39f778246bbe12e" exitCode=0 Oct 12 06:00:42 crc kubenswrapper[4930]: I1012 06:00:42.461083 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-57c6-account-create-26nks" event={"ID":"656956b7-bd5b-4cab-9db0-53f9776ee51e","Type":"ContainerDied","Data":"ecf0ed29a10a370d3b75b2d69566a505fc269b4320c45de8f39f778246bbe12e"} Oct 12 06:00:42 crc kubenswrapper[4930]: I1012 06:00:42.461179 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-57c6-account-create-26nks" event={"ID":"656956b7-bd5b-4cab-9db0-53f9776ee51e","Type":"ContainerStarted","Data":"eeef6c61d84cdfb42a6609f618451c1b49f8f0de46865714b0d69565c0d7423b"} Oct 12 06:00:42 crc kubenswrapper[4930]: I1012 06:00:42.463693 4930 generic.go:334] "Generic (PLEG): container finished" podID="74e2cfe9-e552-4e89-b0dc-7af425310799" containerID="4db42ddbeca5bb8a612141328b7ed44d1925c02c297a7927e3d3ef3583ea9ce7" exitCode=0 Oct 12 06:00:42 crc kubenswrapper[4930]: I1012 06:00:42.463729 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-9054-account-create-rr9h4" event={"ID":"74e2cfe9-e552-4e89-b0dc-7af425310799","Type":"ContainerDied","Data":"4db42ddbeca5bb8a612141328b7ed44d1925c02c297a7927e3d3ef3583ea9ce7"} Oct 12 06:00:42 crc kubenswrapper[4930]: I1012 06:00:42.467979 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59f4bae4-1a84-449a-be72-e735294116e6-logs\") pod \"59f4bae4-1a84-449a-be72-e735294116e6\" (UID: \"59f4bae4-1a84-449a-be72-e735294116e6\") " Oct 12 06:00:42 crc kubenswrapper[4930]: I1012 06:00:42.468221 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/59f4bae4-1a84-449a-be72-e735294116e6-custom-prometheus-ca\") pod \"59f4bae4-1a84-449a-be72-e735294116e6\" (UID: \"59f4bae4-1a84-449a-be72-e735294116e6\") " Oct 12 06:00:42 crc kubenswrapper[4930]: I1012 06:00:42.468365 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59f4bae4-1a84-449a-be72-e735294116e6-combined-ca-bundle\") pod \"59f4bae4-1a84-449a-be72-e735294116e6\" (UID: \"59f4bae4-1a84-449a-be72-e735294116e6\") " Oct 12 06:00:42 crc kubenswrapper[4930]: I1012 06:00:42.468509 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59f4bae4-1a84-449a-be72-e735294116e6-config-data\") pod \"59f4bae4-1a84-449a-be72-e735294116e6\" (UID: \"59f4bae4-1a84-449a-be72-e735294116e6\") " Oct 12 06:00:42 crc kubenswrapper[4930]: I1012 06:00:42.468588 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59f4bae4-1a84-449a-be72-e735294116e6-logs" (OuterVolumeSpecName: "logs") pod "59f4bae4-1a84-449a-be72-e735294116e6" (UID: "59f4bae4-1a84-449a-be72-e735294116e6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 06:00:42 crc kubenswrapper[4930]: I1012 06:00:42.468766 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qgjbx\" (UniqueName: \"kubernetes.io/projected/59f4bae4-1a84-449a-be72-e735294116e6-kube-api-access-qgjbx\") pod \"59f4bae4-1a84-449a-be72-e735294116e6\" (UID: \"59f4bae4-1a84-449a-be72-e735294116e6\") " Oct 12 06:00:42 crc kubenswrapper[4930]: I1012 06:00:42.469453 4930 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59f4bae4-1a84-449a-be72-e735294116e6-logs\") on node \"crc\" DevicePath \"\"" Oct 12 06:00:42 crc kubenswrapper[4930]: I1012 06:00:42.476371 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59f4bae4-1a84-449a-be72-e735294116e6-kube-api-access-qgjbx" (OuterVolumeSpecName: "kube-api-access-qgjbx") pod "59f4bae4-1a84-449a-be72-e735294116e6" (UID: "59f4bae4-1a84-449a-be72-e735294116e6"). InnerVolumeSpecName "kube-api-access-qgjbx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:00:42 crc kubenswrapper[4930]: I1012 06:00:42.499938 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59f4bae4-1a84-449a-be72-e735294116e6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "59f4bae4-1a84-449a-be72-e735294116e6" (UID: "59f4bae4-1a84-449a-be72-e735294116e6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:00:42 crc kubenswrapper[4930]: I1012 06:00:42.524781 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59f4bae4-1a84-449a-be72-e735294116e6-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "59f4bae4-1a84-449a-be72-e735294116e6" (UID: "59f4bae4-1a84-449a-be72-e735294116e6"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:00:42 crc kubenswrapper[4930]: I1012 06:00:42.542343 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59f4bae4-1a84-449a-be72-e735294116e6-config-data" (OuterVolumeSpecName: "config-data") pod "59f4bae4-1a84-449a-be72-e735294116e6" (UID: "59f4bae4-1a84-449a-be72-e735294116e6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:00:42 crc kubenswrapper[4930]: I1012 06:00:42.571318 4930 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/59f4bae4-1a84-449a-be72-e735294116e6-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Oct 12 06:00:42 crc kubenswrapper[4930]: I1012 06:00:42.571357 4930 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59f4bae4-1a84-449a-be72-e735294116e6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 06:00:42 crc kubenswrapper[4930]: I1012 06:00:42.571371 4930 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59f4bae4-1a84-449a-be72-e735294116e6-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 06:00:42 crc kubenswrapper[4930]: I1012 06:00:42.571386 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qgjbx\" (UniqueName: \"kubernetes.io/projected/59f4bae4-1a84-449a-be72-e735294116e6-kube-api-access-qgjbx\") on node \"crc\" DevicePath \"\"" Oct 12 06:00:42 crc kubenswrapper[4930]: I1012 06:00:42.617447 4930 scope.go:117] "RemoveContainer" containerID="08f49b3d51ac716588f5237987a07b5659675e827f1272074de03ba6918727cd" Oct 12 06:00:42 crc kubenswrapper[4930]: I1012 06:00:42.656015 4930 scope.go:117] "RemoveContainer" containerID="d3276e810538259cb3294c616a6c0101e64806aa3e935bc18563a7a3f2b3591c" Oct 12 06:00:42 crc kubenswrapper[4930]: E1012 06:00:42.656504 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3276e810538259cb3294c616a6c0101e64806aa3e935bc18563a7a3f2b3591c\": container with ID starting with d3276e810538259cb3294c616a6c0101e64806aa3e935bc18563a7a3f2b3591c not found: ID does not exist" containerID="d3276e810538259cb3294c616a6c0101e64806aa3e935bc18563a7a3f2b3591c" Oct 12 06:00:42 crc kubenswrapper[4930]: I1012 06:00:42.656543 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3276e810538259cb3294c616a6c0101e64806aa3e935bc18563a7a3f2b3591c"} err="failed to get container status \"d3276e810538259cb3294c616a6c0101e64806aa3e935bc18563a7a3f2b3591c\": rpc error: code = NotFound desc = could not find container \"d3276e810538259cb3294c616a6c0101e64806aa3e935bc18563a7a3f2b3591c\": container with ID starting with d3276e810538259cb3294c616a6c0101e64806aa3e935bc18563a7a3f2b3591c not found: ID does not exist" Oct 12 06:00:42 crc kubenswrapper[4930]: I1012 06:00:42.656569 4930 scope.go:117] "RemoveContainer" containerID="08f49b3d51ac716588f5237987a07b5659675e827f1272074de03ba6918727cd" Oct 12 06:00:42 crc kubenswrapper[4930]: E1012 06:00:42.656929 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08f49b3d51ac716588f5237987a07b5659675e827f1272074de03ba6918727cd\": container with ID starting with 08f49b3d51ac716588f5237987a07b5659675e827f1272074de03ba6918727cd not found: ID does not exist" containerID="08f49b3d51ac716588f5237987a07b5659675e827f1272074de03ba6918727cd" Oct 12 06:00:42 crc kubenswrapper[4930]: I1012 06:00:42.656967 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08f49b3d51ac716588f5237987a07b5659675e827f1272074de03ba6918727cd"} err="failed to get container status \"08f49b3d51ac716588f5237987a07b5659675e827f1272074de03ba6918727cd\": rpc error: code = NotFound desc = could not find container \"08f49b3d51ac716588f5237987a07b5659675e827f1272074de03ba6918727cd\": container with ID starting with 08f49b3d51ac716588f5237987a07b5659675e827f1272074de03ba6918727cd not found: ID does not exist" Oct 12 06:00:42 crc kubenswrapper[4930]: I1012 06:00:42.804777 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 12 06:00:42 crc kubenswrapper[4930]: I1012 06:00:42.818839 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 12 06:00:42 crc kubenswrapper[4930]: I1012 06:00:42.833622 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 12 06:00:42 crc kubenswrapper[4930]: E1012 06:00:42.834083 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59f4bae4-1a84-449a-be72-e735294116e6" containerName="watcher-decision-engine" Oct 12 06:00:42 crc kubenswrapper[4930]: I1012 06:00:42.834101 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="59f4bae4-1a84-449a-be72-e735294116e6" containerName="watcher-decision-engine" Oct 12 06:00:42 crc kubenswrapper[4930]: E1012 06:00:42.834114 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59f4bae4-1a84-449a-be72-e735294116e6" containerName="watcher-decision-engine" Oct 12 06:00:42 crc kubenswrapper[4930]: I1012 06:00:42.834121 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="59f4bae4-1a84-449a-be72-e735294116e6" containerName="watcher-decision-engine" Oct 12 06:00:42 crc kubenswrapper[4930]: E1012 06:00:42.834132 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59f4bae4-1a84-449a-be72-e735294116e6" containerName="watcher-decision-engine" Oct 12 06:00:42 crc kubenswrapper[4930]: I1012 06:00:42.834138 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="59f4bae4-1a84-449a-be72-e735294116e6" containerName="watcher-decision-engine" Oct 12 06:00:42 crc kubenswrapper[4930]: E1012 06:00:42.834158 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59f4bae4-1a84-449a-be72-e735294116e6" containerName="watcher-decision-engine" Oct 12 06:00:42 crc kubenswrapper[4930]: I1012 06:00:42.834165 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="59f4bae4-1a84-449a-be72-e735294116e6" containerName="watcher-decision-engine" Oct 12 06:00:42 crc kubenswrapper[4930]: I1012 06:00:42.834334 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="59f4bae4-1a84-449a-be72-e735294116e6" containerName="watcher-decision-engine" Oct 12 06:00:42 crc kubenswrapper[4930]: I1012 06:00:42.834344 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="59f4bae4-1a84-449a-be72-e735294116e6" containerName="watcher-decision-engine" Oct 12 06:00:42 crc kubenswrapper[4930]: I1012 06:00:42.835026 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Oct 12 06:00:42 crc kubenswrapper[4930]: I1012 06:00:42.837360 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Oct 12 06:00:42 crc kubenswrapper[4930]: I1012 06:00:42.842709 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 12 06:00:42 crc kubenswrapper[4930]: I1012 06:00:42.983711 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/220ebc1c-6f2b-4beb-8a34-339ba62a484f-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"220ebc1c-6f2b-4beb-8a34-339ba62a484f\") " pod="openstack/watcher-decision-engine-0" Oct 12 06:00:42 crc kubenswrapper[4930]: I1012 06:00:42.983798 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/220ebc1c-6f2b-4beb-8a34-339ba62a484f-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"220ebc1c-6f2b-4beb-8a34-339ba62a484f\") " pod="openstack/watcher-decision-engine-0" Oct 12 06:00:42 crc kubenswrapper[4930]: I1012 06:00:42.983843 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9t8z8\" (UniqueName: \"kubernetes.io/projected/220ebc1c-6f2b-4beb-8a34-339ba62a484f-kube-api-access-9t8z8\") pod \"watcher-decision-engine-0\" (UID: \"220ebc1c-6f2b-4beb-8a34-339ba62a484f\") " pod="openstack/watcher-decision-engine-0" Oct 12 06:00:42 crc kubenswrapper[4930]: I1012 06:00:42.983877 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/220ebc1c-6f2b-4beb-8a34-339ba62a484f-logs\") pod \"watcher-decision-engine-0\" (UID: \"220ebc1c-6f2b-4beb-8a34-339ba62a484f\") " pod="openstack/watcher-decision-engine-0" Oct 12 06:00:42 crc kubenswrapper[4930]: I1012 06:00:42.983913 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/220ebc1c-6f2b-4beb-8a34-339ba62a484f-config-data\") pod \"watcher-decision-engine-0\" (UID: \"220ebc1c-6f2b-4beb-8a34-339ba62a484f\") " pod="openstack/watcher-decision-engine-0" Oct 12 06:00:43 crc kubenswrapper[4930]: I1012 06:00:43.086034 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/220ebc1c-6f2b-4beb-8a34-339ba62a484f-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"220ebc1c-6f2b-4beb-8a34-339ba62a484f\") " pod="openstack/watcher-decision-engine-0" Oct 12 06:00:43 crc kubenswrapper[4930]: I1012 06:00:43.086128 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/220ebc1c-6f2b-4beb-8a34-339ba62a484f-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"220ebc1c-6f2b-4beb-8a34-339ba62a484f\") " pod="openstack/watcher-decision-engine-0" Oct 12 06:00:43 crc kubenswrapper[4930]: I1012 06:00:43.086157 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9t8z8\" (UniqueName: \"kubernetes.io/projected/220ebc1c-6f2b-4beb-8a34-339ba62a484f-kube-api-access-9t8z8\") pod \"watcher-decision-engine-0\" (UID: \"220ebc1c-6f2b-4beb-8a34-339ba62a484f\") " pod="openstack/watcher-decision-engine-0" Oct 12 06:00:43 crc kubenswrapper[4930]: I1012 06:00:43.086194 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/220ebc1c-6f2b-4beb-8a34-339ba62a484f-logs\") pod \"watcher-decision-engine-0\" (UID: \"220ebc1c-6f2b-4beb-8a34-339ba62a484f\") " pod="openstack/watcher-decision-engine-0" Oct 12 06:00:43 crc kubenswrapper[4930]: I1012 06:00:43.086627 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/220ebc1c-6f2b-4beb-8a34-339ba62a484f-config-data\") pod \"watcher-decision-engine-0\" (UID: \"220ebc1c-6f2b-4beb-8a34-339ba62a484f\") " pod="openstack/watcher-decision-engine-0" Oct 12 06:00:43 crc kubenswrapper[4930]: I1012 06:00:43.086768 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/220ebc1c-6f2b-4beb-8a34-339ba62a484f-logs\") pod \"watcher-decision-engine-0\" (UID: \"220ebc1c-6f2b-4beb-8a34-339ba62a484f\") " pod="openstack/watcher-decision-engine-0" Oct 12 06:00:43 crc kubenswrapper[4930]: I1012 06:00:43.090955 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/220ebc1c-6f2b-4beb-8a34-339ba62a484f-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"220ebc1c-6f2b-4beb-8a34-339ba62a484f\") " pod="openstack/watcher-decision-engine-0" Oct 12 06:00:43 crc kubenswrapper[4930]: I1012 06:00:43.093297 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/220ebc1c-6f2b-4beb-8a34-339ba62a484f-config-data\") pod \"watcher-decision-engine-0\" (UID: \"220ebc1c-6f2b-4beb-8a34-339ba62a484f\") " pod="openstack/watcher-decision-engine-0" Oct 12 06:00:43 crc kubenswrapper[4930]: I1012 06:00:43.093898 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/220ebc1c-6f2b-4beb-8a34-339ba62a484f-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"220ebc1c-6f2b-4beb-8a34-339ba62a484f\") " pod="openstack/watcher-decision-engine-0" Oct 12 06:00:43 crc kubenswrapper[4930]: I1012 06:00:43.109059 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9t8z8\" (UniqueName: \"kubernetes.io/projected/220ebc1c-6f2b-4beb-8a34-339ba62a484f-kube-api-access-9t8z8\") pod \"watcher-decision-engine-0\" (UID: \"220ebc1c-6f2b-4beb-8a34-339ba62a484f\") " pod="openstack/watcher-decision-engine-0" Oct 12 06:00:43 crc kubenswrapper[4930]: I1012 06:00:43.168576 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Oct 12 06:00:43 crc kubenswrapper[4930]: I1012 06:00:43.688282 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 12 06:00:43 crc kubenswrapper[4930]: W1012 06:00:43.694062 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod220ebc1c_6f2b_4beb_8a34_339ba62a484f.slice/crio-0146ff2aa291712837dd221409098636695ff79287dc6b6017235c1b74a03907 WatchSource:0}: Error finding container 0146ff2aa291712837dd221409098636695ff79287dc6b6017235c1b74a03907: Status 404 returned error can't find the container with id 0146ff2aa291712837dd221409098636695ff79287dc6b6017235c1b74a03907 Oct 12 06:00:43 crc kubenswrapper[4930]: I1012 06:00:43.956811 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-0a55-account-create-jjl4h" Oct 12 06:00:44 crc kubenswrapper[4930]: I1012 06:00:44.011881 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-9054-account-create-rr9h4" Oct 12 06:00:44 crc kubenswrapper[4930]: I1012 06:00:44.045694 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-57c6-account-create-26nks" Oct 12 06:00:44 crc kubenswrapper[4930]: I1012 06:00:44.105141 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmlvh\" (UniqueName: \"kubernetes.io/projected/656956b7-bd5b-4cab-9db0-53f9776ee51e-kube-api-access-hmlvh\") pod \"656956b7-bd5b-4cab-9db0-53f9776ee51e\" (UID: \"656956b7-bd5b-4cab-9db0-53f9776ee51e\") " Oct 12 06:00:44 crc kubenswrapper[4930]: I1012 06:00:44.105258 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2f87h\" (UniqueName: \"kubernetes.io/projected/74e2cfe9-e552-4e89-b0dc-7af425310799-kube-api-access-2f87h\") pod \"74e2cfe9-e552-4e89-b0dc-7af425310799\" (UID: \"74e2cfe9-e552-4e89-b0dc-7af425310799\") " Oct 12 06:00:44 crc kubenswrapper[4930]: I1012 06:00:44.105372 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86tcw\" (UniqueName: \"kubernetes.io/projected/e74a93bc-5b73-4e2b-815a-14398881caf5-kube-api-access-86tcw\") pod \"e74a93bc-5b73-4e2b-815a-14398881caf5\" (UID: \"e74a93bc-5b73-4e2b-815a-14398881caf5\") " Oct 12 06:00:44 crc kubenswrapper[4930]: I1012 06:00:44.110675 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74e2cfe9-e552-4e89-b0dc-7af425310799-kube-api-access-2f87h" (OuterVolumeSpecName: "kube-api-access-2f87h") pod "74e2cfe9-e552-4e89-b0dc-7af425310799" (UID: "74e2cfe9-e552-4e89-b0dc-7af425310799"). InnerVolumeSpecName "kube-api-access-2f87h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:00:44 crc kubenswrapper[4930]: I1012 06:00:44.110689 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/656956b7-bd5b-4cab-9db0-53f9776ee51e-kube-api-access-hmlvh" (OuterVolumeSpecName: "kube-api-access-hmlvh") pod "656956b7-bd5b-4cab-9db0-53f9776ee51e" (UID: "656956b7-bd5b-4cab-9db0-53f9776ee51e"). InnerVolumeSpecName "kube-api-access-hmlvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:00:44 crc kubenswrapper[4930]: I1012 06:00:44.110719 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e74a93bc-5b73-4e2b-815a-14398881caf5-kube-api-access-86tcw" (OuterVolumeSpecName: "kube-api-access-86tcw") pod "e74a93bc-5b73-4e2b-815a-14398881caf5" (UID: "e74a93bc-5b73-4e2b-815a-14398881caf5"). InnerVolumeSpecName "kube-api-access-86tcw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:00:44 crc kubenswrapper[4930]: I1012 06:00:44.149825 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59f4bae4-1a84-449a-be72-e735294116e6" path="/var/lib/kubelet/pods/59f4bae4-1a84-449a-be72-e735294116e6/volumes" Oct 12 06:00:44 crc kubenswrapper[4930]: I1012 06:00:44.208051 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmlvh\" (UniqueName: \"kubernetes.io/projected/656956b7-bd5b-4cab-9db0-53f9776ee51e-kube-api-access-hmlvh\") on node \"crc\" DevicePath \"\"" Oct 12 06:00:44 crc kubenswrapper[4930]: I1012 06:00:44.208096 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2f87h\" (UniqueName: \"kubernetes.io/projected/74e2cfe9-e552-4e89-b0dc-7af425310799-kube-api-access-2f87h\") on node \"crc\" DevicePath \"\"" Oct 12 06:00:44 crc kubenswrapper[4930]: I1012 06:00:44.208119 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86tcw\" (UniqueName: \"kubernetes.io/projected/e74a93bc-5b73-4e2b-815a-14398881caf5-kube-api-access-86tcw\") on node \"crc\" DevicePath \"\"" Oct 12 06:00:44 crc kubenswrapper[4930]: I1012 06:00:44.497824 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-9054-account-create-rr9h4" event={"ID":"74e2cfe9-e552-4e89-b0dc-7af425310799","Type":"ContainerDied","Data":"cf3c2812b35653aa6aec3f9de074532a26dcf86a5091921c551bfdf9b3302678"} Oct 12 06:00:44 crc kubenswrapper[4930]: I1012 06:00:44.497885 4930 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf3c2812b35653aa6aec3f9de074532a26dcf86a5091921c551bfdf9b3302678" Oct 12 06:00:44 crc kubenswrapper[4930]: I1012 06:00:44.497854 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-9054-account-create-rr9h4" Oct 12 06:00:44 crc kubenswrapper[4930]: I1012 06:00:44.501391 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a3697520-ef2b-4b11-80a2-f47ca1de765b","Type":"ContainerStarted","Data":"b25fa8c77be0c182869e6f29dbe64ba2bac901c9d516c2602f81daa149e606a0"} Oct 12 06:00:44 crc kubenswrapper[4930]: I1012 06:00:44.501588 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 12 06:00:44 crc kubenswrapper[4930]: I1012 06:00:44.503647 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"220ebc1c-6f2b-4beb-8a34-339ba62a484f","Type":"ContainerStarted","Data":"47ea2aba82b068a402a17ad622f8a2a7be7da4652dd622ca2414b42b5620165e"} Oct 12 06:00:44 crc kubenswrapper[4930]: I1012 06:00:44.503695 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"220ebc1c-6f2b-4beb-8a34-339ba62a484f","Type":"ContainerStarted","Data":"0146ff2aa291712837dd221409098636695ff79287dc6b6017235c1b74a03907"} Oct 12 06:00:44 crc kubenswrapper[4930]: I1012 06:00:44.506145 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-0a55-account-create-jjl4h" Oct 12 06:00:44 crc kubenswrapper[4930]: I1012 06:00:44.506147 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-0a55-account-create-jjl4h" event={"ID":"e74a93bc-5b73-4e2b-815a-14398881caf5","Type":"ContainerDied","Data":"bffac75ae8ad83c37828fa7273c472a756e12cd86c9228d9316e80f8523dfc9a"} Oct 12 06:00:44 crc kubenswrapper[4930]: I1012 06:00:44.506266 4930 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bffac75ae8ad83c37828fa7273c472a756e12cd86c9228d9316e80f8523dfc9a" Oct 12 06:00:44 crc kubenswrapper[4930]: I1012 06:00:44.509433 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-57c6-account-create-26nks" event={"ID":"656956b7-bd5b-4cab-9db0-53f9776ee51e","Type":"ContainerDied","Data":"eeef6c61d84cdfb42a6609f618451c1b49f8f0de46865714b0d69565c0d7423b"} Oct 12 06:00:44 crc kubenswrapper[4930]: I1012 06:00:44.509462 4930 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eeef6c61d84cdfb42a6609f618451c1b49f8f0de46865714b0d69565c0d7423b" Oct 12 06:00:44 crc kubenswrapper[4930]: I1012 06:00:44.509532 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-57c6-account-create-26nks" Oct 12 06:00:44 crc kubenswrapper[4930]: I1012 06:00:44.523606 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.643491887 podStartE2EDuration="5.523589065s" podCreationTimestamp="2025-10-12 06:00:39 +0000 UTC" firstStartedPulling="2025-10-12 06:00:40.459999781 +0000 UTC m=+1173.002101546" lastFinishedPulling="2025-10-12 06:00:43.340096959 +0000 UTC m=+1175.882198724" observedRunningTime="2025-10-12 06:00:44.523200575 +0000 UTC m=+1177.065302350" watchObservedRunningTime="2025-10-12 06:00:44.523589065 +0000 UTC m=+1177.065690840" Oct 12 06:00:44 crc kubenswrapper[4930]: I1012 06:00:44.566412 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=2.56638821 podStartE2EDuration="2.56638821s" podCreationTimestamp="2025-10-12 06:00:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 06:00:44.560148985 +0000 UTC m=+1177.102250780" watchObservedRunningTime="2025-10-12 06:00:44.56638821 +0000 UTC m=+1177.108489985" Oct 12 06:00:45 crc kubenswrapper[4930]: I1012 06:00:45.932826 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-mqxdr"] Oct 12 06:00:45 crc kubenswrapper[4930]: E1012 06:00:45.933866 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e74a93bc-5b73-4e2b-815a-14398881caf5" containerName="mariadb-account-create" Oct 12 06:00:45 crc kubenswrapper[4930]: I1012 06:00:45.933889 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="e74a93bc-5b73-4e2b-815a-14398881caf5" containerName="mariadb-account-create" Oct 12 06:00:45 crc kubenswrapper[4930]: E1012 06:00:45.933927 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="656956b7-bd5b-4cab-9db0-53f9776ee51e" containerName="mariadb-account-create" Oct 12 06:00:45 crc kubenswrapper[4930]: I1012 06:00:45.933943 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="656956b7-bd5b-4cab-9db0-53f9776ee51e" containerName="mariadb-account-create" Oct 12 06:00:45 crc kubenswrapper[4930]: E1012 06:00:45.933961 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74e2cfe9-e552-4e89-b0dc-7af425310799" containerName="mariadb-account-create" Oct 12 06:00:45 crc kubenswrapper[4930]: I1012 06:00:45.933973 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="74e2cfe9-e552-4e89-b0dc-7af425310799" containerName="mariadb-account-create" Oct 12 06:00:45 crc kubenswrapper[4930]: I1012 06:00:45.934309 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="74e2cfe9-e552-4e89-b0dc-7af425310799" containerName="mariadb-account-create" Oct 12 06:00:45 crc kubenswrapper[4930]: I1012 06:00:45.934353 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="59f4bae4-1a84-449a-be72-e735294116e6" containerName="watcher-decision-engine" Oct 12 06:00:45 crc kubenswrapper[4930]: I1012 06:00:45.934370 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="59f4bae4-1a84-449a-be72-e735294116e6" containerName="watcher-decision-engine" Oct 12 06:00:45 crc kubenswrapper[4930]: I1012 06:00:45.934397 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="656956b7-bd5b-4cab-9db0-53f9776ee51e" containerName="mariadb-account-create" Oct 12 06:00:45 crc kubenswrapper[4930]: I1012 06:00:45.934422 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="e74a93bc-5b73-4e2b-815a-14398881caf5" containerName="mariadb-account-create" Oct 12 06:00:45 crc kubenswrapper[4930]: I1012 06:00:45.935620 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-mqxdr" Oct 12 06:00:45 crc kubenswrapper[4930]: I1012 06:00:45.937643 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-kprw9" Oct 12 06:00:45 crc kubenswrapper[4930]: I1012 06:00:45.937756 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 12 06:00:45 crc kubenswrapper[4930]: I1012 06:00:45.937907 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Oct 12 06:00:45 crc kubenswrapper[4930]: I1012 06:00:45.966816 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-mqxdr"] Oct 12 06:00:46 crc kubenswrapper[4930]: I1012 06:00:46.050004 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d0cf3ce-d82f-43e3-9542-6cca53e19b42-scripts\") pod \"nova-cell0-conductor-db-sync-mqxdr\" (UID: \"9d0cf3ce-d82f-43e3-9542-6cca53e19b42\") " pod="openstack/nova-cell0-conductor-db-sync-mqxdr" Oct 12 06:00:46 crc kubenswrapper[4930]: I1012 06:00:46.050073 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhq6d\" (UniqueName: \"kubernetes.io/projected/9d0cf3ce-d82f-43e3-9542-6cca53e19b42-kube-api-access-rhq6d\") pod \"nova-cell0-conductor-db-sync-mqxdr\" (UID: \"9d0cf3ce-d82f-43e3-9542-6cca53e19b42\") " pod="openstack/nova-cell0-conductor-db-sync-mqxdr" Oct 12 06:00:46 crc kubenswrapper[4930]: I1012 06:00:46.050152 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d0cf3ce-d82f-43e3-9542-6cca53e19b42-config-data\") pod \"nova-cell0-conductor-db-sync-mqxdr\" (UID: \"9d0cf3ce-d82f-43e3-9542-6cca53e19b42\") " pod="openstack/nova-cell0-conductor-db-sync-mqxdr" Oct 12 06:00:46 crc kubenswrapper[4930]: I1012 06:00:46.050308 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d0cf3ce-d82f-43e3-9542-6cca53e19b42-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-mqxdr\" (UID: \"9d0cf3ce-d82f-43e3-9542-6cca53e19b42\") " pod="openstack/nova-cell0-conductor-db-sync-mqxdr" Oct 12 06:00:46 crc kubenswrapper[4930]: I1012 06:00:46.152177 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d0cf3ce-d82f-43e3-9542-6cca53e19b42-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-mqxdr\" (UID: \"9d0cf3ce-d82f-43e3-9542-6cca53e19b42\") " pod="openstack/nova-cell0-conductor-db-sync-mqxdr" Oct 12 06:00:46 crc kubenswrapper[4930]: I1012 06:00:46.152311 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d0cf3ce-d82f-43e3-9542-6cca53e19b42-scripts\") pod \"nova-cell0-conductor-db-sync-mqxdr\" (UID: \"9d0cf3ce-d82f-43e3-9542-6cca53e19b42\") " pod="openstack/nova-cell0-conductor-db-sync-mqxdr" Oct 12 06:00:46 crc kubenswrapper[4930]: I1012 06:00:46.152334 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhq6d\" (UniqueName: \"kubernetes.io/projected/9d0cf3ce-d82f-43e3-9542-6cca53e19b42-kube-api-access-rhq6d\") pod \"nova-cell0-conductor-db-sync-mqxdr\" (UID: \"9d0cf3ce-d82f-43e3-9542-6cca53e19b42\") " pod="openstack/nova-cell0-conductor-db-sync-mqxdr" Oct 12 06:00:46 crc kubenswrapper[4930]: I1012 06:00:46.152370 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d0cf3ce-d82f-43e3-9542-6cca53e19b42-config-data\") pod \"nova-cell0-conductor-db-sync-mqxdr\" (UID: \"9d0cf3ce-d82f-43e3-9542-6cca53e19b42\") " pod="openstack/nova-cell0-conductor-db-sync-mqxdr" Oct 12 06:00:46 crc kubenswrapper[4930]: I1012 06:00:46.157567 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d0cf3ce-d82f-43e3-9542-6cca53e19b42-config-data\") pod \"nova-cell0-conductor-db-sync-mqxdr\" (UID: \"9d0cf3ce-d82f-43e3-9542-6cca53e19b42\") " pod="openstack/nova-cell0-conductor-db-sync-mqxdr" Oct 12 06:00:46 crc kubenswrapper[4930]: I1012 06:00:46.158967 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d0cf3ce-d82f-43e3-9542-6cca53e19b42-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-mqxdr\" (UID: \"9d0cf3ce-d82f-43e3-9542-6cca53e19b42\") " pod="openstack/nova-cell0-conductor-db-sync-mqxdr" Oct 12 06:00:46 crc kubenswrapper[4930]: I1012 06:00:46.166325 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d0cf3ce-d82f-43e3-9542-6cca53e19b42-scripts\") pod \"nova-cell0-conductor-db-sync-mqxdr\" (UID: \"9d0cf3ce-d82f-43e3-9542-6cca53e19b42\") " pod="openstack/nova-cell0-conductor-db-sync-mqxdr" Oct 12 06:00:46 crc kubenswrapper[4930]: I1012 06:00:46.179483 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhq6d\" (UniqueName: \"kubernetes.io/projected/9d0cf3ce-d82f-43e3-9542-6cca53e19b42-kube-api-access-rhq6d\") pod \"nova-cell0-conductor-db-sync-mqxdr\" (UID: \"9d0cf3ce-d82f-43e3-9542-6cca53e19b42\") " pod="openstack/nova-cell0-conductor-db-sync-mqxdr" Oct 12 06:00:46 crc kubenswrapper[4930]: I1012 06:00:46.272489 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-mqxdr" Oct 12 06:00:46 crc kubenswrapper[4930]: I1012 06:00:46.751344 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-mqxdr"] Oct 12 06:00:47 crc kubenswrapper[4930]: I1012 06:00:47.552982 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-mqxdr" event={"ID":"9d0cf3ce-d82f-43e3-9542-6cca53e19b42","Type":"ContainerStarted","Data":"29612554ef331cf11e7e26159cbe95bfedc8ebd25bc6f9284bc8ee0e16f153b3"} Oct 12 06:00:53 crc kubenswrapper[4930]: I1012 06:00:53.169325 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 12 06:00:53 crc kubenswrapper[4930]: I1012 06:00:53.224708 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Oct 12 06:00:53 crc kubenswrapper[4930]: I1012 06:00:53.630543 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Oct 12 06:00:53 crc kubenswrapper[4930]: I1012 06:00:53.680919 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Oct 12 06:00:58 crc kubenswrapper[4930]: I1012 06:00:58.692332 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-mqxdr" event={"ID":"9d0cf3ce-d82f-43e3-9542-6cca53e19b42","Type":"ContainerStarted","Data":"16b5bce420b969768bdbae5362224fabb73c667affdefc4eb5fa1d2883c7c259"} Oct 12 06:00:58 crc kubenswrapper[4930]: I1012 06:00:58.724231 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-mqxdr" podStartSLOduration=2.865421868 podStartE2EDuration="13.724201275s" podCreationTimestamp="2025-10-12 06:00:45 +0000 UTC" firstStartedPulling="2025-10-12 06:00:46.755244868 +0000 UTC m=+1179.297346633" lastFinishedPulling="2025-10-12 06:00:57.614024275 +0000 UTC m=+1190.156126040" observedRunningTime="2025-10-12 06:00:58.722411221 +0000 UTC m=+1191.264513026" watchObservedRunningTime="2025-10-12 06:00:58.724201275 +0000 UTC m=+1191.266303070" Oct 12 06:01:00 crc kubenswrapper[4930]: I1012 06:01:00.152802 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29337481-hzx9c"] Oct 12 06:01:00 crc kubenswrapper[4930]: I1012 06:01:00.154353 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29337481-hzx9c" Oct 12 06:01:00 crc kubenswrapper[4930]: I1012 06:01:00.172237 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29337481-hzx9c"] Oct 12 06:01:00 crc kubenswrapper[4930]: I1012 06:01:00.272962 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f8e371c3-204c-4a74-8c6f-49c25f6b7e90-fernet-keys\") pod \"keystone-cron-29337481-hzx9c\" (UID: \"f8e371c3-204c-4a74-8c6f-49c25f6b7e90\") " pod="openstack/keystone-cron-29337481-hzx9c" Oct 12 06:01:00 crc kubenswrapper[4930]: I1012 06:01:00.273058 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8e371c3-204c-4a74-8c6f-49c25f6b7e90-config-data\") pod \"keystone-cron-29337481-hzx9c\" (UID: \"f8e371c3-204c-4a74-8c6f-49c25f6b7e90\") " pod="openstack/keystone-cron-29337481-hzx9c" Oct 12 06:01:00 crc kubenswrapper[4930]: I1012 06:01:00.273086 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8e371c3-204c-4a74-8c6f-49c25f6b7e90-combined-ca-bundle\") pod \"keystone-cron-29337481-hzx9c\" (UID: \"f8e371c3-204c-4a74-8c6f-49c25f6b7e90\") " pod="openstack/keystone-cron-29337481-hzx9c" Oct 12 06:01:00 crc kubenswrapper[4930]: I1012 06:01:00.273301 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4p4s7\" (UniqueName: \"kubernetes.io/projected/f8e371c3-204c-4a74-8c6f-49c25f6b7e90-kube-api-access-4p4s7\") pod \"keystone-cron-29337481-hzx9c\" (UID: \"f8e371c3-204c-4a74-8c6f-49c25f6b7e90\") " pod="openstack/keystone-cron-29337481-hzx9c" Oct 12 06:01:00 crc kubenswrapper[4930]: I1012 06:01:00.375660 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8e371c3-204c-4a74-8c6f-49c25f6b7e90-config-data\") pod \"keystone-cron-29337481-hzx9c\" (UID: \"f8e371c3-204c-4a74-8c6f-49c25f6b7e90\") " pod="openstack/keystone-cron-29337481-hzx9c" Oct 12 06:01:00 crc kubenswrapper[4930]: I1012 06:01:00.375965 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8e371c3-204c-4a74-8c6f-49c25f6b7e90-combined-ca-bundle\") pod \"keystone-cron-29337481-hzx9c\" (UID: \"f8e371c3-204c-4a74-8c6f-49c25f6b7e90\") " pod="openstack/keystone-cron-29337481-hzx9c" Oct 12 06:01:00 crc kubenswrapper[4930]: I1012 06:01:00.376223 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4p4s7\" (UniqueName: \"kubernetes.io/projected/f8e371c3-204c-4a74-8c6f-49c25f6b7e90-kube-api-access-4p4s7\") pod \"keystone-cron-29337481-hzx9c\" (UID: \"f8e371c3-204c-4a74-8c6f-49c25f6b7e90\") " pod="openstack/keystone-cron-29337481-hzx9c" Oct 12 06:01:00 crc kubenswrapper[4930]: I1012 06:01:00.376370 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f8e371c3-204c-4a74-8c6f-49c25f6b7e90-fernet-keys\") pod \"keystone-cron-29337481-hzx9c\" (UID: \"f8e371c3-204c-4a74-8c6f-49c25f6b7e90\") " pod="openstack/keystone-cron-29337481-hzx9c" Oct 12 06:01:00 crc kubenswrapper[4930]: I1012 06:01:00.384829 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8e371c3-204c-4a74-8c6f-49c25f6b7e90-combined-ca-bundle\") pod \"keystone-cron-29337481-hzx9c\" (UID: \"f8e371c3-204c-4a74-8c6f-49c25f6b7e90\") " pod="openstack/keystone-cron-29337481-hzx9c" Oct 12 06:01:00 crc kubenswrapper[4930]: I1012 06:01:00.386688 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8e371c3-204c-4a74-8c6f-49c25f6b7e90-config-data\") pod \"keystone-cron-29337481-hzx9c\" (UID: \"f8e371c3-204c-4a74-8c6f-49c25f6b7e90\") " pod="openstack/keystone-cron-29337481-hzx9c" Oct 12 06:01:00 crc kubenswrapper[4930]: I1012 06:01:00.404888 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f8e371c3-204c-4a74-8c6f-49c25f6b7e90-fernet-keys\") pod \"keystone-cron-29337481-hzx9c\" (UID: \"f8e371c3-204c-4a74-8c6f-49c25f6b7e90\") " pod="openstack/keystone-cron-29337481-hzx9c" Oct 12 06:01:00 crc kubenswrapper[4930]: I1012 06:01:00.409672 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4p4s7\" (UniqueName: \"kubernetes.io/projected/f8e371c3-204c-4a74-8c6f-49c25f6b7e90-kube-api-access-4p4s7\") pod \"keystone-cron-29337481-hzx9c\" (UID: \"f8e371c3-204c-4a74-8c6f-49c25f6b7e90\") " pod="openstack/keystone-cron-29337481-hzx9c" Oct 12 06:01:00 crc kubenswrapper[4930]: I1012 06:01:00.490285 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29337481-hzx9c" Oct 12 06:01:01 crc kubenswrapper[4930]: I1012 06:01:01.053979 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29337481-hzx9c"] Oct 12 06:01:01 crc kubenswrapper[4930]: W1012 06:01:01.055923 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8e371c3_204c_4a74_8c6f_49c25f6b7e90.slice/crio-d9be2603931df8edfdbe55e22e931be5d56520a43c6b7683d6fae4f736547a03 WatchSource:0}: Error finding container d9be2603931df8edfdbe55e22e931be5d56520a43c6b7683d6fae4f736547a03: Status 404 returned error can't find the container with id d9be2603931df8edfdbe55e22e931be5d56520a43c6b7683d6fae4f736547a03 Oct 12 06:01:01 crc kubenswrapper[4930]: I1012 06:01:01.733552 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29337481-hzx9c" event={"ID":"f8e371c3-204c-4a74-8c6f-49c25f6b7e90","Type":"ContainerStarted","Data":"27bedf46476e237ef085c49bd622ba131c5e412988542e66db2d587af6de525d"} Oct 12 06:01:01 crc kubenswrapper[4930]: I1012 06:01:01.734004 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29337481-hzx9c" event={"ID":"f8e371c3-204c-4a74-8c6f-49c25f6b7e90","Type":"ContainerStarted","Data":"d9be2603931df8edfdbe55e22e931be5d56520a43c6b7683d6fae4f736547a03"} Oct 12 06:01:01 crc kubenswrapper[4930]: I1012 06:01:01.759879 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29337481-hzx9c" podStartSLOduration=1.759850575 podStartE2EDuration="1.759850575s" podCreationTimestamp="2025-10-12 06:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 06:01:01.75199559 +0000 UTC m=+1194.294097395" watchObservedRunningTime="2025-10-12 06:01:01.759850575 +0000 UTC m=+1194.301952380" Oct 12 06:01:03 crc kubenswrapper[4930]: I1012 06:01:03.670087 4930 patch_prober.go:28] interesting pod/machine-config-daemon-mk4tf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 06:01:03 crc kubenswrapper[4930]: I1012 06:01:03.670472 4930 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 06:01:03 crc kubenswrapper[4930]: I1012 06:01:03.670531 4930 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" Oct 12 06:01:03 crc kubenswrapper[4930]: I1012 06:01:03.671576 4930 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c01e0f78e06c76804a67ffb0c83af238f0fea06c1b96b28458581d18668d1cf0"} pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 12 06:01:03 crc kubenswrapper[4930]: I1012 06:01:03.671655 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerName="machine-config-daemon" containerID="cri-o://c01e0f78e06c76804a67ffb0c83af238f0fea06c1b96b28458581d18668d1cf0" gracePeriod=600 Oct 12 06:01:04 crc kubenswrapper[4930]: I1012 06:01:04.798906 4930 generic.go:334] "Generic (PLEG): container finished" podID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerID="c01e0f78e06c76804a67ffb0c83af238f0fea06c1b96b28458581d18668d1cf0" exitCode=0 Oct 12 06:01:04 crc kubenswrapper[4930]: I1012 06:01:04.798975 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" event={"ID":"02f8684c-a3e4-44e8-9741-9f54488d8d8d","Type":"ContainerDied","Data":"c01e0f78e06c76804a67ffb0c83af238f0fea06c1b96b28458581d18668d1cf0"} Oct 12 06:01:04 crc kubenswrapper[4930]: I1012 06:01:04.799430 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" event={"ID":"02f8684c-a3e4-44e8-9741-9f54488d8d8d","Type":"ContainerStarted","Data":"a3725e0633e74d0677c6bbc6f3b93966f5f5bd1dac0a945c6898b98315d866e3"} Oct 12 06:01:04 crc kubenswrapper[4930]: I1012 06:01:04.799481 4930 scope.go:117] "RemoveContainer" containerID="86436bfc7a3b225084b0677c9406a12b80ec7ace76e26bb6e4b679b1e7256578" Oct 12 06:01:04 crc kubenswrapper[4930]: I1012 06:01:04.802283 4930 generic.go:334] "Generic (PLEG): container finished" podID="f8e371c3-204c-4a74-8c6f-49c25f6b7e90" containerID="27bedf46476e237ef085c49bd622ba131c5e412988542e66db2d587af6de525d" exitCode=0 Oct 12 06:01:04 crc kubenswrapper[4930]: I1012 06:01:04.802368 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29337481-hzx9c" event={"ID":"f8e371c3-204c-4a74-8c6f-49c25f6b7e90","Type":"ContainerDied","Data":"27bedf46476e237ef085c49bd622ba131c5e412988542e66db2d587af6de525d"} Oct 12 06:01:06 crc kubenswrapper[4930]: I1012 06:01:06.233831 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29337481-hzx9c" Oct 12 06:01:06 crc kubenswrapper[4930]: I1012 06:01:06.298584 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8e371c3-204c-4a74-8c6f-49c25f6b7e90-config-data\") pod \"f8e371c3-204c-4a74-8c6f-49c25f6b7e90\" (UID: \"f8e371c3-204c-4a74-8c6f-49c25f6b7e90\") " Oct 12 06:01:06 crc kubenswrapper[4930]: I1012 06:01:06.298694 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4p4s7\" (UniqueName: \"kubernetes.io/projected/f8e371c3-204c-4a74-8c6f-49c25f6b7e90-kube-api-access-4p4s7\") pod \"f8e371c3-204c-4a74-8c6f-49c25f6b7e90\" (UID: \"f8e371c3-204c-4a74-8c6f-49c25f6b7e90\") " Oct 12 06:01:06 crc kubenswrapper[4930]: I1012 06:01:06.298748 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8e371c3-204c-4a74-8c6f-49c25f6b7e90-combined-ca-bundle\") pod \"f8e371c3-204c-4a74-8c6f-49c25f6b7e90\" (UID: \"f8e371c3-204c-4a74-8c6f-49c25f6b7e90\") " Oct 12 06:01:06 crc kubenswrapper[4930]: I1012 06:01:06.298806 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f8e371c3-204c-4a74-8c6f-49c25f6b7e90-fernet-keys\") pod \"f8e371c3-204c-4a74-8c6f-49c25f6b7e90\" (UID: \"f8e371c3-204c-4a74-8c6f-49c25f6b7e90\") " Oct 12 06:01:06 crc kubenswrapper[4930]: I1012 06:01:06.306307 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8e371c3-204c-4a74-8c6f-49c25f6b7e90-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "f8e371c3-204c-4a74-8c6f-49c25f6b7e90" (UID: "f8e371c3-204c-4a74-8c6f-49c25f6b7e90"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:01:06 crc kubenswrapper[4930]: I1012 06:01:06.307241 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8e371c3-204c-4a74-8c6f-49c25f6b7e90-kube-api-access-4p4s7" (OuterVolumeSpecName: "kube-api-access-4p4s7") pod "f8e371c3-204c-4a74-8c6f-49c25f6b7e90" (UID: "f8e371c3-204c-4a74-8c6f-49c25f6b7e90"). InnerVolumeSpecName "kube-api-access-4p4s7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:01:06 crc kubenswrapper[4930]: I1012 06:01:06.329755 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8e371c3-204c-4a74-8c6f-49c25f6b7e90-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f8e371c3-204c-4a74-8c6f-49c25f6b7e90" (UID: "f8e371c3-204c-4a74-8c6f-49c25f6b7e90"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:01:06 crc kubenswrapper[4930]: I1012 06:01:06.368922 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8e371c3-204c-4a74-8c6f-49c25f6b7e90-config-data" (OuterVolumeSpecName: "config-data") pod "f8e371c3-204c-4a74-8c6f-49c25f6b7e90" (UID: "f8e371c3-204c-4a74-8c6f-49c25f6b7e90"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:01:06 crc kubenswrapper[4930]: I1012 06:01:06.400748 4930 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8e371c3-204c-4a74-8c6f-49c25f6b7e90-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 06:01:06 crc kubenswrapper[4930]: I1012 06:01:06.400781 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4p4s7\" (UniqueName: \"kubernetes.io/projected/f8e371c3-204c-4a74-8c6f-49c25f6b7e90-kube-api-access-4p4s7\") on node \"crc\" DevicePath \"\"" Oct 12 06:01:06 crc kubenswrapper[4930]: I1012 06:01:06.400793 4930 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8e371c3-204c-4a74-8c6f-49c25f6b7e90-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 06:01:06 crc kubenswrapper[4930]: I1012 06:01:06.400802 4930 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f8e371c3-204c-4a74-8c6f-49c25f6b7e90-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 12 06:01:06 crc kubenswrapper[4930]: I1012 06:01:06.840620 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29337481-hzx9c" event={"ID":"f8e371c3-204c-4a74-8c6f-49c25f6b7e90","Type":"ContainerDied","Data":"d9be2603931df8edfdbe55e22e931be5d56520a43c6b7683d6fae4f736547a03"} Oct 12 06:01:06 crc kubenswrapper[4930]: I1012 06:01:06.841063 4930 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d9be2603931df8edfdbe55e22e931be5d56520a43c6b7683d6fae4f736547a03" Oct 12 06:01:06 crc kubenswrapper[4930]: I1012 06:01:06.840712 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29337481-hzx9c" Oct 12 06:01:09 crc kubenswrapper[4930]: E1012 06:01:09.737311 4930 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d0cf3ce_d82f_43e3_9542_6cca53e19b42.slice/crio-16b5bce420b969768bdbae5362224fabb73c667affdefc4eb5fa1d2883c7c259.scope\": RecentStats: unable to find data in memory cache]" Oct 12 06:01:09 crc kubenswrapper[4930]: I1012 06:01:09.894471 4930 generic.go:334] "Generic (PLEG): container finished" podID="9d0cf3ce-d82f-43e3-9542-6cca53e19b42" containerID="16b5bce420b969768bdbae5362224fabb73c667affdefc4eb5fa1d2883c7c259" exitCode=0 Oct 12 06:01:09 crc kubenswrapper[4930]: I1012 06:01:09.894565 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-mqxdr" event={"ID":"9d0cf3ce-d82f-43e3-9542-6cca53e19b42","Type":"ContainerDied","Data":"16b5bce420b969768bdbae5362224fabb73c667affdefc4eb5fa1d2883c7c259"} Oct 12 06:01:09 crc kubenswrapper[4930]: I1012 06:01:09.936060 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 12 06:01:11 crc kubenswrapper[4930]: I1012 06:01:11.381141 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-mqxdr" Oct 12 06:01:11 crc kubenswrapper[4930]: I1012 06:01:11.510627 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d0cf3ce-d82f-43e3-9542-6cca53e19b42-combined-ca-bundle\") pod \"9d0cf3ce-d82f-43e3-9542-6cca53e19b42\" (UID: \"9d0cf3ce-d82f-43e3-9542-6cca53e19b42\") " Oct 12 06:01:11 crc kubenswrapper[4930]: I1012 06:01:11.510709 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d0cf3ce-d82f-43e3-9542-6cca53e19b42-config-data\") pod \"9d0cf3ce-d82f-43e3-9542-6cca53e19b42\" (UID: \"9d0cf3ce-d82f-43e3-9542-6cca53e19b42\") " Oct 12 06:01:11 crc kubenswrapper[4930]: I1012 06:01:11.510814 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhq6d\" (UniqueName: \"kubernetes.io/projected/9d0cf3ce-d82f-43e3-9542-6cca53e19b42-kube-api-access-rhq6d\") pod \"9d0cf3ce-d82f-43e3-9542-6cca53e19b42\" (UID: \"9d0cf3ce-d82f-43e3-9542-6cca53e19b42\") " Oct 12 06:01:11 crc kubenswrapper[4930]: I1012 06:01:11.511043 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d0cf3ce-d82f-43e3-9542-6cca53e19b42-scripts\") pod \"9d0cf3ce-d82f-43e3-9542-6cca53e19b42\" (UID: \"9d0cf3ce-d82f-43e3-9542-6cca53e19b42\") " Oct 12 06:01:11 crc kubenswrapper[4930]: I1012 06:01:11.517657 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d0cf3ce-d82f-43e3-9542-6cca53e19b42-scripts" (OuterVolumeSpecName: "scripts") pod "9d0cf3ce-d82f-43e3-9542-6cca53e19b42" (UID: "9d0cf3ce-d82f-43e3-9542-6cca53e19b42"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:01:11 crc kubenswrapper[4930]: I1012 06:01:11.517838 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d0cf3ce-d82f-43e3-9542-6cca53e19b42-kube-api-access-rhq6d" (OuterVolumeSpecName: "kube-api-access-rhq6d") pod "9d0cf3ce-d82f-43e3-9542-6cca53e19b42" (UID: "9d0cf3ce-d82f-43e3-9542-6cca53e19b42"). InnerVolumeSpecName "kube-api-access-rhq6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:01:11 crc kubenswrapper[4930]: I1012 06:01:11.564891 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d0cf3ce-d82f-43e3-9542-6cca53e19b42-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9d0cf3ce-d82f-43e3-9542-6cca53e19b42" (UID: "9d0cf3ce-d82f-43e3-9542-6cca53e19b42"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:01:11 crc kubenswrapper[4930]: I1012 06:01:11.568164 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d0cf3ce-d82f-43e3-9542-6cca53e19b42-config-data" (OuterVolumeSpecName: "config-data") pod "9d0cf3ce-d82f-43e3-9542-6cca53e19b42" (UID: "9d0cf3ce-d82f-43e3-9542-6cca53e19b42"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:01:11 crc kubenswrapper[4930]: I1012 06:01:11.613924 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rhq6d\" (UniqueName: \"kubernetes.io/projected/9d0cf3ce-d82f-43e3-9542-6cca53e19b42-kube-api-access-rhq6d\") on node \"crc\" DevicePath \"\"" Oct 12 06:01:11 crc kubenswrapper[4930]: I1012 06:01:11.613963 4930 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d0cf3ce-d82f-43e3-9542-6cca53e19b42-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 06:01:11 crc kubenswrapper[4930]: I1012 06:01:11.613980 4930 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d0cf3ce-d82f-43e3-9542-6cca53e19b42-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 06:01:11 crc kubenswrapper[4930]: I1012 06:01:11.613991 4930 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d0cf3ce-d82f-43e3-9542-6cca53e19b42-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 06:01:11 crc kubenswrapper[4930]: I1012 06:01:11.962957 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-mqxdr" event={"ID":"9d0cf3ce-d82f-43e3-9542-6cca53e19b42","Type":"ContainerDied","Data":"29612554ef331cf11e7e26159cbe95bfedc8ebd25bc6f9284bc8ee0e16f153b3"} Oct 12 06:01:11 crc kubenswrapper[4930]: I1012 06:01:11.963228 4930 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="29612554ef331cf11e7e26159cbe95bfedc8ebd25bc6f9284bc8ee0e16f153b3" Oct 12 06:01:11 crc kubenswrapper[4930]: I1012 06:01:11.963021 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-mqxdr" Oct 12 06:01:12 crc kubenswrapper[4930]: I1012 06:01:12.007856 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 12 06:01:12 crc kubenswrapper[4930]: E1012 06:01:12.008465 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d0cf3ce-d82f-43e3-9542-6cca53e19b42" containerName="nova-cell0-conductor-db-sync" Oct 12 06:01:12 crc kubenswrapper[4930]: I1012 06:01:12.008549 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d0cf3ce-d82f-43e3-9542-6cca53e19b42" containerName="nova-cell0-conductor-db-sync" Oct 12 06:01:12 crc kubenswrapper[4930]: E1012 06:01:12.008646 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8e371c3-204c-4a74-8c6f-49c25f6b7e90" containerName="keystone-cron" Oct 12 06:01:12 crc kubenswrapper[4930]: I1012 06:01:12.008708 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8e371c3-204c-4a74-8c6f-49c25f6b7e90" containerName="keystone-cron" Oct 12 06:01:12 crc kubenswrapper[4930]: I1012 06:01:12.008967 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d0cf3ce-d82f-43e3-9542-6cca53e19b42" containerName="nova-cell0-conductor-db-sync" Oct 12 06:01:12 crc kubenswrapper[4930]: I1012 06:01:12.009787 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8e371c3-204c-4a74-8c6f-49c25f6b7e90" containerName="keystone-cron" Oct 12 06:01:12 crc kubenswrapper[4930]: I1012 06:01:12.010639 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 12 06:01:12 crc kubenswrapper[4930]: I1012 06:01:12.014460 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 12 06:01:12 crc kubenswrapper[4930]: I1012 06:01:12.015707 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-kprw9" Oct 12 06:01:12 crc kubenswrapper[4930]: I1012 06:01:12.029629 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 12 06:01:12 crc kubenswrapper[4930]: I1012 06:01:12.123242 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pt5nw\" (UniqueName: \"kubernetes.io/projected/534efb5a-d958-48db-9d8d-1a49091be4de-kube-api-access-pt5nw\") pod \"nova-cell0-conductor-0\" (UID: \"534efb5a-d958-48db-9d8d-1a49091be4de\") " pod="openstack/nova-cell0-conductor-0" Oct 12 06:01:12 crc kubenswrapper[4930]: I1012 06:01:12.123374 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/534efb5a-d958-48db-9d8d-1a49091be4de-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"534efb5a-d958-48db-9d8d-1a49091be4de\") " pod="openstack/nova-cell0-conductor-0" Oct 12 06:01:12 crc kubenswrapper[4930]: I1012 06:01:12.123401 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/534efb5a-d958-48db-9d8d-1a49091be4de-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"534efb5a-d958-48db-9d8d-1a49091be4de\") " pod="openstack/nova-cell0-conductor-0" Oct 12 06:01:12 crc kubenswrapper[4930]: I1012 06:01:12.225388 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/534efb5a-d958-48db-9d8d-1a49091be4de-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"534efb5a-d958-48db-9d8d-1a49091be4de\") " pod="openstack/nova-cell0-conductor-0" Oct 12 06:01:12 crc kubenswrapper[4930]: I1012 06:01:12.225485 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/534efb5a-d958-48db-9d8d-1a49091be4de-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"534efb5a-d958-48db-9d8d-1a49091be4de\") " pod="openstack/nova-cell0-conductor-0" Oct 12 06:01:12 crc kubenswrapper[4930]: I1012 06:01:12.226696 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pt5nw\" (UniqueName: \"kubernetes.io/projected/534efb5a-d958-48db-9d8d-1a49091be4de-kube-api-access-pt5nw\") pod \"nova-cell0-conductor-0\" (UID: \"534efb5a-d958-48db-9d8d-1a49091be4de\") " pod="openstack/nova-cell0-conductor-0" Oct 12 06:01:12 crc kubenswrapper[4930]: I1012 06:01:12.230989 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/534efb5a-d958-48db-9d8d-1a49091be4de-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"534efb5a-d958-48db-9d8d-1a49091be4de\") " pod="openstack/nova-cell0-conductor-0" Oct 12 06:01:12 crc kubenswrapper[4930]: I1012 06:01:12.231377 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/534efb5a-d958-48db-9d8d-1a49091be4de-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"534efb5a-d958-48db-9d8d-1a49091be4de\") " pod="openstack/nova-cell0-conductor-0" Oct 12 06:01:12 crc kubenswrapper[4930]: I1012 06:01:12.245468 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pt5nw\" (UniqueName: \"kubernetes.io/projected/534efb5a-d958-48db-9d8d-1a49091be4de-kube-api-access-pt5nw\") pod \"nova-cell0-conductor-0\" (UID: \"534efb5a-d958-48db-9d8d-1a49091be4de\") " pod="openstack/nova-cell0-conductor-0" Oct 12 06:01:12 crc kubenswrapper[4930]: I1012 06:01:12.329684 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 12 06:01:12 crc kubenswrapper[4930]: I1012 06:01:12.857368 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 12 06:01:12 crc kubenswrapper[4930]: I1012 06:01:12.976938 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"534efb5a-d958-48db-9d8d-1a49091be4de","Type":"ContainerStarted","Data":"a07b7540b19437276d1bc05f9243c7156639bf7eb2ffd12498f2b1e1a611279a"} Oct 12 06:01:13 crc kubenswrapper[4930]: I1012 06:01:13.866654 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 12 06:01:13 crc kubenswrapper[4930]: I1012 06:01:13.867972 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="86e4fb08-5a1f-4f1e-8cf0-7ef59386d66b" containerName="kube-state-metrics" containerID="cri-o://3fa6d184fb5c0aa4eade4a55080e02d334c5e40b8c240dea9baf23110864fd79" gracePeriod=30 Oct 12 06:01:13 crc kubenswrapper[4930]: I1012 06:01:13.990565 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"534efb5a-d958-48db-9d8d-1a49091be4de","Type":"ContainerStarted","Data":"f434583be3735b19dbd72c2bcd0ad50d46637a735fa8d28c1cf55c775ed4167a"} Oct 12 06:01:13 crc kubenswrapper[4930]: I1012 06:01:13.991854 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 12 06:01:14 crc kubenswrapper[4930]: I1012 06:01:14.037642 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=3.037623632 podStartE2EDuration="3.037623632s" podCreationTimestamp="2025-10-12 06:01:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 06:01:14.007345538 +0000 UTC m=+1206.549447323" watchObservedRunningTime="2025-10-12 06:01:14.037623632 +0000 UTC m=+1206.579725397" Oct 12 06:01:14 crc kubenswrapper[4930]: I1012 06:01:14.444466 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 12 06:01:14 crc kubenswrapper[4930]: I1012 06:01:14.604915 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87rq6\" (UniqueName: \"kubernetes.io/projected/86e4fb08-5a1f-4f1e-8cf0-7ef59386d66b-kube-api-access-87rq6\") pod \"86e4fb08-5a1f-4f1e-8cf0-7ef59386d66b\" (UID: \"86e4fb08-5a1f-4f1e-8cf0-7ef59386d66b\") " Oct 12 06:01:14 crc kubenswrapper[4930]: I1012 06:01:14.610401 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86e4fb08-5a1f-4f1e-8cf0-7ef59386d66b-kube-api-access-87rq6" (OuterVolumeSpecName: "kube-api-access-87rq6") pod "86e4fb08-5a1f-4f1e-8cf0-7ef59386d66b" (UID: "86e4fb08-5a1f-4f1e-8cf0-7ef59386d66b"). InnerVolumeSpecName "kube-api-access-87rq6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:01:14 crc kubenswrapper[4930]: I1012 06:01:14.706858 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87rq6\" (UniqueName: \"kubernetes.io/projected/86e4fb08-5a1f-4f1e-8cf0-7ef59386d66b-kube-api-access-87rq6\") on node \"crc\" DevicePath \"\"" Oct 12 06:01:15 crc kubenswrapper[4930]: I1012 06:01:15.009781 4930 generic.go:334] "Generic (PLEG): container finished" podID="86e4fb08-5a1f-4f1e-8cf0-7ef59386d66b" containerID="3fa6d184fb5c0aa4eade4a55080e02d334c5e40b8c240dea9baf23110864fd79" exitCode=2 Oct 12 06:01:15 crc kubenswrapper[4930]: I1012 06:01:15.009881 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"86e4fb08-5a1f-4f1e-8cf0-7ef59386d66b","Type":"ContainerDied","Data":"3fa6d184fb5c0aa4eade4a55080e02d334c5e40b8c240dea9baf23110864fd79"} Oct 12 06:01:15 crc kubenswrapper[4930]: I1012 06:01:15.009934 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"86e4fb08-5a1f-4f1e-8cf0-7ef59386d66b","Type":"ContainerDied","Data":"3f900e0e86e6104fce7c6e34d04508a5e51f581c9c394ebd4e4884c908ec1ac2"} Oct 12 06:01:15 crc kubenswrapper[4930]: I1012 06:01:15.009956 4930 scope.go:117] "RemoveContainer" containerID="3fa6d184fb5c0aa4eade4a55080e02d334c5e40b8c240dea9baf23110864fd79" Oct 12 06:01:15 crc kubenswrapper[4930]: I1012 06:01:15.010272 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 12 06:01:15 crc kubenswrapper[4930]: I1012 06:01:15.047930 4930 scope.go:117] "RemoveContainer" containerID="3fa6d184fb5c0aa4eade4a55080e02d334c5e40b8c240dea9baf23110864fd79" Oct 12 06:01:15 crc kubenswrapper[4930]: E1012 06:01:15.048361 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fa6d184fb5c0aa4eade4a55080e02d334c5e40b8c240dea9baf23110864fd79\": container with ID starting with 3fa6d184fb5c0aa4eade4a55080e02d334c5e40b8c240dea9baf23110864fd79 not found: ID does not exist" containerID="3fa6d184fb5c0aa4eade4a55080e02d334c5e40b8c240dea9baf23110864fd79" Oct 12 06:01:15 crc kubenswrapper[4930]: I1012 06:01:15.048406 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fa6d184fb5c0aa4eade4a55080e02d334c5e40b8c240dea9baf23110864fd79"} err="failed to get container status \"3fa6d184fb5c0aa4eade4a55080e02d334c5e40b8c240dea9baf23110864fd79\": rpc error: code = NotFound desc = could not find container \"3fa6d184fb5c0aa4eade4a55080e02d334c5e40b8c240dea9baf23110864fd79\": container with ID starting with 3fa6d184fb5c0aa4eade4a55080e02d334c5e40b8c240dea9baf23110864fd79 not found: ID does not exist" Oct 12 06:01:15 crc kubenswrapper[4930]: I1012 06:01:15.063889 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 12 06:01:15 crc kubenswrapper[4930]: I1012 06:01:15.074247 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 12 06:01:15 crc kubenswrapper[4930]: I1012 06:01:15.087939 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 12 06:01:15 crc kubenswrapper[4930]: E1012 06:01:15.088556 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86e4fb08-5a1f-4f1e-8cf0-7ef59386d66b" containerName="kube-state-metrics" Oct 12 06:01:15 crc kubenswrapper[4930]: I1012 06:01:15.088583 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="86e4fb08-5a1f-4f1e-8cf0-7ef59386d66b" containerName="kube-state-metrics" Oct 12 06:01:15 crc kubenswrapper[4930]: I1012 06:01:15.088915 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="86e4fb08-5a1f-4f1e-8cf0-7ef59386d66b" containerName="kube-state-metrics" Oct 12 06:01:15 crc kubenswrapper[4930]: I1012 06:01:15.089886 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 12 06:01:15 crc kubenswrapper[4930]: I1012 06:01:15.092333 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Oct 12 06:01:15 crc kubenswrapper[4930]: I1012 06:01:15.092658 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Oct 12 06:01:15 crc kubenswrapper[4930]: I1012 06:01:15.099240 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 12 06:01:15 crc kubenswrapper[4930]: I1012 06:01:15.216284 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a5cf183-2e6a-408d-9baa-2f43f7b7b354-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"8a5cf183-2e6a-408d-9baa-2f43f7b7b354\") " pod="openstack/kube-state-metrics-0" Oct 12 06:01:15 crc kubenswrapper[4930]: I1012 06:01:15.216658 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/8a5cf183-2e6a-408d-9baa-2f43f7b7b354-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"8a5cf183-2e6a-408d-9baa-2f43f7b7b354\") " pod="openstack/kube-state-metrics-0" Oct 12 06:01:15 crc kubenswrapper[4930]: I1012 06:01:15.216714 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwn4v\" (UniqueName: \"kubernetes.io/projected/8a5cf183-2e6a-408d-9baa-2f43f7b7b354-kube-api-access-hwn4v\") pod \"kube-state-metrics-0\" (UID: \"8a5cf183-2e6a-408d-9baa-2f43f7b7b354\") " pod="openstack/kube-state-metrics-0" Oct 12 06:01:15 crc kubenswrapper[4930]: I1012 06:01:15.217484 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a5cf183-2e6a-408d-9baa-2f43f7b7b354-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"8a5cf183-2e6a-408d-9baa-2f43f7b7b354\") " pod="openstack/kube-state-metrics-0" Oct 12 06:01:15 crc kubenswrapper[4930]: I1012 06:01:15.319976 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a5cf183-2e6a-408d-9baa-2f43f7b7b354-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"8a5cf183-2e6a-408d-9baa-2f43f7b7b354\") " pod="openstack/kube-state-metrics-0" Oct 12 06:01:15 crc kubenswrapper[4930]: I1012 06:01:15.320125 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a5cf183-2e6a-408d-9baa-2f43f7b7b354-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"8a5cf183-2e6a-408d-9baa-2f43f7b7b354\") " pod="openstack/kube-state-metrics-0" Oct 12 06:01:15 crc kubenswrapper[4930]: I1012 06:01:15.320195 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/8a5cf183-2e6a-408d-9baa-2f43f7b7b354-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"8a5cf183-2e6a-408d-9baa-2f43f7b7b354\") " pod="openstack/kube-state-metrics-0" Oct 12 06:01:15 crc kubenswrapper[4930]: I1012 06:01:15.320213 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwn4v\" (UniqueName: \"kubernetes.io/projected/8a5cf183-2e6a-408d-9baa-2f43f7b7b354-kube-api-access-hwn4v\") pod \"kube-state-metrics-0\" (UID: \"8a5cf183-2e6a-408d-9baa-2f43f7b7b354\") " pod="openstack/kube-state-metrics-0" Oct 12 06:01:15 crc kubenswrapper[4930]: I1012 06:01:15.325242 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/8a5cf183-2e6a-408d-9baa-2f43f7b7b354-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"8a5cf183-2e6a-408d-9baa-2f43f7b7b354\") " pod="openstack/kube-state-metrics-0" Oct 12 06:01:15 crc kubenswrapper[4930]: I1012 06:01:15.325926 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a5cf183-2e6a-408d-9baa-2f43f7b7b354-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"8a5cf183-2e6a-408d-9baa-2f43f7b7b354\") " pod="openstack/kube-state-metrics-0" Oct 12 06:01:15 crc kubenswrapper[4930]: I1012 06:01:15.334715 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a5cf183-2e6a-408d-9baa-2f43f7b7b354-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"8a5cf183-2e6a-408d-9baa-2f43f7b7b354\") " pod="openstack/kube-state-metrics-0" Oct 12 06:01:15 crc kubenswrapper[4930]: I1012 06:01:15.349160 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwn4v\" (UniqueName: \"kubernetes.io/projected/8a5cf183-2e6a-408d-9baa-2f43f7b7b354-kube-api-access-hwn4v\") pod \"kube-state-metrics-0\" (UID: \"8a5cf183-2e6a-408d-9baa-2f43f7b7b354\") " pod="openstack/kube-state-metrics-0" Oct 12 06:01:15 crc kubenswrapper[4930]: I1012 06:01:15.417565 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 12 06:01:15 crc kubenswrapper[4930]: I1012 06:01:15.761632 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 12 06:01:15 crc kubenswrapper[4930]: I1012 06:01:15.762168 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a3697520-ef2b-4b11-80a2-f47ca1de765b" containerName="ceilometer-central-agent" containerID="cri-o://3bedaa06bc81039767a716adef320cae4b890d4bd9b853e55e53feb8c7641bd5" gracePeriod=30 Oct 12 06:01:15 crc kubenswrapper[4930]: I1012 06:01:15.762270 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a3697520-ef2b-4b11-80a2-f47ca1de765b" containerName="sg-core" containerID="cri-o://e7e1352593f0bfadce55c4d20a57e8bfb3263418a6531a8536a4a1bd984af0bf" gracePeriod=30 Oct 12 06:01:15 crc kubenswrapper[4930]: I1012 06:01:15.762276 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a3697520-ef2b-4b11-80a2-f47ca1de765b" containerName="ceilometer-notification-agent" containerID="cri-o://71a020f0d422ccae493d5d7258303e1c02d534e2b30ff6a2d4495a7ee6827f60" gracePeriod=30 Oct 12 06:01:15 crc kubenswrapper[4930]: I1012 06:01:15.762362 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a3697520-ef2b-4b11-80a2-f47ca1de765b" containerName="proxy-httpd" containerID="cri-o://b25fa8c77be0c182869e6f29dbe64ba2bac901c9d516c2602f81daa149e606a0" gracePeriod=30 Oct 12 06:01:15 crc kubenswrapper[4930]: I1012 06:01:15.956043 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 12 06:01:15 crc kubenswrapper[4930]: W1012 06:01:15.963487 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a5cf183_2e6a_408d_9baa_2f43f7b7b354.slice/crio-4c0e5f99dd63540a0d91111ebed1d0966cc5a446212ca33db10627b85358b3d9 WatchSource:0}: Error finding container 4c0e5f99dd63540a0d91111ebed1d0966cc5a446212ca33db10627b85358b3d9: Status 404 returned error can't find the container with id 4c0e5f99dd63540a0d91111ebed1d0966cc5a446212ca33db10627b85358b3d9 Oct 12 06:01:16 crc kubenswrapper[4930]: I1012 06:01:16.023413 4930 generic.go:334] "Generic (PLEG): container finished" podID="a3697520-ef2b-4b11-80a2-f47ca1de765b" containerID="b25fa8c77be0c182869e6f29dbe64ba2bac901c9d516c2602f81daa149e606a0" exitCode=0 Oct 12 06:01:16 crc kubenswrapper[4930]: I1012 06:01:16.023442 4930 generic.go:334] "Generic (PLEG): container finished" podID="a3697520-ef2b-4b11-80a2-f47ca1de765b" containerID="e7e1352593f0bfadce55c4d20a57e8bfb3263418a6531a8536a4a1bd984af0bf" exitCode=2 Oct 12 06:01:16 crc kubenswrapper[4930]: I1012 06:01:16.023493 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a3697520-ef2b-4b11-80a2-f47ca1de765b","Type":"ContainerDied","Data":"b25fa8c77be0c182869e6f29dbe64ba2bac901c9d516c2602f81daa149e606a0"} Oct 12 06:01:16 crc kubenswrapper[4930]: I1012 06:01:16.023550 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a3697520-ef2b-4b11-80a2-f47ca1de765b","Type":"ContainerDied","Data":"e7e1352593f0bfadce55c4d20a57e8bfb3263418a6531a8536a4a1bd984af0bf"} Oct 12 06:01:16 crc kubenswrapper[4930]: I1012 06:01:16.026463 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8a5cf183-2e6a-408d-9baa-2f43f7b7b354","Type":"ContainerStarted","Data":"4c0e5f99dd63540a0d91111ebed1d0966cc5a446212ca33db10627b85358b3d9"} Oct 12 06:01:16 crc kubenswrapper[4930]: I1012 06:01:16.146225 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86e4fb08-5a1f-4f1e-8cf0-7ef59386d66b" path="/var/lib/kubelet/pods/86e4fb08-5a1f-4f1e-8cf0-7ef59386d66b/volumes" Oct 12 06:01:17 crc kubenswrapper[4930]: I1012 06:01:17.036852 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8a5cf183-2e6a-408d-9baa-2f43f7b7b354","Type":"ContainerStarted","Data":"f018821d496eea5e862adbd0c3cdfb483c74dc4eef31cdd70dfb0f4b6ecd9101"} Oct 12 06:01:17 crc kubenswrapper[4930]: I1012 06:01:17.037291 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 12 06:01:17 crc kubenswrapper[4930]: I1012 06:01:17.039459 4930 generic.go:334] "Generic (PLEG): container finished" podID="a3697520-ef2b-4b11-80a2-f47ca1de765b" containerID="3bedaa06bc81039767a716adef320cae4b890d4bd9b853e55e53feb8c7641bd5" exitCode=0 Oct 12 06:01:17 crc kubenswrapper[4930]: I1012 06:01:17.039502 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a3697520-ef2b-4b11-80a2-f47ca1de765b","Type":"ContainerDied","Data":"3bedaa06bc81039767a716adef320cae4b890d4bd9b853e55e53feb8c7641bd5"} Oct 12 06:01:17 crc kubenswrapper[4930]: I1012 06:01:17.056373 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.674524404 podStartE2EDuration="2.056350671s" podCreationTimestamp="2025-10-12 06:01:15 +0000 UTC" firstStartedPulling="2025-10-12 06:01:15.965532842 +0000 UTC m=+1208.507634607" lastFinishedPulling="2025-10-12 06:01:16.347359109 +0000 UTC m=+1208.889460874" observedRunningTime="2025-10-12 06:01:17.056036263 +0000 UTC m=+1209.598138028" watchObservedRunningTime="2025-10-12 06:01:17.056350671 +0000 UTC m=+1209.598452436" Oct 12 06:01:22 crc kubenswrapper[4930]: I1012 06:01:22.381923 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 12 06:01:22 crc kubenswrapper[4930]: I1012 06:01:22.942229 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-fzfdt"] Oct 12 06:01:22 crc kubenswrapper[4930]: I1012 06:01:22.943553 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-fzfdt" Oct 12 06:01:22 crc kubenswrapper[4930]: I1012 06:01:22.950585 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Oct 12 06:01:22 crc kubenswrapper[4930]: I1012 06:01:22.950898 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Oct 12 06:01:22 crc kubenswrapper[4930]: I1012 06:01:22.951622 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-fzfdt"] Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.006866 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/022414f2-80b4-4bc1-81e9-df49ebbaae8e-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-fzfdt\" (UID: \"022414f2-80b4-4bc1-81e9-df49ebbaae8e\") " pod="openstack/nova-cell0-cell-mapping-fzfdt" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.007192 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/022414f2-80b4-4bc1-81e9-df49ebbaae8e-config-data\") pod \"nova-cell0-cell-mapping-fzfdt\" (UID: \"022414f2-80b4-4bc1-81e9-df49ebbaae8e\") " pod="openstack/nova-cell0-cell-mapping-fzfdt" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.007226 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/022414f2-80b4-4bc1-81e9-df49ebbaae8e-scripts\") pod \"nova-cell0-cell-mapping-fzfdt\" (UID: \"022414f2-80b4-4bc1-81e9-df49ebbaae8e\") " pod="openstack/nova-cell0-cell-mapping-fzfdt" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.007250 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22zgt\" (UniqueName: \"kubernetes.io/projected/022414f2-80b4-4bc1-81e9-df49ebbaae8e-kube-api-access-22zgt\") pod \"nova-cell0-cell-mapping-fzfdt\" (UID: \"022414f2-80b4-4bc1-81e9-df49ebbaae8e\") " pod="openstack/nova-cell0-cell-mapping-fzfdt" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.108865 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/022414f2-80b4-4bc1-81e9-df49ebbaae8e-config-data\") pod \"nova-cell0-cell-mapping-fzfdt\" (UID: \"022414f2-80b4-4bc1-81e9-df49ebbaae8e\") " pod="openstack/nova-cell0-cell-mapping-fzfdt" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.109121 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/022414f2-80b4-4bc1-81e9-df49ebbaae8e-scripts\") pod \"nova-cell0-cell-mapping-fzfdt\" (UID: \"022414f2-80b4-4bc1-81e9-df49ebbaae8e\") " pod="openstack/nova-cell0-cell-mapping-fzfdt" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.109248 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22zgt\" (UniqueName: \"kubernetes.io/projected/022414f2-80b4-4bc1-81e9-df49ebbaae8e-kube-api-access-22zgt\") pod \"nova-cell0-cell-mapping-fzfdt\" (UID: \"022414f2-80b4-4bc1-81e9-df49ebbaae8e\") " pod="openstack/nova-cell0-cell-mapping-fzfdt" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.109412 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/022414f2-80b4-4bc1-81e9-df49ebbaae8e-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-fzfdt\" (UID: \"022414f2-80b4-4bc1-81e9-df49ebbaae8e\") " pod="openstack/nova-cell0-cell-mapping-fzfdt" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.121263 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/022414f2-80b4-4bc1-81e9-df49ebbaae8e-scripts\") pod \"nova-cell0-cell-mapping-fzfdt\" (UID: \"022414f2-80b4-4bc1-81e9-df49ebbaae8e\") " pod="openstack/nova-cell0-cell-mapping-fzfdt" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.121825 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/022414f2-80b4-4bc1-81e9-df49ebbaae8e-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-fzfdt\" (UID: \"022414f2-80b4-4bc1-81e9-df49ebbaae8e\") " pod="openstack/nova-cell0-cell-mapping-fzfdt" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.135688 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/022414f2-80b4-4bc1-81e9-df49ebbaae8e-config-data\") pod \"nova-cell0-cell-mapping-fzfdt\" (UID: \"022414f2-80b4-4bc1-81e9-df49ebbaae8e\") " pod="openstack/nova-cell0-cell-mapping-fzfdt" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.301336 4930 generic.go:334] "Generic (PLEG): container finished" podID="a3697520-ef2b-4b11-80a2-f47ca1de765b" containerID="71a020f0d422ccae493d5d7258303e1c02d534e2b30ff6a2d4495a7ee6827f60" exitCode=0 Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.301380 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a3697520-ef2b-4b11-80a2-f47ca1de765b","Type":"ContainerDied","Data":"71a020f0d422ccae493d5d7258303e1c02d534e2b30ff6a2d4495a7ee6827f60"} Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.333155 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.334580 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.359996 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.417595 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69fb1eeb-e151-4db3-951e-473de5e1b0a9-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"69fb1eeb-e151-4db3-951e-473de5e1b0a9\") " pod="openstack/nova-cell1-novncproxy-0" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.417641 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxzrq\" (UniqueName: \"kubernetes.io/projected/69fb1eeb-e151-4db3-951e-473de5e1b0a9-kube-api-access-kxzrq\") pod \"nova-cell1-novncproxy-0\" (UID: \"69fb1eeb-e151-4db3-951e-473de5e1b0a9\") " pod="openstack/nova-cell1-novncproxy-0" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.417692 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69fb1eeb-e151-4db3-951e-473de5e1b0a9-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"69fb1eeb-e151-4db3-951e-473de5e1b0a9\") " pod="openstack/nova-cell1-novncproxy-0" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.437406 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22zgt\" (UniqueName: \"kubernetes.io/projected/022414f2-80b4-4bc1-81e9-df49ebbaae8e-kube-api-access-22zgt\") pod \"nova-cell0-cell-mapping-fzfdt\" (UID: \"022414f2-80b4-4bc1-81e9-df49ebbaae8e\") " pod="openstack/nova-cell0-cell-mapping-fzfdt" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.449476 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.452893 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.510632 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 12 06:01:23 crc kubenswrapper[4930]: E1012 06:01:23.511147 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3697520-ef2b-4b11-80a2-f47ca1de765b" containerName="ceilometer-central-agent" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.511165 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3697520-ef2b-4b11-80a2-f47ca1de765b" containerName="ceilometer-central-agent" Oct 12 06:01:23 crc kubenswrapper[4930]: E1012 06:01:23.511174 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3697520-ef2b-4b11-80a2-f47ca1de765b" containerName="sg-core" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.511197 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3697520-ef2b-4b11-80a2-f47ca1de765b" containerName="sg-core" Oct 12 06:01:23 crc kubenswrapper[4930]: E1012 06:01:23.511214 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3697520-ef2b-4b11-80a2-f47ca1de765b" containerName="proxy-httpd" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.511220 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3697520-ef2b-4b11-80a2-f47ca1de765b" containerName="proxy-httpd" Oct 12 06:01:23 crc kubenswrapper[4930]: E1012 06:01:23.511227 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3697520-ef2b-4b11-80a2-f47ca1de765b" containerName="ceilometer-notification-agent" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.511233 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3697520-ef2b-4b11-80a2-f47ca1de765b" containerName="ceilometer-notification-agent" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.511413 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3697520-ef2b-4b11-80a2-f47ca1de765b" containerName="ceilometer-central-agent" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.511433 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3697520-ef2b-4b11-80a2-f47ca1de765b" containerName="sg-core" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.511440 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3697520-ef2b-4b11-80a2-f47ca1de765b" containerName="proxy-httpd" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.511450 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3697520-ef2b-4b11-80a2-f47ca1de765b" containerName="ceilometer-notification-agent" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.512485 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.519600 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a3697520-ef2b-4b11-80a2-f47ca1de765b-sg-core-conf-yaml\") pod \"a3697520-ef2b-4b11-80a2-f47ca1de765b\" (UID: \"a3697520-ef2b-4b11-80a2-f47ca1de765b\") " Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.519672 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3697520-ef2b-4b11-80a2-f47ca1de765b-scripts\") pod \"a3697520-ef2b-4b11-80a2-f47ca1de765b\" (UID: \"a3697520-ef2b-4b11-80a2-f47ca1de765b\") " Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.519730 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a3697520-ef2b-4b11-80a2-f47ca1de765b-log-httpd\") pod \"a3697520-ef2b-4b11-80a2-f47ca1de765b\" (UID: \"a3697520-ef2b-4b11-80a2-f47ca1de765b\") " Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.519802 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a3697520-ef2b-4b11-80a2-f47ca1de765b-run-httpd\") pod \"a3697520-ef2b-4b11-80a2-f47ca1de765b\" (UID: \"a3697520-ef2b-4b11-80a2-f47ca1de765b\") " Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.519859 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3697520-ef2b-4b11-80a2-f47ca1de765b-combined-ca-bundle\") pod \"a3697520-ef2b-4b11-80a2-f47ca1de765b\" (UID: \"a3697520-ef2b-4b11-80a2-f47ca1de765b\") " Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.519900 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thmdt\" (UniqueName: \"kubernetes.io/projected/a3697520-ef2b-4b11-80a2-f47ca1de765b-kube-api-access-thmdt\") pod \"a3697520-ef2b-4b11-80a2-f47ca1de765b\" (UID: \"a3697520-ef2b-4b11-80a2-f47ca1de765b\") " Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.519993 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3697520-ef2b-4b11-80a2-f47ca1de765b-config-data\") pod \"a3697520-ef2b-4b11-80a2-f47ca1de765b\" (UID: \"a3697520-ef2b-4b11-80a2-f47ca1de765b\") " Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.520214 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69fb1eeb-e151-4db3-951e-473de5e1b0a9-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"69fb1eeb-e151-4db3-951e-473de5e1b0a9\") " pod="openstack/nova-cell1-novncproxy-0" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.520341 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69fb1eeb-e151-4db3-951e-473de5e1b0a9-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"69fb1eeb-e151-4db3-951e-473de5e1b0a9\") " pod="openstack/nova-cell1-novncproxy-0" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.520362 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxzrq\" (UniqueName: \"kubernetes.io/projected/69fb1eeb-e151-4db3-951e-473de5e1b0a9-kube-api-access-kxzrq\") pod \"nova-cell1-novncproxy-0\" (UID: \"69fb1eeb-e151-4db3-951e-473de5e1b0a9\") " pod="openstack/nova-cell1-novncproxy-0" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.523661 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.525313 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3697520-ef2b-4b11-80a2-f47ca1de765b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a3697520-ef2b-4b11-80a2-f47ca1de765b" (UID: "a3697520-ef2b-4b11-80a2-f47ca1de765b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.527979 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.532253 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3697520-ef2b-4b11-80a2-f47ca1de765b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a3697520-ef2b-4b11-80a2-f47ca1de765b" (UID: "a3697520-ef2b-4b11-80a2-f47ca1de765b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.534932 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.535263 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69fb1eeb-e151-4db3-951e-473de5e1b0a9-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"69fb1eeb-e151-4db3-951e-473de5e1b0a9\") " pod="openstack/nova-cell1-novncproxy-0" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.545385 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69fb1eeb-e151-4db3-951e-473de5e1b0a9-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"69fb1eeb-e151-4db3-951e-473de5e1b0a9\") " pod="openstack/nova-cell1-novncproxy-0" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.545523 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3697520-ef2b-4b11-80a2-f47ca1de765b-scripts" (OuterVolumeSpecName: "scripts") pod "a3697520-ef2b-4b11-80a2-f47ca1de765b" (UID: "a3697520-ef2b-4b11-80a2-f47ca1de765b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.545918 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3697520-ef2b-4b11-80a2-f47ca1de765b-kube-api-access-thmdt" (OuterVolumeSpecName: "kube-api-access-thmdt") pod "a3697520-ef2b-4b11-80a2-f47ca1de765b" (UID: "a3697520-ef2b-4b11-80a2-f47ca1de765b"). InnerVolumeSpecName "kube-api-access-thmdt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.559597 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.565327 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.577010 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxzrq\" (UniqueName: \"kubernetes.io/projected/69fb1eeb-e151-4db3-951e-473de5e1b0a9-kube-api-access-kxzrq\") pod \"nova-cell1-novncproxy-0\" (UID: \"69fb1eeb-e151-4db3-951e-473de5e1b0a9\") " pod="openstack/nova-cell1-novncproxy-0" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.577086 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.586229 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-fzfdt" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.615763 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3697520-ef2b-4b11-80a2-f47ca1de765b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a3697520-ef2b-4b11-80a2-f47ca1de765b" (UID: "a3697520-ef2b-4b11-80a2-f47ca1de765b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.622183 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e95b56f-80e3-44ae-b28b-f5cca7559e36-config-data\") pod \"nova-api-0\" (UID: \"3e95b56f-80e3-44ae-b28b-f5cca7559e36\") " pod="openstack/nova-api-0" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.622553 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zr5q\" (UniqueName: \"kubernetes.io/projected/4bdf4447-1912-4e75-b0bd-f32a424abb42-kube-api-access-4zr5q\") pod \"nova-metadata-0\" (UID: \"4bdf4447-1912-4e75-b0bd-f32a424abb42\") " pod="openstack/nova-metadata-0" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.622696 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bdf4447-1912-4e75-b0bd-f32a424abb42-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4bdf4447-1912-4e75-b0bd-f32a424abb42\") " pod="openstack/nova-metadata-0" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.622833 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bdf4447-1912-4e75-b0bd-f32a424abb42-config-data\") pod \"nova-metadata-0\" (UID: \"4bdf4447-1912-4e75-b0bd-f32a424abb42\") " pod="openstack/nova-metadata-0" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.622983 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e95b56f-80e3-44ae-b28b-f5cca7559e36-logs\") pod \"nova-api-0\" (UID: \"3e95b56f-80e3-44ae-b28b-f5cca7559e36\") " pod="openstack/nova-api-0" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.623091 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8s68t\" (UniqueName: \"kubernetes.io/projected/3e95b56f-80e3-44ae-b28b-f5cca7559e36-kube-api-access-8s68t\") pod \"nova-api-0\" (UID: \"3e95b56f-80e3-44ae-b28b-f5cca7559e36\") " pod="openstack/nova-api-0" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.623245 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4bdf4447-1912-4e75-b0bd-f32a424abb42-logs\") pod \"nova-metadata-0\" (UID: \"4bdf4447-1912-4e75-b0bd-f32a424abb42\") " pod="openstack/nova-metadata-0" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.623372 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e95b56f-80e3-44ae-b28b-f5cca7559e36-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3e95b56f-80e3-44ae-b28b-f5cca7559e36\") " pod="openstack/nova-api-0" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.623501 4930 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a3697520-ef2b-4b11-80a2-f47ca1de765b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.623588 4930 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3697520-ef2b-4b11-80a2-f47ca1de765b-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.623691 4930 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a3697520-ef2b-4b11-80a2-f47ca1de765b-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.623778 4930 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a3697520-ef2b-4b11-80a2-f47ca1de765b-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.623847 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thmdt\" (UniqueName: \"kubernetes.io/projected/a3697520-ef2b-4b11-80a2-f47ca1de765b-kube-api-access-thmdt\") on node \"crc\" DevicePath \"\"" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.700387 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3697520-ef2b-4b11-80a2-f47ca1de765b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a3697520-ef2b-4b11-80a2-f47ca1de765b" (UID: "a3697520-ef2b-4b11-80a2-f47ca1de765b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.702785 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-844fc57f6f-7cfwk"] Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.704459 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-844fc57f6f-7cfwk" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.727055 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e95b56f-80e3-44ae-b28b-f5cca7559e36-logs\") pod \"nova-api-0\" (UID: \"3e95b56f-80e3-44ae-b28b-f5cca7559e36\") " pod="openstack/nova-api-0" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.727108 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8s68t\" (UniqueName: \"kubernetes.io/projected/3e95b56f-80e3-44ae-b28b-f5cca7559e36-kube-api-access-8s68t\") pod \"nova-api-0\" (UID: \"3e95b56f-80e3-44ae-b28b-f5cca7559e36\") " pod="openstack/nova-api-0" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.727125 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4bdf4447-1912-4e75-b0bd-f32a424abb42-logs\") pod \"nova-metadata-0\" (UID: \"4bdf4447-1912-4e75-b0bd-f32a424abb42\") " pod="openstack/nova-metadata-0" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.727165 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e95b56f-80e3-44ae-b28b-f5cca7559e36-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3e95b56f-80e3-44ae-b28b-f5cca7559e36\") " pod="openstack/nova-api-0" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.727190 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e95b56f-80e3-44ae-b28b-f5cca7559e36-config-data\") pod \"nova-api-0\" (UID: \"3e95b56f-80e3-44ae-b28b-f5cca7559e36\") " pod="openstack/nova-api-0" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.727214 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zr5q\" (UniqueName: \"kubernetes.io/projected/4bdf4447-1912-4e75-b0bd-f32a424abb42-kube-api-access-4zr5q\") pod \"nova-metadata-0\" (UID: \"4bdf4447-1912-4e75-b0bd-f32a424abb42\") " pod="openstack/nova-metadata-0" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.728424 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.728610 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bdf4447-1912-4e75-b0bd-f32a424abb42-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4bdf4447-1912-4e75-b0bd-f32a424abb42\") " pod="openstack/nova-metadata-0" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.728670 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bdf4447-1912-4e75-b0bd-f32a424abb42-config-data\") pod \"nova-metadata-0\" (UID: \"4bdf4447-1912-4e75-b0bd-f32a424abb42\") " pod="openstack/nova-metadata-0" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.728828 4930 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3697520-ef2b-4b11-80a2-f47ca1de765b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.729912 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.738885 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bdf4447-1912-4e75-b0bd-f32a424abb42-config-data\") pod \"nova-metadata-0\" (UID: \"4bdf4447-1912-4e75-b0bd-f32a424abb42\") " pod="openstack/nova-metadata-0" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.739261 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e95b56f-80e3-44ae-b28b-f5cca7559e36-logs\") pod \"nova-api-0\" (UID: \"3e95b56f-80e3-44ae-b28b-f5cca7559e36\") " pod="openstack/nova-api-0" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.739764 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4bdf4447-1912-4e75-b0bd-f32a424abb42-logs\") pod \"nova-metadata-0\" (UID: \"4bdf4447-1912-4e75-b0bd-f32a424abb42\") " pod="openstack/nova-metadata-0" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.744383 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.751110 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.756767 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e95b56f-80e3-44ae-b28b-f5cca7559e36-config-data\") pod \"nova-api-0\" (UID: \"3e95b56f-80e3-44ae-b28b-f5cca7559e36\") " pod="openstack/nova-api-0" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.767482 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-844fc57f6f-7cfwk"] Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.772582 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bdf4447-1912-4e75-b0bd-f32a424abb42-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4bdf4447-1912-4e75-b0bd-f32a424abb42\") " pod="openstack/nova-metadata-0" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.779595 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8s68t\" (UniqueName: \"kubernetes.io/projected/3e95b56f-80e3-44ae-b28b-f5cca7559e36-kube-api-access-8s68t\") pod \"nova-api-0\" (UID: \"3e95b56f-80e3-44ae-b28b-f5cca7559e36\") " pod="openstack/nova-api-0" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.780292 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e95b56f-80e3-44ae-b28b-f5cca7559e36-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3e95b56f-80e3-44ae-b28b-f5cca7559e36\") " pod="openstack/nova-api-0" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.785180 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zr5q\" (UniqueName: \"kubernetes.io/projected/4bdf4447-1912-4e75-b0bd-f32a424abb42-kube-api-access-4zr5q\") pod \"nova-metadata-0\" (UID: \"4bdf4447-1912-4e75-b0bd-f32a424abb42\") " pod="openstack/nova-metadata-0" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.800279 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.833931 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/159b117d-1ad6-4bb3-a748-3946f54ca207-ovsdbserver-sb\") pod \"dnsmasq-dns-844fc57f6f-7cfwk\" (UID: \"159b117d-1ad6-4bb3-a748-3946f54ca207\") " pod="openstack/dnsmasq-dns-844fc57f6f-7cfwk" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.834172 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/159b117d-1ad6-4bb3-a748-3946f54ca207-config\") pod \"dnsmasq-dns-844fc57f6f-7cfwk\" (UID: \"159b117d-1ad6-4bb3-a748-3946f54ca207\") " pod="openstack/dnsmasq-dns-844fc57f6f-7cfwk" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.834213 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/159b117d-1ad6-4bb3-a748-3946f54ca207-dns-svc\") pod \"dnsmasq-dns-844fc57f6f-7cfwk\" (UID: \"159b117d-1ad6-4bb3-a748-3946f54ca207\") " pod="openstack/dnsmasq-dns-844fc57f6f-7cfwk" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.834239 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8mjl\" (UniqueName: \"kubernetes.io/projected/159b117d-1ad6-4bb3-a748-3946f54ca207-kube-api-access-r8mjl\") pod \"dnsmasq-dns-844fc57f6f-7cfwk\" (UID: \"159b117d-1ad6-4bb3-a748-3946f54ca207\") " pod="openstack/dnsmasq-dns-844fc57f6f-7cfwk" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.834263 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gn46c\" (UniqueName: \"kubernetes.io/projected/eb27a965-e24a-4e4a-8577-5abb8c38de00-kube-api-access-gn46c\") pod \"nova-scheduler-0\" (UID: \"eb27a965-e24a-4e4a-8577-5abb8c38de00\") " pod="openstack/nova-scheduler-0" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.834297 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/159b117d-1ad6-4bb3-a748-3946f54ca207-dns-swift-storage-0\") pod \"dnsmasq-dns-844fc57f6f-7cfwk\" (UID: \"159b117d-1ad6-4bb3-a748-3946f54ca207\") " pod="openstack/dnsmasq-dns-844fc57f6f-7cfwk" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.834349 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/159b117d-1ad6-4bb3-a748-3946f54ca207-ovsdbserver-nb\") pod \"dnsmasq-dns-844fc57f6f-7cfwk\" (UID: \"159b117d-1ad6-4bb3-a748-3946f54ca207\") " pod="openstack/dnsmasq-dns-844fc57f6f-7cfwk" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.834397 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb27a965-e24a-4e4a-8577-5abb8c38de00-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"eb27a965-e24a-4e4a-8577-5abb8c38de00\") " pod="openstack/nova-scheduler-0" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.834436 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb27a965-e24a-4e4a-8577-5abb8c38de00-config-data\") pod \"nova-scheduler-0\" (UID: \"eb27a965-e24a-4e4a-8577-5abb8c38de00\") " pod="openstack/nova-scheduler-0" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.922967 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.923996 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3697520-ef2b-4b11-80a2-f47ca1de765b-config-data" (OuterVolumeSpecName: "config-data") pod "a3697520-ef2b-4b11-80a2-f47ca1de765b" (UID: "a3697520-ef2b-4b11-80a2-f47ca1de765b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.928559 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.976446 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/159b117d-1ad6-4bb3-a748-3946f54ca207-ovsdbserver-nb\") pod \"dnsmasq-dns-844fc57f6f-7cfwk\" (UID: \"159b117d-1ad6-4bb3-a748-3946f54ca207\") " pod="openstack/dnsmasq-dns-844fc57f6f-7cfwk" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.976706 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb27a965-e24a-4e4a-8577-5abb8c38de00-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"eb27a965-e24a-4e4a-8577-5abb8c38de00\") " pod="openstack/nova-scheduler-0" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.976814 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb27a965-e24a-4e4a-8577-5abb8c38de00-config-data\") pod \"nova-scheduler-0\" (UID: \"eb27a965-e24a-4e4a-8577-5abb8c38de00\") " pod="openstack/nova-scheduler-0" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.976923 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/159b117d-1ad6-4bb3-a748-3946f54ca207-ovsdbserver-sb\") pod \"dnsmasq-dns-844fc57f6f-7cfwk\" (UID: \"159b117d-1ad6-4bb3-a748-3946f54ca207\") " pod="openstack/dnsmasq-dns-844fc57f6f-7cfwk" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.977014 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/159b117d-1ad6-4bb3-a748-3946f54ca207-config\") pod \"dnsmasq-dns-844fc57f6f-7cfwk\" (UID: \"159b117d-1ad6-4bb3-a748-3946f54ca207\") " pod="openstack/dnsmasq-dns-844fc57f6f-7cfwk" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.977108 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/159b117d-1ad6-4bb3-a748-3946f54ca207-dns-svc\") pod \"dnsmasq-dns-844fc57f6f-7cfwk\" (UID: \"159b117d-1ad6-4bb3-a748-3946f54ca207\") " pod="openstack/dnsmasq-dns-844fc57f6f-7cfwk" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.977190 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8mjl\" (UniqueName: \"kubernetes.io/projected/159b117d-1ad6-4bb3-a748-3946f54ca207-kube-api-access-r8mjl\") pod \"dnsmasq-dns-844fc57f6f-7cfwk\" (UID: \"159b117d-1ad6-4bb3-a748-3946f54ca207\") " pod="openstack/dnsmasq-dns-844fc57f6f-7cfwk" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.977269 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gn46c\" (UniqueName: \"kubernetes.io/projected/eb27a965-e24a-4e4a-8577-5abb8c38de00-kube-api-access-gn46c\") pod \"nova-scheduler-0\" (UID: \"eb27a965-e24a-4e4a-8577-5abb8c38de00\") " pod="openstack/nova-scheduler-0" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.977359 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/159b117d-1ad6-4bb3-a748-3946f54ca207-dns-swift-storage-0\") pod \"dnsmasq-dns-844fc57f6f-7cfwk\" (UID: \"159b117d-1ad6-4bb3-a748-3946f54ca207\") " pod="openstack/dnsmasq-dns-844fc57f6f-7cfwk" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.977495 4930 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3697520-ef2b-4b11-80a2-f47ca1de765b-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.978667 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/159b117d-1ad6-4bb3-a748-3946f54ca207-dns-swift-storage-0\") pod \"dnsmasq-dns-844fc57f6f-7cfwk\" (UID: \"159b117d-1ad6-4bb3-a748-3946f54ca207\") " pod="openstack/dnsmasq-dns-844fc57f6f-7cfwk" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.980141 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/159b117d-1ad6-4bb3-a748-3946f54ca207-config\") pod \"dnsmasq-dns-844fc57f6f-7cfwk\" (UID: \"159b117d-1ad6-4bb3-a748-3946f54ca207\") " pod="openstack/dnsmasq-dns-844fc57f6f-7cfwk" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.981090 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/159b117d-1ad6-4bb3-a748-3946f54ca207-ovsdbserver-nb\") pod \"dnsmasq-dns-844fc57f6f-7cfwk\" (UID: \"159b117d-1ad6-4bb3-a748-3946f54ca207\") " pod="openstack/dnsmasq-dns-844fc57f6f-7cfwk" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.981772 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/159b117d-1ad6-4bb3-a748-3946f54ca207-dns-svc\") pod \"dnsmasq-dns-844fc57f6f-7cfwk\" (UID: \"159b117d-1ad6-4bb3-a748-3946f54ca207\") " pod="openstack/dnsmasq-dns-844fc57f6f-7cfwk" Oct 12 06:01:23 crc kubenswrapper[4930]: I1012 06:01:23.983963 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/159b117d-1ad6-4bb3-a748-3946f54ca207-ovsdbserver-sb\") pod \"dnsmasq-dns-844fc57f6f-7cfwk\" (UID: \"159b117d-1ad6-4bb3-a748-3946f54ca207\") " pod="openstack/dnsmasq-dns-844fc57f6f-7cfwk" Oct 12 06:01:24 crc kubenswrapper[4930]: I1012 06:01:24.001706 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb27a965-e24a-4e4a-8577-5abb8c38de00-config-data\") pod \"nova-scheduler-0\" (UID: \"eb27a965-e24a-4e4a-8577-5abb8c38de00\") " pod="openstack/nova-scheduler-0" Oct 12 06:01:24 crc kubenswrapper[4930]: I1012 06:01:24.017167 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8mjl\" (UniqueName: \"kubernetes.io/projected/159b117d-1ad6-4bb3-a748-3946f54ca207-kube-api-access-r8mjl\") pod \"dnsmasq-dns-844fc57f6f-7cfwk\" (UID: \"159b117d-1ad6-4bb3-a748-3946f54ca207\") " pod="openstack/dnsmasq-dns-844fc57f6f-7cfwk" Oct 12 06:01:24 crc kubenswrapper[4930]: I1012 06:01:24.017384 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gn46c\" (UniqueName: \"kubernetes.io/projected/eb27a965-e24a-4e4a-8577-5abb8c38de00-kube-api-access-gn46c\") pod \"nova-scheduler-0\" (UID: \"eb27a965-e24a-4e4a-8577-5abb8c38de00\") " pod="openstack/nova-scheduler-0" Oct 12 06:01:24 crc kubenswrapper[4930]: I1012 06:01:24.027704 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb27a965-e24a-4e4a-8577-5abb8c38de00-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"eb27a965-e24a-4e4a-8577-5abb8c38de00\") " pod="openstack/nova-scheduler-0" Oct 12 06:01:24 crc kubenswrapper[4930]: I1012 06:01:24.072218 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-844fc57f6f-7cfwk" Oct 12 06:01:24 crc kubenswrapper[4930]: I1012 06:01:24.093357 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 12 06:01:24 crc kubenswrapper[4930]: I1012 06:01:24.294267 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-fzfdt"] Oct 12 06:01:24 crc kubenswrapper[4930]: I1012 06:01:24.341996 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a3697520-ef2b-4b11-80a2-f47ca1de765b","Type":"ContainerDied","Data":"eb1d987cda95b9e4fe9da735477a54fdfaa201a263d695bcdb741473889ca596"} Oct 12 06:01:24 crc kubenswrapper[4930]: I1012 06:01:24.342314 4930 scope.go:117] "RemoveContainer" containerID="b25fa8c77be0c182869e6f29dbe64ba2bac901c9d516c2602f81daa149e606a0" Oct 12 06:01:24 crc kubenswrapper[4930]: I1012 06:01:24.342475 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 06:01:24 crc kubenswrapper[4930]: I1012 06:01:24.392612 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 12 06:01:24 crc kubenswrapper[4930]: I1012 06:01:24.406066 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 12 06:01:24 crc kubenswrapper[4930]: I1012 06:01:24.428436 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 12 06:01:24 crc kubenswrapper[4930]: I1012 06:01:24.431951 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 06:01:24 crc kubenswrapper[4930]: I1012 06:01:24.436964 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 12 06:01:24 crc kubenswrapper[4930]: I1012 06:01:24.437148 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 12 06:01:24 crc kubenswrapper[4930]: I1012 06:01:24.444843 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 12 06:01:24 crc kubenswrapper[4930]: I1012 06:01:24.463992 4930 scope.go:117] "RemoveContainer" containerID="e7e1352593f0bfadce55c4d20a57e8bfb3263418a6531a8536a4a1bd984af0bf" Oct 12 06:01:24 crc kubenswrapper[4930]: I1012 06:01:24.492998 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c04de04-91cf-4e56-85be-6a95e92f0d73-scripts\") pod \"ceilometer-0\" (UID: \"2c04de04-91cf-4e56-85be-6a95e92f0d73\") " pod="openstack/ceilometer-0" Oct 12 06:01:24 crc kubenswrapper[4930]: I1012 06:01:24.493058 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c04de04-91cf-4e56-85be-6a95e92f0d73-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2c04de04-91cf-4e56-85be-6a95e92f0d73\") " pod="openstack/ceilometer-0" Oct 12 06:01:24 crc kubenswrapper[4930]: I1012 06:01:24.493081 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c04de04-91cf-4e56-85be-6a95e92f0d73-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2c04de04-91cf-4e56-85be-6a95e92f0d73\") " pod="openstack/ceilometer-0" Oct 12 06:01:24 crc kubenswrapper[4930]: I1012 06:01:24.493114 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2c04de04-91cf-4e56-85be-6a95e92f0d73-run-httpd\") pod \"ceilometer-0\" (UID: \"2c04de04-91cf-4e56-85be-6a95e92f0d73\") " pod="openstack/ceilometer-0" Oct 12 06:01:24 crc kubenswrapper[4930]: I1012 06:01:24.493157 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2c04de04-91cf-4e56-85be-6a95e92f0d73-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2c04de04-91cf-4e56-85be-6a95e92f0d73\") " pod="openstack/ceilometer-0" Oct 12 06:01:24 crc kubenswrapper[4930]: I1012 06:01:24.493174 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2c04de04-91cf-4e56-85be-6a95e92f0d73-log-httpd\") pod \"ceilometer-0\" (UID: \"2c04de04-91cf-4e56-85be-6a95e92f0d73\") " pod="openstack/ceilometer-0" Oct 12 06:01:24 crc kubenswrapper[4930]: I1012 06:01:24.493217 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c04de04-91cf-4e56-85be-6a95e92f0d73-config-data\") pod \"ceilometer-0\" (UID: \"2c04de04-91cf-4e56-85be-6a95e92f0d73\") " pod="openstack/ceilometer-0" Oct 12 06:01:24 crc kubenswrapper[4930]: I1012 06:01:24.493246 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvzq9\" (UniqueName: \"kubernetes.io/projected/2c04de04-91cf-4e56-85be-6a95e92f0d73-kube-api-access-mvzq9\") pod \"ceilometer-0\" (UID: \"2c04de04-91cf-4e56-85be-6a95e92f0d73\") " pod="openstack/ceilometer-0" Oct 12 06:01:24 crc kubenswrapper[4930]: I1012 06:01:24.494032 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 12 06:01:24 crc kubenswrapper[4930]: I1012 06:01:24.570909 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 12 06:01:24 crc kubenswrapper[4930]: I1012 06:01:24.572643 4930 scope.go:117] "RemoveContainer" containerID="71a020f0d422ccae493d5d7258303e1c02d534e2b30ff6a2d4495a7ee6827f60" Oct 12 06:01:24 crc kubenswrapper[4930]: W1012 06:01:24.594908 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69fb1eeb_e151_4db3_951e_473de5e1b0a9.slice/crio-71967ec97c8bbf32106f045002b326194efebc1b63418504a1702b9f8145385b WatchSource:0}: Error finding container 71967ec97c8bbf32106f045002b326194efebc1b63418504a1702b9f8145385b: Status 404 returned error can't find the container with id 71967ec97c8bbf32106f045002b326194efebc1b63418504a1702b9f8145385b Oct 12 06:01:24 crc kubenswrapper[4930]: I1012 06:01:24.595826 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvzq9\" (UniqueName: \"kubernetes.io/projected/2c04de04-91cf-4e56-85be-6a95e92f0d73-kube-api-access-mvzq9\") pod \"ceilometer-0\" (UID: \"2c04de04-91cf-4e56-85be-6a95e92f0d73\") " pod="openstack/ceilometer-0" Oct 12 06:01:24 crc kubenswrapper[4930]: I1012 06:01:24.595909 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c04de04-91cf-4e56-85be-6a95e92f0d73-scripts\") pod \"ceilometer-0\" (UID: \"2c04de04-91cf-4e56-85be-6a95e92f0d73\") " pod="openstack/ceilometer-0" Oct 12 06:01:24 crc kubenswrapper[4930]: I1012 06:01:24.595949 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c04de04-91cf-4e56-85be-6a95e92f0d73-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2c04de04-91cf-4e56-85be-6a95e92f0d73\") " pod="openstack/ceilometer-0" Oct 12 06:01:24 crc kubenswrapper[4930]: I1012 06:01:24.595978 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c04de04-91cf-4e56-85be-6a95e92f0d73-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2c04de04-91cf-4e56-85be-6a95e92f0d73\") " pod="openstack/ceilometer-0" Oct 12 06:01:24 crc kubenswrapper[4930]: I1012 06:01:24.596009 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2c04de04-91cf-4e56-85be-6a95e92f0d73-run-httpd\") pod \"ceilometer-0\" (UID: \"2c04de04-91cf-4e56-85be-6a95e92f0d73\") " pod="openstack/ceilometer-0" Oct 12 06:01:24 crc kubenswrapper[4930]: I1012 06:01:24.596054 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2c04de04-91cf-4e56-85be-6a95e92f0d73-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2c04de04-91cf-4e56-85be-6a95e92f0d73\") " pod="openstack/ceilometer-0" Oct 12 06:01:24 crc kubenswrapper[4930]: I1012 06:01:24.596073 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2c04de04-91cf-4e56-85be-6a95e92f0d73-log-httpd\") pod \"ceilometer-0\" (UID: \"2c04de04-91cf-4e56-85be-6a95e92f0d73\") " pod="openstack/ceilometer-0" Oct 12 06:01:24 crc kubenswrapper[4930]: I1012 06:01:24.596112 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c04de04-91cf-4e56-85be-6a95e92f0d73-config-data\") pod \"ceilometer-0\" (UID: \"2c04de04-91cf-4e56-85be-6a95e92f0d73\") " pod="openstack/ceilometer-0" Oct 12 06:01:24 crc kubenswrapper[4930]: I1012 06:01:24.613093 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2c04de04-91cf-4e56-85be-6a95e92f0d73-log-httpd\") pod \"ceilometer-0\" (UID: \"2c04de04-91cf-4e56-85be-6a95e92f0d73\") " pod="openstack/ceilometer-0" Oct 12 06:01:24 crc kubenswrapper[4930]: I1012 06:01:24.617052 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2c04de04-91cf-4e56-85be-6a95e92f0d73-run-httpd\") pod \"ceilometer-0\" (UID: \"2c04de04-91cf-4e56-85be-6a95e92f0d73\") " pod="openstack/ceilometer-0" Oct 12 06:01:24 crc kubenswrapper[4930]: I1012 06:01:24.617915 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c04de04-91cf-4e56-85be-6a95e92f0d73-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2c04de04-91cf-4e56-85be-6a95e92f0d73\") " pod="openstack/ceilometer-0" Oct 12 06:01:24 crc kubenswrapper[4930]: I1012 06:01:24.620002 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c04de04-91cf-4e56-85be-6a95e92f0d73-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2c04de04-91cf-4e56-85be-6a95e92f0d73\") " pod="openstack/ceilometer-0" Oct 12 06:01:24 crc kubenswrapper[4930]: I1012 06:01:24.620325 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c04de04-91cf-4e56-85be-6a95e92f0d73-config-data\") pod \"ceilometer-0\" (UID: \"2c04de04-91cf-4e56-85be-6a95e92f0d73\") " pod="openstack/ceilometer-0" Oct 12 06:01:24 crc kubenswrapper[4930]: I1012 06:01:24.622272 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2c04de04-91cf-4e56-85be-6a95e92f0d73-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2c04de04-91cf-4e56-85be-6a95e92f0d73\") " pod="openstack/ceilometer-0" Oct 12 06:01:24 crc kubenswrapper[4930]: I1012 06:01:24.622810 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c04de04-91cf-4e56-85be-6a95e92f0d73-scripts\") pod \"ceilometer-0\" (UID: \"2c04de04-91cf-4e56-85be-6a95e92f0d73\") " pod="openstack/ceilometer-0" Oct 12 06:01:24 crc kubenswrapper[4930]: I1012 06:01:24.641278 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvzq9\" (UniqueName: \"kubernetes.io/projected/2c04de04-91cf-4e56-85be-6a95e92f0d73-kube-api-access-mvzq9\") pod \"ceilometer-0\" (UID: \"2c04de04-91cf-4e56-85be-6a95e92f0d73\") " pod="openstack/ceilometer-0" Oct 12 06:01:24 crc kubenswrapper[4930]: I1012 06:01:24.656726 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 12 06:01:24 crc kubenswrapper[4930]: I1012 06:01:24.658894 4930 scope.go:117] "RemoveContainer" containerID="3bedaa06bc81039767a716adef320cae4b890d4bd9b853e55e53feb8c7641bd5" Oct 12 06:01:24 crc kubenswrapper[4930]: I1012 06:01:24.810151 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 06:01:24 crc kubenswrapper[4930]: I1012 06:01:24.844265 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-mqwg4"] Oct 12 06:01:24 crc kubenswrapper[4930]: I1012 06:01:24.858828 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-mqwg4"] Oct 12 06:01:24 crc kubenswrapper[4930]: I1012 06:01:24.858938 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-mqwg4" Oct 12 06:01:24 crc kubenswrapper[4930]: I1012 06:01:24.861288 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Oct 12 06:01:24 crc kubenswrapper[4930]: I1012 06:01:24.861668 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 12 06:01:24 crc kubenswrapper[4930]: I1012 06:01:24.902297 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kr7vr\" (UniqueName: \"kubernetes.io/projected/25b0cb62-9d52-4ba1-9c00-3cda68c43da8-kube-api-access-kr7vr\") pod \"nova-cell1-conductor-db-sync-mqwg4\" (UID: \"25b0cb62-9d52-4ba1-9c00-3cda68c43da8\") " pod="openstack/nova-cell1-conductor-db-sync-mqwg4" Oct 12 06:01:24 crc kubenswrapper[4930]: I1012 06:01:24.902378 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25b0cb62-9d52-4ba1-9c00-3cda68c43da8-config-data\") pod \"nova-cell1-conductor-db-sync-mqwg4\" (UID: \"25b0cb62-9d52-4ba1-9c00-3cda68c43da8\") " pod="openstack/nova-cell1-conductor-db-sync-mqwg4" Oct 12 06:01:24 crc kubenswrapper[4930]: I1012 06:01:24.902436 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25b0cb62-9d52-4ba1-9c00-3cda68c43da8-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-mqwg4\" (UID: \"25b0cb62-9d52-4ba1-9c00-3cda68c43da8\") " pod="openstack/nova-cell1-conductor-db-sync-mqwg4" Oct 12 06:01:24 crc kubenswrapper[4930]: I1012 06:01:24.902469 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25b0cb62-9d52-4ba1-9c00-3cda68c43da8-scripts\") pod \"nova-cell1-conductor-db-sync-mqwg4\" (UID: \"25b0cb62-9d52-4ba1-9c00-3cda68c43da8\") " pod="openstack/nova-cell1-conductor-db-sync-mqwg4" Oct 12 06:01:24 crc kubenswrapper[4930]: I1012 06:01:24.939678 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 12 06:01:25 crc kubenswrapper[4930]: I1012 06:01:25.003935 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kr7vr\" (UniqueName: \"kubernetes.io/projected/25b0cb62-9d52-4ba1-9c00-3cda68c43da8-kube-api-access-kr7vr\") pod \"nova-cell1-conductor-db-sync-mqwg4\" (UID: \"25b0cb62-9d52-4ba1-9c00-3cda68c43da8\") " pod="openstack/nova-cell1-conductor-db-sync-mqwg4" Oct 12 06:01:25 crc kubenswrapper[4930]: I1012 06:01:25.004015 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25b0cb62-9d52-4ba1-9c00-3cda68c43da8-config-data\") pod \"nova-cell1-conductor-db-sync-mqwg4\" (UID: \"25b0cb62-9d52-4ba1-9c00-3cda68c43da8\") " pod="openstack/nova-cell1-conductor-db-sync-mqwg4" Oct 12 06:01:25 crc kubenswrapper[4930]: I1012 06:01:25.004043 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25b0cb62-9d52-4ba1-9c00-3cda68c43da8-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-mqwg4\" (UID: \"25b0cb62-9d52-4ba1-9c00-3cda68c43da8\") " pod="openstack/nova-cell1-conductor-db-sync-mqwg4" Oct 12 06:01:25 crc kubenswrapper[4930]: I1012 06:01:25.004076 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25b0cb62-9d52-4ba1-9c00-3cda68c43da8-scripts\") pod \"nova-cell1-conductor-db-sync-mqwg4\" (UID: \"25b0cb62-9d52-4ba1-9c00-3cda68c43da8\") " pod="openstack/nova-cell1-conductor-db-sync-mqwg4" Oct 12 06:01:25 crc kubenswrapper[4930]: I1012 06:01:25.005988 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 12 06:01:25 crc kubenswrapper[4930]: I1012 06:01:25.013765 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25b0cb62-9d52-4ba1-9c00-3cda68c43da8-config-data\") pod \"nova-cell1-conductor-db-sync-mqwg4\" (UID: \"25b0cb62-9d52-4ba1-9c00-3cda68c43da8\") " pod="openstack/nova-cell1-conductor-db-sync-mqwg4" Oct 12 06:01:25 crc kubenswrapper[4930]: I1012 06:01:25.028503 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25b0cb62-9d52-4ba1-9c00-3cda68c43da8-scripts\") pod \"nova-cell1-conductor-db-sync-mqwg4\" (UID: \"25b0cb62-9d52-4ba1-9c00-3cda68c43da8\") " pod="openstack/nova-cell1-conductor-db-sync-mqwg4" Oct 12 06:01:25 crc kubenswrapper[4930]: I1012 06:01:25.043569 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kr7vr\" (UniqueName: \"kubernetes.io/projected/25b0cb62-9d52-4ba1-9c00-3cda68c43da8-kube-api-access-kr7vr\") pod \"nova-cell1-conductor-db-sync-mqwg4\" (UID: \"25b0cb62-9d52-4ba1-9c00-3cda68c43da8\") " pod="openstack/nova-cell1-conductor-db-sync-mqwg4" Oct 12 06:01:25 crc kubenswrapper[4930]: I1012 06:01:25.043977 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25b0cb62-9d52-4ba1-9c00-3cda68c43da8-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-mqwg4\" (UID: \"25b0cb62-9d52-4ba1-9c00-3cda68c43da8\") " pod="openstack/nova-cell1-conductor-db-sync-mqwg4" Oct 12 06:01:25 crc kubenswrapper[4930]: I1012 06:01:25.044332 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-844fc57f6f-7cfwk"] Oct 12 06:01:25 crc kubenswrapper[4930]: I1012 06:01:25.069584 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-mqwg4" Oct 12 06:01:26 crc kubenswrapper[4930]: I1012 06:01:25.353939 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3e95b56f-80e3-44ae-b28b-f5cca7559e36","Type":"ContainerStarted","Data":"1c48f36556bb35308e7ea8c47fe4ac7d0adb684f67c55272f194eee49b67132b"} Oct 12 06:01:26 crc kubenswrapper[4930]: I1012 06:01:25.364645 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-844fc57f6f-7cfwk" event={"ID":"159b117d-1ad6-4bb3-a748-3946f54ca207","Type":"ContainerStarted","Data":"ff4cf649b495608a58c66506264ad553c97d977b26353de5d5b1b431201bad9f"} Oct 12 06:01:26 crc kubenswrapper[4930]: I1012 06:01:25.368587 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-fzfdt" event={"ID":"022414f2-80b4-4bc1-81e9-df49ebbaae8e","Type":"ContainerStarted","Data":"11587a9e8bb1e90a07706f5bca9e8051fea00fd934ade62a22227ec2378089af"} Oct 12 06:01:26 crc kubenswrapper[4930]: I1012 06:01:25.368623 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-fzfdt" event={"ID":"022414f2-80b4-4bc1-81e9-df49ebbaae8e","Type":"ContainerStarted","Data":"726c1183404d2c5b0930f2caabee3ba82d1e3f9f4130dfbdcef130c84474ebba"} Oct 12 06:01:26 crc kubenswrapper[4930]: I1012 06:01:25.390657 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"69fb1eeb-e151-4db3-951e-473de5e1b0a9","Type":"ContainerStarted","Data":"71967ec97c8bbf32106f045002b326194efebc1b63418504a1702b9f8145385b"} Oct 12 06:01:26 crc kubenswrapper[4930]: I1012 06:01:25.396050 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"eb27a965-e24a-4e4a-8577-5abb8c38de00","Type":"ContainerStarted","Data":"f8831add24f3ea797c2e9025a832cdaf8744db232d19f73dcdc44b41f3a587e5"} Oct 12 06:01:26 crc kubenswrapper[4930]: I1012 06:01:25.399835 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4bdf4447-1912-4e75-b0bd-f32a424abb42","Type":"ContainerStarted","Data":"33bf05497795143c9665c42efea77ae4f6e096cf6225469fcb76e239d7fba0d2"} Oct 12 06:01:26 crc kubenswrapper[4930]: I1012 06:01:25.406422 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-fzfdt" podStartSLOduration=3.406397356 podStartE2EDuration="3.406397356s" podCreationTimestamp="2025-10-12 06:01:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 06:01:25.387463274 +0000 UTC m=+1217.929565039" watchObservedRunningTime="2025-10-12 06:01:25.406397356 +0000 UTC m=+1217.948499121" Oct 12 06:01:26 crc kubenswrapper[4930]: I1012 06:01:25.431890 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 12 06:01:26 crc kubenswrapper[4930]: I1012 06:01:25.513099 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 12 06:01:26 crc kubenswrapper[4930]: I1012 06:01:26.166978 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3697520-ef2b-4b11-80a2-f47ca1de765b" path="/var/lib/kubelet/pods/a3697520-ef2b-4b11-80a2-f47ca1de765b/volumes" Oct 12 06:01:26 crc kubenswrapper[4930]: I1012 06:01:26.456041 4930 generic.go:334] "Generic (PLEG): container finished" podID="159b117d-1ad6-4bb3-a748-3946f54ca207" containerID="32e7cbedc200dd4f84cfe55abea58ad69a7e33232a477b7d745ae332d8deba01" exitCode=0 Oct 12 06:01:26 crc kubenswrapper[4930]: I1012 06:01:26.457576 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-844fc57f6f-7cfwk" event={"ID":"159b117d-1ad6-4bb3-a748-3946f54ca207","Type":"ContainerDied","Data":"32e7cbedc200dd4f84cfe55abea58ad69a7e33232a477b7d745ae332d8deba01"} Oct 12 06:01:26 crc kubenswrapper[4930]: I1012 06:01:26.468919 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2c04de04-91cf-4e56-85be-6a95e92f0d73","Type":"ContainerStarted","Data":"45c450a08e3387e612e626b73f1913a4bfe0b896d87b1faf8bc9ea2296001eff"} Oct 12 06:01:26 crc kubenswrapper[4930]: I1012 06:01:26.468967 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2c04de04-91cf-4e56-85be-6a95e92f0d73","Type":"ContainerStarted","Data":"f5947b343785f8d5d157cc3f27d3076c391d472e1d9f88aa99b43e969fcb5106"} Oct 12 06:01:26 crc kubenswrapper[4930]: I1012 06:01:26.653783 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-mqwg4"] Oct 12 06:01:27 crc kubenswrapper[4930]: I1012 06:01:27.411649 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 12 06:01:27 crc kubenswrapper[4930]: I1012 06:01:27.425872 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 12 06:01:27 crc kubenswrapper[4930]: I1012 06:01:27.482051 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2c04de04-91cf-4e56-85be-6a95e92f0d73","Type":"ContainerStarted","Data":"1945b8ffd802b593294d77243e90298b6713eed95da1501888c5fe65d73f1636"} Oct 12 06:01:27 crc kubenswrapper[4930]: I1012 06:01:27.487336 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-844fc57f6f-7cfwk" event={"ID":"159b117d-1ad6-4bb3-a748-3946f54ca207","Type":"ContainerStarted","Data":"6679c6b163fbeb8e483b159bf04e6c8bd343318b3008456e6fe5ccd838e2f13e"} Oct 12 06:01:27 crc kubenswrapper[4930]: I1012 06:01:27.487525 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-844fc57f6f-7cfwk" Oct 12 06:01:27 crc kubenswrapper[4930]: I1012 06:01:27.492484 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-mqwg4" event={"ID":"25b0cb62-9d52-4ba1-9c00-3cda68c43da8","Type":"ContainerStarted","Data":"7c5c4a121c4d9f3987ec65112d4c46c15bea1770075f362a6949b4fd38d1788f"} Oct 12 06:01:27 crc kubenswrapper[4930]: I1012 06:01:27.492525 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-mqwg4" event={"ID":"25b0cb62-9d52-4ba1-9c00-3cda68c43da8","Type":"ContainerStarted","Data":"e9e93e30bf5d10cc71ff961bbba68fdd7eded580c4c3e7aef0e24f8951e1f0d4"} Oct 12 06:01:27 crc kubenswrapper[4930]: I1012 06:01:27.507287 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-844fc57f6f-7cfwk" podStartSLOduration=4.507269681 podStartE2EDuration="4.507269681s" podCreationTimestamp="2025-10-12 06:01:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 06:01:27.50639352 +0000 UTC m=+1220.048495275" watchObservedRunningTime="2025-10-12 06:01:27.507269681 +0000 UTC m=+1220.049371446" Oct 12 06:01:27 crc kubenswrapper[4930]: I1012 06:01:27.530061 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-mqwg4" podStartSLOduration=3.530047089 podStartE2EDuration="3.530047089s" podCreationTimestamp="2025-10-12 06:01:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 06:01:27.527766122 +0000 UTC m=+1220.069867887" watchObservedRunningTime="2025-10-12 06:01:27.530047089 +0000 UTC m=+1220.072148854" Oct 12 06:01:30 crc kubenswrapper[4930]: I1012 06:01:30.523459 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"eb27a965-e24a-4e4a-8577-5abb8c38de00","Type":"ContainerStarted","Data":"1d2c46de744b137bc8ae99a8e9a02fb6382db83c9e938d507b984b06eda756b8"} Oct 12 06:01:30 crc kubenswrapper[4930]: I1012 06:01:30.525922 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"69fb1eeb-e151-4db3-951e-473de5e1b0a9","Type":"ContainerStarted","Data":"413e0fc328ed4b1593c4e95b37dc172bf363e4aed76afddc02cb3ba4e8f2a92e"} Oct 12 06:01:30 crc kubenswrapper[4930]: I1012 06:01:30.526045 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="69fb1eeb-e151-4db3-951e-473de5e1b0a9" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://413e0fc328ed4b1593c4e95b37dc172bf363e4aed76afddc02cb3ba4e8f2a92e" gracePeriod=30 Oct 12 06:01:30 crc kubenswrapper[4930]: I1012 06:01:30.530238 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2c04de04-91cf-4e56-85be-6a95e92f0d73","Type":"ContainerStarted","Data":"1b5ac694c6e912580320b240c2b12c5ce9262d5cb406ab4cacfb06a6d9403406"} Oct 12 06:01:30 crc kubenswrapper[4930]: I1012 06:01:30.534161 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4bdf4447-1912-4e75-b0bd-f32a424abb42","Type":"ContainerStarted","Data":"c9b504d45dc7e7cf9563e2fd2489a46c6eaa891b6b031ea4056aeb4730fc38e0"} Oct 12 06:01:30 crc kubenswrapper[4930]: I1012 06:01:30.534200 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="4bdf4447-1912-4e75-b0bd-f32a424abb42" containerName="nova-metadata-log" containerID="cri-o://4a38ee783056f09234085d2d016cb27e4c30fa7ba34fcf5454176fc1839ae8db" gracePeriod=30 Oct 12 06:01:30 crc kubenswrapper[4930]: I1012 06:01:30.534291 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="4bdf4447-1912-4e75-b0bd-f32a424abb42" containerName="nova-metadata-metadata" containerID="cri-o://c9b504d45dc7e7cf9563e2fd2489a46c6eaa891b6b031ea4056aeb4730fc38e0" gracePeriod=30 Oct 12 06:01:30 crc kubenswrapper[4930]: I1012 06:01:30.534208 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4bdf4447-1912-4e75-b0bd-f32a424abb42","Type":"ContainerStarted","Data":"4a38ee783056f09234085d2d016cb27e4c30fa7ba34fcf5454176fc1839ae8db"} Oct 12 06:01:30 crc kubenswrapper[4930]: I1012 06:01:30.538817 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3e95b56f-80e3-44ae-b28b-f5cca7559e36","Type":"ContainerStarted","Data":"76869c73a7da45fa8450435eb5fa0e30782564f5763b06957d4aca277f70d01f"} Oct 12 06:01:30 crc kubenswrapper[4930]: I1012 06:01:30.538858 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3e95b56f-80e3-44ae-b28b-f5cca7559e36","Type":"ContainerStarted","Data":"4966612a69ce6e520c58e3c08429f711ad9f84f2c5fc5e33faf7901cd983e1a7"} Oct 12 06:01:30 crc kubenswrapper[4930]: I1012 06:01:30.551299 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.118596387 podStartE2EDuration="7.5512847s" podCreationTimestamp="2025-10-12 06:01:23 +0000 UTC" firstStartedPulling="2025-10-12 06:01:25.042802203 +0000 UTC m=+1217.584903968" lastFinishedPulling="2025-10-12 06:01:29.475490516 +0000 UTC m=+1222.017592281" observedRunningTime="2025-10-12 06:01:30.546554042 +0000 UTC m=+1223.088655807" watchObservedRunningTime="2025-10-12 06:01:30.5512847 +0000 UTC m=+1223.093386465" Oct 12 06:01:30 crc kubenswrapper[4930]: I1012 06:01:30.575413 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.712708601 podStartE2EDuration="7.57539683s" podCreationTimestamp="2025-10-12 06:01:23 +0000 UTC" firstStartedPulling="2025-10-12 06:01:24.622034387 +0000 UTC m=+1217.164136152" lastFinishedPulling="2025-10-12 06:01:29.484722616 +0000 UTC m=+1222.026824381" observedRunningTime="2025-10-12 06:01:30.574026426 +0000 UTC m=+1223.116128221" watchObservedRunningTime="2025-10-12 06:01:30.57539683 +0000 UTC m=+1223.117498595" Oct 12 06:01:30 crc kubenswrapper[4930]: I1012 06:01:30.596780 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.078301944 podStartE2EDuration="7.596761122s" podCreationTimestamp="2025-10-12 06:01:23 +0000 UTC" firstStartedPulling="2025-10-12 06:01:24.962066453 +0000 UTC m=+1217.504168218" lastFinishedPulling="2025-10-12 06:01:29.480525631 +0000 UTC m=+1222.022627396" observedRunningTime="2025-10-12 06:01:30.588488186 +0000 UTC m=+1223.130589951" watchObservedRunningTime="2025-10-12 06:01:30.596761122 +0000 UTC m=+1223.138862897" Oct 12 06:01:30 crc kubenswrapper[4930]: I1012 06:01:30.612332 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.826414813 podStartE2EDuration="7.612312699s" podCreationTimestamp="2025-10-12 06:01:23 +0000 UTC" firstStartedPulling="2025-10-12 06:01:24.693989549 +0000 UTC m=+1217.236091314" lastFinishedPulling="2025-10-12 06:01:29.479887435 +0000 UTC m=+1222.021989200" observedRunningTime="2025-10-12 06:01:30.605068039 +0000 UTC m=+1223.147169804" watchObservedRunningTime="2025-10-12 06:01:30.612312699 +0000 UTC m=+1223.154414464" Oct 12 06:01:31 crc kubenswrapper[4930]: I1012 06:01:31.149056 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 12 06:01:31 crc kubenswrapper[4930]: I1012 06:01:31.195139 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zr5q\" (UniqueName: \"kubernetes.io/projected/4bdf4447-1912-4e75-b0bd-f32a424abb42-kube-api-access-4zr5q\") pod \"4bdf4447-1912-4e75-b0bd-f32a424abb42\" (UID: \"4bdf4447-1912-4e75-b0bd-f32a424abb42\") " Oct 12 06:01:31 crc kubenswrapper[4930]: I1012 06:01:31.195190 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4bdf4447-1912-4e75-b0bd-f32a424abb42-logs\") pod \"4bdf4447-1912-4e75-b0bd-f32a424abb42\" (UID: \"4bdf4447-1912-4e75-b0bd-f32a424abb42\") " Oct 12 06:01:31 crc kubenswrapper[4930]: I1012 06:01:31.195293 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bdf4447-1912-4e75-b0bd-f32a424abb42-combined-ca-bundle\") pod \"4bdf4447-1912-4e75-b0bd-f32a424abb42\" (UID: \"4bdf4447-1912-4e75-b0bd-f32a424abb42\") " Oct 12 06:01:31 crc kubenswrapper[4930]: I1012 06:01:31.195397 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bdf4447-1912-4e75-b0bd-f32a424abb42-config-data\") pod \"4bdf4447-1912-4e75-b0bd-f32a424abb42\" (UID: \"4bdf4447-1912-4e75-b0bd-f32a424abb42\") " Oct 12 06:01:31 crc kubenswrapper[4930]: I1012 06:01:31.196841 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bdf4447-1912-4e75-b0bd-f32a424abb42-logs" (OuterVolumeSpecName: "logs") pod "4bdf4447-1912-4e75-b0bd-f32a424abb42" (UID: "4bdf4447-1912-4e75-b0bd-f32a424abb42"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 06:01:31 crc kubenswrapper[4930]: I1012 06:01:31.200434 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bdf4447-1912-4e75-b0bd-f32a424abb42-kube-api-access-4zr5q" (OuterVolumeSpecName: "kube-api-access-4zr5q") pod "4bdf4447-1912-4e75-b0bd-f32a424abb42" (UID: "4bdf4447-1912-4e75-b0bd-f32a424abb42"). InnerVolumeSpecName "kube-api-access-4zr5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:01:31 crc kubenswrapper[4930]: I1012 06:01:31.223567 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bdf4447-1912-4e75-b0bd-f32a424abb42-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4bdf4447-1912-4e75-b0bd-f32a424abb42" (UID: "4bdf4447-1912-4e75-b0bd-f32a424abb42"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:01:31 crc kubenswrapper[4930]: I1012 06:01:31.224965 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bdf4447-1912-4e75-b0bd-f32a424abb42-config-data" (OuterVolumeSpecName: "config-data") pod "4bdf4447-1912-4e75-b0bd-f32a424abb42" (UID: "4bdf4447-1912-4e75-b0bd-f32a424abb42"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:01:31 crc kubenswrapper[4930]: I1012 06:01:31.297406 4930 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bdf4447-1912-4e75-b0bd-f32a424abb42-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 06:01:31 crc kubenswrapper[4930]: I1012 06:01:31.297446 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4zr5q\" (UniqueName: \"kubernetes.io/projected/4bdf4447-1912-4e75-b0bd-f32a424abb42-kube-api-access-4zr5q\") on node \"crc\" DevicePath \"\"" Oct 12 06:01:31 crc kubenswrapper[4930]: I1012 06:01:31.297457 4930 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4bdf4447-1912-4e75-b0bd-f32a424abb42-logs\") on node \"crc\" DevicePath \"\"" Oct 12 06:01:31 crc kubenswrapper[4930]: I1012 06:01:31.297467 4930 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bdf4447-1912-4e75-b0bd-f32a424abb42-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 06:01:31 crc kubenswrapper[4930]: I1012 06:01:31.551953 4930 generic.go:334] "Generic (PLEG): container finished" podID="4bdf4447-1912-4e75-b0bd-f32a424abb42" containerID="c9b504d45dc7e7cf9563e2fd2489a46c6eaa891b6b031ea4056aeb4730fc38e0" exitCode=0 Oct 12 06:01:31 crc kubenswrapper[4930]: I1012 06:01:31.551983 4930 generic.go:334] "Generic (PLEG): container finished" podID="4bdf4447-1912-4e75-b0bd-f32a424abb42" containerID="4a38ee783056f09234085d2d016cb27e4c30fa7ba34fcf5454176fc1839ae8db" exitCode=143 Oct 12 06:01:31 crc kubenswrapper[4930]: I1012 06:01:31.552842 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4bdf4447-1912-4e75-b0bd-f32a424abb42","Type":"ContainerDied","Data":"c9b504d45dc7e7cf9563e2fd2489a46c6eaa891b6b031ea4056aeb4730fc38e0"} Oct 12 06:01:31 crc kubenswrapper[4930]: I1012 06:01:31.552872 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4bdf4447-1912-4e75-b0bd-f32a424abb42","Type":"ContainerDied","Data":"4a38ee783056f09234085d2d016cb27e4c30fa7ba34fcf5454176fc1839ae8db"} Oct 12 06:01:31 crc kubenswrapper[4930]: I1012 06:01:31.552882 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4bdf4447-1912-4e75-b0bd-f32a424abb42","Type":"ContainerDied","Data":"33bf05497795143c9665c42efea77ae4f6e096cf6225469fcb76e239d7fba0d2"} Oct 12 06:01:31 crc kubenswrapper[4930]: I1012 06:01:31.552900 4930 scope.go:117] "RemoveContainer" containerID="c9b504d45dc7e7cf9563e2fd2489a46c6eaa891b6b031ea4056aeb4730fc38e0" Oct 12 06:01:31 crc kubenswrapper[4930]: I1012 06:01:31.552845 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 12 06:01:31 crc kubenswrapper[4930]: I1012 06:01:31.607446 4930 scope.go:117] "RemoveContainer" containerID="4a38ee783056f09234085d2d016cb27e4c30fa7ba34fcf5454176fc1839ae8db" Oct 12 06:01:31 crc kubenswrapper[4930]: I1012 06:01:31.625171 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 12 06:01:31 crc kubenswrapper[4930]: I1012 06:01:31.636483 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 12 06:01:31 crc kubenswrapper[4930]: I1012 06:01:31.645937 4930 scope.go:117] "RemoveContainer" containerID="c9b504d45dc7e7cf9563e2fd2489a46c6eaa891b6b031ea4056aeb4730fc38e0" Oct 12 06:01:31 crc kubenswrapper[4930]: E1012 06:01:31.647269 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9b504d45dc7e7cf9563e2fd2489a46c6eaa891b6b031ea4056aeb4730fc38e0\": container with ID starting with c9b504d45dc7e7cf9563e2fd2489a46c6eaa891b6b031ea4056aeb4730fc38e0 not found: ID does not exist" containerID="c9b504d45dc7e7cf9563e2fd2489a46c6eaa891b6b031ea4056aeb4730fc38e0" Oct 12 06:01:31 crc kubenswrapper[4930]: I1012 06:01:31.647305 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9b504d45dc7e7cf9563e2fd2489a46c6eaa891b6b031ea4056aeb4730fc38e0"} err="failed to get container status \"c9b504d45dc7e7cf9563e2fd2489a46c6eaa891b6b031ea4056aeb4730fc38e0\": rpc error: code = NotFound desc = could not find container \"c9b504d45dc7e7cf9563e2fd2489a46c6eaa891b6b031ea4056aeb4730fc38e0\": container with ID starting with c9b504d45dc7e7cf9563e2fd2489a46c6eaa891b6b031ea4056aeb4730fc38e0 not found: ID does not exist" Oct 12 06:01:31 crc kubenswrapper[4930]: I1012 06:01:31.647330 4930 scope.go:117] "RemoveContainer" containerID="4a38ee783056f09234085d2d016cb27e4c30fa7ba34fcf5454176fc1839ae8db" Oct 12 06:01:31 crc kubenswrapper[4930]: E1012 06:01:31.647627 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a38ee783056f09234085d2d016cb27e4c30fa7ba34fcf5454176fc1839ae8db\": container with ID starting with 4a38ee783056f09234085d2d016cb27e4c30fa7ba34fcf5454176fc1839ae8db not found: ID does not exist" containerID="4a38ee783056f09234085d2d016cb27e4c30fa7ba34fcf5454176fc1839ae8db" Oct 12 06:01:31 crc kubenswrapper[4930]: I1012 06:01:31.647667 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a38ee783056f09234085d2d016cb27e4c30fa7ba34fcf5454176fc1839ae8db"} err="failed to get container status \"4a38ee783056f09234085d2d016cb27e4c30fa7ba34fcf5454176fc1839ae8db\": rpc error: code = NotFound desc = could not find container \"4a38ee783056f09234085d2d016cb27e4c30fa7ba34fcf5454176fc1839ae8db\": container with ID starting with 4a38ee783056f09234085d2d016cb27e4c30fa7ba34fcf5454176fc1839ae8db not found: ID does not exist" Oct 12 06:01:31 crc kubenswrapper[4930]: I1012 06:01:31.647831 4930 scope.go:117] "RemoveContainer" containerID="c9b504d45dc7e7cf9563e2fd2489a46c6eaa891b6b031ea4056aeb4730fc38e0" Oct 12 06:01:31 crc kubenswrapper[4930]: I1012 06:01:31.651835 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9b504d45dc7e7cf9563e2fd2489a46c6eaa891b6b031ea4056aeb4730fc38e0"} err="failed to get container status \"c9b504d45dc7e7cf9563e2fd2489a46c6eaa891b6b031ea4056aeb4730fc38e0\": rpc error: code = NotFound desc = could not find container \"c9b504d45dc7e7cf9563e2fd2489a46c6eaa891b6b031ea4056aeb4730fc38e0\": container with ID starting with c9b504d45dc7e7cf9563e2fd2489a46c6eaa891b6b031ea4056aeb4730fc38e0 not found: ID does not exist" Oct 12 06:01:31 crc kubenswrapper[4930]: I1012 06:01:31.651876 4930 scope.go:117] "RemoveContainer" containerID="4a38ee783056f09234085d2d016cb27e4c30fa7ba34fcf5454176fc1839ae8db" Oct 12 06:01:31 crc kubenswrapper[4930]: I1012 06:01:31.653536 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a38ee783056f09234085d2d016cb27e4c30fa7ba34fcf5454176fc1839ae8db"} err="failed to get container status \"4a38ee783056f09234085d2d016cb27e4c30fa7ba34fcf5454176fc1839ae8db\": rpc error: code = NotFound desc = could not find container \"4a38ee783056f09234085d2d016cb27e4c30fa7ba34fcf5454176fc1839ae8db\": container with ID starting with 4a38ee783056f09234085d2d016cb27e4c30fa7ba34fcf5454176fc1839ae8db not found: ID does not exist" Oct 12 06:01:31 crc kubenswrapper[4930]: I1012 06:01:31.665429 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 12 06:01:31 crc kubenswrapper[4930]: E1012 06:01:31.666399 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bdf4447-1912-4e75-b0bd-f32a424abb42" containerName="nova-metadata-log" Oct 12 06:01:31 crc kubenswrapper[4930]: I1012 06:01:31.666418 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bdf4447-1912-4e75-b0bd-f32a424abb42" containerName="nova-metadata-log" Oct 12 06:01:31 crc kubenswrapper[4930]: E1012 06:01:31.666453 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bdf4447-1912-4e75-b0bd-f32a424abb42" containerName="nova-metadata-metadata" Oct 12 06:01:31 crc kubenswrapper[4930]: I1012 06:01:31.666460 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bdf4447-1912-4e75-b0bd-f32a424abb42" containerName="nova-metadata-metadata" Oct 12 06:01:31 crc kubenswrapper[4930]: I1012 06:01:31.666642 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bdf4447-1912-4e75-b0bd-f32a424abb42" containerName="nova-metadata-log" Oct 12 06:01:31 crc kubenswrapper[4930]: I1012 06:01:31.666664 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bdf4447-1912-4e75-b0bd-f32a424abb42" containerName="nova-metadata-metadata" Oct 12 06:01:31 crc kubenswrapper[4930]: I1012 06:01:31.667882 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 12 06:01:31 crc kubenswrapper[4930]: I1012 06:01:31.672447 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 12 06:01:31 crc kubenswrapper[4930]: I1012 06:01:31.672721 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 12 06:01:31 crc kubenswrapper[4930]: I1012 06:01:31.680667 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 12 06:01:31 crc kubenswrapper[4930]: I1012 06:01:31.806534 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aca4b64e-a9a9-4e11-8bcb-64c930ba919f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"aca4b64e-a9a9-4e11-8bcb-64c930ba919f\") " pod="openstack/nova-metadata-0" Oct 12 06:01:31 crc kubenswrapper[4930]: I1012 06:01:31.806654 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/aca4b64e-a9a9-4e11-8bcb-64c930ba919f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"aca4b64e-a9a9-4e11-8bcb-64c930ba919f\") " pod="openstack/nova-metadata-0" Oct 12 06:01:31 crc kubenswrapper[4930]: I1012 06:01:31.806756 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qvx5\" (UniqueName: \"kubernetes.io/projected/aca4b64e-a9a9-4e11-8bcb-64c930ba919f-kube-api-access-9qvx5\") pod \"nova-metadata-0\" (UID: \"aca4b64e-a9a9-4e11-8bcb-64c930ba919f\") " pod="openstack/nova-metadata-0" Oct 12 06:01:31 crc kubenswrapper[4930]: I1012 06:01:31.806787 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aca4b64e-a9a9-4e11-8bcb-64c930ba919f-logs\") pod \"nova-metadata-0\" (UID: \"aca4b64e-a9a9-4e11-8bcb-64c930ba919f\") " pod="openstack/nova-metadata-0" Oct 12 06:01:31 crc kubenswrapper[4930]: I1012 06:01:31.806818 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aca4b64e-a9a9-4e11-8bcb-64c930ba919f-config-data\") pod \"nova-metadata-0\" (UID: \"aca4b64e-a9a9-4e11-8bcb-64c930ba919f\") " pod="openstack/nova-metadata-0" Oct 12 06:01:31 crc kubenswrapper[4930]: I1012 06:01:31.908883 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aca4b64e-a9a9-4e11-8bcb-64c930ba919f-logs\") pod \"nova-metadata-0\" (UID: \"aca4b64e-a9a9-4e11-8bcb-64c930ba919f\") " pod="openstack/nova-metadata-0" Oct 12 06:01:31 crc kubenswrapper[4930]: I1012 06:01:31.908934 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aca4b64e-a9a9-4e11-8bcb-64c930ba919f-config-data\") pod \"nova-metadata-0\" (UID: \"aca4b64e-a9a9-4e11-8bcb-64c930ba919f\") " pod="openstack/nova-metadata-0" Oct 12 06:01:31 crc kubenswrapper[4930]: I1012 06:01:31.909039 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aca4b64e-a9a9-4e11-8bcb-64c930ba919f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"aca4b64e-a9a9-4e11-8bcb-64c930ba919f\") " pod="openstack/nova-metadata-0" Oct 12 06:01:31 crc kubenswrapper[4930]: I1012 06:01:31.909108 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/aca4b64e-a9a9-4e11-8bcb-64c930ba919f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"aca4b64e-a9a9-4e11-8bcb-64c930ba919f\") " pod="openstack/nova-metadata-0" Oct 12 06:01:31 crc kubenswrapper[4930]: I1012 06:01:31.909156 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qvx5\" (UniqueName: \"kubernetes.io/projected/aca4b64e-a9a9-4e11-8bcb-64c930ba919f-kube-api-access-9qvx5\") pod \"nova-metadata-0\" (UID: \"aca4b64e-a9a9-4e11-8bcb-64c930ba919f\") " pod="openstack/nova-metadata-0" Oct 12 06:01:31 crc kubenswrapper[4930]: I1012 06:01:31.909474 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aca4b64e-a9a9-4e11-8bcb-64c930ba919f-logs\") pod \"nova-metadata-0\" (UID: \"aca4b64e-a9a9-4e11-8bcb-64c930ba919f\") " pod="openstack/nova-metadata-0" Oct 12 06:01:31 crc kubenswrapper[4930]: I1012 06:01:31.914358 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aca4b64e-a9a9-4e11-8bcb-64c930ba919f-config-data\") pod \"nova-metadata-0\" (UID: \"aca4b64e-a9a9-4e11-8bcb-64c930ba919f\") " pod="openstack/nova-metadata-0" Oct 12 06:01:31 crc kubenswrapper[4930]: I1012 06:01:31.916168 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/aca4b64e-a9a9-4e11-8bcb-64c930ba919f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"aca4b64e-a9a9-4e11-8bcb-64c930ba919f\") " pod="openstack/nova-metadata-0" Oct 12 06:01:31 crc kubenswrapper[4930]: I1012 06:01:31.927905 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aca4b64e-a9a9-4e11-8bcb-64c930ba919f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"aca4b64e-a9a9-4e11-8bcb-64c930ba919f\") " pod="openstack/nova-metadata-0" Oct 12 06:01:31 crc kubenswrapper[4930]: I1012 06:01:31.944040 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qvx5\" (UniqueName: \"kubernetes.io/projected/aca4b64e-a9a9-4e11-8bcb-64c930ba919f-kube-api-access-9qvx5\") pod \"nova-metadata-0\" (UID: \"aca4b64e-a9a9-4e11-8bcb-64c930ba919f\") " pod="openstack/nova-metadata-0" Oct 12 06:01:31 crc kubenswrapper[4930]: I1012 06:01:31.997752 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 12 06:01:32 crc kubenswrapper[4930]: I1012 06:01:32.163082 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bdf4447-1912-4e75-b0bd-f32a424abb42" path="/var/lib/kubelet/pods/4bdf4447-1912-4e75-b0bd-f32a424abb42/volumes" Oct 12 06:01:32 crc kubenswrapper[4930]: I1012 06:01:32.564518 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2c04de04-91cf-4e56-85be-6a95e92f0d73","Type":"ContainerStarted","Data":"7037d31dfb64aca5d14378c2494a85253062b17dd88418591c82c2dc00193a04"} Oct 12 06:01:32 crc kubenswrapper[4930]: I1012 06:01:32.565016 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 12 06:01:32 crc kubenswrapper[4930]: I1012 06:01:32.595097 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.706848224 podStartE2EDuration="8.595081146s" podCreationTimestamp="2025-10-12 06:01:24 +0000 UTC" firstStartedPulling="2025-10-12 06:01:25.523784588 +0000 UTC m=+1218.065886353" lastFinishedPulling="2025-10-12 06:01:31.41201751 +0000 UTC m=+1223.954119275" observedRunningTime="2025-10-12 06:01:32.590922182 +0000 UTC m=+1225.133023947" watchObservedRunningTime="2025-10-12 06:01:32.595081146 +0000 UTC m=+1225.137182911" Oct 12 06:01:32 crc kubenswrapper[4930]: I1012 06:01:32.639058 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 12 06:01:32 crc kubenswrapper[4930]: W1012 06:01:32.654935 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaca4b64e_a9a9_4e11_8bcb_64c930ba919f.slice/crio-79fdde229bbc8f5dde18a306e5496e4e5741d00e6d4a7795f83fa98c9f21cbe6 WatchSource:0}: Error finding container 79fdde229bbc8f5dde18a306e5496e4e5741d00e6d4a7795f83fa98c9f21cbe6: Status 404 returned error can't find the container with id 79fdde229bbc8f5dde18a306e5496e4e5741d00e6d4a7795f83fa98c9f21cbe6 Oct 12 06:01:33 crc kubenswrapper[4930]: I1012 06:01:33.575486 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"aca4b64e-a9a9-4e11-8bcb-64c930ba919f","Type":"ContainerStarted","Data":"2aaf2021b3efb3a6186fcba0b71cda71be06500733a41ce1b2f83e3870ce43af"} Oct 12 06:01:33 crc kubenswrapper[4930]: I1012 06:01:33.575852 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"aca4b64e-a9a9-4e11-8bcb-64c930ba919f","Type":"ContainerStarted","Data":"7b0f0184297a4cb5ff895744215361c33387c224a030d62d391a73d247726b19"} Oct 12 06:01:33 crc kubenswrapper[4930]: I1012 06:01:33.575868 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"aca4b64e-a9a9-4e11-8bcb-64c930ba919f","Type":"ContainerStarted","Data":"79fdde229bbc8f5dde18a306e5496e4e5741d00e6d4a7795f83fa98c9f21cbe6"} Oct 12 06:01:33 crc kubenswrapper[4930]: I1012 06:01:33.602978 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.60295441 podStartE2EDuration="2.60295441s" podCreationTimestamp="2025-10-12 06:01:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 06:01:33.596384656 +0000 UTC m=+1226.138486461" watchObservedRunningTime="2025-10-12 06:01:33.60295441 +0000 UTC m=+1226.145056165" Oct 12 06:01:33 crc kubenswrapper[4930]: I1012 06:01:33.751832 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 12 06:01:33 crc kubenswrapper[4930]: I1012 06:01:33.924446 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 12 06:01:33 crc kubenswrapper[4930]: I1012 06:01:33.925703 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 12 06:01:34 crc kubenswrapper[4930]: I1012 06:01:34.074893 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-844fc57f6f-7cfwk" Oct 12 06:01:34 crc kubenswrapper[4930]: I1012 06:01:34.094180 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 12 06:01:34 crc kubenswrapper[4930]: I1012 06:01:34.094220 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 12 06:01:34 crc kubenswrapper[4930]: I1012 06:01:34.160859 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 12 06:01:34 crc kubenswrapper[4930]: I1012 06:01:34.171946 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75958fc765-rbf85"] Oct 12 06:01:34 crc kubenswrapper[4930]: I1012 06:01:34.172348 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-75958fc765-rbf85" podUID="0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3" containerName="dnsmasq-dns" containerID="cri-o://2f1f3c78bf7472b42b43be9c8798a04a99ab6f1b52b82784f82597baf1acf3c0" gracePeriod=10 Oct 12 06:01:34 crc kubenswrapper[4930]: I1012 06:01:34.589193 4930 generic.go:334] "Generic (PLEG): container finished" podID="022414f2-80b4-4bc1-81e9-df49ebbaae8e" containerID="11587a9e8bb1e90a07706f5bca9e8051fea00fd934ade62a22227ec2378089af" exitCode=0 Oct 12 06:01:34 crc kubenswrapper[4930]: I1012 06:01:34.589264 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-fzfdt" event={"ID":"022414f2-80b4-4bc1-81e9-df49ebbaae8e","Type":"ContainerDied","Data":"11587a9e8bb1e90a07706f5bca9e8051fea00fd934ade62a22227ec2378089af"} Oct 12 06:01:34 crc kubenswrapper[4930]: I1012 06:01:34.592707 4930 generic.go:334] "Generic (PLEG): container finished" podID="0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3" containerID="2f1f3c78bf7472b42b43be9c8798a04a99ab6f1b52b82784f82597baf1acf3c0" exitCode=0 Oct 12 06:01:34 crc kubenswrapper[4930]: I1012 06:01:34.593569 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75958fc765-rbf85" event={"ID":"0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3","Type":"ContainerDied","Data":"2f1f3c78bf7472b42b43be9c8798a04a99ab6f1b52b82784f82597baf1acf3c0"} Oct 12 06:01:34 crc kubenswrapper[4930]: I1012 06:01:34.648719 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 12 06:01:34 crc kubenswrapper[4930]: I1012 06:01:34.690664 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75958fc765-rbf85" Oct 12 06:01:34 crc kubenswrapper[4930]: I1012 06:01:34.774869 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3-ovsdbserver-sb\") pod \"0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3\" (UID: \"0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3\") " Oct 12 06:01:34 crc kubenswrapper[4930]: I1012 06:01:34.774993 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3-dns-svc\") pod \"0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3\" (UID: \"0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3\") " Oct 12 06:01:34 crc kubenswrapper[4930]: I1012 06:01:34.775041 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3-config\") pod \"0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3\" (UID: \"0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3\") " Oct 12 06:01:34 crc kubenswrapper[4930]: I1012 06:01:34.775079 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xd9sh\" (UniqueName: \"kubernetes.io/projected/0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3-kube-api-access-xd9sh\") pod \"0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3\" (UID: \"0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3\") " Oct 12 06:01:34 crc kubenswrapper[4930]: I1012 06:01:34.775107 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3-dns-swift-storage-0\") pod \"0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3\" (UID: \"0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3\") " Oct 12 06:01:34 crc kubenswrapper[4930]: I1012 06:01:34.775139 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3-ovsdbserver-nb\") pod \"0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3\" (UID: \"0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3\") " Oct 12 06:01:34 crc kubenswrapper[4930]: I1012 06:01:34.807881 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3-kube-api-access-xd9sh" (OuterVolumeSpecName: "kube-api-access-xd9sh") pod "0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3" (UID: "0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3"). InnerVolumeSpecName "kube-api-access-xd9sh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:01:34 crc kubenswrapper[4930]: I1012 06:01:34.830377 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3" (UID: "0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 06:01:34 crc kubenswrapper[4930]: I1012 06:01:34.854423 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3" (UID: "0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 06:01:34 crc kubenswrapper[4930]: I1012 06:01:34.864110 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3-config" (OuterVolumeSpecName: "config") pod "0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3" (UID: "0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 06:01:34 crc kubenswrapper[4930]: I1012 06:01:34.877153 4930 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 12 06:01:34 crc kubenswrapper[4930]: I1012 06:01:34.877188 4930 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3-config\") on node \"crc\" DevicePath \"\"" Oct 12 06:01:34 crc kubenswrapper[4930]: I1012 06:01:34.877199 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xd9sh\" (UniqueName: \"kubernetes.io/projected/0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3-kube-api-access-xd9sh\") on node \"crc\" DevicePath \"\"" Oct 12 06:01:34 crc kubenswrapper[4930]: I1012 06:01:34.877209 4930 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 12 06:01:34 crc kubenswrapper[4930]: I1012 06:01:34.884720 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3" (UID: "0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 06:01:34 crc kubenswrapper[4930]: I1012 06:01:34.889862 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3" (UID: "0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 06:01:34 crc kubenswrapper[4930]: I1012 06:01:34.980615 4930 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 12 06:01:34 crc kubenswrapper[4930]: I1012 06:01:34.980867 4930 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 12 06:01:35 crc kubenswrapper[4930]: I1012 06:01:35.008900 4930 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="3e95b56f-80e3-44ae-b28b-f5cca7559e36" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.212:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 12 06:01:35 crc kubenswrapper[4930]: I1012 06:01:35.008919 4930 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="3e95b56f-80e3-44ae-b28b-f5cca7559e36" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.212:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 12 06:01:35 crc kubenswrapper[4930]: I1012 06:01:35.602711 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75958fc765-rbf85" event={"ID":"0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3","Type":"ContainerDied","Data":"721ba2e10bbe10d03f20cda1a6030e409e263e5fcd40f16789525d06be336d17"} Oct 12 06:01:35 crc kubenswrapper[4930]: I1012 06:01:35.602756 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75958fc765-rbf85" Oct 12 06:01:35 crc kubenswrapper[4930]: I1012 06:01:35.602790 4930 scope.go:117] "RemoveContainer" containerID="2f1f3c78bf7472b42b43be9c8798a04a99ab6f1b52b82784f82597baf1acf3c0" Oct 12 06:01:35 crc kubenswrapper[4930]: I1012 06:01:35.640546 4930 scope.go:117] "RemoveContainer" containerID="e9bc605ff3a8ee85dab30fa3131dfbafb181071c13d955a23d4dfcb804046adc" Oct 12 06:01:35 crc kubenswrapper[4930]: I1012 06:01:35.647590 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75958fc765-rbf85"] Oct 12 06:01:35 crc kubenswrapper[4930]: I1012 06:01:35.663332 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75958fc765-rbf85"] Oct 12 06:01:36 crc kubenswrapper[4930]: I1012 06:01:36.073708 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-fzfdt" Oct 12 06:01:36 crc kubenswrapper[4930]: I1012 06:01:36.100841 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/022414f2-80b4-4bc1-81e9-df49ebbaae8e-config-data\") pod \"022414f2-80b4-4bc1-81e9-df49ebbaae8e\" (UID: \"022414f2-80b4-4bc1-81e9-df49ebbaae8e\") " Oct 12 06:01:36 crc kubenswrapper[4930]: I1012 06:01:36.100981 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22zgt\" (UniqueName: \"kubernetes.io/projected/022414f2-80b4-4bc1-81e9-df49ebbaae8e-kube-api-access-22zgt\") pod \"022414f2-80b4-4bc1-81e9-df49ebbaae8e\" (UID: \"022414f2-80b4-4bc1-81e9-df49ebbaae8e\") " Oct 12 06:01:36 crc kubenswrapper[4930]: I1012 06:01:36.101047 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/022414f2-80b4-4bc1-81e9-df49ebbaae8e-scripts\") pod \"022414f2-80b4-4bc1-81e9-df49ebbaae8e\" (UID: \"022414f2-80b4-4bc1-81e9-df49ebbaae8e\") " Oct 12 06:01:36 crc kubenswrapper[4930]: I1012 06:01:36.101220 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/022414f2-80b4-4bc1-81e9-df49ebbaae8e-combined-ca-bundle\") pod \"022414f2-80b4-4bc1-81e9-df49ebbaae8e\" (UID: \"022414f2-80b4-4bc1-81e9-df49ebbaae8e\") " Oct 12 06:01:36 crc kubenswrapper[4930]: I1012 06:01:36.108090 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/022414f2-80b4-4bc1-81e9-df49ebbaae8e-kube-api-access-22zgt" (OuterVolumeSpecName: "kube-api-access-22zgt") pod "022414f2-80b4-4bc1-81e9-df49ebbaae8e" (UID: "022414f2-80b4-4bc1-81e9-df49ebbaae8e"). InnerVolumeSpecName "kube-api-access-22zgt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:01:36 crc kubenswrapper[4930]: I1012 06:01:36.120015 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/022414f2-80b4-4bc1-81e9-df49ebbaae8e-scripts" (OuterVolumeSpecName: "scripts") pod "022414f2-80b4-4bc1-81e9-df49ebbaae8e" (UID: "022414f2-80b4-4bc1-81e9-df49ebbaae8e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:01:36 crc kubenswrapper[4930]: I1012 06:01:36.152701 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/022414f2-80b4-4bc1-81e9-df49ebbaae8e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "022414f2-80b4-4bc1-81e9-df49ebbaae8e" (UID: "022414f2-80b4-4bc1-81e9-df49ebbaae8e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:01:36 crc kubenswrapper[4930]: I1012 06:01:36.159838 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3" path="/var/lib/kubelet/pods/0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3/volumes" Oct 12 06:01:36 crc kubenswrapper[4930]: I1012 06:01:36.177817 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/022414f2-80b4-4bc1-81e9-df49ebbaae8e-config-data" (OuterVolumeSpecName: "config-data") pod "022414f2-80b4-4bc1-81e9-df49ebbaae8e" (UID: "022414f2-80b4-4bc1-81e9-df49ebbaae8e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:01:36 crc kubenswrapper[4930]: I1012 06:01:36.204077 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22zgt\" (UniqueName: \"kubernetes.io/projected/022414f2-80b4-4bc1-81e9-df49ebbaae8e-kube-api-access-22zgt\") on node \"crc\" DevicePath \"\"" Oct 12 06:01:36 crc kubenswrapper[4930]: I1012 06:01:36.204108 4930 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/022414f2-80b4-4bc1-81e9-df49ebbaae8e-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 06:01:36 crc kubenswrapper[4930]: I1012 06:01:36.204118 4930 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/022414f2-80b4-4bc1-81e9-df49ebbaae8e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 06:01:36 crc kubenswrapper[4930]: I1012 06:01:36.204126 4930 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/022414f2-80b4-4bc1-81e9-df49ebbaae8e-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 06:01:36 crc kubenswrapper[4930]: I1012 06:01:36.612353 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-fzfdt" event={"ID":"022414f2-80b4-4bc1-81e9-df49ebbaae8e","Type":"ContainerDied","Data":"726c1183404d2c5b0930f2caabee3ba82d1e3f9f4130dfbdcef130c84474ebba"} Oct 12 06:01:36 crc kubenswrapper[4930]: I1012 06:01:36.612399 4930 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="726c1183404d2c5b0930f2caabee3ba82d1e3f9f4130dfbdcef130c84474ebba" Oct 12 06:01:36 crc kubenswrapper[4930]: I1012 06:01:36.612461 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-fzfdt" Oct 12 06:01:36 crc kubenswrapper[4930]: I1012 06:01:36.789399 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 12 06:01:36 crc kubenswrapper[4930]: I1012 06:01:36.789700 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="3e95b56f-80e3-44ae-b28b-f5cca7559e36" containerName="nova-api-api" containerID="cri-o://76869c73a7da45fa8450435eb5fa0e30782564f5763b06957d4aca277f70d01f" gracePeriod=30 Oct 12 06:01:36 crc kubenswrapper[4930]: I1012 06:01:36.789872 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="3e95b56f-80e3-44ae-b28b-f5cca7559e36" containerName="nova-api-log" containerID="cri-o://4966612a69ce6e520c58e3c08429f711ad9f84f2c5fc5e33faf7901cd983e1a7" gracePeriod=30 Oct 12 06:01:36 crc kubenswrapper[4930]: I1012 06:01:36.806619 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 12 06:01:36 crc kubenswrapper[4930]: I1012 06:01:36.806903 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="eb27a965-e24a-4e4a-8577-5abb8c38de00" containerName="nova-scheduler-scheduler" containerID="cri-o://1d2c46de744b137bc8ae99a8e9a02fb6382db83c9e938d507b984b06eda756b8" gracePeriod=30 Oct 12 06:01:36 crc kubenswrapper[4930]: I1012 06:01:36.827712 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 12 06:01:36 crc kubenswrapper[4930]: I1012 06:01:36.829687 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="aca4b64e-a9a9-4e11-8bcb-64c930ba919f" containerName="nova-metadata-log" containerID="cri-o://7b0f0184297a4cb5ff895744215361c33387c224a030d62d391a73d247726b19" gracePeriod=30 Oct 12 06:01:36 crc kubenswrapper[4930]: I1012 06:01:36.830155 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="aca4b64e-a9a9-4e11-8bcb-64c930ba919f" containerName="nova-metadata-metadata" containerID="cri-o://2aaf2021b3efb3a6186fcba0b71cda71be06500733a41ce1b2f83e3870ce43af" gracePeriod=30 Oct 12 06:01:36 crc kubenswrapper[4930]: I1012 06:01:36.999588 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 12 06:01:36 crc kubenswrapper[4930]: I1012 06:01:36.999636 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 12 06:01:37 crc kubenswrapper[4930]: I1012 06:01:37.438218 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 12 06:01:37 crc kubenswrapper[4930]: I1012 06:01:37.527607 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aca4b64e-a9a9-4e11-8bcb-64c930ba919f-combined-ca-bundle\") pod \"aca4b64e-a9a9-4e11-8bcb-64c930ba919f\" (UID: \"aca4b64e-a9a9-4e11-8bcb-64c930ba919f\") " Oct 12 06:01:37 crc kubenswrapper[4930]: I1012 06:01:37.527770 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aca4b64e-a9a9-4e11-8bcb-64c930ba919f-logs\") pod \"aca4b64e-a9a9-4e11-8bcb-64c930ba919f\" (UID: \"aca4b64e-a9a9-4e11-8bcb-64c930ba919f\") " Oct 12 06:01:37 crc kubenswrapper[4930]: I1012 06:01:37.527799 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qvx5\" (UniqueName: \"kubernetes.io/projected/aca4b64e-a9a9-4e11-8bcb-64c930ba919f-kube-api-access-9qvx5\") pod \"aca4b64e-a9a9-4e11-8bcb-64c930ba919f\" (UID: \"aca4b64e-a9a9-4e11-8bcb-64c930ba919f\") " Oct 12 06:01:37 crc kubenswrapper[4930]: I1012 06:01:37.527848 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/aca4b64e-a9a9-4e11-8bcb-64c930ba919f-nova-metadata-tls-certs\") pod \"aca4b64e-a9a9-4e11-8bcb-64c930ba919f\" (UID: \"aca4b64e-a9a9-4e11-8bcb-64c930ba919f\") " Oct 12 06:01:37 crc kubenswrapper[4930]: I1012 06:01:37.527918 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aca4b64e-a9a9-4e11-8bcb-64c930ba919f-config-data\") pod \"aca4b64e-a9a9-4e11-8bcb-64c930ba919f\" (UID: \"aca4b64e-a9a9-4e11-8bcb-64c930ba919f\") " Oct 12 06:01:37 crc kubenswrapper[4930]: I1012 06:01:37.529135 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aca4b64e-a9a9-4e11-8bcb-64c930ba919f-logs" (OuterVolumeSpecName: "logs") pod "aca4b64e-a9a9-4e11-8bcb-64c930ba919f" (UID: "aca4b64e-a9a9-4e11-8bcb-64c930ba919f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 06:01:37 crc kubenswrapper[4930]: I1012 06:01:37.534025 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aca4b64e-a9a9-4e11-8bcb-64c930ba919f-kube-api-access-9qvx5" (OuterVolumeSpecName: "kube-api-access-9qvx5") pod "aca4b64e-a9a9-4e11-8bcb-64c930ba919f" (UID: "aca4b64e-a9a9-4e11-8bcb-64c930ba919f"). InnerVolumeSpecName "kube-api-access-9qvx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:01:37 crc kubenswrapper[4930]: I1012 06:01:37.574195 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aca4b64e-a9a9-4e11-8bcb-64c930ba919f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aca4b64e-a9a9-4e11-8bcb-64c930ba919f" (UID: "aca4b64e-a9a9-4e11-8bcb-64c930ba919f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:01:37 crc kubenswrapper[4930]: I1012 06:01:37.582547 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aca4b64e-a9a9-4e11-8bcb-64c930ba919f-config-data" (OuterVolumeSpecName: "config-data") pod "aca4b64e-a9a9-4e11-8bcb-64c930ba919f" (UID: "aca4b64e-a9a9-4e11-8bcb-64c930ba919f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:01:37 crc kubenswrapper[4930]: I1012 06:01:37.610797 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aca4b64e-a9a9-4e11-8bcb-64c930ba919f-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "aca4b64e-a9a9-4e11-8bcb-64c930ba919f" (UID: "aca4b64e-a9a9-4e11-8bcb-64c930ba919f"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:01:37 crc kubenswrapper[4930]: I1012 06:01:37.630078 4930 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aca4b64e-a9a9-4e11-8bcb-64c930ba919f-logs\") on node \"crc\" DevicePath \"\"" Oct 12 06:01:37 crc kubenswrapper[4930]: I1012 06:01:37.630120 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qvx5\" (UniqueName: \"kubernetes.io/projected/aca4b64e-a9a9-4e11-8bcb-64c930ba919f-kube-api-access-9qvx5\") on node \"crc\" DevicePath \"\"" Oct 12 06:01:37 crc kubenswrapper[4930]: I1012 06:01:37.630137 4930 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/aca4b64e-a9a9-4e11-8bcb-64c930ba919f-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 12 06:01:37 crc kubenswrapper[4930]: I1012 06:01:37.630150 4930 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aca4b64e-a9a9-4e11-8bcb-64c930ba919f-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 06:01:37 crc kubenswrapper[4930]: I1012 06:01:37.630162 4930 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aca4b64e-a9a9-4e11-8bcb-64c930ba919f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 06:01:37 crc kubenswrapper[4930]: I1012 06:01:37.632161 4930 generic.go:334] "Generic (PLEG): container finished" podID="3e95b56f-80e3-44ae-b28b-f5cca7559e36" containerID="4966612a69ce6e520c58e3c08429f711ad9f84f2c5fc5e33faf7901cd983e1a7" exitCode=143 Oct 12 06:01:37 crc kubenswrapper[4930]: I1012 06:01:37.632220 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3e95b56f-80e3-44ae-b28b-f5cca7559e36","Type":"ContainerDied","Data":"4966612a69ce6e520c58e3c08429f711ad9f84f2c5fc5e33faf7901cd983e1a7"} Oct 12 06:01:37 crc kubenswrapper[4930]: I1012 06:01:37.633947 4930 generic.go:334] "Generic (PLEG): container finished" podID="25b0cb62-9d52-4ba1-9c00-3cda68c43da8" containerID="7c5c4a121c4d9f3987ec65112d4c46c15bea1770075f362a6949b4fd38d1788f" exitCode=0 Oct 12 06:01:37 crc kubenswrapper[4930]: I1012 06:01:37.634001 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-mqwg4" event={"ID":"25b0cb62-9d52-4ba1-9c00-3cda68c43da8","Type":"ContainerDied","Data":"7c5c4a121c4d9f3987ec65112d4c46c15bea1770075f362a6949b4fd38d1788f"} Oct 12 06:01:37 crc kubenswrapper[4930]: I1012 06:01:37.648185 4930 generic.go:334] "Generic (PLEG): container finished" podID="aca4b64e-a9a9-4e11-8bcb-64c930ba919f" containerID="2aaf2021b3efb3a6186fcba0b71cda71be06500733a41ce1b2f83e3870ce43af" exitCode=0 Oct 12 06:01:37 crc kubenswrapper[4930]: I1012 06:01:37.648216 4930 generic.go:334] "Generic (PLEG): container finished" podID="aca4b64e-a9a9-4e11-8bcb-64c930ba919f" containerID="7b0f0184297a4cb5ff895744215361c33387c224a030d62d391a73d247726b19" exitCode=143 Oct 12 06:01:37 crc kubenswrapper[4930]: I1012 06:01:37.648237 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 12 06:01:37 crc kubenswrapper[4930]: I1012 06:01:37.648242 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"aca4b64e-a9a9-4e11-8bcb-64c930ba919f","Type":"ContainerDied","Data":"2aaf2021b3efb3a6186fcba0b71cda71be06500733a41ce1b2f83e3870ce43af"} Oct 12 06:01:37 crc kubenswrapper[4930]: I1012 06:01:37.648285 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"aca4b64e-a9a9-4e11-8bcb-64c930ba919f","Type":"ContainerDied","Data":"7b0f0184297a4cb5ff895744215361c33387c224a030d62d391a73d247726b19"} Oct 12 06:01:37 crc kubenswrapper[4930]: I1012 06:01:37.648297 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"aca4b64e-a9a9-4e11-8bcb-64c930ba919f","Type":"ContainerDied","Data":"79fdde229bbc8f5dde18a306e5496e4e5741d00e6d4a7795f83fa98c9f21cbe6"} Oct 12 06:01:37 crc kubenswrapper[4930]: I1012 06:01:37.648314 4930 scope.go:117] "RemoveContainer" containerID="2aaf2021b3efb3a6186fcba0b71cda71be06500733a41ce1b2f83e3870ce43af" Oct 12 06:01:37 crc kubenswrapper[4930]: I1012 06:01:37.702688 4930 scope.go:117] "RemoveContainer" containerID="7b0f0184297a4cb5ff895744215361c33387c224a030d62d391a73d247726b19" Oct 12 06:01:37 crc kubenswrapper[4930]: I1012 06:01:37.702744 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 12 06:01:37 crc kubenswrapper[4930]: I1012 06:01:37.710820 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 12 06:01:37 crc kubenswrapper[4930]: I1012 06:01:37.732691 4930 scope.go:117] "RemoveContainer" containerID="2aaf2021b3efb3a6186fcba0b71cda71be06500733a41ce1b2f83e3870ce43af" Oct 12 06:01:37 crc kubenswrapper[4930]: I1012 06:01:37.733078 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 12 06:01:37 crc kubenswrapper[4930]: E1012 06:01:37.733473 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3" containerName="init" Oct 12 06:01:37 crc kubenswrapper[4930]: I1012 06:01:37.733488 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3" containerName="init" Oct 12 06:01:37 crc kubenswrapper[4930]: E1012 06:01:37.733505 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="022414f2-80b4-4bc1-81e9-df49ebbaae8e" containerName="nova-manage" Oct 12 06:01:37 crc kubenswrapper[4930]: I1012 06:01:37.733512 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="022414f2-80b4-4bc1-81e9-df49ebbaae8e" containerName="nova-manage" Oct 12 06:01:37 crc kubenswrapper[4930]: E1012 06:01:37.733534 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3" containerName="dnsmasq-dns" Oct 12 06:01:37 crc kubenswrapper[4930]: I1012 06:01:37.733539 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3" containerName="dnsmasq-dns" Oct 12 06:01:37 crc kubenswrapper[4930]: E1012 06:01:37.733557 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aca4b64e-a9a9-4e11-8bcb-64c930ba919f" containerName="nova-metadata-metadata" Oct 12 06:01:37 crc kubenswrapper[4930]: I1012 06:01:37.733563 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="aca4b64e-a9a9-4e11-8bcb-64c930ba919f" containerName="nova-metadata-metadata" Oct 12 06:01:37 crc kubenswrapper[4930]: E1012 06:01:37.733624 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aca4b64e-a9a9-4e11-8bcb-64c930ba919f" containerName="nova-metadata-log" Oct 12 06:01:37 crc kubenswrapper[4930]: I1012 06:01:37.733633 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="aca4b64e-a9a9-4e11-8bcb-64c930ba919f" containerName="nova-metadata-log" Oct 12 06:01:37 crc kubenswrapper[4930]: I1012 06:01:37.734080 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="aca4b64e-a9a9-4e11-8bcb-64c930ba919f" containerName="nova-metadata-log" Oct 12 06:01:37 crc kubenswrapper[4930]: I1012 06:01:37.734109 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="022414f2-80b4-4bc1-81e9-df49ebbaae8e" containerName="nova-manage" Oct 12 06:01:37 crc kubenswrapper[4930]: I1012 06:01:37.734170 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d4012ea-ef03-4e8b-8c56-8fbb4b33efd3" containerName="dnsmasq-dns" Oct 12 06:01:37 crc kubenswrapper[4930]: I1012 06:01:37.734186 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="aca4b64e-a9a9-4e11-8bcb-64c930ba919f" containerName="nova-metadata-metadata" Oct 12 06:01:37 crc kubenswrapper[4930]: E1012 06:01:37.734860 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2aaf2021b3efb3a6186fcba0b71cda71be06500733a41ce1b2f83e3870ce43af\": container with ID starting with 2aaf2021b3efb3a6186fcba0b71cda71be06500733a41ce1b2f83e3870ce43af not found: ID does not exist" containerID="2aaf2021b3efb3a6186fcba0b71cda71be06500733a41ce1b2f83e3870ce43af" Oct 12 06:01:37 crc kubenswrapper[4930]: I1012 06:01:37.734994 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2aaf2021b3efb3a6186fcba0b71cda71be06500733a41ce1b2f83e3870ce43af"} err="failed to get container status \"2aaf2021b3efb3a6186fcba0b71cda71be06500733a41ce1b2f83e3870ce43af\": rpc error: code = NotFound desc = could not find container \"2aaf2021b3efb3a6186fcba0b71cda71be06500733a41ce1b2f83e3870ce43af\": container with ID starting with 2aaf2021b3efb3a6186fcba0b71cda71be06500733a41ce1b2f83e3870ce43af not found: ID does not exist" Oct 12 06:01:37 crc kubenswrapper[4930]: I1012 06:01:37.735112 4930 scope.go:117] "RemoveContainer" containerID="7b0f0184297a4cb5ff895744215361c33387c224a030d62d391a73d247726b19" Oct 12 06:01:37 crc kubenswrapper[4930]: E1012 06:01:37.735573 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b0f0184297a4cb5ff895744215361c33387c224a030d62d391a73d247726b19\": container with ID starting with 7b0f0184297a4cb5ff895744215361c33387c224a030d62d391a73d247726b19 not found: ID does not exist" containerID="7b0f0184297a4cb5ff895744215361c33387c224a030d62d391a73d247726b19" Oct 12 06:01:37 crc kubenswrapper[4930]: I1012 06:01:37.735600 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b0f0184297a4cb5ff895744215361c33387c224a030d62d391a73d247726b19"} err="failed to get container status \"7b0f0184297a4cb5ff895744215361c33387c224a030d62d391a73d247726b19\": rpc error: code = NotFound desc = could not find container \"7b0f0184297a4cb5ff895744215361c33387c224a030d62d391a73d247726b19\": container with ID starting with 7b0f0184297a4cb5ff895744215361c33387c224a030d62d391a73d247726b19 not found: ID does not exist" Oct 12 06:01:37 crc kubenswrapper[4930]: I1012 06:01:37.735614 4930 scope.go:117] "RemoveContainer" containerID="2aaf2021b3efb3a6186fcba0b71cda71be06500733a41ce1b2f83e3870ce43af" Oct 12 06:01:37 crc kubenswrapper[4930]: I1012 06:01:37.736019 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2aaf2021b3efb3a6186fcba0b71cda71be06500733a41ce1b2f83e3870ce43af"} err="failed to get container status \"2aaf2021b3efb3a6186fcba0b71cda71be06500733a41ce1b2f83e3870ce43af\": rpc error: code = NotFound desc = could not find container \"2aaf2021b3efb3a6186fcba0b71cda71be06500733a41ce1b2f83e3870ce43af\": container with ID starting with 2aaf2021b3efb3a6186fcba0b71cda71be06500733a41ce1b2f83e3870ce43af not found: ID does not exist" Oct 12 06:01:37 crc kubenswrapper[4930]: I1012 06:01:37.736035 4930 scope.go:117] "RemoveContainer" containerID="7b0f0184297a4cb5ff895744215361c33387c224a030d62d391a73d247726b19" Oct 12 06:01:37 crc kubenswrapper[4930]: I1012 06:01:37.736439 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b0f0184297a4cb5ff895744215361c33387c224a030d62d391a73d247726b19"} err="failed to get container status \"7b0f0184297a4cb5ff895744215361c33387c224a030d62d391a73d247726b19\": rpc error: code = NotFound desc = could not find container \"7b0f0184297a4cb5ff895744215361c33387c224a030d62d391a73d247726b19\": container with ID starting with 7b0f0184297a4cb5ff895744215361c33387c224a030d62d391a73d247726b19 not found: ID does not exist" Oct 12 06:01:37 crc kubenswrapper[4930]: I1012 06:01:37.737768 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 12 06:01:37 crc kubenswrapper[4930]: I1012 06:01:37.743232 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 12 06:01:37 crc kubenswrapper[4930]: I1012 06:01:37.743470 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 12 06:01:37 crc kubenswrapper[4930]: I1012 06:01:37.767811 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 12 06:01:37 crc kubenswrapper[4930]: I1012 06:01:37.834306 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4r4lh\" (UniqueName: \"kubernetes.io/projected/0fb6258b-6691-4a25-b22c-ad2a5a03b167-kube-api-access-4r4lh\") pod \"nova-metadata-0\" (UID: \"0fb6258b-6691-4a25-b22c-ad2a5a03b167\") " pod="openstack/nova-metadata-0" Oct 12 06:01:37 crc kubenswrapper[4930]: I1012 06:01:37.834452 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fb6258b-6691-4a25-b22c-ad2a5a03b167-config-data\") pod \"nova-metadata-0\" (UID: \"0fb6258b-6691-4a25-b22c-ad2a5a03b167\") " pod="openstack/nova-metadata-0" Oct 12 06:01:37 crc kubenswrapper[4930]: I1012 06:01:37.834478 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fb6258b-6691-4a25-b22c-ad2a5a03b167-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0fb6258b-6691-4a25-b22c-ad2a5a03b167\") " pod="openstack/nova-metadata-0" Oct 12 06:01:37 crc kubenswrapper[4930]: I1012 06:01:37.834530 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0fb6258b-6691-4a25-b22c-ad2a5a03b167-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0fb6258b-6691-4a25-b22c-ad2a5a03b167\") " pod="openstack/nova-metadata-0" Oct 12 06:01:37 crc kubenswrapper[4930]: I1012 06:01:37.834556 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0fb6258b-6691-4a25-b22c-ad2a5a03b167-logs\") pod \"nova-metadata-0\" (UID: \"0fb6258b-6691-4a25-b22c-ad2a5a03b167\") " pod="openstack/nova-metadata-0" Oct 12 06:01:37 crc kubenswrapper[4930]: I1012 06:01:37.937033 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fb6258b-6691-4a25-b22c-ad2a5a03b167-config-data\") pod \"nova-metadata-0\" (UID: \"0fb6258b-6691-4a25-b22c-ad2a5a03b167\") " pod="openstack/nova-metadata-0" Oct 12 06:01:37 crc kubenswrapper[4930]: I1012 06:01:37.937149 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fb6258b-6691-4a25-b22c-ad2a5a03b167-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0fb6258b-6691-4a25-b22c-ad2a5a03b167\") " pod="openstack/nova-metadata-0" Oct 12 06:01:37 crc kubenswrapper[4930]: I1012 06:01:37.937310 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0fb6258b-6691-4a25-b22c-ad2a5a03b167-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0fb6258b-6691-4a25-b22c-ad2a5a03b167\") " pod="openstack/nova-metadata-0" Oct 12 06:01:37 crc kubenswrapper[4930]: I1012 06:01:37.937380 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0fb6258b-6691-4a25-b22c-ad2a5a03b167-logs\") pod \"nova-metadata-0\" (UID: \"0fb6258b-6691-4a25-b22c-ad2a5a03b167\") " pod="openstack/nova-metadata-0" Oct 12 06:01:37 crc kubenswrapper[4930]: I1012 06:01:37.937429 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4r4lh\" (UniqueName: \"kubernetes.io/projected/0fb6258b-6691-4a25-b22c-ad2a5a03b167-kube-api-access-4r4lh\") pod \"nova-metadata-0\" (UID: \"0fb6258b-6691-4a25-b22c-ad2a5a03b167\") " pod="openstack/nova-metadata-0" Oct 12 06:01:37 crc kubenswrapper[4930]: I1012 06:01:37.938505 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0fb6258b-6691-4a25-b22c-ad2a5a03b167-logs\") pod \"nova-metadata-0\" (UID: \"0fb6258b-6691-4a25-b22c-ad2a5a03b167\") " pod="openstack/nova-metadata-0" Oct 12 06:01:37 crc kubenswrapper[4930]: I1012 06:01:37.942345 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0fb6258b-6691-4a25-b22c-ad2a5a03b167-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0fb6258b-6691-4a25-b22c-ad2a5a03b167\") " pod="openstack/nova-metadata-0" Oct 12 06:01:37 crc kubenswrapper[4930]: I1012 06:01:37.942396 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fb6258b-6691-4a25-b22c-ad2a5a03b167-config-data\") pod \"nova-metadata-0\" (UID: \"0fb6258b-6691-4a25-b22c-ad2a5a03b167\") " pod="openstack/nova-metadata-0" Oct 12 06:01:37 crc kubenswrapper[4930]: I1012 06:01:37.951531 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fb6258b-6691-4a25-b22c-ad2a5a03b167-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0fb6258b-6691-4a25-b22c-ad2a5a03b167\") " pod="openstack/nova-metadata-0" Oct 12 06:01:37 crc kubenswrapper[4930]: I1012 06:01:37.960690 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4r4lh\" (UniqueName: \"kubernetes.io/projected/0fb6258b-6691-4a25-b22c-ad2a5a03b167-kube-api-access-4r4lh\") pod \"nova-metadata-0\" (UID: \"0fb6258b-6691-4a25-b22c-ad2a5a03b167\") " pod="openstack/nova-metadata-0" Oct 12 06:01:38 crc kubenswrapper[4930]: I1012 06:01:38.080622 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 12 06:01:38 crc kubenswrapper[4930]: I1012 06:01:38.156487 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aca4b64e-a9a9-4e11-8bcb-64c930ba919f" path="/var/lib/kubelet/pods/aca4b64e-a9a9-4e11-8bcb-64c930ba919f/volumes" Oct 12 06:01:38 crc kubenswrapper[4930]: I1012 06:01:38.615649 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 12 06:01:38 crc kubenswrapper[4930]: I1012 06:01:38.683009 4930 generic.go:334] "Generic (PLEG): container finished" podID="eb27a965-e24a-4e4a-8577-5abb8c38de00" containerID="1d2c46de744b137bc8ae99a8e9a02fb6382db83c9e938d507b984b06eda756b8" exitCode=0 Oct 12 06:01:38 crc kubenswrapper[4930]: I1012 06:01:38.683090 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"eb27a965-e24a-4e4a-8577-5abb8c38de00","Type":"ContainerDied","Data":"1d2c46de744b137bc8ae99a8e9a02fb6382db83c9e938d507b984b06eda756b8"} Oct 12 06:01:38 crc kubenswrapper[4930]: I1012 06:01:38.686578 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0fb6258b-6691-4a25-b22c-ad2a5a03b167","Type":"ContainerStarted","Data":"cd56a96277fc6b37242b544ce8113391e28dfec41729f651da8e44b68c4e774f"} Oct 12 06:01:38 crc kubenswrapper[4930]: I1012 06:01:38.772689 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 12 06:01:38 crc kubenswrapper[4930]: I1012 06:01:38.884729 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gn46c\" (UniqueName: \"kubernetes.io/projected/eb27a965-e24a-4e4a-8577-5abb8c38de00-kube-api-access-gn46c\") pod \"eb27a965-e24a-4e4a-8577-5abb8c38de00\" (UID: \"eb27a965-e24a-4e4a-8577-5abb8c38de00\") " Oct 12 06:01:38 crc kubenswrapper[4930]: I1012 06:01:38.884789 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb27a965-e24a-4e4a-8577-5abb8c38de00-combined-ca-bundle\") pod \"eb27a965-e24a-4e4a-8577-5abb8c38de00\" (UID: \"eb27a965-e24a-4e4a-8577-5abb8c38de00\") " Oct 12 06:01:38 crc kubenswrapper[4930]: I1012 06:01:38.884922 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb27a965-e24a-4e4a-8577-5abb8c38de00-config-data\") pod \"eb27a965-e24a-4e4a-8577-5abb8c38de00\" (UID: \"eb27a965-e24a-4e4a-8577-5abb8c38de00\") " Oct 12 06:01:38 crc kubenswrapper[4930]: I1012 06:01:38.958210 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb27a965-e24a-4e4a-8577-5abb8c38de00-kube-api-access-gn46c" (OuterVolumeSpecName: "kube-api-access-gn46c") pod "eb27a965-e24a-4e4a-8577-5abb8c38de00" (UID: "eb27a965-e24a-4e4a-8577-5abb8c38de00"). InnerVolumeSpecName "kube-api-access-gn46c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:01:38 crc kubenswrapper[4930]: I1012 06:01:38.987648 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gn46c\" (UniqueName: \"kubernetes.io/projected/eb27a965-e24a-4e4a-8577-5abb8c38de00-kube-api-access-gn46c\") on node \"crc\" DevicePath \"\"" Oct 12 06:01:39 crc kubenswrapper[4930]: I1012 06:01:39.042073 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb27a965-e24a-4e4a-8577-5abb8c38de00-config-data" (OuterVolumeSpecName: "config-data") pod "eb27a965-e24a-4e4a-8577-5abb8c38de00" (UID: "eb27a965-e24a-4e4a-8577-5abb8c38de00"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:01:39 crc kubenswrapper[4930]: I1012 06:01:39.047255 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-mqwg4" Oct 12 06:01:39 crc kubenswrapper[4930]: I1012 06:01:39.050464 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb27a965-e24a-4e4a-8577-5abb8c38de00-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eb27a965-e24a-4e4a-8577-5abb8c38de00" (UID: "eb27a965-e24a-4e4a-8577-5abb8c38de00"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:01:39 crc kubenswrapper[4930]: I1012 06:01:39.088363 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25b0cb62-9d52-4ba1-9c00-3cda68c43da8-combined-ca-bundle\") pod \"25b0cb62-9d52-4ba1-9c00-3cda68c43da8\" (UID: \"25b0cb62-9d52-4ba1-9c00-3cda68c43da8\") " Oct 12 06:01:39 crc kubenswrapper[4930]: I1012 06:01:39.088481 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25b0cb62-9d52-4ba1-9c00-3cda68c43da8-config-data\") pod \"25b0cb62-9d52-4ba1-9c00-3cda68c43da8\" (UID: \"25b0cb62-9d52-4ba1-9c00-3cda68c43da8\") " Oct 12 06:01:39 crc kubenswrapper[4930]: I1012 06:01:39.088547 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kr7vr\" (UniqueName: \"kubernetes.io/projected/25b0cb62-9d52-4ba1-9c00-3cda68c43da8-kube-api-access-kr7vr\") pod \"25b0cb62-9d52-4ba1-9c00-3cda68c43da8\" (UID: \"25b0cb62-9d52-4ba1-9c00-3cda68c43da8\") " Oct 12 06:01:39 crc kubenswrapper[4930]: I1012 06:01:39.088576 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25b0cb62-9d52-4ba1-9c00-3cda68c43da8-scripts\") pod \"25b0cb62-9d52-4ba1-9c00-3cda68c43da8\" (UID: \"25b0cb62-9d52-4ba1-9c00-3cda68c43da8\") " Oct 12 06:01:39 crc kubenswrapper[4930]: I1012 06:01:39.089117 4930 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb27a965-e24a-4e4a-8577-5abb8c38de00-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 06:01:39 crc kubenswrapper[4930]: I1012 06:01:39.089134 4930 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb27a965-e24a-4e4a-8577-5abb8c38de00-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 06:01:39 crc kubenswrapper[4930]: I1012 06:01:39.093061 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25b0cb62-9d52-4ba1-9c00-3cda68c43da8-scripts" (OuterVolumeSpecName: "scripts") pod "25b0cb62-9d52-4ba1-9c00-3cda68c43da8" (UID: "25b0cb62-9d52-4ba1-9c00-3cda68c43da8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:01:39 crc kubenswrapper[4930]: I1012 06:01:39.093899 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25b0cb62-9d52-4ba1-9c00-3cda68c43da8-kube-api-access-kr7vr" (OuterVolumeSpecName: "kube-api-access-kr7vr") pod "25b0cb62-9d52-4ba1-9c00-3cda68c43da8" (UID: "25b0cb62-9d52-4ba1-9c00-3cda68c43da8"). InnerVolumeSpecName "kube-api-access-kr7vr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:01:39 crc kubenswrapper[4930]: I1012 06:01:39.127870 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25b0cb62-9d52-4ba1-9c00-3cda68c43da8-config-data" (OuterVolumeSpecName: "config-data") pod "25b0cb62-9d52-4ba1-9c00-3cda68c43da8" (UID: "25b0cb62-9d52-4ba1-9c00-3cda68c43da8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:01:39 crc kubenswrapper[4930]: I1012 06:01:39.136698 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25b0cb62-9d52-4ba1-9c00-3cda68c43da8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "25b0cb62-9d52-4ba1-9c00-3cda68c43da8" (UID: "25b0cb62-9d52-4ba1-9c00-3cda68c43da8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:01:39 crc kubenswrapper[4930]: I1012 06:01:39.190335 4930 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25b0cb62-9d52-4ba1-9c00-3cda68c43da8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 06:01:39 crc kubenswrapper[4930]: I1012 06:01:39.190519 4930 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25b0cb62-9d52-4ba1-9c00-3cda68c43da8-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 06:01:39 crc kubenswrapper[4930]: I1012 06:01:39.190529 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kr7vr\" (UniqueName: \"kubernetes.io/projected/25b0cb62-9d52-4ba1-9c00-3cda68c43da8-kube-api-access-kr7vr\") on node \"crc\" DevicePath \"\"" Oct 12 06:01:39 crc kubenswrapper[4930]: I1012 06:01:39.190539 4930 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25b0cb62-9d52-4ba1-9c00-3cda68c43da8-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 06:01:39 crc kubenswrapper[4930]: I1012 06:01:39.721310 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0fb6258b-6691-4a25-b22c-ad2a5a03b167","Type":"ContainerStarted","Data":"c829134520a8f961827096f58e320507bb1e104b0d830f6775dc1b17997cb132"} Oct 12 06:01:39 crc kubenswrapper[4930]: I1012 06:01:39.721362 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0fb6258b-6691-4a25-b22c-ad2a5a03b167","Type":"ContainerStarted","Data":"3f46f37403747a9f51f09e17860efbd5c1d75c8431788d8e2976f9ca29614b40"} Oct 12 06:01:39 crc kubenswrapper[4930]: I1012 06:01:39.735986 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-mqwg4" event={"ID":"25b0cb62-9d52-4ba1-9c00-3cda68c43da8","Type":"ContainerDied","Data":"e9e93e30bf5d10cc71ff961bbba68fdd7eded580c4c3e7aef0e24f8951e1f0d4"} Oct 12 06:01:39 crc kubenswrapper[4930]: I1012 06:01:39.736036 4930 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9e93e30bf5d10cc71ff961bbba68fdd7eded580c4c3e7aef0e24f8951e1f0d4" Oct 12 06:01:39 crc kubenswrapper[4930]: I1012 06:01:39.736141 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-mqwg4" Oct 12 06:01:39 crc kubenswrapper[4930]: I1012 06:01:39.747574 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"eb27a965-e24a-4e4a-8577-5abb8c38de00","Type":"ContainerDied","Data":"f8831add24f3ea797c2e9025a832cdaf8744db232d19f73dcdc44b41f3a587e5"} Oct 12 06:01:39 crc kubenswrapper[4930]: I1012 06:01:39.747626 4930 scope.go:117] "RemoveContainer" containerID="1d2c46de744b137bc8ae99a8e9a02fb6382db83c9e938d507b984b06eda756b8" Oct 12 06:01:39 crc kubenswrapper[4930]: I1012 06:01:39.747770 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 12 06:01:39 crc kubenswrapper[4930]: I1012 06:01:39.751424 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 12 06:01:39 crc kubenswrapper[4930]: E1012 06:01:39.751921 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25b0cb62-9d52-4ba1-9c00-3cda68c43da8" containerName="nova-cell1-conductor-db-sync" Oct 12 06:01:39 crc kubenswrapper[4930]: I1012 06:01:39.751940 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="25b0cb62-9d52-4ba1-9c00-3cda68c43da8" containerName="nova-cell1-conductor-db-sync" Oct 12 06:01:39 crc kubenswrapper[4930]: E1012 06:01:39.751977 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb27a965-e24a-4e4a-8577-5abb8c38de00" containerName="nova-scheduler-scheduler" Oct 12 06:01:39 crc kubenswrapper[4930]: I1012 06:01:39.751986 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb27a965-e24a-4e4a-8577-5abb8c38de00" containerName="nova-scheduler-scheduler" Oct 12 06:01:39 crc kubenswrapper[4930]: I1012 06:01:39.752179 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="25b0cb62-9d52-4ba1-9c00-3cda68c43da8" containerName="nova-cell1-conductor-db-sync" Oct 12 06:01:39 crc kubenswrapper[4930]: I1012 06:01:39.752448 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb27a965-e24a-4e4a-8577-5abb8c38de00" containerName="nova-scheduler-scheduler" Oct 12 06:01:39 crc kubenswrapper[4930]: I1012 06:01:39.753345 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 12 06:01:39 crc kubenswrapper[4930]: I1012 06:01:39.755267 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 12 06:01:39 crc kubenswrapper[4930]: I1012 06:01:39.777240 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.777208333 podStartE2EDuration="2.777208333s" podCreationTimestamp="2025-10-12 06:01:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 06:01:39.766356603 +0000 UTC m=+1232.308458368" watchObservedRunningTime="2025-10-12 06:01:39.777208333 +0000 UTC m=+1232.319310118" Oct 12 06:01:39 crc kubenswrapper[4930]: I1012 06:01:39.800386 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 12 06:01:39 crc kubenswrapper[4930]: I1012 06:01:39.905070 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 12 06:01:39 crc kubenswrapper[4930]: I1012 06:01:39.915083 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6981a83-f891-4520-8602-a51b9132dbfa-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"e6981a83-f891-4520-8602-a51b9132dbfa\") " pod="openstack/nova-cell1-conductor-0" Oct 12 06:01:39 crc kubenswrapper[4930]: I1012 06:01:39.915225 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6981a83-f891-4520-8602-a51b9132dbfa-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"e6981a83-f891-4520-8602-a51b9132dbfa\") " pod="openstack/nova-cell1-conductor-0" Oct 12 06:01:39 crc kubenswrapper[4930]: I1012 06:01:39.915416 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45nrr\" (UniqueName: \"kubernetes.io/projected/e6981a83-f891-4520-8602-a51b9132dbfa-kube-api-access-45nrr\") pod \"nova-cell1-conductor-0\" (UID: \"e6981a83-f891-4520-8602-a51b9132dbfa\") " pod="openstack/nova-cell1-conductor-0" Oct 12 06:01:39 crc kubenswrapper[4930]: I1012 06:01:39.917236 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 12 06:01:39 crc kubenswrapper[4930]: I1012 06:01:39.930863 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 12 06:01:39 crc kubenswrapper[4930]: I1012 06:01:39.932581 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 12 06:01:39 crc kubenswrapper[4930]: I1012 06:01:39.935793 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 12 06:01:39 crc kubenswrapper[4930]: I1012 06:01:39.937068 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 12 06:01:40 crc kubenswrapper[4930]: I1012 06:01:40.017607 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a446391-253a-4715-bdf8-709b778e76eb-config-data\") pod \"nova-scheduler-0\" (UID: \"7a446391-253a-4715-bdf8-709b778e76eb\") " pod="openstack/nova-scheduler-0" Oct 12 06:01:40 crc kubenswrapper[4930]: I1012 06:01:40.017686 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45nrr\" (UniqueName: \"kubernetes.io/projected/e6981a83-f891-4520-8602-a51b9132dbfa-kube-api-access-45nrr\") pod \"nova-cell1-conductor-0\" (UID: \"e6981a83-f891-4520-8602-a51b9132dbfa\") " pod="openstack/nova-cell1-conductor-0" Oct 12 06:01:40 crc kubenswrapper[4930]: I1012 06:01:40.017809 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6981a83-f891-4520-8602-a51b9132dbfa-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"e6981a83-f891-4520-8602-a51b9132dbfa\") " pod="openstack/nova-cell1-conductor-0" Oct 12 06:01:40 crc kubenswrapper[4930]: I1012 06:01:40.017849 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a446391-253a-4715-bdf8-709b778e76eb-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7a446391-253a-4715-bdf8-709b778e76eb\") " pod="openstack/nova-scheduler-0" Oct 12 06:01:40 crc kubenswrapper[4930]: I1012 06:01:40.017879 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsmfc\" (UniqueName: \"kubernetes.io/projected/7a446391-253a-4715-bdf8-709b778e76eb-kube-api-access-jsmfc\") pod \"nova-scheduler-0\" (UID: \"7a446391-253a-4715-bdf8-709b778e76eb\") " pod="openstack/nova-scheduler-0" Oct 12 06:01:40 crc kubenswrapper[4930]: I1012 06:01:40.017928 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6981a83-f891-4520-8602-a51b9132dbfa-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"e6981a83-f891-4520-8602-a51b9132dbfa\") " pod="openstack/nova-cell1-conductor-0" Oct 12 06:01:40 crc kubenswrapper[4930]: I1012 06:01:40.023674 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6981a83-f891-4520-8602-a51b9132dbfa-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"e6981a83-f891-4520-8602-a51b9132dbfa\") " pod="openstack/nova-cell1-conductor-0" Oct 12 06:01:40 crc kubenswrapper[4930]: I1012 06:01:40.039108 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6981a83-f891-4520-8602-a51b9132dbfa-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"e6981a83-f891-4520-8602-a51b9132dbfa\") " pod="openstack/nova-cell1-conductor-0" Oct 12 06:01:40 crc kubenswrapper[4930]: I1012 06:01:40.052401 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45nrr\" (UniqueName: \"kubernetes.io/projected/e6981a83-f891-4520-8602-a51b9132dbfa-kube-api-access-45nrr\") pod \"nova-cell1-conductor-0\" (UID: \"e6981a83-f891-4520-8602-a51b9132dbfa\") " pod="openstack/nova-cell1-conductor-0" Oct 12 06:01:40 crc kubenswrapper[4930]: I1012 06:01:40.119313 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a446391-253a-4715-bdf8-709b778e76eb-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7a446391-253a-4715-bdf8-709b778e76eb\") " pod="openstack/nova-scheduler-0" Oct 12 06:01:40 crc kubenswrapper[4930]: I1012 06:01:40.119360 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsmfc\" (UniqueName: \"kubernetes.io/projected/7a446391-253a-4715-bdf8-709b778e76eb-kube-api-access-jsmfc\") pod \"nova-scheduler-0\" (UID: \"7a446391-253a-4715-bdf8-709b778e76eb\") " pod="openstack/nova-scheduler-0" Oct 12 06:01:40 crc kubenswrapper[4930]: I1012 06:01:40.119406 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a446391-253a-4715-bdf8-709b778e76eb-config-data\") pod \"nova-scheduler-0\" (UID: \"7a446391-253a-4715-bdf8-709b778e76eb\") " pod="openstack/nova-scheduler-0" Oct 12 06:01:40 crc kubenswrapper[4930]: I1012 06:01:40.125250 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a446391-253a-4715-bdf8-709b778e76eb-config-data\") pod \"nova-scheduler-0\" (UID: \"7a446391-253a-4715-bdf8-709b778e76eb\") " pod="openstack/nova-scheduler-0" Oct 12 06:01:40 crc kubenswrapper[4930]: I1012 06:01:40.125549 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a446391-253a-4715-bdf8-709b778e76eb-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7a446391-253a-4715-bdf8-709b778e76eb\") " pod="openstack/nova-scheduler-0" Oct 12 06:01:40 crc kubenswrapper[4930]: I1012 06:01:40.146311 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsmfc\" (UniqueName: \"kubernetes.io/projected/7a446391-253a-4715-bdf8-709b778e76eb-kube-api-access-jsmfc\") pod \"nova-scheduler-0\" (UID: \"7a446391-253a-4715-bdf8-709b778e76eb\") " pod="openstack/nova-scheduler-0" Oct 12 06:01:40 crc kubenswrapper[4930]: I1012 06:01:40.146954 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb27a965-e24a-4e4a-8577-5abb8c38de00" path="/var/lib/kubelet/pods/eb27a965-e24a-4e4a-8577-5abb8c38de00/volumes" Oct 12 06:01:40 crc kubenswrapper[4930]: I1012 06:01:40.178088 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 12 06:01:40 crc kubenswrapper[4930]: I1012 06:01:40.184937 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 12 06:01:40 crc kubenswrapper[4930]: I1012 06:01:40.253164 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 12 06:01:40 crc kubenswrapper[4930]: I1012 06:01:40.327453 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e95b56f-80e3-44ae-b28b-f5cca7559e36-combined-ca-bundle\") pod \"3e95b56f-80e3-44ae-b28b-f5cca7559e36\" (UID: \"3e95b56f-80e3-44ae-b28b-f5cca7559e36\") " Oct 12 06:01:40 crc kubenswrapper[4930]: I1012 06:01:40.327927 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8s68t\" (UniqueName: \"kubernetes.io/projected/3e95b56f-80e3-44ae-b28b-f5cca7559e36-kube-api-access-8s68t\") pod \"3e95b56f-80e3-44ae-b28b-f5cca7559e36\" (UID: \"3e95b56f-80e3-44ae-b28b-f5cca7559e36\") " Oct 12 06:01:40 crc kubenswrapper[4930]: I1012 06:01:40.328022 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e95b56f-80e3-44ae-b28b-f5cca7559e36-config-data\") pod \"3e95b56f-80e3-44ae-b28b-f5cca7559e36\" (UID: \"3e95b56f-80e3-44ae-b28b-f5cca7559e36\") " Oct 12 06:01:40 crc kubenswrapper[4930]: I1012 06:01:40.328126 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e95b56f-80e3-44ae-b28b-f5cca7559e36-logs\") pod \"3e95b56f-80e3-44ae-b28b-f5cca7559e36\" (UID: \"3e95b56f-80e3-44ae-b28b-f5cca7559e36\") " Oct 12 06:01:40 crc kubenswrapper[4930]: I1012 06:01:40.328895 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e95b56f-80e3-44ae-b28b-f5cca7559e36-logs" (OuterVolumeSpecName: "logs") pod "3e95b56f-80e3-44ae-b28b-f5cca7559e36" (UID: "3e95b56f-80e3-44ae-b28b-f5cca7559e36"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 06:01:40 crc kubenswrapper[4930]: I1012 06:01:40.331793 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e95b56f-80e3-44ae-b28b-f5cca7559e36-kube-api-access-8s68t" (OuterVolumeSpecName: "kube-api-access-8s68t") pod "3e95b56f-80e3-44ae-b28b-f5cca7559e36" (UID: "3e95b56f-80e3-44ae-b28b-f5cca7559e36"). InnerVolumeSpecName "kube-api-access-8s68t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:01:40 crc kubenswrapper[4930]: I1012 06:01:40.381362 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e95b56f-80e3-44ae-b28b-f5cca7559e36-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3e95b56f-80e3-44ae-b28b-f5cca7559e36" (UID: "3e95b56f-80e3-44ae-b28b-f5cca7559e36"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:01:40 crc kubenswrapper[4930]: I1012 06:01:40.392434 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e95b56f-80e3-44ae-b28b-f5cca7559e36-config-data" (OuterVolumeSpecName: "config-data") pod "3e95b56f-80e3-44ae-b28b-f5cca7559e36" (UID: "3e95b56f-80e3-44ae-b28b-f5cca7559e36"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:01:40 crc kubenswrapper[4930]: I1012 06:01:40.430272 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8s68t\" (UniqueName: \"kubernetes.io/projected/3e95b56f-80e3-44ae-b28b-f5cca7559e36-kube-api-access-8s68t\") on node \"crc\" DevicePath \"\"" Oct 12 06:01:40 crc kubenswrapper[4930]: I1012 06:01:40.430869 4930 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e95b56f-80e3-44ae-b28b-f5cca7559e36-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 06:01:40 crc kubenswrapper[4930]: I1012 06:01:40.430898 4930 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e95b56f-80e3-44ae-b28b-f5cca7559e36-logs\") on node \"crc\" DevicePath \"\"" Oct 12 06:01:40 crc kubenswrapper[4930]: I1012 06:01:40.430912 4930 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e95b56f-80e3-44ae-b28b-f5cca7559e36-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 06:01:40 crc kubenswrapper[4930]: I1012 06:01:40.656481 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 12 06:01:40 crc kubenswrapper[4930]: I1012 06:01:40.761618 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"e6981a83-f891-4520-8602-a51b9132dbfa","Type":"ContainerStarted","Data":"13c948964fde6c870c3f1bdb4a32928201e7034fbc57936912737213f015aa81"} Oct 12 06:01:40 crc kubenswrapper[4930]: I1012 06:01:40.763577 4930 generic.go:334] "Generic (PLEG): container finished" podID="3e95b56f-80e3-44ae-b28b-f5cca7559e36" containerID="76869c73a7da45fa8450435eb5fa0e30782564f5763b06957d4aca277f70d01f" exitCode=0 Oct 12 06:01:40 crc kubenswrapper[4930]: I1012 06:01:40.763650 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 12 06:01:40 crc kubenswrapper[4930]: I1012 06:01:40.763681 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3e95b56f-80e3-44ae-b28b-f5cca7559e36","Type":"ContainerDied","Data":"76869c73a7da45fa8450435eb5fa0e30782564f5763b06957d4aca277f70d01f"} Oct 12 06:01:40 crc kubenswrapper[4930]: I1012 06:01:40.763760 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3e95b56f-80e3-44ae-b28b-f5cca7559e36","Type":"ContainerDied","Data":"1c48f36556bb35308e7ea8c47fe4ac7d0adb684f67c55272f194eee49b67132b"} Oct 12 06:01:40 crc kubenswrapper[4930]: I1012 06:01:40.763784 4930 scope.go:117] "RemoveContainer" containerID="76869c73a7da45fa8450435eb5fa0e30782564f5763b06957d4aca277f70d01f" Oct 12 06:01:40 crc kubenswrapper[4930]: I1012 06:01:40.811075 4930 scope.go:117] "RemoveContainer" containerID="4966612a69ce6e520c58e3c08429f711ad9f84f2c5fc5e33faf7901cd983e1a7" Oct 12 06:01:40 crc kubenswrapper[4930]: I1012 06:01:40.821774 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 12 06:01:40 crc kubenswrapper[4930]: I1012 06:01:40.855592 4930 scope.go:117] "RemoveContainer" containerID="76869c73a7da45fa8450435eb5fa0e30782564f5763b06957d4aca277f70d01f" Oct 12 06:01:40 crc kubenswrapper[4930]: I1012 06:01:40.857342 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 12 06:01:40 crc kubenswrapper[4930]: E1012 06:01:40.874537 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76869c73a7da45fa8450435eb5fa0e30782564f5763b06957d4aca277f70d01f\": container with ID starting with 76869c73a7da45fa8450435eb5fa0e30782564f5763b06957d4aca277f70d01f not found: ID does not exist" containerID="76869c73a7da45fa8450435eb5fa0e30782564f5763b06957d4aca277f70d01f" Oct 12 06:01:40 crc kubenswrapper[4930]: I1012 06:01:40.874598 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76869c73a7da45fa8450435eb5fa0e30782564f5763b06957d4aca277f70d01f"} err="failed to get container status \"76869c73a7da45fa8450435eb5fa0e30782564f5763b06957d4aca277f70d01f\": rpc error: code = NotFound desc = could not find container \"76869c73a7da45fa8450435eb5fa0e30782564f5763b06957d4aca277f70d01f\": container with ID starting with 76869c73a7da45fa8450435eb5fa0e30782564f5763b06957d4aca277f70d01f not found: ID does not exist" Oct 12 06:01:40 crc kubenswrapper[4930]: I1012 06:01:40.880948 4930 scope.go:117] "RemoveContainer" containerID="4966612a69ce6e520c58e3c08429f711ad9f84f2c5fc5e33faf7901cd983e1a7" Oct 12 06:01:40 crc kubenswrapper[4930]: E1012 06:01:40.881934 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4966612a69ce6e520c58e3c08429f711ad9f84f2c5fc5e33faf7901cd983e1a7\": container with ID starting with 4966612a69ce6e520c58e3c08429f711ad9f84f2c5fc5e33faf7901cd983e1a7 not found: ID does not exist" containerID="4966612a69ce6e520c58e3c08429f711ad9f84f2c5fc5e33faf7901cd983e1a7" Oct 12 06:01:40 crc kubenswrapper[4930]: I1012 06:01:40.881990 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4966612a69ce6e520c58e3c08429f711ad9f84f2c5fc5e33faf7901cd983e1a7"} err="failed to get container status \"4966612a69ce6e520c58e3c08429f711ad9f84f2c5fc5e33faf7901cd983e1a7\": rpc error: code = NotFound desc = could not find container \"4966612a69ce6e520c58e3c08429f711ad9f84f2c5fc5e33faf7901cd983e1a7\": container with ID starting with 4966612a69ce6e520c58e3c08429f711ad9f84f2c5fc5e33faf7901cd983e1a7 not found: ID does not exist" Oct 12 06:01:40 crc kubenswrapper[4930]: I1012 06:01:40.883060 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 12 06:01:40 crc kubenswrapper[4930]: I1012 06:01:40.899712 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 12 06:01:40 crc kubenswrapper[4930]: E1012 06:01:40.900183 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e95b56f-80e3-44ae-b28b-f5cca7559e36" containerName="nova-api-api" Oct 12 06:01:40 crc kubenswrapper[4930]: I1012 06:01:40.900201 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e95b56f-80e3-44ae-b28b-f5cca7559e36" containerName="nova-api-api" Oct 12 06:01:40 crc kubenswrapper[4930]: E1012 06:01:40.900244 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e95b56f-80e3-44ae-b28b-f5cca7559e36" containerName="nova-api-log" Oct 12 06:01:40 crc kubenswrapper[4930]: I1012 06:01:40.900252 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e95b56f-80e3-44ae-b28b-f5cca7559e36" containerName="nova-api-log" Oct 12 06:01:40 crc kubenswrapper[4930]: I1012 06:01:40.900520 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e95b56f-80e3-44ae-b28b-f5cca7559e36" containerName="nova-api-log" Oct 12 06:01:40 crc kubenswrapper[4930]: I1012 06:01:40.900559 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e95b56f-80e3-44ae-b28b-f5cca7559e36" containerName="nova-api-api" Oct 12 06:01:40 crc kubenswrapper[4930]: I1012 06:01:40.901955 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 12 06:01:40 crc kubenswrapper[4930]: I1012 06:01:40.904176 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 12 06:01:40 crc kubenswrapper[4930]: I1012 06:01:40.910876 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 12 06:01:41 crc kubenswrapper[4930]: I1012 06:01:41.046932 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35b08013-0d97-4592-b9ab-719b114a7d40-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"35b08013-0d97-4592-b9ab-719b114a7d40\") " pod="openstack/nova-api-0" Oct 12 06:01:41 crc kubenswrapper[4930]: I1012 06:01:41.047020 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcjcb\" (UniqueName: \"kubernetes.io/projected/35b08013-0d97-4592-b9ab-719b114a7d40-kube-api-access-mcjcb\") pod \"nova-api-0\" (UID: \"35b08013-0d97-4592-b9ab-719b114a7d40\") " pod="openstack/nova-api-0" Oct 12 06:01:41 crc kubenswrapper[4930]: I1012 06:01:41.047053 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35b08013-0d97-4592-b9ab-719b114a7d40-config-data\") pod \"nova-api-0\" (UID: \"35b08013-0d97-4592-b9ab-719b114a7d40\") " pod="openstack/nova-api-0" Oct 12 06:01:41 crc kubenswrapper[4930]: I1012 06:01:41.047109 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35b08013-0d97-4592-b9ab-719b114a7d40-logs\") pod \"nova-api-0\" (UID: \"35b08013-0d97-4592-b9ab-719b114a7d40\") " pod="openstack/nova-api-0" Oct 12 06:01:41 crc kubenswrapper[4930]: I1012 06:01:41.148819 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35b08013-0d97-4592-b9ab-719b114a7d40-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"35b08013-0d97-4592-b9ab-719b114a7d40\") " pod="openstack/nova-api-0" Oct 12 06:01:41 crc kubenswrapper[4930]: I1012 06:01:41.148974 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcjcb\" (UniqueName: \"kubernetes.io/projected/35b08013-0d97-4592-b9ab-719b114a7d40-kube-api-access-mcjcb\") pod \"nova-api-0\" (UID: \"35b08013-0d97-4592-b9ab-719b114a7d40\") " pod="openstack/nova-api-0" Oct 12 06:01:41 crc kubenswrapper[4930]: I1012 06:01:41.149029 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35b08013-0d97-4592-b9ab-719b114a7d40-config-data\") pod \"nova-api-0\" (UID: \"35b08013-0d97-4592-b9ab-719b114a7d40\") " pod="openstack/nova-api-0" Oct 12 06:01:41 crc kubenswrapper[4930]: I1012 06:01:41.149105 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35b08013-0d97-4592-b9ab-719b114a7d40-logs\") pod \"nova-api-0\" (UID: \"35b08013-0d97-4592-b9ab-719b114a7d40\") " pod="openstack/nova-api-0" Oct 12 06:01:41 crc kubenswrapper[4930]: I1012 06:01:41.149722 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35b08013-0d97-4592-b9ab-719b114a7d40-logs\") pod \"nova-api-0\" (UID: \"35b08013-0d97-4592-b9ab-719b114a7d40\") " pod="openstack/nova-api-0" Oct 12 06:01:41 crc kubenswrapper[4930]: I1012 06:01:41.152454 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35b08013-0d97-4592-b9ab-719b114a7d40-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"35b08013-0d97-4592-b9ab-719b114a7d40\") " pod="openstack/nova-api-0" Oct 12 06:01:41 crc kubenswrapper[4930]: I1012 06:01:41.153507 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35b08013-0d97-4592-b9ab-719b114a7d40-config-data\") pod \"nova-api-0\" (UID: \"35b08013-0d97-4592-b9ab-719b114a7d40\") " pod="openstack/nova-api-0" Oct 12 06:01:41 crc kubenswrapper[4930]: I1012 06:01:41.170392 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcjcb\" (UniqueName: \"kubernetes.io/projected/35b08013-0d97-4592-b9ab-719b114a7d40-kube-api-access-mcjcb\") pod \"nova-api-0\" (UID: \"35b08013-0d97-4592-b9ab-719b114a7d40\") " pod="openstack/nova-api-0" Oct 12 06:01:41 crc kubenswrapper[4930]: I1012 06:01:41.216297 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 12 06:01:41 crc kubenswrapper[4930]: I1012 06:01:41.775324 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"e6981a83-f891-4520-8602-a51b9132dbfa","Type":"ContainerStarted","Data":"96e7cb74f6671896cb683e93f3cc5774ed1573006a36a8d4af638b029f334d42"} Oct 12 06:01:41 crc kubenswrapper[4930]: I1012 06:01:41.775666 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 12 06:01:41 crc kubenswrapper[4930]: I1012 06:01:41.778323 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7a446391-253a-4715-bdf8-709b778e76eb","Type":"ContainerStarted","Data":"d42d56e1d7e3a0415895fd9f006c566a38f260bb3157b826f2b9b73cadc99a04"} Oct 12 06:01:41 crc kubenswrapper[4930]: I1012 06:01:41.778373 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7a446391-253a-4715-bdf8-709b778e76eb","Type":"ContainerStarted","Data":"5e636ee956dcad70db6db0954ab08f45ba462bfce8aeecbe70164c482f57c25f"} Oct 12 06:01:41 crc kubenswrapper[4930]: I1012 06:01:41.797848 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.797819299 podStartE2EDuration="2.797819299s" podCreationTimestamp="2025-10-12 06:01:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 06:01:41.792302521 +0000 UTC m=+1234.334404326" watchObservedRunningTime="2025-10-12 06:01:41.797819299 +0000 UTC m=+1234.339921094" Oct 12 06:01:41 crc kubenswrapper[4930]: I1012 06:01:41.819179 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.819158049 podStartE2EDuration="2.819158049s" podCreationTimestamp="2025-10-12 06:01:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 06:01:41.812886883 +0000 UTC m=+1234.354988648" watchObservedRunningTime="2025-10-12 06:01:41.819158049 +0000 UTC m=+1234.361259834" Oct 12 06:01:41 crc kubenswrapper[4930]: I1012 06:01:41.839214 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 12 06:01:42 crc kubenswrapper[4930]: I1012 06:01:42.148044 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e95b56f-80e3-44ae-b28b-f5cca7559e36" path="/var/lib/kubelet/pods/3e95b56f-80e3-44ae-b28b-f5cca7559e36/volumes" Oct 12 06:01:42 crc kubenswrapper[4930]: I1012 06:01:42.799934 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"35b08013-0d97-4592-b9ab-719b114a7d40","Type":"ContainerStarted","Data":"d61363c5fd4bf2976d83ab66b183851af2604054624d67858c46fbb89af0e6d8"} Oct 12 06:01:42 crc kubenswrapper[4930]: I1012 06:01:42.800276 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"35b08013-0d97-4592-b9ab-719b114a7d40","Type":"ContainerStarted","Data":"9d0d9c8fd3567d311b752575ae1cc2471fc35f9607e5dc17a24ef7061ea4b275"} Oct 12 06:01:42 crc kubenswrapper[4930]: I1012 06:01:42.800295 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"35b08013-0d97-4592-b9ab-719b114a7d40","Type":"ContainerStarted","Data":"9dc1ee23efdbbb0d8207f3b6e99aeb6e864160988cdf068da4b0bb171e132ff8"} Oct 12 06:01:42 crc kubenswrapper[4930]: I1012 06:01:42.829964 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.8299458079999997 podStartE2EDuration="2.829945808s" podCreationTimestamp="2025-10-12 06:01:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 06:01:42.826902972 +0000 UTC m=+1235.369004757" watchObservedRunningTime="2025-10-12 06:01:42.829945808 +0000 UTC m=+1235.372047583" Oct 12 06:01:43 crc kubenswrapper[4930]: I1012 06:01:43.081653 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 12 06:01:43 crc kubenswrapper[4930]: I1012 06:01:43.081702 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 12 06:01:45 crc kubenswrapper[4930]: I1012 06:01:45.229861 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 12 06:01:45 crc kubenswrapper[4930]: I1012 06:01:45.253802 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 12 06:01:48 crc kubenswrapper[4930]: I1012 06:01:48.081232 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 12 06:01:48 crc kubenswrapper[4930]: I1012 06:01:48.081585 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 12 06:01:49 crc kubenswrapper[4930]: I1012 06:01:49.096863 4930 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="0fb6258b-6691-4a25-b22c-ad2a5a03b167" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.218:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 12 06:01:49 crc kubenswrapper[4930]: I1012 06:01:49.096895 4930 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="0fb6258b-6691-4a25-b22c-ad2a5a03b167" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.218:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 12 06:01:50 crc kubenswrapper[4930]: I1012 06:01:50.254092 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 12 06:01:50 crc kubenswrapper[4930]: I1012 06:01:50.316802 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 12 06:01:50 crc kubenswrapper[4930]: I1012 06:01:50.970871 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 12 06:01:51 crc kubenswrapper[4930]: I1012 06:01:51.217424 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 12 06:01:51 crc kubenswrapper[4930]: I1012 06:01:51.217800 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 12 06:01:52 crc kubenswrapper[4930]: I1012 06:01:52.300143 4930 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="35b08013-0d97-4592-b9ab-719b114a7d40" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.221:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 12 06:01:52 crc kubenswrapper[4930]: I1012 06:01:52.300412 4930 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="35b08013-0d97-4592-b9ab-719b114a7d40" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.221:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 12 06:01:54 crc kubenswrapper[4930]: I1012 06:01:54.825231 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 12 06:01:58 crc kubenswrapper[4930]: I1012 06:01:58.086488 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 12 06:01:58 crc kubenswrapper[4930]: I1012 06:01:58.090842 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 12 06:01:58 crc kubenswrapper[4930]: I1012 06:01:58.092598 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 12 06:01:59 crc kubenswrapper[4930]: I1012 06:01:59.028521 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 12 06:02:01 crc kubenswrapper[4930]: I1012 06:02:01.061082 4930 generic.go:334] "Generic (PLEG): container finished" podID="69fb1eeb-e151-4db3-951e-473de5e1b0a9" containerID="413e0fc328ed4b1593c4e95b37dc172bf363e4aed76afddc02cb3ba4e8f2a92e" exitCode=137 Oct 12 06:02:01 crc kubenswrapper[4930]: I1012 06:02:01.061148 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"69fb1eeb-e151-4db3-951e-473de5e1b0a9","Type":"ContainerDied","Data":"413e0fc328ed4b1593c4e95b37dc172bf363e4aed76afddc02cb3ba4e8f2a92e"} Oct 12 06:02:01 crc kubenswrapper[4930]: I1012 06:02:01.065276 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"69fb1eeb-e151-4db3-951e-473de5e1b0a9","Type":"ContainerDied","Data":"71967ec97c8bbf32106f045002b326194efebc1b63418504a1702b9f8145385b"} Oct 12 06:02:01 crc kubenswrapper[4930]: I1012 06:02:01.065294 4930 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="71967ec97c8bbf32106f045002b326194efebc1b63418504a1702b9f8145385b" Oct 12 06:02:01 crc kubenswrapper[4930]: I1012 06:02:01.067706 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 12 06:02:01 crc kubenswrapper[4930]: I1012 06:02:01.186282 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69fb1eeb-e151-4db3-951e-473de5e1b0a9-combined-ca-bundle\") pod \"69fb1eeb-e151-4db3-951e-473de5e1b0a9\" (UID: \"69fb1eeb-e151-4db3-951e-473de5e1b0a9\") " Oct 12 06:02:01 crc kubenswrapper[4930]: I1012 06:02:01.186514 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69fb1eeb-e151-4db3-951e-473de5e1b0a9-config-data\") pod \"69fb1eeb-e151-4db3-951e-473de5e1b0a9\" (UID: \"69fb1eeb-e151-4db3-951e-473de5e1b0a9\") " Oct 12 06:02:01 crc kubenswrapper[4930]: I1012 06:02:01.186577 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxzrq\" (UniqueName: \"kubernetes.io/projected/69fb1eeb-e151-4db3-951e-473de5e1b0a9-kube-api-access-kxzrq\") pod \"69fb1eeb-e151-4db3-951e-473de5e1b0a9\" (UID: \"69fb1eeb-e151-4db3-951e-473de5e1b0a9\") " Oct 12 06:02:01 crc kubenswrapper[4930]: I1012 06:02:01.196420 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69fb1eeb-e151-4db3-951e-473de5e1b0a9-kube-api-access-kxzrq" (OuterVolumeSpecName: "kube-api-access-kxzrq") pod "69fb1eeb-e151-4db3-951e-473de5e1b0a9" (UID: "69fb1eeb-e151-4db3-951e-473de5e1b0a9"). InnerVolumeSpecName "kube-api-access-kxzrq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:02:01 crc kubenswrapper[4930]: I1012 06:02:01.224337 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 12 06:02:01 crc kubenswrapper[4930]: I1012 06:02:01.224873 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 12 06:02:01 crc kubenswrapper[4930]: I1012 06:02:01.234439 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69fb1eeb-e151-4db3-951e-473de5e1b0a9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "69fb1eeb-e151-4db3-951e-473de5e1b0a9" (UID: "69fb1eeb-e151-4db3-951e-473de5e1b0a9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:02:01 crc kubenswrapper[4930]: I1012 06:02:01.251127 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 12 06:02:01 crc kubenswrapper[4930]: I1012 06:02:01.253044 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69fb1eeb-e151-4db3-951e-473de5e1b0a9-config-data" (OuterVolumeSpecName: "config-data") pod "69fb1eeb-e151-4db3-951e-473de5e1b0a9" (UID: "69fb1eeb-e151-4db3-951e-473de5e1b0a9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:02:01 crc kubenswrapper[4930]: I1012 06:02:01.253210 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 12 06:02:01 crc kubenswrapper[4930]: I1012 06:02:01.289023 4930 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69fb1eeb-e151-4db3-951e-473de5e1b0a9-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 06:02:01 crc kubenswrapper[4930]: I1012 06:02:01.289402 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxzrq\" (UniqueName: \"kubernetes.io/projected/69fb1eeb-e151-4db3-951e-473de5e1b0a9-kube-api-access-kxzrq\") on node \"crc\" DevicePath \"\"" Oct 12 06:02:01 crc kubenswrapper[4930]: I1012 06:02:01.289416 4930 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69fb1eeb-e151-4db3-951e-473de5e1b0a9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 06:02:02 crc kubenswrapper[4930]: I1012 06:02:02.077799 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 12 06:02:02 crc kubenswrapper[4930]: I1012 06:02:02.078769 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 12 06:02:02 crc kubenswrapper[4930]: I1012 06:02:02.093376 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 12 06:02:02 crc kubenswrapper[4930]: I1012 06:02:02.165436 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 12 06:02:02 crc kubenswrapper[4930]: I1012 06:02:02.185199 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 12 06:02:02 crc kubenswrapper[4930]: I1012 06:02:02.200288 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 12 06:02:02 crc kubenswrapper[4930]: E1012 06:02:02.200752 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69fb1eeb-e151-4db3-951e-473de5e1b0a9" containerName="nova-cell1-novncproxy-novncproxy" Oct 12 06:02:02 crc kubenswrapper[4930]: I1012 06:02:02.200771 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="69fb1eeb-e151-4db3-951e-473de5e1b0a9" containerName="nova-cell1-novncproxy-novncproxy" Oct 12 06:02:02 crc kubenswrapper[4930]: I1012 06:02:02.200986 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="69fb1eeb-e151-4db3-951e-473de5e1b0a9" containerName="nova-cell1-novncproxy-novncproxy" Oct 12 06:02:02 crc kubenswrapper[4930]: I1012 06:02:02.201766 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 12 06:02:02 crc kubenswrapper[4930]: I1012 06:02:02.207564 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 12 06:02:02 crc kubenswrapper[4930]: I1012 06:02:02.207787 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Oct 12 06:02:02 crc kubenswrapper[4930]: I1012 06:02:02.207846 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Oct 12 06:02:02 crc kubenswrapper[4930]: I1012 06:02:02.224055 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 12 06:02:02 crc kubenswrapper[4930]: I1012 06:02:02.309567 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e40e826-773b-46e2-aa5d-d1efe925bf9f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"3e40e826-773b-46e2-aa5d-d1efe925bf9f\") " pod="openstack/nova-cell1-novncproxy-0" Oct 12 06:02:02 crc kubenswrapper[4930]: I1012 06:02:02.310017 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e40e826-773b-46e2-aa5d-d1efe925bf9f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"3e40e826-773b-46e2-aa5d-d1efe925bf9f\") " pod="openstack/nova-cell1-novncproxy-0" Oct 12 06:02:02 crc kubenswrapper[4930]: I1012 06:02:02.310372 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e40e826-773b-46e2-aa5d-d1efe925bf9f-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"3e40e826-773b-46e2-aa5d-d1efe925bf9f\") " pod="openstack/nova-cell1-novncproxy-0" Oct 12 06:02:02 crc kubenswrapper[4930]: I1012 06:02:02.310545 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e40e826-773b-46e2-aa5d-d1efe925bf9f-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"3e40e826-773b-46e2-aa5d-d1efe925bf9f\") " pod="openstack/nova-cell1-novncproxy-0" Oct 12 06:02:02 crc kubenswrapper[4930]: I1012 06:02:02.310582 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lcnd\" (UniqueName: \"kubernetes.io/projected/3e40e826-773b-46e2-aa5d-d1efe925bf9f-kube-api-access-8lcnd\") pod \"nova-cell1-novncproxy-0\" (UID: \"3e40e826-773b-46e2-aa5d-d1efe925bf9f\") " pod="openstack/nova-cell1-novncproxy-0" Oct 12 06:02:02 crc kubenswrapper[4930]: I1012 06:02:02.343321 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-54599d8f7-xv8dr"] Oct 12 06:02:02 crc kubenswrapper[4930]: I1012 06:02:02.345057 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54599d8f7-xv8dr" Oct 12 06:02:02 crc kubenswrapper[4930]: I1012 06:02:02.355116 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54599d8f7-xv8dr"] Oct 12 06:02:02 crc kubenswrapper[4930]: I1012 06:02:02.414228 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e40e826-773b-46e2-aa5d-d1efe925bf9f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"3e40e826-773b-46e2-aa5d-d1efe925bf9f\") " pod="openstack/nova-cell1-novncproxy-0" Oct 12 06:02:02 crc kubenswrapper[4930]: I1012 06:02:02.414330 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e40e826-773b-46e2-aa5d-d1efe925bf9f-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"3e40e826-773b-46e2-aa5d-d1efe925bf9f\") " pod="openstack/nova-cell1-novncproxy-0" Oct 12 06:02:02 crc kubenswrapper[4930]: I1012 06:02:02.414384 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e40e826-773b-46e2-aa5d-d1efe925bf9f-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"3e40e826-773b-46e2-aa5d-d1efe925bf9f\") " pod="openstack/nova-cell1-novncproxy-0" Oct 12 06:02:02 crc kubenswrapper[4930]: I1012 06:02:02.414407 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lcnd\" (UniqueName: \"kubernetes.io/projected/3e40e826-773b-46e2-aa5d-d1efe925bf9f-kube-api-access-8lcnd\") pod \"nova-cell1-novncproxy-0\" (UID: \"3e40e826-773b-46e2-aa5d-d1efe925bf9f\") " pod="openstack/nova-cell1-novncproxy-0" Oct 12 06:02:02 crc kubenswrapper[4930]: I1012 06:02:02.414446 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e40e826-773b-46e2-aa5d-d1efe925bf9f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"3e40e826-773b-46e2-aa5d-d1efe925bf9f\") " pod="openstack/nova-cell1-novncproxy-0" Oct 12 06:02:02 crc kubenswrapper[4930]: I1012 06:02:02.426161 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e40e826-773b-46e2-aa5d-d1efe925bf9f-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"3e40e826-773b-46e2-aa5d-d1efe925bf9f\") " pod="openstack/nova-cell1-novncproxy-0" Oct 12 06:02:02 crc kubenswrapper[4930]: I1012 06:02:02.426301 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e40e826-773b-46e2-aa5d-d1efe925bf9f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"3e40e826-773b-46e2-aa5d-d1efe925bf9f\") " pod="openstack/nova-cell1-novncproxy-0" Oct 12 06:02:02 crc kubenswrapper[4930]: I1012 06:02:02.426705 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e40e826-773b-46e2-aa5d-d1efe925bf9f-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"3e40e826-773b-46e2-aa5d-d1efe925bf9f\") " pod="openstack/nova-cell1-novncproxy-0" Oct 12 06:02:02 crc kubenswrapper[4930]: I1012 06:02:02.426786 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e40e826-773b-46e2-aa5d-d1efe925bf9f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"3e40e826-773b-46e2-aa5d-d1efe925bf9f\") " pod="openstack/nova-cell1-novncproxy-0" Oct 12 06:02:02 crc kubenswrapper[4930]: I1012 06:02:02.442474 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lcnd\" (UniqueName: \"kubernetes.io/projected/3e40e826-773b-46e2-aa5d-d1efe925bf9f-kube-api-access-8lcnd\") pod \"nova-cell1-novncproxy-0\" (UID: \"3e40e826-773b-46e2-aa5d-d1efe925bf9f\") " pod="openstack/nova-cell1-novncproxy-0" Oct 12 06:02:02 crc kubenswrapper[4930]: I1012 06:02:02.516003 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/57ae0cc6-55ee-4117-8d32-e82672b2ea46-ovsdbserver-nb\") pod \"dnsmasq-dns-54599d8f7-xv8dr\" (UID: \"57ae0cc6-55ee-4117-8d32-e82672b2ea46\") " pod="openstack/dnsmasq-dns-54599d8f7-xv8dr" Oct 12 06:02:02 crc kubenswrapper[4930]: I1012 06:02:02.516068 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57ae0cc6-55ee-4117-8d32-e82672b2ea46-config\") pod \"dnsmasq-dns-54599d8f7-xv8dr\" (UID: \"57ae0cc6-55ee-4117-8d32-e82672b2ea46\") " pod="openstack/dnsmasq-dns-54599d8f7-xv8dr" Oct 12 06:02:02 crc kubenswrapper[4930]: I1012 06:02:02.516088 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gvlh\" (UniqueName: \"kubernetes.io/projected/57ae0cc6-55ee-4117-8d32-e82672b2ea46-kube-api-access-8gvlh\") pod \"dnsmasq-dns-54599d8f7-xv8dr\" (UID: \"57ae0cc6-55ee-4117-8d32-e82672b2ea46\") " pod="openstack/dnsmasq-dns-54599d8f7-xv8dr" Oct 12 06:02:02 crc kubenswrapper[4930]: I1012 06:02:02.516234 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/57ae0cc6-55ee-4117-8d32-e82672b2ea46-ovsdbserver-sb\") pod \"dnsmasq-dns-54599d8f7-xv8dr\" (UID: \"57ae0cc6-55ee-4117-8d32-e82672b2ea46\") " pod="openstack/dnsmasq-dns-54599d8f7-xv8dr" Oct 12 06:02:02 crc kubenswrapper[4930]: I1012 06:02:02.516283 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/57ae0cc6-55ee-4117-8d32-e82672b2ea46-dns-swift-storage-0\") pod \"dnsmasq-dns-54599d8f7-xv8dr\" (UID: \"57ae0cc6-55ee-4117-8d32-e82672b2ea46\") " pod="openstack/dnsmasq-dns-54599d8f7-xv8dr" Oct 12 06:02:02 crc kubenswrapper[4930]: I1012 06:02:02.516319 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/57ae0cc6-55ee-4117-8d32-e82672b2ea46-dns-svc\") pod \"dnsmasq-dns-54599d8f7-xv8dr\" (UID: \"57ae0cc6-55ee-4117-8d32-e82672b2ea46\") " pod="openstack/dnsmasq-dns-54599d8f7-xv8dr" Oct 12 06:02:02 crc kubenswrapper[4930]: I1012 06:02:02.540365 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 12 06:02:02 crc kubenswrapper[4930]: I1012 06:02:02.617669 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/57ae0cc6-55ee-4117-8d32-e82672b2ea46-ovsdbserver-nb\") pod \"dnsmasq-dns-54599d8f7-xv8dr\" (UID: \"57ae0cc6-55ee-4117-8d32-e82672b2ea46\") " pod="openstack/dnsmasq-dns-54599d8f7-xv8dr" Oct 12 06:02:02 crc kubenswrapper[4930]: I1012 06:02:02.617750 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57ae0cc6-55ee-4117-8d32-e82672b2ea46-config\") pod \"dnsmasq-dns-54599d8f7-xv8dr\" (UID: \"57ae0cc6-55ee-4117-8d32-e82672b2ea46\") " pod="openstack/dnsmasq-dns-54599d8f7-xv8dr" Oct 12 06:02:02 crc kubenswrapper[4930]: I1012 06:02:02.617771 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gvlh\" (UniqueName: \"kubernetes.io/projected/57ae0cc6-55ee-4117-8d32-e82672b2ea46-kube-api-access-8gvlh\") pod \"dnsmasq-dns-54599d8f7-xv8dr\" (UID: \"57ae0cc6-55ee-4117-8d32-e82672b2ea46\") " pod="openstack/dnsmasq-dns-54599d8f7-xv8dr" Oct 12 06:02:02 crc kubenswrapper[4930]: I1012 06:02:02.617832 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/57ae0cc6-55ee-4117-8d32-e82672b2ea46-ovsdbserver-sb\") pod \"dnsmasq-dns-54599d8f7-xv8dr\" (UID: \"57ae0cc6-55ee-4117-8d32-e82672b2ea46\") " pod="openstack/dnsmasq-dns-54599d8f7-xv8dr" Oct 12 06:02:02 crc kubenswrapper[4930]: I1012 06:02:02.617877 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/57ae0cc6-55ee-4117-8d32-e82672b2ea46-dns-swift-storage-0\") pod \"dnsmasq-dns-54599d8f7-xv8dr\" (UID: \"57ae0cc6-55ee-4117-8d32-e82672b2ea46\") " pod="openstack/dnsmasq-dns-54599d8f7-xv8dr" Oct 12 06:02:02 crc kubenswrapper[4930]: I1012 06:02:02.617913 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/57ae0cc6-55ee-4117-8d32-e82672b2ea46-dns-svc\") pod \"dnsmasq-dns-54599d8f7-xv8dr\" (UID: \"57ae0cc6-55ee-4117-8d32-e82672b2ea46\") " pod="openstack/dnsmasq-dns-54599d8f7-xv8dr" Oct 12 06:02:02 crc kubenswrapper[4930]: I1012 06:02:02.618720 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/57ae0cc6-55ee-4117-8d32-e82672b2ea46-dns-svc\") pod \"dnsmasq-dns-54599d8f7-xv8dr\" (UID: \"57ae0cc6-55ee-4117-8d32-e82672b2ea46\") " pod="openstack/dnsmasq-dns-54599d8f7-xv8dr" Oct 12 06:02:02 crc kubenswrapper[4930]: I1012 06:02:02.618777 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57ae0cc6-55ee-4117-8d32-e82672b2ea46-config\") pod \"dnsmasq-dns-54599d8f7-xv8dr\" (UID: \"57ae0cc6-55ee-4117-8d32-e82672b2ea46\") " pod="openstack/dnsmasq-dns-54599d8f7-xv8dr" Oct 12 06:02:02 crc kubenswrapper[4930]: I1012 06:02:02.618964 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/57ae0cc6-55ee-4117-8d32-e82672b2ea46-dns-swift-storage-0\") pod \"dnsmasq-dns-54599d8f7-xv8dr\" (UID: \"57ae0cc6-55ee-4117-8d32-e82672b2ea46\") " pod="openstack/dnsmasq-dns-54599d8f7-xv8dr" Oct 12 06:02:02 crc kubenswrapper[4930]: I1012 06:02:02.618720 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/57ae0cc6-55ee-4117-8d32-e82672b2ea46-ovsdbserver-nb\") pod \"dnsmasq-dns-54599d8f7-xv8dr\" (UID: \"57ae0cc6-55ee-4117-8d32-e82672b2ea46\") " pod="openstack/dnsmasq-dns-54599d8f7-xv8dr" Oct 12 06:02:02 crc kubenswrapper[4930]: I1012 06:02:02.619548 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/57ae0cc6-55ee-4117-8d32-e82672b2ea46-ovsdbserver-sb\") pod \"dnsmasq-dns-54599d8f7-xv8dr\" (UID: \"57ae0cc6-55ee-4117-8d32-e82672b2ea46\") " pod="openstack/dnsmasq-dns-54599d8f7-xv8dr" Oct 12 06:02:02 crc kubenswrapper[4930]: I1012 06:02:02.638047 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gvlh\" (UniqueName: \"kubernetes.io/projected/57ae0cc6-55ee-4117-8d32-e82672b2ea46-kube-api-access-8gvlh\") pod \"dnsmasq-dns-54599d8f7-xv8dr\" (UID: \"57ae0cc6-55ee-4117-8d32-e82672b2ea46\") " pod="openstack/dnsmasq-dns-54599d8f7-xv8dr" Oct 12 06:02:02 crc kubenswrapper[4930]: I1012 06:02:02.665996 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54599d8f7-xv8dr" Oct 12 06:02:03 crc kubenswrapper[4930]: W1012 06:02:03.005689 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e40e826_773b_46e2_aa5d_d1efe925bf9f.slice/crio-c0e04069eae1a4822baa279ed1b4e3c6ab7af6946234ddbb859324e8d13b92f1 WatchSource:0}: Error finding container c0e04069eae1a4822baa279ed1b4e3c6ab7af6946234ddbb859324e8d13b92f1: Status 404 returned error can't find the container with id c0e04069eae1a4822baa279ed1b4e3c6ab7af6946234ddbb859324e8d13b92f1 Oct 12 06:02:03 crc kubenswrapper[4930]: I1012 06:02:03.008799 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 12 06:02:03 crc kubenswrapper[4930]: I1012 06:02:03.088637 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"3e40e826-773b-46e2-aa5d-d1efe925bf9f","Type":"ContainerStarted","Data":"c0e04069eae1a4822baa279ed1b4e3c6ab7af6946234ddbb859324e8d13b92f1"} Oct 12 06:02:03 crc kubenswrapper[4930]: I1012 06:02:03.175528 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54599d8f7-xv8dr"] Oct 12 06:02:03 crc kubenswrapper[4930]: W1012 06:02:03.183093 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod57ae0cc6_55ee_4117_8d32_e82672b2ea46.slice/crio-487a561894a95fc1abee995a6090193bbae294233ea07dd5afece6d49d553e6b WatchSource:0}: Error finding container 487a561894a95fc1abee995a6090193bbae294233ea07dd5afece6d49d553e6b: Status 404 returned error can't find the container with id 487a561894a95fc1abee995a6090193bbae294233ea07dd5afece6d49d553e6b Oct 12 06:02:04 crc kubenswrapper[4930]: I1012 06:02:04.098966 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"3e40e826-773b-46e2-aa5d-d1efe925bf9f","Type":"ContainerStarted","Data":"de7577b8048f8a4d833e584f269182f0b109ed29c1cb3fde976094aeb93f17fe"} Oct 12 06:02:04 crc kubenswrapper[4930]: I1012 06:02:04.100645 4930 generic.go:334] "Generic (PLEG): container finished" podID="57ae0cc6-55ee-4117-8d32-e82672b2ea46" containerID="014783dc84fc370fcb7fb3501ebc3ae96744624044bc2b495d7cd6520b66b768" exitCode=0 Oct 12 06:02:04 crc kubenswrapper[4930]: I1012 06:02:04.100719 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54599d8f7-xv8dr" event={"ID":"57ae0cc6-55ee-4117-8d32-e82672b2ea46","Type":"ContainerDied","Data":"014783dc84fc370fcb7fb3501ebc3ae96744624044bc2b495d7cd6520b66b768"} Oct 12 06:02:04 crc kubenswrapper[4930]: I1012 06:02:04.100815 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54599d8f7-xv8dr" event={"ID":"57ae0cc6-55ee-4117-8d32-e82672b2ea46","Type":"ContainerStarted","Data":"487a561894a95fc1abee995a6090193bbae294233ea07dd5afece6d49d553e6b"} Oct 12 06:02:04 crc kubenswrapper[4930]: I1012 06:02:04.116531 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.11650947 podStartE2EDuration="2.11650947s" podCreationTimestamp="2025-10-12 06:02:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 06:02:04.114187412 +0000 UTC m=+1256.656289217" watchObservedRunningTime="2025-10-12 06:02:04.11650947 +0000 UTC m=+1256.658611265" Oct 12 06:02:04 crc kubenswrapper[4930]: I1012 06:02:04.153595 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69fb1eeb-e151-4db3-951e-473de5e1b0a9" path="/var/lib/kubelet/pods/69fb1eeb-e151-4db3-951e-473de5e1b0a9/volumes" Oct 12 06:02:04 crc kubenswrapper[4930]: I1012 06:02:04.392896 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 12 06:02:04 crc kubenswrapper[4930]: I1012 06:02:04.393401 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2c04de04-91cf-4e56-85be-6a95e92f0d73" containerName="ceilometer-central-agent" containerID="cri-o://45c450a08e3387e612e626b73f1913a4bfe0b896d87b1faf8bc9ea2296001eff" gracePeriod=30 Oct 12 06:02:04 crc kubenswrapper[4930]: I1012 06:02:04.393772 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2c04de04-91cf-4e56-85be-6a95e92f0d73" containerName="proxy-httpd" containerID="cri-o://7037d31dfb64aca5d14378c2494a85253062b17dd88418591c82c2dc00193a04" gracePeriod=30 Oct 12 06:02:04 crc kubenswrapper[4930]: I1012 06:02:04.393875 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2c04de04-91cf-4e56-85be-6a95e92f0d73" containerName="ceilometer-notification-agent" containerID="cri-o://1945b8ffd802b593294d77243e90298b6713eed95da1501888c5fe65d73f1636" gracePeriod=30 Oct 12 06:02:04 crc kubenswrapper[4930]: I1012 06:02:04.394714 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2c04de04-91cf-4e56-85be-6a95e92f0d73" containerName="sg-core" containerID="cri-o://1b5ac694c6e912580320b240c2b12c5ce9262d5cb406ab4cacfb06a6d9403406" gracePeriod=30 Oct 12 06:02:04 crc kubenswrapper[4930]: I1012 06:02:04.871201 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 12 06:02:05 crc kubenswrapper[4930]: I1012 06:02:05.111519 4930 generic.go:334] "Generic (PLEG): container finished" podID="2c04de04-91cf-4e56-85be-6a95e92f0d73" containerID="7037d31dfb64aca5d14378c2494a85253062b17dd88418591c82c2dc00193a04" exitCode=0 Oct 12 06:02:05 crc kubenswrapper[4930]: I1012 06:02:05.111557 4930 generic.go:334] "Generic (PLEG): container finished" podID="2c04de04-91cf-4e56-85be-6a95e92f0d73" containerID="1b5ac694c6e912580320b240c2b12c5ce9262d5cb406ab4cacfb06a6d9403406" exitCode=2 Oct 12 06:02:05 crc kubenswrapper[4930]: I1012 06:02:05.111565 4930 generic.go:334] "Generic (PLEG): container finished" podID="2c04de04-91cf-4e56-85be-6a95e92f0d73" containerID="45c450a08e3387e612e626b73f1913a4bfe0b896d87b1faf8bc9ea2296001eff" exitCode=0 Oct 12 06:02:05 crc kubenswrapper[4930]: I1012 06:02:05.111599 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2c04de04-91cf-4e56-85be-6a95e92f0d73","Type":"ContainerDied","Data":"7037d31dfb64aca5d14378c2494a85253062b17dd88418591c82c2dc00193a04"} Oct 12 06:02:05 crc kubenswrapper[4930]: I1012 06:02:05.111623 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2c04de04-91cf-4e56-85be-6a95e92f0d73","Type":"ContainerDied","Data":"1b5ac694c6e912580320b240c2b12c5ce9262d5cb406ab4cacfb06a6d9403406"} Oct 12 06:02:05 crc kubenswrapper[4930]: I1012 06:02:05.111632 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2c04de04-91cf-4e56-85be-6a95e92f0d73","Type":"ContainerDied","Data":"45c450a08e3387e612e626b73f1913a4bfe0b896d87b1faf8bc9ea2296001eff"} Oct 12 06:02:05 crc kubenswrapper[4930]: I1012 06:02:05.113640 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54599d8f7-xv8dr" event={"ID":"57ae0cc6-55ee-4117-8d32-e82672b2ea46","Type":"ContainerStarted","Data":"394b48bb4d59d15ea85d2840ac217d8379fd9c15434e3fb55aafe3a72687a4f8"} Oct 12 06:02:05 crc kubenswrapper[4930]: I1012 06:02:05.113888 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="35b08013-0d97-4592-b9ab-719b114a7d40" containerName="nova-api-log" containerID="cri-o://9d0d9c8fd3567d311b752575ae1cc2471fc35f9607e5dc17a24ef7061ea4b275" gracePeriod=30 Oct 12 06:02:05 crc kubenswrapper[4930]: I1012 06:02:05.114086 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="35b08013-0d97-4592-b9ab-719b114a7d40" containerName="nova-api-api" containerID="cri-o://d61363c5fd4bf2976d83ab66b183851af2604054624d67858c46fbb89af0e6d8" gracePeriod=30 Oct 12 06:02:05 crc kubenswrapper[4930]: I1012 06:02:05.114246 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-54599d8f7-xv8dr" Oct 12 06:02:06 crc kubenswrapper[4930]: I1012 06:02:06.126599 4930 generic.go:334] "Generic (PLEG): container finished" podID="35b08013-0d97-4592-b9ab-719b114a7d40" containerID="d61363c5fd4bf2976d83ab66b183851af2604054624d67858c46fbb89af0e6d8" exitCode=0 Oct 12 06:02:06 crc kubenswrapper[4930]: I1012 06:02:06.126926 4930 generic.go:334] "Generic (PLEG): container finished" podID="35b08013-0d97-4592-b9ab-719b114a7d40" containerID="9d0d9c8fd3567d311b752575ae1cc2471fc35f9607e5dc17a24ef7061ea4b275" exitCode=143 Oct 12 06:02:06 crc kubenswrapper[4930]: I1012 06:02:06.126801 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"35b08013-0d97-4592-b9ab-719b114a7d40","Type":"ContainerDied","Data":"d61363c5fd4bf2976d83ab66b183851af2604054624d67858c46fbb89af0e6d8"} Oct 12 06:02:06 crc kubenswrapper[4930]: I1012 06:02:06.127194 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"35b08013-0d97-4592-b9ab-719b114a7d40","Type":"ContainerDied","Data":"9d0d9c8fd3567d311b752575ae1cc2471fc35f9607e5dc17a24ef7061ea4b275"} Oct 12 06:02:06 crc kubenswrapper[4930]: I1012 06:02:06.467456 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 12 06:02:06 crc kubenswrapper[4930]: I1012 06:02:06.491458 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-54599d8f7-xv8dr" podStartSLOduration=4.4914370869999996 podStartE2EDuration="4.491437087s" podCreationTimestamp="2025-10-12 06:02:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 06:02:05.147160612 +0000 UTC m=+1257.689262377" watchObservedRunningTime="2025-10-12 06:02:06.491437087 +0000 UTC m=+1259.033538852" Oct 12 06:02:06 crc kubenswrapper[4930]: I1012 06:02:06.616712 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mcjcb\" (UniqueName: \"kubernetes.io/projected/35b08013-0d97-4592-b9ab-719b114a7d40-kube-api-access-mcjcb\") pod \"35b08013-0d97-4592-b9ab-719b114a7d40\" (UID: \"35b08013-0d97-4592-b9ab-719b114a7d40\") " Oct 12 06:02:06 crc kubenswrapper[4930]: I1012 06:02:06.616881 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35b08013-0d97-4592-b9ab-719b114a7d40-config-data\") pod \"35b08013-0d97-4592-b9ab-719b114a7d40\" (UID: \"35b08013-0d97-4592-b9ab-719b114a7d40\") " Oct 12 06:02:06 crc kubenswrapper[4930]: I1012 06:02:06.617099 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35b08013-0d97-4592-b9ab-719b114a7d40-logs\") pod \"35b08013-0d97-4592-b9ab-719b114a7d40\" (UID: \"35b08013-0d97-4592-b9ab-719b114a7d40\") " Oct 12 06:02:06 crc kubenswrapper[4930]: I1012 06:02:06.617186 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35b08013-0d97-4592-b9ab-719b114a7d40-combined-ca-bundle\") pod \"35b08013-0d97-4592-b9ab-719b114a7d40\" (UID: \"35b08013-0d97-4592-b9ab-719b114a7d40\") " Oct 12 06:02:06 crc kubenswrapper[4930]: I1012 06:02:06.620386 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35b08013-0d97-4592-b9ab-719b114a7d40-logs" (OuterVolumeSpecName: "logs") pod "35b08013-0d97-4592-b9ab-719b114a7d40" (UID: "35b08013-0d97-4592-b9ab-719b114a7d40"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 06:02:06 crc kubenswrapper[4930]: I1012 06:02:06.625164 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35b08013-0d97-4592-b9ab-719b114a7d40-kube-api-access-mcjcb" (OuterVolumeSpecName: "kube-api-access-mcjcb") pod "35b08013-0d97-4592-b9ab-719b114a7d40" (UID: "35b08013-0d97-4592-b9ab-719b114a7d40"). InnerVolumeSpecName "kube-api-access-mcjcb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:02:06 crc kubenswrapper[4930]: I1012 06:02:06.647061 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35b08013-0d97-4592-b9ab-719b114a7d40-config-data" (OuterVolumeSpecName: "config-data") pod "35b08013-0d97-4592-b9ab-719b114a7d40" (UID: "35b08013-0d97-4592-b9ab-719b114a7d40"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:02:06 crc kubenswrapper[4930]: I1012 06:02:06.659288 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35b08013-0d97-4592-b9ab-719b114a7d40-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "35b08013-0d97-4592-b9ab-719b114a7d40" (UID: "35b08013-0d97-4592-b9ab-719b114a7d40"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:02:06 crc kubenswrapper[4930]: I1012 06:02:06.719945 4930 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35b08013-0d97-4592-b9ab-719b114a7d40-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 06:02:06 crc kubenswrapper[4930]: I1012 06:02:06.719986 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mcjcb\" (UniqueName: \"kubernetes.io/projected/35b08013-0d97-4592-b9ab-719b114a7d40-kube-api-access-mcjcb\") on node \"crc\" DevicePath \"\"" Oct 12 06:02:06 crc kubenswrapper[4930]: I1012 06:02:06.719998 4930 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35b08013-0d97-4592-b9ab-719b114a7d40-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 06:02:06 crc kubenswrapper[4930]: I1012 06:02:06.720008 4930 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35b08013-0d97-4592-b9ab-719b114a7d40-logs\") on node \"crc\" DevicePath \"\"" Oct 12 06:02:07 crc kubenswrapper[4930]: I1012 06:02:07.142071 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"35b08013-0d97-4592-b9ab-719b114a7d40","Type":"ContainerDied","Data":"9dc1ee23efdbbb0d8207f3b6e99aeb6e864160988cdf068da4b0bb171e132ff8"} Oct 12 06:02:07 crc kubenswrapper[4930]: I1012 06:02:07.142124 4930 scope.go:117] "RemoveContainer" containerID="d61363c5fd4bf2976d83ab66b183851af2604054624d67858c46fbb89af0e6d8" Oct 12 06:02:07 crc kubenswrapper[4930]: I1012 06:02:07.142178 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 12 06:02:07 crc kubenswrapper[4930]: I1012 06:02:07.205349 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 12 06:02:07 crc kubenswrapper[4930]: I1012 06:02:07.225682 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 12 06:02:07 crc kubenswrapper[4930]: I1012 06:02:07.230549 4930 scope.go:117] "RemoveContainer" containerID="9d0d9c8fd3567d311b752575ae1cc2471fc35f9607e5dc17a24ef7061ea4b275" Oct 12 06:02:07 crc kubenswrapper[4930]: I1012 06:02:07.241358 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 12 06:02:07 crc kubenswrapper[4930]: E1012 06:02:07.241832 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35b08013-0d97-4592-b9ab-719b114a7d40" containerName="nova-api-log" Oct 12 06:02:07 crc kubenswrapper[4930]: I1012 06:02:07.241852 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="35b08013-0d97-4592-b9ab-719b114a7d40" containerName="nova-api-log" Oct 12 06:02:07 crc kubenswrapper[4930]: E1012 06:02:07.241884 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35b08013-0d97-4592-b9ab-719b114a7d40" containerName="nova-api-api" Oct 12 06:02:07 crc kubenswrapper[4930]: I1012 06:02:07.241894 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="35b08013-0d97-4592-b9ab-719b114a7d40" containerName="nova-api-api" Oct 12 06:02:07 crc kubenswrapper[4930]: I1012 06:02:07.242137 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="35b08013-0d97-4592-b9ab-719b114a7d40" containerName="nova-api-api" Oct 12 06:02:07 crc kubenswrapper[4930]: I1012 06:02:07.242175 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="35b08013-0d97-4592-b9ab-719b114a7d40" containerName="nova-api-log" Oct 12 06:02:07 crc kubenswrapper[4930]: I1012 06:02:07.243544 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 12 06:02:07 crc kubenswrapper[4930]: I1012 06:02:07.249929 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 12 06:02:07 crc kubenswrapper[4930]: I1012 06:02:07.249929 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 12 06:02:07 crc kubenswrapper[4930]: I1012 06:02:07.250020 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 12 06:02:07 crc kubenswrapper[4930]: I1012 06:02:07.255314 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 12 06:02:07 crc kubenswrapper[4930]: I1012 06:02:07.433406 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79d95824-c8fd-4179-8f31-3a922c4b356d-logs\") pod \"nova-api-0\" (UID: \"79d95824-c8fd-4179-8f31-3a922c4b356d\") " pod="openstack/nova-api-0" Oct 12 06:02:07 crc kubenswrapper[4930]: I1012 06:02:07.433492 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/79d95824-c8fd-4179-8f31-3a922c4b356d-public-tls-certs\") pod \"nova-api-0\" (UID: \"79d95824-c8fd-4179-8f31-3a922c4b356d\") " pod="openstack/nova-api-0" Oct 12 06:02:07 crc kubenswrapper[4930]: I1012 06:02:07.433647 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79d95824-c8fd-4179-8f31-3a922c4b356d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"79d95824-c8fd-4179-8f31-3a922c4b356d\") " pod="openstack/nova-api-0" Oct 12 06:02:07 crc kubenswrapper[4930]: I1012 06:02:07.433973 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/79d95824-c8fd-4179-8f31-3a922c4b356d-internal-tls-certs\") pod \"nova-api-0\" (UID: \"79d95824-c8fd-4179-8f31-3a922c4b356d\") " pod="openstack/nova-api-0" Oct 12 06:02:07 crc kubenswrapper[4930]: I1012 06:02:07.434177 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79d95824-c8fd-4179-8f31-3a922c4b356d-config-data\") pod \"nova-api-0\" (UID: \"79d95824-c8fd-4179-8f31-3a922c4b356d\") " pod="openstack/nova-api-0" Oct 12 06:02:07 crc kubenswrapper[4930]: I1012 06:02:07.434318 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtpvg\" (UniqueName: \"kubernetes.io/projected/79d95824-c8fd-4179-8f31-3a922c4b356d-kube-api-access-xtpvg\") pod \"nova-api-0\" (UID: \"79d95824-c8fd-4179-8f31-3a922c4b356d\") " pod="openstack/nova-api-0" Oct 12 06:02:07 crc kubenswrapper[4930]: I1012 06:02:07.535979 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/79d95824-c8fd-4179-8f31-3a922c4b356d-internal-tls-certs\") pod \"nova-api-0\" (UID: \"79d95824-c8fd-4179-8f31-3a922c4b356d\") " pod="openstack/nova-api-0" Oct 12 06:02:07 crc kubenswrapper[4930]: I1012 06:02:07.536266 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79d95824-c8fd-4179-8f31-3a922c4b356d-config-data\") pod \"nova-api-0\" (UID: \"79d95824-c8fd-4179-8f31-3a922c4b356d\") " pod="openstack/nova-api-0" Oct 12 06:02:07 crc kubenswrapper[4930]: I1012 06:02:07.536368 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtpvg\" (UniqueName: \"kubernetes.io/projected/79d95824-c8fd-4179-8f31-3a922c4b356d-kube-api-access-xtpvg\") pod \"nova-api-0\" (UID: \"79d95824-c8fd-4179-8f31-3a922c4b356d\") " pod="openstack/nova-api-0" Oct 12 06:02:07 crc kubenswrapper[4930]: I1012 06:02:07.536460 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79d95824-c8fd-4179-8f31-3a922c4b356d-logs\") pod \"nova-api-0\" (UID: \"79d95824-c8fd-4179-8f31-3a922c4b356d\") " pod="openstack/nova-api-0" Oct 12 06:02:07 crc kubenswrapper[4930]: I1012 06:02:07.536562 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/79d95824-c8fd-4179-8f31-3a922c4b356d-public-tls-certs\") pod \"nova-api-0\" (UID: \"79d95824-c8fd-4179-8f31-3a922c4b356d\") " pod="openstack/nova-api-0" Oct 12 06:02:07 crc kubenswrapper[4930]: I1012 06:02:07.536648 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79d95824-c8fd-4179-8f31-3a922c4b356d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"79d95824-c8fd-4179-8f31-3a922c4b356d\") " pod="openstack/nova-api-0" Oct 12 06:02:07 crc kubenswrapper[4930]: I1012 06:02:07.537460 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79d95824-c8fd-4179-8f31-3a922c4b356d-logs\") pod \"nova-api-0\" (UID: \"79d95824-c8fd-4179-8f31-3a922c4b356d\") " pod="openstack/nova-api-0" Oct 12 06:02:07 crc kubenswrapper[4930]: I1012 06:02:07.541223 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 12 06:02:07 crc kubenswrapper[4930]: I1012 06:02:07.543596 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79d95824-c8fd-4179-8f31-3a922c4b356d-config-data\") pod \"nova-api-0\" (UID: \"79d95824-c8fd-4179-8f31-3a922c4b356d\") " pod="openstack/nova-api-0" Oct 12 06:02:07 crc kubenswrapper[4930]: I1012 06:02:07.546260 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/79d95824-c8fd-4179-8f31-3a922c4b356d-public-tls-certs\") pod \"nova-api-0\" (UID: \"79d95824-c8fd-4179-8f31-3a922c4b356d\") " pod="openstack/nova-api-0" Oct 12 06:02:07 crc kubenswrapper[4930]: I1012 06:02:07.547079 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79d95824-c8fd-4179-8f31-3a922c4b356d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"79d95824-c8fd-4179-8f31-3a922c4b356d\") " pod="openstack/nova-api-0" Oct 12 06:02:07 crc kubenswrapper[4930]: I1012 06:02:07.583686 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtpvg\" (UniqueName: \"kubernetes.io/projected/79d95824-c8fd-4179-8f31-3a922c4b356d-kube-api-access-xtpvg\") pod \"nova-api-0\" (UID: \"79d95824-c8fd-4179-8f31-3a922c4b356d\") " pod="openstack/nova-api-0" Oct 12 06:02:07 crc kubenswrapper[4930]: I1012 06:02:07.583844 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/79d95824-c8fd-4179-8f31-3a922c4b356d-internal-tls-certs\") pod \"nova-api-0\" (UID: \"79d95824-c8fd-4179-8f31-3a922c4b356d\") " pod="openstack/nova-api-0" Oct 12 06:02:07 crc kubenswrapper[4930]: I1012 06:02:07.864555 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 12 06:02:08 crc kubenswrapper[4930]: I1012 06:02:08.154980 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35b08013-0d97-4592-b9ab-719b114a7d40" path="/var/lib/kubelet/pods/35b08013-0d97-4592-b9ab-719b114a7d40/volumes" Oct 12 06:02:08 crc kubenswrapper[4930]: I1012 06:02:08.357495 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 12 06:02:09 crc kubenswrapper[4930]: I1012 06:02:09.162630 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"79d95824-c8fd-4179-8f31-3a922c4b356d","Type":"ContainerStarted","Data":"f6dcadf865412a7b36211d6460df037460a89656868813ddc1b10e1d24c23320"} Oct 12 06:02:09 crc kubenswrapper[4930]: I1012 06:02:09.163319 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"79d95824-c8fd-4179-8f31-3a922c4b356d","Type":"ContainerStarted","Data":"9e9961dac682168f9f9e593d89620460490258b98012e40a3bfcb1e5e1962936"} Oct 12 06:02:09 crc kubenswrapper[4930]: I1012 06:02:09.163337 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"79d95824-c8fd-4179-8f31-3a922c4b356d","Type":"ContainerStarted","Data":"5db1540c50d2ad49fd7aacc0b623e3ac217d1e99c2fc7e3a4319665cb8fb4148"} Oct 12 06:02:09 crc kubenswrapper[4930]: I1012 06:02:09.184586 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.184568962 podStartE2EDuration="2.184568962s" podCreationTimestamp="2025-10-12 06:02:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 06:02:09.180224083 +0000 UTC m=+1261.722325838" watchObservedRunningTime="2025-10-12 06:02:09.184568962 +0000 UTC m=+1261.726670737" Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.022566 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.190889 4930 generic.go:334] "Generic (PLEG): container finished" podID="2c04de04-91cf-4e56-85be-6a95e92f0d73" containerID="1945b8ffd802b593294d77243e90298b6713eed95da1501888c5fe65d73f1636" exitCode=0 Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.191000 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2c04de04-91cf-4e56-85be-6a95e92f0d73","Type":"ContainerDied","Data":"1945b8ffd802b593294d77243e90298b6713eed95da1501888c5fe65d73f1636"} Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.191041 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2c04de04-91cf-4e56-85be-6a95e92f0d73","Type":"ContainerDied","Data":"f5947b343785f8d5d157cc3f27d3076c391d472e1d9f88aa99b43e969fcb5106"} Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.191060 4930 scope.go:117] "RemoveContainer" containerID="7037d31dfb64aca5d14378c2494a85253062b17dd88418591c82c2dc00193a04" Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.191079 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.194229 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2c04de04-91cf-4e56-85be-6a95e92f0d73-sg-core-conf-yaml\") pod \"2c04de04-91cf-4e56-85be-6a95e92f0d73\" (UID: \"2c04de04-91cf-4e56-85be-6a95e92f0d73\") " Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.194305 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c04de04-91cf-4e56-85be-6a95e92f0d73-config-data\") pod \"2c04de04-91cf-4e56-85be-6a95e92f0d73\" (UID: \"2c04de04-91cf-4e56-85be-6a95e92f0d73\") " Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.194346 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c04de04-91cf-4e56-85be-6a95e92f0d73-scripts\") pod \"2c04de04-91cf-4e56-85be-6a95e92f0d73\" (UID: \"2c04de04-91cf-4e56-85be-6a95e92f0d73\") " Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.195032 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c04de04-91cf-4e56-85be-6a95e92f0d73-ceilometer-tls-certs\") pod \"2c04de04-91cf-4e56-85be-6a95e92f0d73\" (UID: \"2c04de04-91cf-4e56-85be-6a95e92f0d73\") " Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.195069 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c04de04-91cf-4e56-85be-6a95e92f0d73-combined-ca-bundle\") pod \"2c04de04-91cf-4e56-85be-6a95e92f0d73\" (UID: \"2c04de04-91cf-4e56-85be-6a95e92f0d73\") " Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.195144 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2c04de04-91cf-4e56-85be-6a95e92f0d73-log-httpd\") pod \"2c04de04-91cf-4e56-85be-6a95e92f0d73\" (UID: \"2c04de04-91cf-4e56-85be-6a95e92f0d73\") " Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.195176 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2c04de04-91cf-4e56-85be-6a95e92f0d73-run-httpd\") pod \"2c04de04-91cf-4e56-85be-6a95e92f0d73\" (UID: \"2c04de04-91cf-4e56-85be-6a95e92f0d73\") " Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.195218 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvzq9\" (UniqueName: \"kubernetes.io/projected/2c04de04-91cf-4e56-85be-6a95e92f0d73-kube-api-access-mvzq9\") pod \"2c04de04-91cf-4e56-85be-6a95e92f0d73\" (UID: \"2c04de04-91cf-4e56-85be-6a95e92f0d73\") " Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.196009 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c04de04-91cf-4e56-85be-6a95e92f0d73-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2c04de04-91cf-4e56-85be-6a95e92f0d73" (UID: "2c04de04-91cf-4e56-85be-6a95e92f0d73"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.196223 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c04de04-91cf-4e56-85be-6a95e92f0d73-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2c04de04-91cf-4e56-85be-6a95e92f0d73" (UID: "2c04de04-91cf-4e56-85be-6a95e92f0d73"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.201987 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c04de04-91cf-4e56-85be-6a95e92f0d73-kube-api-access-mvzq9" (OuterVolumeSpecName: "kube-api-access-mvzq9") pod "2c04de04-91cf-4e56-85be-6a95e92f0d73" (UID: "2c04de04-91cf-4e56-85be-6a95e92f0d73"). InnerVolumeSpecName "kube-api-access-mvzq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.201997 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c04de04-91cf-4e56-85be-6a95e92f0d73-scripts" (OuterVolumeSpecName: "scripts") pod "2c04de04-91cf-4e56-85be-6a95e92f0d73" (UID: "2c04de04-91cf-4e56-85be-6a95e92f0d73"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.222443 4930 scope.go:117] "RemoveContainer" containerID="1b5ac694c6e912580320b240c2b12c5ce9262d5cb406ab4cacfb06a6d9403406" Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.225159 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c04de04-91cf-4e56-85be-6a95e92f0d73-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2c04de04-91cf-4e56-85be-6a95e92f0d73" (UID: "2c04de04-91cf-4e56-85be-6a95e92f0d73"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.257866 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c04de04-91cf-4e56-85be-6a95e92f0d73-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "2c04de04-91cf-4e56-85be-6a95e92f0d73" (UID: "2c04de04-91cf-4e56-85be-6a95e92f0d73"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.292681 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c04de04-91cf-4e56-85be-6a95e92f0d73-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2c04de04-91cf-4e56-85be-6a95e92f0d73" (UID: "2c04de04-91cf-4e56-85be-6a95e92f0d73"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.298229 4930 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2c04de04-91cf-4e56-85be-6a95e92f0d73-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.298264 4930 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c04de04-91cf-4e56-85be-6a95e92f0d73-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.298273 4930 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c04de04-91cf-4e56-85be-6a95e92f0d73-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.298282 4930 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c04de04-91cf-4e56-85be-6a95e92f0d73-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.298292 4930 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2c04de04-91cf-4e56-85be-6a95e92f0d73-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.298300 4930 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2c04de04-91cf-4e56-85be-6a95e92f0d73-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.298307 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvzq9\" (UniqueName: \"kubernetes.io/projected/2c04de04-91cf-4e56-85be-6a95e92f0d73-kube-api-access-mvzq9\") on node \"crc\" DevicePath \"\"" Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.349418 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c04de04-91cf-4e56-85be-6a95e92f0d73-config-data" (OuterVolumeSpecName: "config-data") pod "2c04de04-91cf-4e56-85be-6a95e92f0d73" (UID: "2c04de04-91cf-4e56-85be-6a95e92f0d73"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.385914 4930 scope.go:117] "RemoveContainer" containerID="1945b8ffd802b593294d77243e90298b6713eed95da1501888c5fe65d73f1636" Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.400194 4930 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c04de04-91cf-4e56-85be-6a95e92f0d73-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.410795 4930 scope.go:117] "RemoveContainer" containerID="45c450a08e3387e612e626b73f1913a4bfe0b896d87b1faf8bc9ea2296001eff" Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.432926 4930 scope.go:117] "RemoveContainer" containerID="7037d31dfb64aca5d14378c2494a85253062b17dd88418591c82c2dc00193a04" Oct 12 06:02:10 crc kubenswrapper[4930]: E1012 06:02:10.433332 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7037d31dfb64aca5d14378c2494a85253062b17dd88418591c82c2dc00193a04\": container with ID starting with 7037d31dfb64aca5d14378c2494a85253062b17dd88418591c82c2dc00193a04 not found: ID does not exist" containerID="7037d31dfb64aca5d14378c2494a85253062b17dd88418591c82c2dc00193a04" Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.433403 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7037d31dfb64aca5d14378c2494a85253062b17dd88418591c82c2dc00193a04"} err="failed to get container status \"7037d31dfb64aca5d14378c2494a85253062b17dd88418591c82c2dc00193a04\": rpc error: code = NotFound desc = could not find container \"7037d31dfb64aca5d14378c2494a85253062b17dd88418591c82c2dc00193a04\": container with ID starting with 7037d31dfb64aca5d14378c2494a85253062b17dd88418591c82c2dc00193a04 not found: ID does not exist" Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.433446 4930 scope.go:117] "RemoveContainer" containerID="1b5ac694c6e912580320b240c2b12c5ce9262d5cb406ab4cacfb06a6d9403406" Oct 12 06:02:10 crc kubenswrapper[4930]: E1012 06:02:10.434038 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b5ac694c6e912580320b240c2b12c5ce9262d5cb406ab4cacfb06a6d9403406\": container with ID starting with 1b5ac694c6e912580320b240c2b12c5ce9262d5cb406ab4cacfb06a6d9403406 not found: ID does not exist" containerID="1b5ac694c6e912580320b240c2b12c5ce9262d5cb406ab4cacfb06a6d9403406" Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.434085 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b5ac694c6e912580320b240c2b12c5ce9262d5cb406ab4cacfb06a6d9403406"} err="failed to get container status \"1b5ac694c6e912580320b240c2b12c5ce9262d5cb406ab4cacfb06a6d9403406\": rpc error: code = NotFound desc = could not find container \"1b5ac694c6e912580320b240c2b12c5ce9262d5cb406ab4cacfb06a6d9403406\": container with ID starting with 1b5ac694c6e912580320b240c2b12c5ce9262d5cb406ab4cacfb06a6d9403406 not found: ID does not exist" Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.434114 4930 scope.go:117] "RemoveContainer" containerID="1945b8ffd802b593294d77243e90298b6713eed95da1501888c5fe65d73f1636" Oct 12 06:02:10 crc kubenswrapper[4930]: E1012 06:02:10.434399 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1945b8ffd802b593294d77243e90298b6713eed95da1501888c5fe65d73f1636\": container with ID starting with 1945b8ffd802b593294d77243e90298b6713eed95da1501888c5fe65d73f1636 not found: ID does not exist" containerID="1945b8ffd802b593294d77243e90298b6713eed95da1501888c5fe65d73f1636" Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.434434 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1945b8ffd802b593294d77243e90298b6713eed95da1501888c5fe65d73f1636"} err="failed to get container status \"1945b8ffd802b593294d77243e90298b6713eed95da1501888c5fe65d73f1636\": rpc error: code = NotFound desc = could not find container \"1945b8ffd802b593294d77243e90298b6713eed95da1501888c5fe65d73f1636\": container with ID starting with 1945b8ffd802b593294d77243e90298b6713eed95da1501888c5fe65d73f1636 not found: ID does not exist" Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.434457 4930 scope.go:117] "RemoveContainer" containerID="45c450a08e3387e612e626b73f1913a4bfe0b896d87b1faf8bc9ea2296001eff" Oct 12 06:02:10 crc kubenswrapper[4930]: E1012 06:02:10.434675 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45c450a08e3387e612e626b73f1913a4bfe0b896d87b1faf8bc9ea2296001eff\": container with ID starting with 45c450a08e3387e612e626b73f1913a4bfe0b896d87b1faf8bc9ea2296001eff not found: ID does not exist" containerID="45c450a08e3387e612e626b73f1913a4bfe0b896d87b1faf8bc9ea2296001eff" Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.434698 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45c450a08e3387e612e626b73f1913a4bfe0b896d87b1faf8bc9ea2296001eff"} err="failed to get container status \"45c450a08e3387e612e626b73f1913a4bfe0b896d87b1faf8bc9ea2296001eff\": rpc error: code = NotFound desc = could not find container \"45c450a08e3387e612e626b73f1913a4bfe0b896d87b1faf8bc9ea2296001eff\": container with ID starting with 45c450a08e3387e612e626b73f1913a4bfe0b896d87b1faf8bc9ea2296001eff not found: ID does not exist" Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.529388 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.539989 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.553549 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 12 06:02:10 crc kubenswrapper[4930]: E1012 06:02:10.553969 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c04de04-91cf-4e56-85be-6a95e92f0d73" containerName="ceilometer-notification-agent" Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.553986 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c04de04-91cf-4e56-85be-6a95e92f0d73" containerName="ceilometer-notification-agent" Oct 12 06:02:10 crc kubenswrapper[4930]: E1012 06:02:10.554002 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c04de04-91cf-4e56-85be-6a95e92f0d73" containerName="ceilometer-central-agent" Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.554007 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c04de04-91cf-4e56-85be-6a95e92f0d73" containerName="ceilometer-central-agent" Oct 12 06:02:10 crc kubenswrapper[4930]: E1012 06:02:10.554018 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c04de04-91cf-4e56-85be-6a95e92f0d73" containerName="sg-core" Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.554024 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c04de04-91cf-4e56-85be-6a95e92f0d73" containerName="sg-core" Oct 12 06:02:10 crc kubenswrapper[4930]: E1012 06:02:10.554038 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c04de04-91cf-4e56-85be-6a95e92f0d73" containerName="proxy-httpd" Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.554046 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c04de04-91cf-4e56-85be-6a95e92f0d73" containerName="proxy-httpd" Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.554219 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c04de04-91cf-4e56-85be-6a95e92f0d73" containerName="sg-core" Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.554232 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c04de04-91cf-4e56-85be-6a95e92f0d73" containerName="ceilometer-notification-agent" Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.554249 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c04de04-91cf-4e56-85be-6a95e92f0d73" containerName="ceilometer-central-agent" Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.554271 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c04de04-91cf-4e56-85be-6a95e92f0d73" containerName="proxy-httpd" Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.556237 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.558479 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.558672 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.562065 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.568005 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.705803 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2jvj\" (UniqueName: \"kubernetes.io/projected/1d036de3-1a99-408d-8677-978880f41705-kube-api-access-t2jvj\") pod \"ceilometer-0\" (UID: \"1d036de3-1a99-408d-8677-978880f41705\") " pod="openstack/ceilometer-0" Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.705860 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1d036de3-1a99-408d-8677-978880f41705-run-httpd\") pod \"ceilometer-0\" (UID: \"1d036de3-1a99-408d-8677-978880f41705\") " pod="openstack/ceilometer-0" Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.705931 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d036de3-1a99-408d-8677-978880f41705-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1d036de3-1a99-408d-8677-978880f41705\") " pod="openstack/ceilometer-0" Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.705955 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d036de3-1a99-408d-8677-978880f41705-config-data\") pod \"ceilometer-0\" (UID: \"1d036de3-1a99-408d-8677-978880f41705\") " pod="openstack/ceilometer-0" Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.705993 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d036de3-1a99-408d-8677-978880f41705-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1d036de3-1a99-408d-8677-978880f41705\") " pod="openstack/ceilometer-0" Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.706023 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1d036de3-1a99-408d-8677-978880f41705-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1d036de3-1a99-408d-8677-978880f41705\") " pod="openstack/ceilometer-0" Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.706053 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1d036de3-1a99-408d-8677-978880f41705-log-httpd\") pod \"ceilometer-0\" (UID: \"1d036de3-1a99-408d-8677-978880f41705\") " pod="openstack/ceilometer-0" Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.706084 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d036de3-1a99-408d-8677-978880f41705-scripts\") pod \"ceilometer-0\" (UID: \"1d036de3-1a99-408d-8677-978880f41705\") " pod="openstack/ceilometer-0" Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.808322 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1d036de3-1a99-408d-8677-978880f41705-log-httpd\") pod \"ceilometer-0\" (UID: \"1d036de3-1a99-408d-8677-978880f41705\") " pod="openstack/ceilometer-0" Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.808825 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d036de3-1a99-408d-8677-978880f41705-scripts\") pod \"ceilometer-0\" (UID: \"1d036de3-1a99-408d-8677-978880f41705\") " pod="openstack/ceilometer-0" Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.808856 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1d036de3-1a99-408d-8677-978880f41705-log-httpd\") pod \"ceilometer-0\" (UID: \"1d036de3-1a99-408d-8677-978880f41705\") " pod="openstack/ceilometer-0" Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.809210 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2jvj\" (UniqueName: \"kubernetes.io/projected/1d036de3-1a99-408d-8677-978880f41705-kube-api-access-t2jvj\") pod \"ceilometer-0\" (UID: \"1d036de3-1a99-408d-8677-978880f41705\") " pod="openstack/ceilometer-0" Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.809396 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1d036de3-1a99-408d-8677-978880f41705-run-httpd\") pod \"ceilometer-0\" (UID: \"1d036de3-1a99-408d-8677-978880f41705\") " pod="openstack/ceilometer-0" Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.809638 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d036de3-1a99-408d-8677-978880f41705-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1d036de3-1a99-408d-8677-978880f41705\") " pod="openstack/ceilometer-0" Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.809818 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d036de3-1a99-408d-8677-978880f41705-config-data\") pod \"ceilometer-0\" (UID: \"1d036de3-1a99-408d-8677-978880f41705\") " pod="openstack/ceilometer-0" Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.809860 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1d036de3-1a99-408d-8677-978880f41705-run-httpd\") pod \"ceilometer-0\" (UID: \"1d036de3-1a99-408d-8677-978880f41705\") " pod="openstack/ceilometer-0" Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.810699 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d036de3-1a99-408d-8677-978880f41705-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1d036de3-1a99-408d-8677-978880f41705\") " pod="openstack/ceilometer-0" Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.811020 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1d036de3-1a99-408d-8677-978880f41705-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1d036de3-1a99-408d-8677-978880f41705\") " pod="openstack/ceilometer-0" Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.814435 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d036de3-1a99-408d-8677-978880f41705-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1d036de3-1a99-408d-8677-978880f41705\") " pod="openstack/ceilometer-0" Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.815039 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d036de3-1a99-408d-8677-978880f41705-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1d036de3-1a99-408d-8677-978880f41705\") " pod="openstack/ceilometer-0" Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.815525 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d036de3-1a99-408d-8677-978880f41705-scripts\") pod \"ceilometer-0\" (UID: \"1d036de3-1a99-408d-8677-978880f41705\") " pod="openstack/ceilometer-0" Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.816177 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d036de3-1a99-408d-8677-978880f41705-config-data\") pod \"ceilometer-0\" (UID: \"1d036de3-1a99-408d-8677-978880f41705\") " pod="openstack/ceilometer-0" Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.817113 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1d036de3-1a99-408d-8677-978880f41705-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1d036de3-1a99-408d-8677-978880f41705\") " pod="openstack/ceilometer-0" Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.828565 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2jvj\" (UniqueName: \"kubernetes.io/projected/1d036de3-1a99-408d-8677-978880f41705-kube-api-access-t2jvj\") pod \"ceilometer-0\" (UID: \"1d036de3-1a99-408d-8677-978880f41705\") " pod="openstack/ceilometer-0" Oct 12 06:02:10 crc kubenswrapper[4930]: I1012 06:02:10.881515 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 12 06:02:11 crc kubenswrapper[4930]: I1012 06:02:11.350694 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 12 06:02:12 crc kubenswrapper[4930]: I1012 06:02:12.149469 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c04de04-91cf-4e56-85be-6a95e92f0d73" path="/var/lib/kubelet/pods/2c04de04-91cf-4e56-85be-6a95e92f0d73/volumes" Oct 12 06:02:12 crc kubenswrapper[4930]: I1012 06:02:12.220057 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1d036de3-1a99-408d-8677-978880f41705","Type":"ContainerStarted","Data":"0d86381767722fb734fb255eaf8afd0cdc9e857b371e8c54ef462e577ec5709a"} Oct 12 06:02:12 crc kubenswrapper[4930]: I1012 06:02:12.220125 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1d036de3-1a99-408d-8677-978880f41705","Type":"ContainerStarted","Data":"4f696a5077c72d3c36cb89cb77415456d397a5bbacabdab228466238069d32d1"} Oct 12 06:02:12 crc kubenswrapper[4930]: I1012 06:02:12.541240 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Oct 12 06:02:12 crc kubenswrapper[4930]: I1012 06:02:12.569897 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Oct 12 06:02:12 crc kubenswrapper[4930]: I1012 06:02:12.668001 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-54599d8f7-xv8dr" Oct 12 06:02:12 crc kubenswrapper[4930]: I1012 06:02:12.838052 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-844fc57f6f-7cfwk"] Oct 12 06:02:12 crc kubenswrapper[4930]: I1012 06:02:12.839610 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-844fc57f6f-7cfwk" podUID="159b117d-1ad6-4bb3-a748-3946f54ca207" containerName="dnsmasq-dns" containerID="cri-o://6679c6b163fbeb8e483b159bf04e6c8bd343318b3008456e6fe5ccd838e2f13e" gracePeriod=10 Oct 12 06:02:13 crc kubenswrapper[4930]: I1012 06:02:13.237659 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1d036de3-1a99-408d-8677-978880f41705","Type":"ContainerStarted","Data":"515657ce81c0f6c5f7388e27e73b26edbdc5eb25f192e90b21c6be26047223aa"} Oct 12 06:02:13 crc kubenswrapper[4930]: I1012 06:02:13.257811 4930 generic.go:334] "Generic (PLEG): container finished" podID="159b117d-1ad6-4bb3-a748-3946f54ca207" containerID="6679c6b163fbeb8e483b159bf04e6c8bd343318b3008456e6fe5ccd838e2f13e" exitCode=0 Oct 12 06:02:13 crc kubenswrapper[4930]: I1012 06:02:13.257945 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-844fc57f6f-7cfwk" event={"ID":"159b117d-1ad6-4bb3-a748-3946f54ca207","Type":"ContainerDied","Data":"6679c6b163fbeb8e483b159bf04e6c8bd343318b3008456e6fe5ccd838e2f13e"} Oct 12 06:02:13 crc kubenswrapper[4930]: I1012 06:02:13.283784 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Oct 12 06:02:13 crc kubenswrapper[4930]: I1012 06:02:13.427894 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-844fc57f6f-7cfwk" Oct 12 06:02:13 crc kubenswrapper[4930]: I1012 06:02:13.431062 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-mnnd5"] Oct 12 06:02:13 crc kubenswrapper[4930]: E1012 06:02:13.431681 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="159b117d-1ad6-4bb3-a748-3946f54ca207" containerName="init" Oct 12 06:02:13 crc kubenswrapper[4930]: I1012 06:02:13.431697 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="159b117d-1ad6-4bb3-a748-3946f54ca207" containerName="init" Oct 12 06:02:13 crc kubenswrapper[4930]: E1012 06:02:13.431753 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="159b117d-1ad6-4bb3-a748-3946f54ca207" containerName="dnsmasq-dns" Oct 12 06:02:13 crc kubenswrapper[4930]: I1012 06:02:13.431761 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="159b117d-1ad6-4bb3-a748-3946f54ca207" containerName="dnsmasq-dns" Oct 12 06:02:13 crc kubenswrapper[4930]: I1012 06:02:13.431963 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="159b117d-1ad6-4bb3-a748-3946f54ca207" containerName="dnsmasq-dns" Oct 12 06:02:13 crc kubenswrapper[4930]: I1012 06:02:13.432623 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-mnnd5" Oct 12 06:02:13 crc kubenswrapper[4930]: I1012 06:02:13.439150 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Oct 12 06:02:13 crc kubenswrapper[4930]: I1012 06:02:13.439672 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Oct 12 06:02:13 crc kubenswrapper[4930]: I1012 06:02:13.455248 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-mnnd5"] Oct 12 06:02:13 crc kubenswrapper[4930]: I1012 06:02:13.569128 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/159b117d-1ad6-4bb3-a748-3946f54ca207-dns-swift-storage-0\") pod \"159b117d-1ad6-4bb3-a748-3946f54ca207\" (UID: \"159b117d-1ad6-4bb3-a748-3946f54ca207\") " Oct 12 06:02:13 crc kubenswrapper[4930]: I1012 06:02:13.569204 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/159b117d-1ad6-4bb3-a748-3946f54ca207-dns-svc\") pod \"159b117d-1ad6-4bb3-a748-3946f54ca207\" (UID: \"159b117d-1ad6-4bb3-a748-3946f54ca207\") " Oct 12 06:02:13 crc kubenswrapper[4930]: I1012 06:02:13.569234 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/159b117d-1ad6-4bb3-a748-3946f54ca207-ovsdbserver-sb\") pod \"159b117d-1ad6-4bb3-a748-3946f54ca207\" (UID: \"159b117d-1ad6-4bb3-a748-3946f54ca207\") " Oct 12 06:02:13 crc kubenswrapper[4930]: I1012 06:02:13.569299 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8mjl\" (UniqueName: \"kubernetes.io/projected/159b117d-1ad6-4bb3-a748-3946f54ca207-kube-api-access-r8mjl\") pod \"159b117d-1ad6-4bb3-a748-3946f54ca207\" (UID: \"159b117d-1ad6-4bb3-a748-3946f54ca207\") " Oct 12 06:02:13 crc kubenswrapper[4930]: I1012 06:02:13.569336 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/159b117d-1ad6-4bb3-a748-3946f54ca207-config\") pod \"159b117d-1ad6-4bb3-a748-3946f54ca207\" (UID: \"159b117d-1ad6-4bb3-a748-3946f54ca207\") " Oct 12 06:02:13 crc kubenswrapper[4930]: I1012 06:02:13.569357 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/159b117d-1ad6-4bb3-a748-3946f54ca207-ovsdbserver-nb\") pod \"159b117d-1ad6-4bb3-a748-3946f54ca207\" (UID: \"159b117d-1ad6-4bb3-a748-3946f54ca207\") " Oct 12 06:02:13 crc kubenswrapper[4930]: I1012 06:02:13.570464 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7444\" (UniqueName: \"kubernetes.io/projected/c2f80483-5b42-4fa4-8013-f2a8fd535c9e-kube-api-access-v7444\") pod \"nova-cell1-cell-mapping-mnnd5\" (UID: \"c2f80483-5b42-4fa4-8013-f2a8fd535c9e\") " pod="openstack/nova-cell1-cell-mapping-mnnd5" Oct 12 06:02:13 crc kubenswrapper[4930]: I1012 06:02:13.570518 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2f80483-5b42-4fa4-8013-f2a8fd535c9e-scripts\") pod \"nova-cell1-cell-mapping-mnnd5\" (UID: \"c2f80483-5b42-4fa4-8013-f2a8fd535c9e\") " pod="openstack/nova-cell1-cell-mapping-mnnd5" Oct 12 06:02:13 crc kubenswrapper[4930]: I1012 06:02:13.570689 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2f80483-5b42-4fa4-8013-f2a8fd535c9e-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-mnnd5\" (UID: \"c2f80483-5b42-4fa4-8013-f2a8fd535c9e\") " pod="openstack/nova-cell1-cell-mapping-mnnd5" Oct 12 06:02:13 crc kubenswrapper[4930]: I1012 06:02:13.570889 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2f80483-5b42-4fa4-8013-f2a8fd535c9e-config-data\") pod \"nova-cell1-cell-mapping-mnnd5\" (UID: \"c2f80483-5b42-4fa4-8013-f2a8fd535c9e\") " pod="openstack/nova-cell1-cell-mapping-mnnd5" Oct 12 06:02:13 crc kubenswrapper[4930]: I1012 06:02:13.576878 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/159b117d-1ad6-4bb3-a748-3946f54ca207-kube-api-access-r8mjl" (OuterVolumeSpecName: "kube-api-access-r8mjl") pod "159b117d-1ad6-4bb3-a748-3946f54ca207" (UID: "159b117d-1ad6-4bb3-a748-3946f54ca207"). InnerVolumeSpecName "kube-api-access-r8mjl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:02:13 crc kubenswrapper[4930]: I1012 06:02:13.620500 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/159b117d-1ad6-4bb3-a748-3946f54ca207-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "159b117d-1ad6-4bb3-a748-3946f54ca207" (UID: "159b117d-1ad6-4bb3-a748-3946f54ca207"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 06:02:13 crc kubenswrapper[4930]: I1012 06:02:13.627329 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/159b117d-1ad6-4bb3-a748-3946f54ca207-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "159b117d-1ad6-4bb3-a748-3946f54ca207" (UID: "159b117d-1ad6-4bb3-a748-3946f54ca207"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 06:02:13 crc kubenswrapper[4930]: I1012 06:02:13.631184 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/159b117d-1ad6-4bb3-a748-3946f54ca207-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "159b117d-1ad6-4bb3-a748-3946f54ca207" (UID: "159b117d-1ad6-4bb3-a748-3946f54ca207"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 06:02:13 crc kubenswrapper[4930]: I1012 06:02:13.639151 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/159b117d-1ad6-4bb3-a748-3946f54ca207-config" (OuterVolumeSpecName: "config") pod "159b117d-1ad6-4bb3-a748-3946f54ca207" (UID: "159b117d-1ad6-4bb3-a748-3946f54ca207"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 06:02:13 crc kubenswrapper[4930]: I1012 06:02:13.640813 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/159b117d-1ad6-4bb3-a748-3946f54ca207-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "159b117d-1ad6-4bb3-a748-3946f54ca207" (UID: "159b117d-1ad6-4bb3-a748-3946f54ca207"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 06:02:13 crc kubenswrapper[4930]: I1012 06:02:13.672369 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2f80483-5b42-4fa4-8013-f2a8fd535c9e-config-data\") pod \"nova-cell1-cell-mapping-mnnd5\" (UID: \"c2f80483-5b42-4fa4-8013-f2a8fd535c9e\") " pod="openstack/nova-cell1-cell-mapping-mnnd5" Oct 12 06:02:13 crc kubenswrapper[4930]: I1012 06:02:13.672645 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7444\" (UniqueName: \"kubernetes.io/projected/c2f80483-5b42-4fa4-8013-f2a8fd535c9e-kube-api-access-v7444\") pod \"nova-cell1-cell-mapping-mnnd5\" (UID: \"c2f80483-5b42-4fa4-8013-f2a8fd535c9e\") " pod="openstack/nova-cell1-cell-mapping-mnnd5" Oct 12 06:02:13 crc kubenswrapper[4930]: I1012 06:02:13.672679 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2f80483-5b42-4fa4-8013-f2a8fd535c9e-scripts\") pod \"nova-cell1-cell-mapping-mnnd5\" (UID: \"c2f80483-5b42-4fa4-8013-f2a8fd535c9e\") " pod="openstack/nova-cell1-cell-mapping-mnnd5" Oct 12 06:02:13 crc kubenswrapper[4930]: I1012 06:02:13.672798 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2f80483-5b42-4fa4-8013-f2a8fd535c9e-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-mnnd5\" (UID: \"c2f80483-5b42-4fa4-8013-f2a8fd535c9e\") " pod="openstack/nova-cell1-cell-mapping-mnnd5" Oct 12 06:02:13 crc kubenswrapper[4930]: I1012 06:02:13.672860 4930 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/159b117d-1ad6-4bb3-a748-3946f54ca207-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 12 06:02:13 crc kubenswrapper[4930]: I1012 06:02:13.672877 4930 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/159b117d-1ad6-4bb3-a748-3946f54ca207-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 12 06:02:13 crc kubenswrapper[4930]: I1012 06:02:13.672886 4930 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/159b117d-1ad6-4bb3-a748-3946f54ca207-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 12 06:02:13 crc kubenswrapper[4930]: I1012 06:02:13.672895 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8mjl\" (UniqueName: \"kubernetes.io/projected/159b117d-1ad6-4bb3-a748-3946f54ca207-kube-api-access-r8mjl\") on node \"crc\" DevicePath \"\"" Oct 12 06:02:13 crc kubenswrapper[4930]: I1012 06:02:13.672906 4930 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/159b117d-1ad6-4bb3-a748-3946f54ca207-config\") on node \"crc\" DevicePath \"\"" Oct 12 06:02:13 crc kubenswrapper[4930]: I1012 06:02:13.672914 4930 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/159b117d-1ad6-4bb3-a748-3946f54ca207-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 12 06:02:13 crc kubenswrapper[4930]: I1012 06:02:13.676722 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2f80483-5b42-4fa4-8013-f2a8fd535c9e-scripts\") pod \"nova-cell1-cell-mapping-mnnd5\" (UID: \"c2f80483-5b42-4fa4-8013-f2a8fd535c9e\") " pod="openstack/nova-cell1-cell-mapping-mnnd5" Oct 12 06:02:13 crc kubenswrapper[4930]: I1012 06:02:13.679932 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2f80483-5b42-4fa4-8013-f2a8fd535c9e-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-mnnd5\" (UID: \"c2f80483-5b42-4fa4-8013-f2a8fd535c9e\") " pod="openstack/nova-cell1-cell-mapping-mnnd5" Oct 12 06:02:13 crc kubenswrapper[4930]: I1012 06:02:13.681361 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2f80483-5b42-4fa4-8013-f2a8fd535c9e-config-data\") pod \"nova-cell1-cell-mapping-mnnd5\" (UID: \"c2f80483-5b42-4fa4-8013-f2a8fd535c9e\") " pod="openstack/nova-cell1-cell-mapping-mnnd5" Oct 12 06:02:13 crc kubenswrapper[4930]: I1012 06:02:13.689274 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7444\" (UniqueName: \"kubernetes.io/projected/c2f80483-5b42-4fa4-8013-f2a8fd535c9e-kube-api-access-v7444\") pod \"nova-cell1-cell-mapping-mnnd5\" (UID: \"c2f80483-5b42-4fa4-8013-f2a8fd535c9e\") " pod="openstack/nova-cell1-cell-mapping-mnnd5" Oct 12 06:02:13 crc kubenswrapper[4930]: I1012 06:02:13.755807 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-mnnd5" Oct 12 06:02:14 crc kubenswrapper[4930]: I1012 06:02:14.234856 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-mnnd5"] Oct 12 06:02:14 crc kubenswrapper[4930]: I1012 06:02:14.275902 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-mnnd5" event={"ID":"c2f80483-5b42-4fa4-8013-f2a8fd535c9e","Type":"ContainerStarted","Data":"0152f8b583988e5a1012fc48933953221aa6b87161dfce38001897fd96a8d782"} Oct 12 06:02:14 crc kubenswrapper[4930]: I1012 06:02:14.277730 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1d036de3-1a99-408d-8677-978880f41705","Type":"ContainerStarted","Data":"e86e8377f7c7e129536406aaf5f2cf8f42faecb9aac99495f7d9483664ea63d8"} Oct 12 06:02:14 crc kubenswrapper[4930]: I1012 06:02:14.281886 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-844fc57f6f-7cfwk" Oct 12 06:02:14 crc kubenswrapper[4930]: I1012 06:02:14.282050 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-844fc57f6f-7cfwk" event={"ID":"159b117d-1ad6-4bb3-a748-3946f54ca207","Type":"ContainerDied","Data":"ff4cf649b495608a58c66506264ad553c97d977b26353de5d5b1b431201bad9f"} Oct 12 06:02:14 crc kubenswrapper[4930]: I1012 06:02:14.282123 4930 scope.go:117] "RemoveContainer" containerID="6679c6b163fbeb8e483b159bf04e6c8bd343318b3008456e6fe5ccd838e2f13e" Oct 12 06:02:14 crc kubenswrapper[4930]: I1012 06:02:14.390941 4930 scope.go:117] "RemoveContainer" containerID="32e7cbedc200dd4f84cfe55abea58ad69a7e33232a477b7d745ae332d8deba01" Oct 12 06:02:14 crc kubenswrapper[4930]: I1012 06:02:14.419312 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-844fc57f6f-7cfwk"] Oct 12 06:02:14 crc kubenswrapper[4930]: I1012 06:02:14.437672 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-844fc57f6f-7cfwk"] Oct 12 06:02:15 crc kubenswrapper[4930]: I1012 06:02:15.298198 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1d036de3-1a99-408d-8677-978880f41705","Type":"ContainerStarted","Data":"2b6a0c15ab286e34830e98ba543b3e187771d6d1dbcd41888991b136a5cac88c"} Oct 12 06:02:15 crc kubenswrapper[4930]: I1012 06:02:15.298420 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 12 06:02:15 crc kubenswrapper[4930]: I1012 06:02:15.302926 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-mnnd5" event={"ID":"c2f80483-5b42-4fa4-8013-f2a8fd535c9e","Type":"ContainerStarted","Data":"c2e9c3a7da9bb6b0c961c0d94ff115cebf02d88855e7f1e38799f8dde8387931"} Oct 12 06:02:15 crc kubenswrapper[4930]: I1012 06:02:15.354534 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.238358019 podStartE2EDuration="5.354504827s" podCreationTimestamp="2025-10-12 06:02:10 +0000 UTC" firstStartedPulling="2025-10-12 06:02:11.354448097 +0000 UTC m=+1263.896549892" lastFinishedPulling="2025-10-12 06:02:14.470594935 +0000 UTC m=+1267.012696700" observedRunningTime="2025-10-12 06:02:15.327842724 +0000 UTC m=+1267.869944489" watchObservedRunningTime="2025-10-12 06:02:15.354504827 +0000 UTC m=+1267.896606632" Oct 12 06:02:15 crc kubenswrapper[4930]: I1012 06:02:15.367821 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-mnnd5" podStartSLOduration=2.367797458 podStartE2EDuration="2.367797458s" podCreationTimestamp="2025-10-12 06:02:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 06:02:15.351702377 +0000 UTC m=+1267.893804142" watchObservedRunningTime="2025-10-12 06:02:15.367797458 +0000 UTC m=+1267.909899263" Oct 12 06:02:16 crc kubenswrapper[4930]: I1012 06:02:16.158705 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="159b117d-1ad6-4bb3-a748-3946f54ca207" path="/var/lib/kubelet/pods/159b117d-1ad6-4bb3-a748-3946f54ca207/volumes" Oct 12 06:02:17 crc kubenswrapper[4930]: I1012 06:02:17.866393 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 12 06:02:17 crc kubenswrapper[4930]: I1012 06:02:17.868127 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 12 06:02:18 crc kubenswrapper[4930]: I1012 06:02:18.877977 4930 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="79d95824-c8fd-4179-8f31-3a922c4b356d" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.224:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 12 06:02:18 crc kubenswrapper[4930]: I1012 06:02:18.878569 4930 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="79d95824-c8fd-4179-8f31-3a922c4b356d" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.224:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 12 06:02:19 crc kubenswrapper[4930]: I1012 06:02:19.351351 4930 generic.go:334] "Generic (PLEG): container finished" podID="c2f80483-5b42-4fa4-8013-f2a8fd535c9e" containerID="c2e9c3a7da9bb6b0c961c0d94ff115cebf02d88855e7f1e38799f8dde8387931" exitCode=0 Oct 12 06:02:19 crc kubenswrapper[4930]: I1012 06:02:19.351413 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-mnnd5" event={"ID":"c2f80483-5b42-4fa4-8013-f2a8fd535c9e","Type":"ContainerDied","Data":"c2e9c3a7da9bb6b0c961c0d94ff115cebf02d88855e7f1e38799f8dde8387931"} Oct 12 06:02:20 crc kubenswrapper[4930]: I1012 06:02:20.819795 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-mnnd5" Oct 12 06:02:20 crc kubenswrapper[4930]: I1012 06:02:20.966818 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7444\" (UniqueName: \"kubernetes.io/projected/c2f80483-5b42-4fa4-8013-f2a8fd535c9e-kube-api-access-v7444\") pod \"c2f80483-5b42-4fa4-8013-f2a8fd535c9e\" (UID: \"c2f80483-5b42-4fa4-8013-f2a8fd535c9e\") " Oct 12 06:02:20 crc kubenswrapper[4930]: I1012 06:02:20.967110 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2f80483-5b42-4fa4-8013-f2a8fd535c9e-config-data\") pod \"c2f80483-5b42-4fa4-8013-f2a8fd535c9e\" (UID: \"c2f80483-5b42-4fa4-8013-f2a8fd535c9e\") " Oct 12 06:02:20 crc kubenswrapper[4930]: I1012 06:02:20.967163 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2f80483-5b42-4fa4-8013-f2a8fd535c9e-combined-ca-bundle\") pod \"c2f80483-5b42-4fa4-8013-f2a8fd535c9e\" (UID: \"c2f80483-5b42-4fa4-8013-f2a8fd535c9e\") " Oct 12 06:02:20 crc kubenswrapper[4930]: I1012 06:02:20.967401 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2f80483-5b42-4fa4-8013-f2a8fd535c9e-scripts\") pod \"c2f80483-5b42-4fa4-8013-f2a8fd535c9e\" (UID: \"c2f80483-5b42-4fa4-8013-f2a8fd535c9e\") " Oct 12 06:02:20 crc kubenswrapper[4930]: I1012 06:02:20.974123 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2f80483-5b42-4fa4-8013-f2a8fd535c9e-kube-api-access-v7444" (OuterVolumeSpecName: "kube-api-access-v7444") pod "c2f80483-5b42-4fa4-8013-f2a8fd535c9e" (UID: "c2f80483-5b42-4fa4-8013-f2a8fd535c9e"). InnerVolumeSpecName "kube-api-access-v7444". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:02:20 crc kubenswrapper[4930]: I1012 06:02:20.974971 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2f80483-5b42-4fa4-8013-f2a8fd535c9e-scripts" (OuterVolumeSpecName: "scripts") pod "c2f80483-5b42-4fa4-8013-f2a8fd535c9e" (UID: "c2f80483-5b42-4fa4-8013-f2a8fd535c9e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:02:21 crc kubenswrapper[4930]: I1012 06:02:21.000689 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2f80483-5b42-4fa4-8013-f2a8fd535c9e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c2f80483-5b42-4fa4-8013-f2a8fd535c9e" (UID: "c2f80483-5b42-4fa4-8013-f2a8fd535c9e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:02:21 crc kubenswrapper[4930]: I1012 06:02:21.008852 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2f80483-5b42-4fa4-8013-f2a8fd535c9e-config-data" (OuterVolumeSpecName: "config-data") pod "c2f80483-5b42-4fa4-8013-f2a8fd535c9e" (UID: "c2f80483-5b42-4fa4-8013-f2a8fd535c9e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:02:21 crc kubenswrapper[4930]: I1012 06:02:21.069874 4930 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2f80483-5b42-4fa4-8013-f2a8fd535c9e-scripts\") on node \"crc\" DevicePath \"\"" Oct 12 06:02:21 crc kubenswrapper[4930]: I1012 06:02:21.069924 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7444\" (UniqueName: \"kubernetes.io/projected/c2f80483-5b42-4fa4-8013-f2a8fd535c9e-kube-api-access-v7444\") on node \"crc\" DevicePath \"\"" Oct 12 06:02:21 crc kubenswrapper[4930]: I1012 06:02:21.069946 4930 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2f80483-5b42-4fa4-8013-f2a8fd535c9e-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 06:02:21 crc kubenswrapper[4930]: I1012 06:02:21.069966 4930 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2f80483-5b42-4fa4-8013-f2a8fd535c9e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 06:02:21 crc kubenswrapper[4930]: I1012 06:02:21.375604 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-mnnd5" event={"ID":"c2f80483-5b42-4fa4-8013-f2a8fd535c9e","Type":"ContainerDied","Data":"0152f8b583988e5a1012fc48933953221aa6b87161dfce38001897fd96a8d782"} Oct 12 06:02:21 crc kubenswrapper[4930]: I1012 06:02:21.375653 4930 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0152f8b583988e5a1012fc48933953221aa6b87161dfce38001897fd96a8d782" Oct 12 06:02:21 crc kubenswrapper[4930]: I1012 06:02:21.375705 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-mnnd5" Oct 12 06:02:21 crc kubenswrapper[4930]: I1012 06:02:21.598914 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 12 06:02:21 crc kubenswrapper[4930]: I1012 06:02:21.599276 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="7a446391-253a-4715-bdf8-709b778e76eb" containerName="nova-scheduler-scheduler" containerID="cri-o://d42d56e1d7e3a0415895fd9f006c566a38f260bb3157b826f2b9b73cadc99a04" gracePeriod=30 Oct 12 06:02:21 crc kubenswrapper[4930]: I1012 06:02:21.616066 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 12 06:02:21 crc kubenswrapper[4930]: I1012 06:02:21.616368 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="79d95824-c8fd-4179-8f31-3a922c4b356d" containerName="nova-api-log" containerID="cri-o://9e9961dac682168f9f9e593d89620460490258b98012e40a3bfcb1e5e1962936" gracePeriod=30 Oct 12 06:02:21 crc kubenswrapper[4930]: I1012 06:02:21.616795 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="79d95824-c8fd-4179-8f31-3a922c4b356d" containerName="nova-api-api" containerID="cri-o://f6dcadf865412a7b36211d6460df037460a89656868813ddc1b10e1d24c23320" gracePeriod=30 Oct 12 06:02:21 crc kubenswrapper[4930]: I1012 06:02:21.629465 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 12 06:02:21 crc kubenswrapper[4930]: I1012 06:02:21.629691 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="0fb6258b-6691-4a25-b22c-ad2a5a03b167" containerName="nova-metadata-log" containerID="cri-o://3f46f37403747a9f51f09e17860efbd5c1d75c8431788d8e2976f9ca29614b40" gracePeriod=30 Oct 12 06:02:21 crc kubenswrapper[4930]: I1012 06:02:21.629848 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="0fb6258b-6691-4a25-b22c-ad2a5a03b167" containerName="nova-metadata-metadata" containerID="cri-o://c829134520a8f961827096f58e320507bb1e104b0d830f6775dc1b17997cb132" gracePeriod=30 Oct 12 06:02:21 crc kubenswrapper[4930]: E1012 06:02:21.836021 4930 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0fb6258b_6691_4a25_b22c_ad2a5a03b167.slice/crio-3f46f37403747a9f51f09e17860efbd5c1d75c8431788d8e2976f9ca29614b40.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79d95824_c8fd_4179_8f31_3a922c4b356d.slice/crio-conmon-9e9961dac682168f9f9e593d89620460490258b98012e40a3bfcb1e5e1962936.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79d95824_c8fd_4179_8f31_3a922c4b356d.slice/crio-9e9961dac682168f9f9e593d89620460490258b98012e40a3bfcb1e5e1962936.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0fb6258b_6691_4a25_b22c_ad2a5a03b167.slice/crio-conmon-3f46f37403747a9f51f09e17860efbd5c1d75c8431788d8e2976f9ca29614b40.scope\": RecentStats: unable to find data in memory cache]" Oct 12 06:02:22 crc kubenswrapper[4930]: I1012 06:02:22.392727 4930 generic.go:334] "Generic (PLEG): container finished" podID="79d95824-c8fd-4179-8f31-3a922c4b356d" containerID="9e9961dac682168f9f9e593d89620460490258b98012e40a3bfcb1e5e1962936" exitCode=143 Oct 12 06:02:22 crc kubenswrapper[4930]: I1012 06:02:22.393465 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"79d95824-c8fd-4179-8f31-3a922c4b356d","Type":"ContainerDied","Data":"9e9961dac682168f9f9e593d89620460490258b98012e40a3bfcb1e5e1962936"} Oct 12 06:02:22 crc kubenswrapper[4930]: I1012 06:02:22.396440 4930 generic.go:334] "Generic (PLEG): container finished" podID="0fb6258b-6691-4a25-b22c-ad2a5a03b167" containerID="3f46f37403747a9f51f09e17860efbd5c1d75c8431788d8e2976f9ca29614b40" exitCode=143 Oct 12 06:02:22 crc kubenswrapper[4930]: I1012 06:02:22.396471 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0fb6258b-6691-4a25-b22c-ad2a5a03b167","Type":"ContainerDied","Data":"3f46f37403747a9f51f09e17860efbd5c1d75c8431788d8e2976f9ca29614b40"} Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.052201 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.058713 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.210367 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fb6258b-6691-4a25-b22c-ad2a5a03b167-config-data\") pod \"0fb6258b-6691-4a25-b22c-ad2a5a03b167\" (UID: \"0fb6258b-6691-4a25-b22c-ad2a5a03b167\") " Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.210439 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79d95824-c8fd-4179-8f31-3a922c4b356d-config-data\") pod \"79d95824-c8fd-4179-8f31-3a922c4b356d\" (UID: \"79d95824-c8fd-4179-8f31-3a922c4b356d\") " Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.210470 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0fb6258b-6691-4a25-b22c-ad2a5a03b167-nova-metadata-tls-certs\") pod \"0fb6258b-6691-4a25-b22c-ad2a5a03b167\" (UID: \"0fb6258b-6691-4a25-b22c-ad2a5a03b167\") " Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.210496 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0fb6258b-6691-4a25-b22c-ad2a5a03b167-logs\") pod \"0fb6258b-6691-4a25-b22c-ad2a5a03b167\" (UID: \"0fb6258b-6691-4a25-b22c-ad2a5a03b167\") " Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.210547 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xtpvg\" (UniqueName: \"kubernetes.io/projected/79d95824-c8fd-4179-8f31-3a922c4b356d-kube-api-access-xtpvg\") pod \"79d95824-c8fd-4179-8f31-3a922c4b356d\" (UID: \"79d95824-c8fd-4179-8f31-3a922c4b356d\") " Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.210587 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/79d95824-c8fd-4179-8f31-3a922c4b356d-internal-tls-certs\") pod \"79d95824-c8fd-4179-8f31-3a922c4b356d\" (UID: \"79d95824-c8fd-4179-8f31-3a922c4b356d\") " Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.210638 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/79d95824-c8fd-4179-8f31-3a922c4b356d-public-tls-certs\") pod \"79d95824-c8fd-4179-8f31-3a922c4b356d\" (UID: \"79d95824-c8fd-4179-8f31-3a922c4b356d\") " Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.210723 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79d95824-c8fd-4179-8f31-3a922c4b356d-logs\") pod \"79d95824-c8fd-4179-8f31-3a922c4b356d\" (UID: \"79d95824-c8fd-4179-8f31-3a922c4b356d\") " Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.210772 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4r4lh\" (UniqueName: \"kubernetes.io/projected/0fb6258b-6691-4a25-b22c-ad2a5a03b167-kube-api-access-4r4lh\") pod \"0fb6258b-6691-4a25-b22c-ad2a5a03b167\" (UID: \"0fb6258b-6691-4a25-b22c-ad2a5a03b167\") " Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.210835 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79d95824-c8fd-4179-8f31-3a922c4b356d-combined-ca-bundle\") pod \"79d95824-c8fd-4179-8f31-3a922c4b356d\" (UID: \"79d95824-c8fd-4179-8f31-3a922c4b356d\") " Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.211377 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fb6258b-6691-4a25-b22c-ad2a5a03b167-combined-ca-bundle\") pod \"0fb6258b-6691-4a25-b22c-ad2a5a03b167\" (UID: \"0fb6258b-6691-4a25-b22c-ad2a5a03b167\") " Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.211489 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79d95824-c8fd-4179-8f31-3a922c4b356d-logs" (OuterVolumeSpecName: "logs") pod "79d95824-c8fd-4179-8f31-3a922c4b356d" (UID: "79d95824-c8fd-4179-8f31-3a922c4b356d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.212248 4930 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79d95824-c8fd-4179-8f31-3a922c4b356d-logs\") on node \"crc\" DevicePath \"\"" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.212374 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0fb6258b-6691-4a25-b22c-ad2a5a03b167-logs" (OuterVolumeSpecName: "logs") pod "0fb6258b-6691-4a25-b22c-ad2a5a03b167" (UID: "0fb6258b-6691-4a25-b22c-ad2a5a03b167"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.235985 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fb6258b-6691-4a25-b22c-ad2a5a03b167-kube-api-access-4r4lh" (OuterVolumeSpecName: "kube-api-access-4r4lh") pod "0fb6258b-6691-4a25-b22c-ad2a5a03b167" (UID: "0fb6258b-6691-4a25-b22c-ad2a5a03b167"). InnerVolumeSpecName "kube-api-access-4r4lh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.243135 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79d95824-c8fd-4179-8f31-3a922c4b356d-kube-api-access-xtpvg" (OuterVolumeSpecName: "kube-api-access-xtpvg") pod "79d95824-c8fd-4179-8f31-3a922c4b356d" (UID: "79d95824-c8fd-4179-8f31-3a922c4b356d"). InnerVolumeSpecName "kube-api-access-xtpvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.256877 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79d95824-c8fd-4179-8f31-3a922c4b356d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "79d95824-c8fd-4179-8f31-3a922c4b356d" (UID: "79d95824-c8fd-4179-8f31-3a922c4b356d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.261807 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79d95824-c8fd-4179-8f31-3a922c4b356d-config-data" (OuterVolumeSpecName: "config-data") pod "79d95824-c8fd-4179-8f31-3a922c4b356d" (UID: "79d95824-c8fd-4179-8f31-3a922c4b356d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.281607 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fb6258b-6691-4a25-b22c-ad2a5a03b167-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0fb6258b-6691-4a25-b22c-ad2a5a03b167" (UID: "0fb6258b-6691-4a25-b22c-ad2a5a03b167"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.294332 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fb6258b-6691-4a25-b22c-ad2a5a03b167-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "0fb6258b-6691-4a25-b22c-ad2a5a03b167" (UID: "0fb6258b-6691-4a25-b22c-ad2a5a03b167"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.295000 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fb6258b-6691-4a25-b22c-ad2a5a03b167-config-data" (OuterVolumeSpecName: "config-data") pod "0fb6258b-6691-4a25-b22c-ad2a5a03b167" (UID: "0fb6258b-6691-4a25-b22c-ad2a5a03b167"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.314026 4930 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79d95824-c8fd-4179-8f31-3a922c4b356d-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.314073 4930 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0fb6258b-6691-4a25-b22c-ad2a5a03b167-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.314090 4930 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0fb6258b-6691-4a25-b22c-ad2a5a03b167-logs\") on node \"crc\" DevicePath \"\"" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.314102 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xtpvg\" (UniqueName: \"kubernetes.io/projected/79d95824-c8fd-4179-8f31-3a922c4b356d-kube-api-access-xtpvg\") on node \"crc\" DevicePath \"\"" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.314114 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4r4lh\" (UniqueName: \"kubernetes.io/projected/0fb6258b-6691-4a25-b22c-ad2a5a03b167-kube-api-access-4r4lh\") on node \"crc\" DevicePath \"\"" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.314127 4930 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79d95824-c8fd-4179-8f31-3a922c4b356d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.314138 4930 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fb6258b-6691-4a25-b22c-ad2a5a03b167-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.314149 4930 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fb6258b-6691-4a25-b22c-ad2a5a03b167-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.314930 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79d95824-c8fd-4179-8f31-3a922c4b356d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "79d95824-c8fd-4179-8f31-3a922c4b356d" (UID: "79d95824-c8fd-4179-8f31-3a922c4b356d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.327450 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79d95824-c8fd-4179-8f31-3a922c4b356d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "79d95824-c8fd-4179-8f31-3a922c4b356d" (UID: "79d95824-c8fd-4179-8f31-3a922c4b356d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.405291 4930 generic.go:334] "Generic (PLEG): container finished" podID="79d95824-c8fd-4179-8f31-3a922c4b356d" containerID="f6dcadf865412a7b36211d6460df037460a89656868813ddc1b10e1d24c23320" exitCode=0 Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.405358 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"79d95824-c8fd-4179-8f31-3a922c4b356d","Type":"ContainerDied","Data":"f6dcadf865412a7b36211d6460df037460a89656868813ddc1b10e1d24c23320"} Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.405364 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.405386 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"79d95824-c8fd-4179-8f31-3a922c4b356d","Type":"ContainerDied","Data":"5db1540c50d2ad49fd7aacc0b623e3ac217d1e99c2fc7e3a4319665cb8fb4148"} Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.405403 4930 scope.go:117] "RemoveContainer" containerID="f6dcadf865412a7b36211d6460df037460a89656868813ddc1b10e1d24c23320" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.416870 4930 generic.go:334] "Generic (PLEG): container finished" podID="0fb6258b-6691-4a25-b22c-ad2a5a03b167" containerID="c829134520a8f961827096f58e320507bb1e104b0d830f6775dc1b17997cb132" exitCode=0 Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.417193 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0fb6258b-6691-4a25-b22c-ad2a5a03b167","Type":"ContainerDied","Data":"c829134520a8f961827096f58e320507bb1e104b0d830f6775dc1b17997cb132"} Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.417221 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0fb6258b-6691-4a25-b22c-ad2a5a03b167","Type":"ContainerDied","Data":"cd56a96277fc6b37242b544ce8113391e28dfec41729f651da8e44b68c4e774f"} Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.417217 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.418446 4930 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/79d95824-c8fd-4179-8f31-3a922c4b356d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.418489 4930 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/79d95824-c8fd-4179-8f31-3a922c4b356d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.442957 4930 scope.go:117] "RemoveContainer" containerID="9e9961dac682168f9f9e593d89620460490258b98012e40a3bfcb1e5e1962936" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.488321 4930 scope.go:117] "RemoveContainer" containerID="f6dcadf865412a7b36211d6460df037460a89656868813ddc1b10e1d24c23320" Oct 12 06:02:23 crc kubenswrapper[4930]: E1012 06:02:23.488816 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6dcadf865412a7b36211d6460df037460a89656868813ddc1b10e1d24c23320\": container with ID starting with f6dcadf865412a7b36211d6460df037460a89656868813ddc1b10e1d24c23320 not found: ID does not exist" containerID="f6dcadf865412a7b36211d6460df037460a89656868813ddc1b10e1d24c23320" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.488875 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6dcadf865412a7b36211d6460df037460a89656868813ddc1b10e1d24c23320"} err="failed to get container status \"f6dcadf865412a7b36211d6460df037460a89656868813ddc1b10e1d24c23320\": rpc error: code = NotFound desc = could not find container \"f6dcadf865412a7b36211d6460df037460a89656868813ddc1b10e1d24c23320\": container with ID starting with f6dcadf865412a7b36211d6460df037460a89656868813ddc1b10e1d24c23320 not found: ID does not exist" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.488906 4930 scope.go:117] "RemoveContainer" containerID="9e9961dac682168f9f9e593d89620460490258b98012e40a3bfcb1e5e1962936" Oct 12 06:02:23 crc kubenswrapper[4930]: E1012 06:02:23.489180 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e9961dac682168f9f9e593d89620460490258b98012e40a3bfcb1e5e1962936\": container with ID starting with 9e9961dac682168f9f9e593d89620460490258b98012e40a3bfcb1e5e1962936 not found: ID does not exist" containerID="9e9961dac682168f9f9e593d89620460490258b98012e40a3bfcb1e5e1962936" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.489210 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e9961dac682168f9f9e593d89620460490258b98012e40a3bfcb1e5e1962936"} err="failed to get container status \"9e9961dac682168f9f9e593d89620460490258b98012e40a3bfcb1e5e1962936\": rpc error: code = NotFound desc = could not find container \"9e9961dac682168f9f9e593d89620460490258b98012e40a3bfcb1e5e1962936\": container with ID starting with 9e9961dac682168f9f9e593d89620460490258b98012e40a3bfcb1e5e1962936 not found: ID does not exist" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.489222 4930 scope.go:117] "RemoveContainer" containerID="c829134520a8f961827096f58e320507bb1e104b0d830f6775dc1b17997cb132" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.495955 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.527942 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.536069 4930 scope.go:117] "RemoveContainer" containerID="3f46f37403747a9f51f09e17860efbd5c1d75c8431788d8e2976f9ca29614b40" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.538035 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 12 06:02:23 crc kubenswrapper[4930]: E1012 06:02:23.538479 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fb6258b-6691-4a25-b22c-ad2a5a03b167" containerName="nova-metadata-metadata" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.538494 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fb6258b-6691-4a25-b22c-ad2a5a03b167" containerName="nova-metadata-metadata" Oct 12 06:02:23 crc kubenswrapper[4930]: E1012 06:02:23.538517 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79d95824-c8fd-4179-8f31-3a922c4b356d" containerName="nova-api-log" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.538523 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="79d95824-c8fd-4179-8f31-3a922c4b356d" containerName="nova-api-log" Oct 12 06:02:23 crc kubenswrapper[4930]: E1012 06:02:23.538540 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2f80483-5b42-4fa4-8013-f2a8fd535c9e" containerName="nova-manage" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.538546 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2f80483-5b42-4fa4-8013-f2a8fd535c9e" containerName="nova-manage" Oct 12 06:02:23 crc kubenswrapper[4930]: E1012 06:02:23.538561 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fb6258b-6691-4a25-b22c-ad2a5a03b167" containerName="nova-metadata-log" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.538567 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fb6258b-6691-4a25-b22c-ad2a5a03b167" containerName="nova-metadata-log" Oct 12 06:02:23 crc kubenswrapper[4930]: E1012 06:02:23.538586 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79d95824-c8fd-4179-8f31-3a922c4b356d" containerName="nova-api-api" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.538592 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="79d95824-c8fd-4179-8f31-3a922c4b356d" containerName="nova-api-api" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.538802 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fb6258b-6691-4a25-b22c-ad2a5a03b167" containerName="nova-metadata-log" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.538821 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2f80483-5b42-4fa4-8013-f2a8fd535c9e" containerName="nova-manage" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.538834 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="79d95824-c8fd-4179-8f31-3a922c4b356d" containerName="nova-api-api" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.538843 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fb6258b-6691-4a25-b22c-ad2a5a03b167" containerName="nova-metadata-metadata" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.538862 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="79d95824-c8fd-4179-8f31-3a922c4b356d" containerName="nova-api-log" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.540206 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.556200 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.556404 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.557039 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.557904 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.577190 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.584947 4930 scope.go:117] "RemoveContainer" containerID="c829134520a8f961827096f58e320507bb1e104b0d830f6775dc1b17997cb132" Oct 12 06:02:23 crc kubenswrapper[4930]: E1012 06:02:23.585285 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c829134520a8f961827096f58e320507bb1e104b0d830f6775dc1b17997cb132\": container with ID starting with c829134520a8f961827096f58e320507bb1e104b0d830f6775dc1b17997cb132 not found: ID does not exist" containerID="c829134520a8f961827096f58e320507bb1e104b0d830f6775dc1b17997cb132" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.585334 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c829134520a8f961827096f58e320507bb1e104b0d830f6775dc1b17997cb132"} err="failed to get container status \"c829134520a8f961827096f58e320507bb1e104b0d830f6775dc1b17997cb132\": rpc error: code = NotFound desc = could not find container \"c829134520a8f961827096f58e320507bb1e104b0d830f6775dc1b17997cb132\": container with ID starting with c829134520a8f961827096f58e320507bb1e104b0d830f6775dc1b17997cb132 not found: ID does not exist" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.585363 4930 scope.go:117] "RemoveContainer" containerID="3f46f37403747a9f51f09e17860efbd5c1d75c8431788d8e2976f9ca29614b40" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.585452 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 12 06:02:23 crc kubenswrapper[4930]: E1012 06:02:23.585931 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f46f37403747a9f51f09e17860efbd5c1d75c8431788d8e2976f9ca29614b40\": container with ID starting with 3f46f37403747a9f51f09e17860efbd5c1d75c8431788d8e2976f9ca29614b40 not found: ID does not exist" containerID="3f46f37403747a9f51f09e17860efbd5c1d75c8431788d8e2976f9ca29614b40" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.585978 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f46f37403747a9f51f09e17860efbd5c1d75c8431788d8e2976f9ca29614b40"} err="failed to get container status \"3f46f37403747a9f51f09e17860efbd5c1d75c8431788d8e2976f9ca29614b40\": rpc error: code = NotFound desc = could not find container \"3f46f37403747a9f51f09e17860efbd5c1d75c8431788d8e2976f9ca29614b40\": container with ID starting with 3f46f37403747a9f51f09e17860efbd5c1d75c8431788d8e2976f9ca29614b40 not found: ID does not exist" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.593587 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.595354 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.597344 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.597416 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.601836 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.624641 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5709234d-7700-462e-9fa6-7e4f09bd0d91-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5709234d-7700-462e-9fa6-7e4f09bd0d91\") " pod="openstack/nova-api-0" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.624712 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5709234d-7700-462e-9fa6-7e4f09bd0d91-logs\") pod \"nova-api-0\" (UID: \"5709234d-7700-462e-9fa6-7e4f09bd0d91\") " pod="openstack/nova-api-0" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.624782 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b0fcd1f-00b5-41e2-85e9-b1fd08dcf139-config-data\") pod \"nova-metadata-0\" (UID: \"0b0fcd1f-00b5-41e2-85e9-b1fd08dcf139\") " pod="openstack/nova-metadata-0" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.624809 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbf4r\" (UniqueName: \"kubernetes.io/projected/0b0fcd1f-00b5-41e2-85e9-b1fd08dcf139-kube-api-access-dbf4r\") pod \"nova-metadata-0\" (UID: \"0b0fcd1f-00b5-41e2-85e9-b1fd08dcf139\") " pod="openstack/nova-metadata-0" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.624842 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5709234d-7700-462e-9fa6-7e4f09bd0d91-public-tls-certs\") pod \"nova-api-0\" (UID: \"5709234d-7700-462e-9fa6-7e4f09bd0d91\") " pod="openstack/nova-api-0" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.624862 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5709234d-7700-462e-9fa6-7e4f09bd0d91-internal-tls-certs\") pod \"nova-api-0\" (UID: \"5709234d-7700-462e-9fa6-7e4f09bd0d91\") " pod="openstack/nova-api-0" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.624924 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5709234d-7700-462e-9fa6-7e4f09bd0d91-config-data\") pod \"nova-api-0\" (UID: \"5709234d-7700-462e-9fa6-7e4f09bd0d91\") " pod="openstack/nova-api-0" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.624944 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b0fcd1f-00b5-41e2-85e9-b1fd08dcf139-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0b0fcd1f-00b5-41e2-85e9-b1fd08dcf139\") " pod="openstack/nova-metadata-0" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.625028 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b0fcd1f-00b5-41e2-85e9-b1fd08dcf139-logs\") pod \"nova-metadata-0\" (UID: \"0b0fcd1f-00b5-41e2-85e9-b1fd08dcf139\") " pod="openstack/nova-metadata-0" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.625088 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b0fcd1f-00b5-41e2-85e9-b1fd08dcf139-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0b0fcd1f-00b5-41e2-85e9-b1fd08dcf139\") " pod="openstack/nova-metadata-0" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.625193 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hq5xx\" (UniqueName: \"kubernetes.io/projected/5709234d-7700-462e-9fa6-7e4f09bd0d91-kube-api-access-hq5xx\") pod \"nova-api-0\" (UID: \"5709234d-7700-462e-9fa6-7e4f09bd0d91\") " pod="openstack/nova-api-0" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.726961 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b0fcd1f-00b5-41e2-85e9-b1fd08dcf139-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0b0fcd1f-00b5-41e2-85e9-b1fd08dcf139\") " pod="openstack/nova-metadata-0" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.727035 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hq5xx\" (UniqueName: \"kubernetes.io/projected/5709234d-7700-462e-9fa6-7e4f09bd0d91-kube-api-access-hq5xx\") pod \"nova-api-0\" (UID: \"5709234d-7700-462e-9fa6-7e4f09bd0d91\") " pod="openstack/nova-api-0" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.727100 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5709234d-7700-462e-9fa6-7e4f09bd0d91-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5709234d-7700-462e-9fa6-7e4f09bd0d91\") " pod="openstack/nova-api-0" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.727129 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5709234d-7700-462e-9fa6-7e4f09bd0d91-logs\") pod \"nova-api-0\" (UID: \"5709234d-7700-462e-9fa6-7e4f09bd0d91\") " pod="openstack/nova-api-0" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.727160 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b0fcd1f-00b5-41e2-85e9-b1fd08dcf139-config-data\") pod \"nova-metadata-0\" (UID: \"0b0fcd1f-00b5-41e2-85e9-b1fd08dcf139\") " pod="openstack/nova-metadata-0" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.727188 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbf4r\" (UniqueName: \"kubernetes.io/projected/0b0fcd1f-00b5-41e2-85e9-b1fd08dcf139-kube-api-access-dbf4r\") pod \"nova-metadata-0\" (UID: \"0b0fcd1f-00b5-41e2-85e9-b1fd08dcf139\") " pod="openstack/nova-metadata-0" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.727212 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5709234d-7700-462e-9fa6-7e4f09bd0d91-public-tls-certs\") pod \"nova-api-0\" (UID: \"5709234d-7700-462e-9fa6-7e4f09bd0d91\") " pod="openstack/nova-api-0" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.727240 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5709234d-7700-462e-9fa6-7e4f09bd0d91-internal-tls-certs\") pod \"nova-api-0\" (UID: \"5709234d-7700-462e-9fa6-7e4f09bd0d91\") " pod="openstack/nova-api-0" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.727645 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5709234d-7700-462e-9fa6-7e4f09bd0d91-config-data\") pod \"nova-api-0\" (UID: \"5709234d-7700-462e-9fa6-7e4f09bd0d91\") " pod="openstack/nova-api-0" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.727710 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b0fcd1f-00b5-41e2-85e9-b1fd08dcf139-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0b0fcd1f-00b5-41e2-85e9-b1fd08dcf139\") " pod="openstack/nova-metadata-0" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.727803 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5709234d-7700-462e-9fa6-7e4f09bd0d91-logs\") pod \"nova-api-0\" (UID: \"5709234d-7700-462e-9fa6-7e4f09bd0d91\") " pod="openstack/nova-api-0" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.729217 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b0fcd1f-00b5-41e2-85e9-b1fd08dcf139-logs\") pod \"nova-metadata-0\" (UID: \"0b0fcd1f-00b5-41e2-85e9-b1fd08dcf139\") " pod="openstack/nova-metadata-0" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.729234 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b0fcd1f-00b5-41e2-85e9-b1fd08dcf139-logs\") pod \"nova-metadata-0\" (UID: \"0b0fcd1f-00b5-41e2-85e9-b1fd08dcf139\") " pod="openstack/nova-metadata-0" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.732732 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b0fcd1f-00b5-41e2-85e9-b1fd08dcf139-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0b0fcd1f-00b5-41e2-85e9-b1fd08dcf139\") " pod="openstack/nova-metadata-0" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.733099 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5709234d-7700-462e-9fa6-7e4f09bd0d91-internal-tls-certs\") pod \"nova-api-0\" (UID: \"5709234d-7700-462e-9fa6-7e4f09bd0d91\") " pod="openstack/nova-api-0" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.733159 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b0fcd1f-00b5-41e2-85e9-b1fd08dcf139-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0b0fcd1f-00b5-41e2-85e9-b1fd08dcf139\") " pod="openstack/nova-metadata-0" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.733631 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b0fcd1f-00b5-41e2-85e9-b1fd08dcf139-config-data\") pod \"nova-metadata-0\" (UID: \"0b0fcd1f-00b5-41e2-85e9-b1fd08dcf139\") " pod="openstack/nova-metadata-0" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.734145 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5709234d-7700-462e-9fa6-7e4f09bd0d91-config-data\") pod \"nova-api-0\" (UID: \"5709234d-7700-462e-9fa6-7e4f09bd0d91\") " pod="openstack/nova-api-0" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.734441 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5709234d-7700-462e-9fa6-7e4f09bd0d91-public-tls-certs\") pod \"nova-api-0\" (UID: \"5709234d-7700-462e-9fa6-7e4f09bd0d91\") " pod="openstack/nova-api-0" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.740334 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5709234d-7700-462e-9fa6-7e4f09bd0d91-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5709234d-7700-462e-9fa6-7e4f09bd0d91\") " pod="openstack/nova-api-0" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.744650 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbf4r\" (UniqueName: \"kubernetes.io/projected/0b0fcd1f-00b5-41e2-85e9-b1fd08dcf139-kube-api-access-dbf4r\") pod \"nova-metadata-0\" (UID: \"0b0fcd1f-00b5-41e2-85e9-b1fd08dcf139\") " pod="openstack/nova-metadata-0" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.749678 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hq5xx\" (UniqueName: \"kubernetes.io/projected/5709234d-7700-462e-9fa6-7e4f09bd0d91-kube-api-access-hq5xx\") pod \"nova-api-0\" (UID: \"5709234d-7700-462e-9fa6-7e4f09bd0d91\") " pod="openstack/nova-api-0" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.878326 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 12 06:02:23 crc kubenswrapper[4930]: I1012 06:02:23.919261 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 12 06:02:24 crc kubenswrapper[4930]: I1012 06:02:24.161908 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fb6258b-6691-4a25-b22c-ad2a5a03b167" path="/var/lib/kubelet/pods/0fb6258b-6691-4a25-b22c-ad2a5a03b167/volumes" Oct 12 06:02:24 crc kubenswrapper[4930]: I1012 06:02:24.163601 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79d95824-c8fd-4179-8f31-3a922c4b356d" path="/var/lib/kubelet/pods/79d95824-c8fd-4179-8f31-3a922c4b356d/volumes" Oct 12 06:02:24 crc kubenswrapper[4930]: I1012 06:02:24.386987 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 12 06:02:24 crc kubenswrapper[4930]: I1012 06:02:24.432278 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5709234d-7700-462e-9fa6-7e4f09bd0d91","Type":"ContainerStarted","Data":"36b547a9ceedc76329a93f64fe2ed79af9def30eaffcb6b35318c71c2311df1b"} Oct 12 06:02:24 crc kubenswrapper[4930]: I1012 06:02:24.436029 4930 generic.go:334] "Generic (PLEG): container finished" podID="7a446391-253a-4715-bdf8-709b778e76eb" containerID="d42d56e1d7e3a0415895fd9f006c566a38f260bb3157b826f2b9b73cadc99a04" exitCode=0 Oct 12 06:02:24 crc kubenswrapper[4930]: I1012 06:02:24.436106 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7a446391-253a-4715-bdf8-709b778e76eb","Type":"ContainerDied","Data":"d42d56e1d7e3a0415895fd9f006c566a38f260bb3157b826f2b9b73cadc99a04"} Oct 12 06:02:24 crc kubenswrapper[4930]: I1012 06:02:24.488274 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 12 06:02:24 crc kubenswrapper[4930]: I1012 06:02:24.623044 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 12 06:02:24 crc kubenswrapper[4930]: I1012 06:02:24.757062 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a446391-253a-4715-bdf8-709b778e76eb-config-data\") pod \"7a446391-253a-4715-bdf8-709b778e76eb\" (UID: \"7a446391-253a-4715-bdf8-709b778e76eb\") " Oct 12 06:02:24 crc kubenswrapper[4930]: I1012 06:02:24.757580 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a446391-253a-4715-bdf8-709b778e76eb-combined-ca-bundle\") pod \"7a446391-253a-4715-bdf8-709b778e76eb\" (UID: \"7a446391-253a-4715-bdf8-709b778e76eb\") " Oct 12 06:02:24 crc kubenswrapper[4930]: I1012 06:02:24.757901 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jsmfc\" (UniqueName: \"kubernetes.io/projected/7a446391-253a-4715-bdf8-709b778e76eb-kube-api-access-jsmfc\") pod \"7a446391-253a-4715-bdf8-709b778e76eb\" (UID: \"7a446391-253a-4715-bdf8-709b778e76eb\") " Oct 12 06:02:24 crc kubenswrapper[4930]: I1012 06:02:24.761595 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a446391-253a-4715-bdf8-709b778e76eb-kube-api-access-jsmfc" (OuterVolumeSpecName: "kube-api-access-jsmfc") pod "7a446391-253a-4715-bdf8-709b778e76eb" (UID: "7a446391-253a-4715-bdf8-709b778e76eb"). InnerVolumeSpecName "kube-api-access-jsmfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:02:24 crc kubenswrapper[4930]: I1012 06:02:24.810386 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a446391-253a-4715-bdf8-709b778e76eb-config-data" (OuterVolumeSpecName: "config-data") pod "7a446391-253a-4715-bdf8-709b778e76eb" (UID: "7a446391-253a-4715-bdf8-709b778e76eb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:02:24 crc kubenswrapper[4930]: I1012 06:02:24.812920 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a446391-253a-4715-bdf8-709b778e76eb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7a446391-253a-4715-bdf8-709b778e76eb" (UID: "7a446391-253a-4715-bdf8-709b778e76eb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:02:24 crc kubenswrapper[4930]: I1012 06:02:24.859905 4930 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a446391-253a-4715-bdf8-709b778e76eb-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 06:02:24 crc kubenswrapper[4930]: I1012 06:02:24.859946 4930 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a446391-253a-4715-bdf8-709b778e76eb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 06:02:24 crc kubenswrapper[4930]: I1012 06:02:24.859962 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jsmfc\" (UniqueName: \"kubernetes.io/projected/7a446391-253a-4715-bdf8-709b778e76eb-kube-api-access-jsmfc\") on node \"crc\" DevicePath \"\"" Oct 12 06:02:25 crc kubenswrapper[4930]: I1012 06:02:25.453528 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5709234d-7700-462e-9fa6-7e4f09bd0d91","Type":"ContainerStarted","Data":"7d6a59b3cac3310610d78e562e72a13f6715b8501778371f118422661aaf3abf"} Oct 12 06:02:25 crc kubenswrapper[4930]: I1012 06:02:25.453835 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5709234d-7700-462e-9fa6-7e4f09bd0d91","Type":"ContainerStarted","Data":"7805f55f5111dfc766b3615f297d94c7d07b131737d5f52dea5391baac08d1e8"} Oct 12 06:02:25 crc kubenswrapper[4930]: I1012 06:02:25.457414 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0b0fcd1f-00b5-41e2-85e9-b1fd08dcf139","Type":"ContainerStarted","Data":"dba8b83b5d9fa02c97f24770b8978b2a30e55d735f7560f924934bc62d1f58f5"} Oct 12 06:02:25 crc kubenswrapper[4930]: I1012 06:02:25.457459 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0b0fcd1f-00b5-41e2-85e9-b1fd08dcf139","Type":"ContainerStarted","Data":"ceb4cf31964d98e88258f74b8dac3458fcea3bd0630176199fb75bc7ada98862"} Oct 12 06:02:25 crc kubenswrapper[4930]: I1012 06:02:25.457473 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0b0fcd1f-00b5-41e2-85e9-b1fd08dcf139","Type":"ContainerStarted","Data":"1fce3bb58dd523986711cbb13b676d681ccfe78507f63eb3a3e1567b890b8e98"} Oct 12 06:02:25 crc kubenswrapper[4930]: I1012 06:02:25.459975 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7a446391-253a-4715-bdf8-709b778e76eb","Type":"ContainerDied","Data":"5e636ee956dcad70db6db0954ab08f45ba462bfce8aeecbe70164c482f57c25f"} Oct 12 06:02:25 crc kubenswrapper[4930]: I1012 06:02:25.460025 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 12 06:02:25 crc kubenswrapper[4930]: I1012 06:02:25.460104 4930 scope.go:117] "RemoveContainer" containerID="d42d56e1d7e3a0415895fd9f006c566a38f260bb3157b826f2b9b73cadc99a04" Oct 12 06:02:25 crc kubenswrapper[4930]: I1012 06:02:25.484402 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.484383074 podStartE2EDuration="2.484383074s" podCreationTimestamp="2025-10-12 06:02:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 06:02:25.47454566 +0000 UTC m=+1278.016647435" watchObservedRunningTime="2025-10-12 06:02:25.484383074 +0000 UTC m=+1278.026484839" Oct 12 06:02:25 crc kubenswrapper[4930]: I1012 06:02:25.530201 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.530170694 podStartE2EDuration="2.530170694s" podCreationTimestamp="2025-10-12 06:02:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 06:02:25.513183701 +0000 UTC m=+1278.055285476" watchObservedRunningTime="2025-10-12 06:02:25.530170694 +0000 UTC m=+1278.072272489" Oct 12 06:02:25 crc kubenswrapper[4930]: I1012 06:02:25.556723 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 12 06:02:25 crc kubenswrapper[4930]: I1012 06:02:25.567624 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 12 06:02:25 crc kubenswrapper[4930]: I1012 06:02:25.578023 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 12 06:02:25 crc kubenswrapper[4930]: E1012 06:02:25.578591 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a446391-253a-4715-bdf8-709b778e76eb" containerName="nova-scheduler-scheduler" Oct 12 06:02:25 crc kubenswrapper[4930]: I1012 06:02:25.578616 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a446391-253a-4715-bdf8-709b778e76eb" containerName="nova-scheduler-scheduler" Oct 12 06:02:25 crc kubenswrapper[4930]: I1012 06:02:25.578990 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a446391-253a-4715-bdf8-709b778e76eb" containerName="nova-scheduler-scheduler" Oct 12 06:02:25 crc kubenswrapper[4930]: I1012 06:02:25.579926 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 12 06:02:25 crc kubenswrapper[4930]: I1012 06:02:25.584913 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 12 06:02:25 crc kubenswrapper[4930]: I1012 06:02:25.585760 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 12 06:02:25 crc kubenswrapper[4930]: I1012 06:02:25.680451 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58a1e3ca-ad7a-49ac-8129-33ba2953d881-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"58a1e3ca-ad7a-49ac-8129-33ba2953d881\") " pod="openstack/nova-scheduler-0" Oct 12 06:02:25 crc kubenswrapper[4930]: I1012 06:02:25.680695 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58a1e3ca-ad7a-49ac-8129-33ba2953d881-config-data\") pod \"nova-scheduler-0\" (UID: \"58a1e3ca-ad7a-49ac-8129-33ba2953d881\") " pod="openstack/nova-scheduler-0" Oct 12 06:02:25 crc kubenswrapper[4930]: I1012 06:02:25.680783 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvkkw\" (UniqueName: \"kubernetes.io/projected/58a1e3ca-ad7a-49ac-8129-33ba2953d881-kube-api-access-fvkkw\") pod \"nova-scheduler-0\" (UID: \"58a1e3ca-ad7a-49ac-8129-33ba2953d881\") " pod="openstack/nova-scheduler-0" Oct 12 06:02:25 crc kubenswrapper[4930]: I1012 06:02:25.783701 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58a1e3ca-ad7a-49ac-8129-33ba2953d881-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"58a1e3ca-ad7a-49ac-8129-33ba2953d881\") " pod="openstack/nova-scheduler-0" Oct 12 06:02:25 crc kubenswrapper[4930]: I1012 06:02:25.783804 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58a1e3ca-ad7a-49ac-8129-33ba2953d881-config-data\") pod \"nova-scheduler-0\" (UID: \"58a1e3ca-ad7a-49ac-8129-33ba2953d881\") " pod="openstack/nova-scheduler-0" Oct 12 06:02:25 crc kubenswrapper[4930]: I1012 06:02:25.783827 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvkkw\" (UniqueName: \"kubernetes.io/projected/58a1e3ca-ad7a-49ac-8129-33ba2953d881-kube-api-access-fvkkw\") pod \"nova-scheduler-0\" (UID: \"58a1e3ca-ad7a-49ac-8129-33ba2953d881\") " pod="openstack/nova-scheduler-0" Oct 12 06:02:25 crc kubenswrapper[4930]: I1012 06:02:25.790437 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58a1e3ca-ad7a-49ac-8129-33ba2953d881-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"58a1e3ca-ad7a-49ac-8129-33ba2953d881\") " pod="openstack/nova-scheduler-0" Oct 12 06:02:25 crc kubenswrapper[4930]: I1012 06:02:25.791631 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58a1e3ca-ad7a-49ac-8129-33ba2953d881-config-data\") pod \"nova-scheduler-0\" (UID: \"58a1e3ca-ad7a-49ac-8129-33ba2953d881\") " pod="openstack/nova-scheduler-0" Oct 12 06:02:25 crc kubenswrapper[4930]: I1012 06:02:25.812947 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvkkw\" (UniqueName: \"kubernetes.io/projected/58a1e3ca-ad7a-49ac-8129-33ba2953d881-kube-api-access-fvkkw\") pod \"nova-scheduler-0\" (UID: \"58a1e3ca-ad7a-49ac-8129-33ba2953d881\") " pod="openstack/nova-scheduler-0" Oct 12 06:02:25 crc kubenswrapper[4930]: I1012 06:02:25.911253 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 12 06:02:26 crc kubenswrapper[4930]: I1012 06:02:26.155030 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a446391-253a-4715-bdf8-709b778e76eb" path="/var/lib/kubelet/pods/7a446391-253a-4715-bdf8-709b778e76eb/volumes" Oct 12 06:02:26 crc kubenswrapper[4930]: W1012 06:02:26.520892 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod58a1e3ca_ad7a_49ac_8129_33ba2953d881.slice/crio-c6f2a2d961336724f58afd813f3858bbdc53853a4756db9ebb688e56f5fbcab1 WatchSource:0}: Error finding container c6f2a2d961336724f58afd813f3858bbdc53853a4756db9ebb688e56f5fbcab1: Status 404 returned error can't find the container with id c6f2a2d961336724f58afd813f3858bbdc53853a4756db9ebb688e56f5fbcab1 Oct 12 06:02:26 crc kubenswrapper[4930]: I1012 06:02:26.522936 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 12 06:02:27 crc kubenswrapper[4930]: I1012 06:02:27.492305 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"58a1e3ca-ad7a-49ac-8129-33ba2953d881","Type":"ContainerStarted","Data":"82dae08b7a7d10c1e19edfde255e42e78a70863da410b3bf9544e9d7249670f4"} Oct 12 06:02:27 crc kubenswrapper[4930]: I1012 06:02:27.492657 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"58a1e3ca-ad7a-49ac-8129-33ba2953d881","Type":"ContainerStarted","Data":"c6f2a2d961336724f58afd813f3858bbdc53853a4756db9ebb688e56f5fbcab1"} Oct 12 06:02:27 crc kubenswrapper[4930]: I1012 06:02:27.536410 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.536380237 podStartE2EDuration="2.536380237s" podCreationTimestamp="2025-10-12 06:02:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 06:02:27.513337484 +0000 UTC m=+1280.055439289" watchObservedRunningTime="2025-10-12 06:02:27.536380237 +0000 UTC m=+1280.078482042" Oct 12 06:02:28 crc kubenswrapper[4930]: I1012 06:02:28.920414 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 12 06:02:28 crc kubenswrapper[4930]: I1012 06:02:28.920702 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 12 06:02:30 crc kubenswrapper[4930]: I1012 06:02:30.911334 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 12 06:02:33 crc kubenswrapper[4930]: I1012 06:02:33.878893 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 12 06:02:33 crc kubenswrapper[4930]: I1012 06:02:33.879485 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 12 06:02:33 crc kubenswrapper[4930]: I1012 06:02:33.920939 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 12 06:02:33 crc kubenswrapper[4930]: I1012 06:02:33.921008 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 12 06:02:34 crc kubenswrapper[4930]: I1012 06:02:34.893985 4930 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5709234d-7700-462e-9fa6-7e4f09bd0d91" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.227:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 12 06:02:34 crc kubenswrapper[4930]: I1012 06:02:34.894007 4930 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5709234d-7700-462e-9fa6-7e4f09bd0d91" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.227:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 12 06:02:34 crc kubenswrapper[4930]: I1012 06:02:34.937242 4930 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="0b0fcd1f-00b5-41e2-85e9-b1fd08dcf139" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.228:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 12 06:02:34 crc kubenswrapper[4930]: I1012 06:02:34.937226 4930 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="0b0fcd1f-00b5-41e2-85e9-b1fd08dcf139" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.228:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 12 06:02:35 crc kubenswrapper[4930]: I1012 06:02:35.912299 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 12 06:02:35 crc kubenswrapper[4930]: I1012 06:02:35.950413 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 12 06:02:36 crc kubenswrapper[4930]: I1012 06:02:36.659101 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 12 06:02:40 crc kubenswrapper[4930]: I1012 06:02:40.896644 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 12 06:02:43 crc kubenswrapper[4930]: I1012 06:02:43.915064 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 12 06:02:43 crc kubenswrapper[4930]: I1012 06:02:43.915873 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 12 06:02:43 crc kubenswrapper[4930]: I1012 06:02:43.927229 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 12 06:02:43 crc kubenswrapper[4930]: I1012 06:02:43.929944 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 12 06:02:43 crc kubenswrapper[4930]: I1012 06:02:43.930614 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 12 06:02:43 crc kubenswrapper[4930]: I1012 06:02:43.932848 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 12 06:02:43 crc kubenswrapper[4930]: I1012 06:02:43.969751 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 12 06:02:44 crc kubenswrapper[4930]: I1012 06:02:44.725649 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 12 06:02:44 crc kubenswrapper[4930]: I1012 06:02:44.732604 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 12 06:02:44 crc kubenswrapper[4930]: I1012 06:02:44.739034 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 12 06:02:52 crc kubenswrapper[4930]: I1012 06:02:52.873429 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 12 06:02:54 crc kubenswrapper[4930]: I1012 06:02:54.493779 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 12 06:02:56 crc kubenswrapper[4930]: I1012 06:02:56.341512 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="bad3587e-d515-4add-9edd-da341fe519b7" containerName="rabbitmq" containerID="cri-o://62f020028b6b17a42630094b79d284a4e9e7103dc1dc7e65f9313baef3502d4e" gracePeriod=604797 Oct 12 06:02:57 crc kubenswrapper[4930]: I1012 06:02:57.822424 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="fe886795-2501-4474-bfbc-9febcc5113f3" containerName="rabbitmq" containerID="cri-o://16242491c9076429f59d7ec96b240e6798e421997f2bcf00ecca026f566f6d44" gracePeriod=604797 Oct 12 06:02:57 crc kubenswrapper[4930]: I1012 06:02:57.872081 4930 generic.go:334] "Generic (PLEG): container finished" podID="bad3587e-d515-4add-9edd-da341fe519b7" containerID="62f020028b6b17a42630094b79d284a4e9e7103dc1dc7e65f9313baef3502d4e" exitCode=0 Oct 12 06:02:57 crc kubenswrapper[4930]: I1012 06:02:57.872126 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"bad3587e-d515-4add-9edd-da341fe519b7","Type":"ContainerDied","Data":"62f020028b6b17a42630094b79d284a4e9e7103dc1dc7e65f9313baef3502d4e"} Oct 12 06:02:57 crc kubenswrapper[4930]: I1012 06:02:57.872152 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"bad3587e-d515-4add-9edd-da341fe519b7","Type":"ContainerDied","Data":"8fedf8e7d958f38b23fceca5dd59d3b7c1d5d56ff5df608c2df7deae4e7103ec"} Oct 12 06:02:57 crc kubenswrapper[4930]: I1012 06:02:57.872165 4930 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8fedf8e7d958f38b23fceca5dd59d3b7c1d5d56ff5df608c2df7deae4e7103ec" Oct 12 06:02:57 crc kubenswrapper[4930]: I1012 06:02:57.919926 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 12 06:02:58 crc kubenswrapper[4930]: I1012 06:02:58.096382 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bad3587e-d515-4add-9edd-da341fe519b7-rabbitmq-tls\") pod \"bad3587e-d515-4add-9edd-da341fe519b7\" (UID: \"bad3587e-d515-4add-9edd-da341fe519b7\") " Oct 12 06:02:58 crc kubenswrapper[4930]: I1012 06:02:58.096426 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bad3587e-d515-4add-9edd-da341fe519b7-rabbitmq-confd\") pod \"bad3587e-d515-4add-9edd-da341fe519b7\" (UID: \"bad3587e-d515-4add-9edd-da341fe519b7\") " Oct 12 06:02:58 crc kubenswrapper[4930]: I1012 06:02:58.096505 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bad3587e-d515-4add-9edd-da341fe519b7-server-conf\") pod \"bad3587e-d515-4add-9edd-da341fe519b7\" (UID: \"bad3587e-d515-4add-9edd-da341fe519b7\") " Oct 12 06:02:58 crc kubenswrapper[4930]: I1012 06:02:58.096540 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bad3587e-d515-4add-9edd-da341fe519b7-rabbitmq-erlang-cookie\") pod \"bad3587e-d515-4add-9edd-da341fe519b7\" (UID: \"bad3587e-d515-4add-9edd-da341fe519b7\") " Oct 12 06:02:58 crc kubenswrapper[4930]: I1012 06:02:58.096578 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"bad3587e-d515-4add-9edd-da341fe519b7\" (UID: \"bad3587e-d515-4add-9edd-da341fe519b7\") " Oct 12 06:02:58 crc kubenswrapper[4930]: I1012 06:02:58.096614 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bad3587e-d515-4add-9edd-da341fe519b7-plugins-conf\") pod \"bad3587e-d515-4add-9edd-da341fe519b7\" (UID: \"bad3587e-d515-4add-9edd-da341fe519b7\") " Oct 12 06:02:58 crc kubenswrapper[4930]: I1012 06:02:58.096673 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bad3587e-d515-4add-9edd-da341fe519b7-rabbitmq-plugins\") pod \"bad3587e-d515-4add-9edd-da341fe519b7\" (UID: \"bad3587e-d515-4add-9edd-da341fe519b7\") " Oct 12 06:02:58 crc kubenswrapper[4930]: I1012 06:02:58.096713 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bad3587e-d515-4add-9edd-da341fe519b7-pod-info\") pod \"bad3587e-d515-4add-9edd-da341fe519b7\" (UID: \"bad3587e-d515-4add-9edd-da341fe519b7\") " Oct 12 06:02:58 crc kubenswrapper[4930]: I1012 06:02:58.096836 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhrgr\" (UniqueName: \"kubernetes.io/projected/bad3587e-d515-4add-9edd-da341fe519b7-kube-api-access-jhrgr\") pod \"bad3587e-d515-4add-9edd-da341fe519b7\" (UID: \"bad3587e-d515-4add-9edd-da341fe519b7\") " Oct 12 06:02:58 crc kubenswrapper[4930]: I1012 06:02:58.096853 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bad3587e-d515-4add-9edd-da341fe519b7-config-data\") pod \"bad3587e-d515-4add-9edd-da341fe519b7\" (UID: \"bad3587e-d515-4add-9edd-da341fe519b7\") " Oct 12 06:02:58 crc kubenswrapper[4930]: I1012 06:02:58.096881 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bad3587e-d515-4add-9edd-da341fe519b7-erlang-cookie-secret\") pod \"bad3587e-d515-4add-9edd-da341fe519b7\" (UID: \"bad3587e-d515-4add-9edd-da341fe519b7\") " Oct 12 06:02:58 crc kubenswrapper[4930]: I1012 06:02:58.104248 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bad3587e-d515-4add-9edd-da341fe519b7-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "bad3587e-d515-4add-9edd-da341fe519b7" (UID: "bad3587e-d515-4add-9edd-da341fe519b7"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:02:58 crc kubenswrapper[4930]: I1012 06:02:58.108358 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bad3587e-d515-4add-9edd-da341fe519b7-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "bad3587e-d515-4add-9edd-da341fe519b7" (UID: "bad3587e-d515-4add-9edd-da341fe519b7"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:02:58 crc kubenswrapper[4930]: I1012 06:02:58.108811 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bad3587e-d515-4add-9edd-da341fe519b7-kube-api-access-jhrgr" (OuterVolumeSpecName: "kube-api-access-jhrgr") pod "bad3587e-d515-4add-9edd-da341fe519b7" (UID: "bad3587e-d515-4add-9edd-da341fe519b7"). InnerVolumeSpecName "kube-api-access-jhrgr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:02:58 crc kubenswrapper[4930]: I1012 06:02:58.108816 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/bad3587e-d515-4add-9edd-da341fe519b7-pod-info" (OuterVolumeSpecName: "pod-info") pod "bad3587e-d515-4add-9edd-da341fe519b7" (UID: "bad3587e-d515-4add-9edd-da341fe519b7"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 12 06:02:58 crc kubenswrapper[4930]: I1012 06:02:58.109954 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bad3587e-d515-4add-9edd-da341fe519b7-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "bad3587e-d515-4add-9edd-da341fe519b7" (UID: "bad3587e-d515-4add-9edd-da341fe519b7"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 06:02:58 crc kubenswrapper[4930]: I1012 06:02:58.110152 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bad3587e-d515-4add-9edd-da341fe519b7-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "bad3587e-d515-4add-9edd-da341fe519b7" (UID: "bad3587e-d515-4add-9edd-da341fe519b7"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 06:02:58 crc kubenswrapper[4930]: I1012 06:02:58.110353 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bad3587e-d515-4add-9edd-da341fe519b7-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "bad3587e-d515-4add-9edd-da341fe519b7" (UID: "bad3587e-d515-4add-9edd-da341fe519b7"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 06:02:58 crc kubenswrapper[4930]: I1012 06:02:58.113478 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "persistence") pod "bad3587e-d515-4add-9edd-da341fe519b7" (UID: "bad3587e-d515-4add-9edd-da341fe519b7"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 12 06:02:58 crc kubenswrapper[4930]: I1012 06:02:58.128763 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bad3587e-d515-4add-9edd-da341fe519b7-config-data" (OuterVolumeSpecName: "config-data") pod "bad3587e-d515-4add-9edd-da341fe519b7" (UID: "bad3587e-d515-4add-9edd-da341fe519b7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 06:02:58 crc kubenswrapper[4930]: I1012 06:02:58.192642 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bad3587e-d515-4add-9edd-da341fe519b7-server-conf" (OuterVolumeSpecName: "server-conf") pod "bad3587e-d515-4add-9edd-da341fe519b7" (UID: "bad3587e-d515-4add-9edd-da341fe519b7"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 06:02:58 crc kubenswrapper[4930]: I1012 06:02:58.199772 4930 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bad3587e-d515-4add-9edd-da341fe519b7-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 12 06:02:58 crc kubenswrapper[4930]: I1012 06:02:58.199805 4930 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bad3587e-d515-4add-9edd-da341fe519b7-pod-info\") on node \"crc\" DevicePath \"\"" Oct 12 06:02:58 crc kubenswrapper[4930]: I1012 06:02:58.199815 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhrgr\" (UniqueName: \"kubernetes.io/projected/bad3587e-d515-4add-9edd-da341fe519b7-kube-api-access-jhrgr\") on node \"crc\" DevicePath \"\"" Oct 12 06:02:58 crc kubenswrapper[4930]: I1012 06:02:58.199825 4930 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bad3587e-d515-4add-9edd-da341fe519b7-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 06:02:58 crc kubenswrapper[4930]: I1012 06:02:58.199857 4930 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bad3587e-d515-4add-9edd-da341fe519b7-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 12 06:02:58 crc kubenswrapper[4930]: I1012 06:02:58.199866 4930 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bad3587e-d515-4add-9edd-da341fe519b7-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 12 06:02:58 crc kubenswrapper[4930]: I1012 06:02:58.199873 4930 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bad3587e-d515-4add-9edd-da341fe519b7-server-conf\") on node \"crc\" DevicePath \"\"" Oct 12 06:02:58 crc kubenswrapper[4930]: I1012 06:02:58.199882 4930 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bad3587e-d515-4add-9edd-da341fe519b7-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 12 06:02:58 crc kubenswrapper[4930]: I1012 06:02:58.199988 4930 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Oct 12 06:02:58 crc kubenswrapper[4930]: I1012 06:02:58.200000 4930 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bad3587e-d515-4add-9edd-da341fe519b7-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 12 06:02:58 crc kubenswrapper[4930]: I1012 06:02:58.238975 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bad3587e-d515-4add-9edd-da341fe519b7-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "bad3587e-d515-4add-9edd-da341fe519b7" (UID: "bad3587e-d515-4add-9edd-da341fe519b7"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:02:58 crc kubenswrapper[4930]: I1012 06:02:58.254408 4930 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Oct 12 06:02:58 crc kubenswrapper[4930]: I1012 06:02:58.303312 4930 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bad3587e-d515-4add-9edd-da341fe519b7-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 12 06:02:58 crc kubenswrapper[4930]: I1012 06:02:58.303337 4930 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Oct 12 06:02:58 crc kubenswrapper[4930]: I1012 06:02:58.881809 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 12 06:02:58 crc kubenswrapper[4930]: I1012 06:02:58.921751 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 12 06:02:58 crc kubenswrapper[4930]: I1012 06:02:58.931188 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 12 06:02:58 crc kubenswrapper[4930]: I1012 06:02:58.956458 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 12 06:02:58 crc kubenswrapper[4930]: E1012 06:02:58.956938 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bad3587e-d515-4add-9edd-da341fe519b7" containerName="setup-container" Oct 12 06:02:58 crc kubenswrapper[4930]: I1012 06:02:58.956956 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="bad3587e-d515-4add-9edd-da341fe519b7" containerName="setup-container" Oct 12 06:02:58 crc kubenswrapper[4930]: E1012 06:02:58.956980 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bad3587e-d515-4add-9edd-da341fe519b7" containerName="rabbitmq" Oct 12 06:02:58 crc kubenswrapper[4930]: I1012 06:02:58.956987 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="bad3587e-d515-4add-9edd-da341fe519b7" containerName="rabbitmq" Oct 12 06:02:58 crc kubenswrapper[4930]: I1012 06:02:58.957157 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="bad3587e-d515-4add-9edd-da341fe519b7" containerName="rabbitmq" Oct 12 06:02:58 crc kubenswrapper[4930]: I1012 06:02:58.958252 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 12 06:02:58 crc kubenswrapper[4930]: I1012 06:02:58.962205 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-54fs6" Oct 12 06:02:58 crc kubenswrapper[4930]: I1012 06:02:58.963514 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 12 06:02:58 crc kubenswrapper[4930]: I1012 06:02:58.967808 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 12 06:02:58 crc kubenswrapper[4930]: I1012 06:02:58.967846 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 12 06:02:58 crc kubenswrapper[4930]: I1012 06:02:58.967808 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 12 06:02:58 crc kubenswrapper[4930]: I1012 06:02:58.967994 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 12 06:02:58 crc kubenswrapper[4930]: I1012 06:02:58.968200 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 12 06:02:58 crc kubenswrapper[4930]: I1012 06:02:58.995570 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.120945 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wn5d6\" (UniqueName: \"kubernetes.io/projected/cb65a4f8-39a7-4b20-b1cd-a077fff7ad8d-kube-api-access-wn5d6\") pod \"rabbitmq-server-0\" (UID: \"cb65a4f8-39a7-4b20-b1cd-a077fff7ad8d\") " pod="openstack/rabbitmq-server-0" Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.120998 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cb65a4f8-39a7-4b20-b1cd-a077fff7ad8d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"cb65a4f8-39a7-4b20-b1cd-a077fff7ad8d\") " pod="openstack/rabbitmq-server-0" Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.121033 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cb65a4f8-39a7-4b20-b1cd-a077fff7ad8d-config-data\") pod \"rabbitmq-server-0\" (UID: \"cb65a4f8-39a7-4b20-b1cd-a077fff7ad8d\") " pod="openstack/rabbitmq-server-0" Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.121313 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cb65a4f8-39a7-4b20-b1cd-a077fff7ad8d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"cb65a4f8-39a7-4b20-b1cd-a077fff7ad8d\") " pod="openstack/rabbitmq-server-0" Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.123144 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"cb65a4f8-39a7-4b20-b1cd-a077fff7ad8d\") " pod="openstack/rabbitmq-server-0" Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.123306 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cb65a4f8-39a7-4b20-b1cd-a077fff7ad8d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"cb65a4f8-39a7-4b20-b1cd-a077fff7ad8d\") " pod="openstack/rabbitmq-server-0" Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.123341 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cb65a4f8-39a7-4b20-b1cd-a077fff7ad8d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"cb65a4f8-39a7-4b20-b1cd-a077fff7ad8d\") " pod="openstack/rabbitmq-server-0" Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.123410 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cb65a4f8-39a7-4b20-b1cd-a077fff7ad8d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"cb65a4f8-39a7-4b20-b1cd-a077fff7ad8d\") " pod="openstack/rabbitmq-server-0" Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.123441 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cb65a4f8-39a7-4b20-b1cd-a077fff7ad8d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"cb65a4f8-39a7-4b20-b1cd-a077fff7ad8d\") " pod="openstack/rabbitmq-server-0" Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.123468 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cb65a4f8-39a7-4b20-b1cd-a077fff7ad8d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"cb65a4f8-39a7-4b20-b1cd-a077fff7ad8d\") " pod="openstack/rabbitmq-server-0" Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.123597 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cb65a4f8-39a7-4b20-b1cd-a077fff7ad8d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"cb65a4f8-39a7-4b20-b1cd-a077fff7ad8d\") " pod="openstack/rabbitmq-server-0" Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.234880 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cb65a4f8-39a7-4b20-b1cd-a077fff7ad8d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"cb65a4f8-39a7-4b20-b1cd-a077fff7ad8d\") " pod="openstack/rabbitmq-server-0" Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.234961 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wn5d6\" (UniqueName: \"kubernetes.io/projected/cb65a4f8-39a7-4b20-b1cd-a077fff7ad8d-kube-api-access-wn5d6\") pod \"rabbitmq-server-0\" (UID: \"cb65a4f8-39a7-4b20-b1cd-a077fff7ad8d\") " pod="openstack/rabbitmq-server-0" Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.235002 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cb65a4f8-39a7-4b20-b1cd-a077fff7ad8d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"cb65a4f8-39a7-4b20-b1cd-a077fff7ad8d\") " pod="openstack/rabbitmq-server-0" Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.235065 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cb65a4f8-39a7-4b20-b1cd-a077fff7ad8d-config-data\") pod \"rabbitmq-server-0\" (UID: \"cb65a4f8-39a7-4b20-b1cd-a077fff7ad8d\") " pod="openstack/rabbitmq-server-0" Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.235157 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cb65a4f8-39a7-4b20-b1cd-a077fff7ad8d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"cb65a4f8-39a7-4b20-b1cd-a077fff7ad8d\") " pod="openstack/rabbitmq-server-0" Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.235199 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"cb65a4f8-39a7-4b20-b1cd-a077fff7ad8d\") " pod="openstack/rabbitmq-server-0" Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.235279 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cb65a4f8-39a7-4b20-b1cd-a077fff7ad8d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"cb65a4f8-39a7-4b20-b1cd-a077fff7ad8d\") " pod="openstack/rabbitmq-server-0" Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.235304 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cb65a4f8-39a7-4b20-b1cd-a077fff7ad8d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"cb65a4f8-39a7-4b20-b1cd-a077fff7ad8d\") " pod="openstack/rabbitmq-server-0" Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.235342 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cb65a4f8-39a7-4b20-b1cd-a077fff7ad8d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"cb65a4f8-39a7-4b20-b1cd-a077fff7ad8d\") " pod="openstack/rabbitmq-server-0" Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.235368 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cb65a4f8-39a7-4b20-b1cd-a077fff7ad8d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"cb65a4f8-39a7-4b20-b1cd-a077fff7ad8d\") " pod="openstack/rabbitmq-server-0" Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.235392 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cb65a4f8-39a7-4b20-b1cd-a077fff7ad8d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"cb65a4f8-39a7-4b20-b1cd-a077fff7ad8d\") " pod="openstack/rabbitmq-server-0" Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.236051 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cb65a4f8-39a7-4b20-b1cd-a077fff7ad8d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"cb65a4f8-39a7-4b20-b1cd-a077fff7ad8d\") " pod="openstack/rabbitmq-server-0" Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.242058 4930 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"cb65a4f8-39a7-4b20-b1cd-a077fff7ad8d\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/rabbitmq-server-0" Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.243252 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cb65a4f8-39a7-4b20-b1cd-a077fff7ad8d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"cb65a4f8-39a7-4b20-b1cd-a077fff7ad8d\") " pod="openstack/rabbitmq-server-0" Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.246486 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cb65a4f8-39a7-4b20-b1cd-a077fff7ad8d-config-data\") pod \"rabbitmq-server-0\" (UID: \"cb65a4f8-39a7-4b20-b1cd-a077fff7ad8d\") " pod="openstack/rabbitmq-server-0" Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.246716 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cb65a4f8-39a7-4b20-b1cd-a077fff7ad8d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"cb65a4f8-39a7-4b20-b1cd-a077fff7ad8d\") " pod="openstack/rabbitmq-server-0" Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.249951 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cb65a4f8-39a7-4b20-b1cd-a077fff7ad8d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"cb65a4f8-39a7-4b20-b1cd-a077fff7ad8d\") " pod="openstack/rabbitmq-server-0" Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.250452 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cb65a4f8-39a7-4b20-b1cd-a077fff7ad8d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"cb65a4f8-39a7-4b20-b1cd-a077fff7ad8d\") " pod="openstack/rabbitmq-server-0" Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.252665 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cb65a4f8-39a7-4b20-b1cd-a077fff7ad8d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"cb65a4f8-39a7-4b20-b1cd-a077fff7ad8d\") " pod="openstack/rabbitmq-server-0" Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.266848 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cb65a4f8-39a7-4b20-b1cd-a077fff7ad8d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"cb65a4f8-39a7-4b20-b1cd-a077fff7ad8d\") " pod="openstack/rabbitmq-server-0" Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.271066 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wn5d6\" (UniqueName: \"kubernetes.io/projected/cb65a4f8-39a7-4b20-b1cd-a077fff7ad8d-kube-api-access-wn5d6\") pod \"rabbitmq-server-0\" (UID: \"cb65a4f8-39a7-4b20-b1cd-a077fff7ad8d\") " pod="openstack/rabbitmq-server-0" Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.280775 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cb65a4f8-39a7-4b20-b1cd-a077fff7ad8d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"cb65a4f8-39a7-4b20-b1cd-a077fff7ad8d\") " pod="openstack/rabbitmq-server-0" Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.295702 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"cb65a4f8-39a7-4b20-b1cd-a077fff7ad8d\") " pod="openstack/rabbitmq-server-0" Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.327990 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.384354 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.550104 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/fe886795-2501-4474-bfbc-9febcc5113f3-rabbitmq-confd\") pod \"fe886795-2501-4474-bfbc-9febcc5113f3\" (UID: \"fe886795-2501-4474-bfbc-9febcc5113f3\") " Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.550474 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/fe886795-2501-4474-bfbc-9febcc5113f3-server-conf\") pod \"fe886795-2501-4474-bfbc-9febcc5113f3\" (UID: \"fe886795-2501-4474-bfbc-9febcc5113f3\") " Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.550505 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzflr\" (UniqueName: \"kubernetes.io/projected/fe886795-2501-4474-bfbc-9febcc5113f3-kube-api-access-zzflr\") pod \"fe886795-2501-4474-bfbc-9febcc5113f3\" (UID: \"fe886795-2501-4474-bfbc-9febcc5113f3\") " Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.550565 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/fe886795-2501-4474-bfbc-9febcc5113f3-rabbitmq-erlang-cookie\") pod \"fe886795-2501-4474-bfbc-9febcc5113f3\" (UID: \"fe886795-2501-4474-bfbc-9febcc5113f3\") " Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.550614 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fe886795-2501-4474-bfbc-9febcc5113f3-config-data\") pod \"fe886795-2501-4474-bfbc-9febcc5113f3\" (UID: \"fe886795-2501-4474-bfbc-9febcc5113f3\") " Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.550695 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"fe886795-2501-4474-bfbc-9febcc5113f3\" (UID: \"fe886795-2501-4474-bfbc-9febcc5113f3\") " Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.550810 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/fe886795-2501-4474-bfbc-9febcc5113f3-plugins-conf\") pod \"fe886795-2501-4474-bfbc-9febcc5113f3\" (UID: \"fe886795-2501-4474-bfbc-9febcc5113f3\") " Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.550874 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/fe886795-2501-4474-bfbc-9febcc5113f3-erlang-cookie-secret\") pod \"fe886795-2501-4474-bfbc-9febcc5113f3\" (UID: \"fe886795-2501-4474-bfbc-9febcc5113f3\") " Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.550955 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/fe886795-2501-4474-bfbc-9febcc5113f3-rabbitmq-plugins\") pod \"fe886795-2501-4474-bfbc-9febcc5113f3\" (UID: \"fe886795-2501-4474-bfbc-9febcc5113f3\") " Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.550982 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/fe886795-2501-4474-bfbc-9febcc5113f3-pod-info\") pod \"fe886795-2501-4474-bfbc-9febcc5113f3\" (UID: \"fe886795-2501-4474-bfbc-9febcc5113f3\") " Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.551022 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/fe886795-2501-4474-bfbc-9febcc5113f3-rabbitmq-tls\") pod \"fe886795-2501-4474-bfbc-9febcc5113f3\" (UID: \"fe886795-2501-4474-bfbc-9febcc5113f3\") " Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.555815 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe886795-2501-4474-bfbc-9febcc5113f3-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "fe886795-2501-4474-bfbc-9febcc5113f3" (UID: "fe886795-2501-4474-bfbc-9febcc5113f3"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.556869 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "persistence") pod "fe886795-2501-4474-bfbc-9febcc5113f3" (UID: "fe886795-2501-4474-bfbc-9febcc5113f3"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.556928 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe886795-2501-4474-bfbc-9febcc5113f3-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "fe886795-2501-4474-bfbc-9febcc5113f3" (UID: "fe886795-2501-4474-bfbc-9febcc5113f3"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.556962 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe886795-2501-4474-bfbc-9febcc5113f3-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "fe886795-2501-4474-bfbc-9febcc5113f3" (UID: "fe886795-2501-4474-bfbc-9febcc5113f3"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.558651 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe886795-2501-4474-bfbc-9febcc5113f3-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "fe886795-2501-4474-bfbc-9febcc5113f3" (UID: "fe886795-2501-4474-bfbc-9febcc5113f3"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.561294 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe886795-2501-4474-bfbc-9febcc5113f3-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "fe886795-2501-4474-bfbc-9febcc5113f3" (UID: "fe886795-2501-4474-bfbc-9febcc5113f3"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.578833 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/fe886795-2501-4474-bfbc-9febcc5113f3-pod-info" (OuterVolumeSpecName: "pod-info") pod "fe886795-2501-4474-bfbc-9febcc5113f3" (UID: "fe886795-2501-4474-bfbc-9febcc5113f3"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.589500 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe886795-2501-4474-bfbc-9febcc5113f3-kube-api-access-zzflr" (OuterVolumeSpecName: "kube-api-access-zzflr") pod "fe886795-2501-4474-bfbc-9febcc5113f3" (UID: "fe886795-2501-4474-bfbc-9febcc5113f3"). InnerVolumeSpecName "kube-api-access-zzflr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.598347 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe886795-2501-4474-bfbc-9febcc5113f3-config-data" (OuterVolumeSpecName: "config-data") pod "fe886795-2501-4474-bfbc-9febcc5113f3" (UID: "fe886795-2501-4474-bfbc-9febcc5113f3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.634878 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe886795-2501-4474-bfbc-9febcc5113f3-server-conf" (OuterVolumeSpecName: "server-conf") pod "fe886795-2501-4474-bfbc-9febcc5113f3" (UID: "fe886795-2501-4474-bfbc-9febcc5113f3"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.653151 4930 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/fe886795-2501-4474-bfbc-9febcc5113f3-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.653179 4930 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/fe886795-2501-4474-bfbc-9febcc5113f3-pod-info\") on node \"crc\" DevicePath \"\"" Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.653188 4930 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/fe886795-2501-4474-bfbc-9febcc5113f3-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.653198 4930 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/fe886795-2501-4474-bfbc-9febcc5113f3-server-conf\") on node \"crc\" DevicePath \"\"" Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.653209 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzflr\" (UniqueName: \"kubernetes.io/projected/fe886795-2501-4474-bfbc-9febcc5113f3-kube-api-access-zzflr\") on node \"crc\" DevicePath \"\"" Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.653219 4930 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/fe886795-2501-4474-bfbc-9febcc5113f3-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.653229 4930 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fe886795-2501-4474-bfbc-9febcc5113f3-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.653263 4930 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.653272 4930 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/fe886795-2501-4474-bfbc-9febcc5113f3-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.653281 4930 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/fe886795-2501-4474-bfbc-9febcc5113f3-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.678044 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe886795-2501-4474-bfbc-9febcc5113f3-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "fe886795-2501-4474-bfbc-9febcc5113f3" (UID: "fe886795-2501-4474-bfbc-9febcc5113f3"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.681591 4930 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.754294 4930 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/fe886795-2501-4474-bfbc-9febcc5113f3-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.754327 4930 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.862637 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.895686 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cb65a4f8-39a7-4b20-b1cd-a077fff7ad8d","Type":"ContainerStarted","Data":"817258d22abe19e2cfe41192c32774667c5dbd29f8079aad9eaf0131a75c43f4"} Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.898125 4930 generic.go:334] "Generic (PLEG): container finished" podID="fe886795-2501-4474-bfbc-9febcc5113f3" containerID="16242491c9076429f59d7ec96b240e6798e421997f2bcf00ecca026f566f6d44" exitCode=0 Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.898212 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.898857 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"fe886795-2501-4474-bfbc-9febcc5113f3","Type":"ContainerDied","Data":"16242491c9076429f59d7ec96b240e6798e421997f2bcf00ecca026f566f6d44"} Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.898885 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"fe886795-2501-4474-bfbc-9febcc5113f3","Type":"ContainerDied","Data":"f9945bf75945e3910574098df1060de9d6a3b6bc351c9cb1ab39a8c4ef341187"} Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.898901 4930 scope.go:117] "RemoveContainer" containerID="16242491c9076429f59d7ec96b240e6798e421997f2bcf00ecca026f566f6d44" Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.929775 4930 scope.go:117] "RemoveContainer" containerID="65c472105bc514ff6ba9f1922a44a2d105ac01e044a675fd9c167f35ca166f27" Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.936965 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.949848 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.960074 4930 scope.go:117] "RemoveContainer" containerID="16242491c9076429f59d7ec96b240e6798e421997f2bcf00ecca026f566f6d44" Oct 12 06:02:59 crc kubenswrapper[4930]: E1012 06:02:59.963168 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16242491c9076429f59d7ec96b240e6798e421997f2bcf00ecca026f566f6d44\": container with ID starting with 16242491c9076429f59d7ec96b240e6798e421997f2bcf00ecca026f566f6d44 not found: ID does not exist" containerID="16242491c9076429f59d7ec96b240e6798e421997f2bcf00ecca026f566f6d44" Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.963359 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16242491c9076429f59d7ec96b240e6798e421997f2bcf00ecca026f566f6d44"} err="failed to get container status \"16242491c9076429f59d7ec96b240e6798e421997f2bcf00ecca026f566f6d44\": rpc error: code = NotFound desc = could not find container \"16242491c9076429f59d7ec96b240e6798e421997f2bcf00ecca026f566f6d44\": container with ID starting with 16242491c9076429f59d7ec96b240e6798e421997f2bcf00ecca026f566f6d44 not found: ID does not exist" Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.963476 4930 scope.go:117] "RemoveContainer" containerID="65c472105bc514ff6ba9f1922a44a2d105ac01e044a675fd9c167f35ca166f27" Oct 12 06:02:59 crc kubenswrapper[4930]: E1012 06:02:59.966880 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65c472105bc514ff6ba9f1922a44a2d105ac01e044a675fd9c167f35ca166f27\": container with ID starting with 65c472105bc514ff6ba9f1922a44a2d105ac01e044a675fd9c167f35ca166f27 not found: ID does not exist" containerID="65c472105bc514ff6ba9f1922a44a2d105ac01e044a675fd9c167f35ca166f27" Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.966932 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65c472105bc514ff6ba9f1922a44a2d105ac01e044a675fd9c167f35ca166f27"} err="failed to get container status \"65c472105bc514ff6ba9f1922a44a2d105ac01e044a675fd9c167f35ca166f27\": rpc error: code = NotFound desc = could not find container \"65c472105bc514ff6ba9f1922a44a2d105ac01e044a675fd9c167f35ca166f27\": container with ID starting with 65c472105bc514ff6ba9f1922a44a2d105ac01e044a675fd9c167f35ca166f27 not found: ID does not exist" Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.975514 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 12 06:02:59 crc kubenswrapper[4930]: E1012 06:02:59.978488 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe886795-2501-4474-bfbc-9febcc5113f3" containerName="setup-container" Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.978511 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe886795-2501-4474-bfbc-9febcc5113f3" containerName="setup-container" Oct 12 06:02:59 crc kubenswrapper[4930]: E1012 06:02:59.978539 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe886795-2501-4474-bfbc-9febcc5113f3" containerName="rabbitmq" Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.978546 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe886795-2501-4474-bfbc-9febcc5113f3" containerName="rabbitmq" Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.978796 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe886795-2501-4474-bfbc-9febcc5113f3" containerName="rabbitmq" Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.980760 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.984521 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.984726 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.984954 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.984812 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.985060 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-ltsgl" Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.984964 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.985023 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 12 06:02:59 crc kubenswrapper[4930]: I1012 06:02:59.998502 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 12 06:03:00 crc kubenswrapper[4930]: I1012 06:03:00.150388 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bad3587e-d515-4add-9edd-da341fe519b7" path="/var/lib/kubelet/pods/bad3587e-d515-4add-9edd-da341fe519b7/volumes" Oct 12 06:03:00 crc kubenswrapper[4930]: I1012 06:03:00.151169 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe886795-2501-4474-bfbc-9febcc5113f3" path="/var/lib/kubelet/pods/fe886795-2501-4474-bfbc-9febcc5113f3/volumes" Oct 12 06:03:00 crc kubenswrapper[4930]: I1012 06:03:00.161141 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsd6j\" (UniqueName: \"kubernetes.io/projected/62c5d71d-6283-44b4-9b50-96fd50d7ad99-kube-api-access-tsd6j\") pod \"rabbitmq-cell1-server-0\" (UID: \"62c5d71d-6283-44b4-9b50-96fd50d7ad99\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 06:03:00 crc kubenswrapper[4930]: I1012 06:03:00.161229 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/62c5d71d-6283-44b4-9b50-96fd50d7ad99-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"62c5d71d-6283-44b4-9b50-96fd50d7ad99\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 06:03:00 crc kubenswrapper[4930]: I1012 06:03:00.161251 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/62c5d71d-6283-44b4-9b50-96fd50d7ad99-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"62c5d71d-6283-44b4-9b50-96fd50d7ad99\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 06:03:00 crc kubenswrapper[4930]: I1012 06:03:00.161271 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"62c5d71d-6283-44b4-9b50-96fd50d7ad99\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 06:03:00 crc kubenswrapper[4930]: I1012 06:03:00.161295 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/62c5d71d-6283-44b4-9b50-96fd50d7ad99-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"62c5d71d-6283-44b4-9b50-96fd50d7ad99\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 06:03:00 crc kubenswrapper[4930]: I1012 06:03:00.161331 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/62c5d71d-6283-44b4-9b50-96fd50d7ad99-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"62c5d71d-6283-44b4-9b50-96fd50d7ad99\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 06:03:00 crc kubenswrapper[4930]: I1012 06:03:00.161348 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/62c5d71d-6283-44b4-9b50-96fd50d7ad99-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"62c5d71d-6283-44b4-9b50-96fd50d7ad99\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 06:03:00 crc kubenswrapper[4930]: I1012 06:03:00.161366 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/62c5d71d-6283-44b4-9b50-96fd50d7ad99-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"62c5d71d-6283-44b4-9b50-96fd50d7ad99\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 06:03:00 crc kubenswrapper[4930]: I1012 06:03:00.161396 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/62c5d71d-6283-44b4-9b50-96fd50d7ad99-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"62c5d71d-6283-44b4-9b50-96fd50d7ad99\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 06:03:00 crc kubenswrapper[4930]: I1012 06:03:00.161419 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/62c5d71d-6283-44b4-9b50-96fd50d7ad99-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"62c5d71d-6283-44b4-9b50-96fd50d7ad99\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 06:03:00 crc kubenswrapper[4930]: I1012 06:03:00.161446 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/62c5d71d-6283-44b4-9b50-96fd50d7ad99-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"62c5d71d-6283-44b4-9b50-96fd50d7ad99\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 06:03:00 crc kubenswrapper[4930]: I1012 06:03:00.263198 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/62c5d71d-6283-44b4-9b50-96fd50d7ad99-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"62c5d71d-6283-44b4-9b50-96fd50d7ad99\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 06:03:00 crc kubenswrapper[4930]: I1012 06:03:00.263950 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/62c5d71d-6283-44b4-9b50-96fd50d7ad99-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"62c5d71d-6283-44b4-9b50-96fd50d7ad99\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 06:03:00 crc kubenswrapper[4930]: I1012 06:03:00.264049 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/62c5d71d-6283-44b4-9b50-96fd50d7ad99-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"62c5d71d-6283-44b4-9b50-96fd50d7ad99\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 06:03:00 crc kubenswrapper[4930]: I1012 06:03:00.264120 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/62c5d71d-6283-44b4-9b50-96fd50d7ad99-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"62c5d71d-6283-44b4-9b50-96fd50d7ad99\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 06:03:00 crc kubenswrapper[4930]: I1012 06:03:00.264202 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/62c5d71d-6283-44b4-9b50-96fd50d7ad99-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"62c5d71d-6283-44b4-9b50-96fd50d7ad99\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 06:03:00 crc kubenswrapper[4930]: I1012 06:03:00.264264 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/62c5d71d-6283-44b4-9b50-96fd50d7ad99-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"62c5d71d-6283-44b4-9b50-96fd50d7ad99\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 06:03:00 crc kubenswrapper[4930]: I1012 06:03:00.264342 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/62c5d71d-6283-44b4-9b50-96fd50d7ad99-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"62c5d71d-6283-44b4-9b50-96fd50d7ad99\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 06:03:00 crc kubenswrapper[4930]: I1012 06:03:00.264417 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsd6j\" (UniqueName: \"kubernetes.io/projected/62c5d71d-6283-44b4-9b50-96fd50d7ad99-kube-api-access-tsd6j\") pod \"rabbitmq-cell1-server-0\" (UID: \"62c5d71d-6283-44b4-9b50-96fd50d7ad99\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 06:03:00 crc kubenswrapper[4930]: I1012 06:03:00.264559 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/62c5d71d-6283-44b4-9b50-96fd50d7ad99-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"62c5d71d-6283-44b4-9b50-96fd50d7ad99\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 06:03:00 crc kubenswrapper[4930]: I1012 06:03:00.264632 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/62c5d71d-6283-44b4-9b50-96fd50d7ad99-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"62c5d71d-6283-44b4-9b50-96fd50d7ad99\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 06:03:00 crc kubenswrapper[4930]: I1012 06:03:00.265572 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"62c5d71d-6283-44b4-9b50-96fd50d7ad99\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 06:03:00 crc kubenswrapper[4930]: I1012 06:03:00.265316 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/62c5d71d-6283-44b4-9b50-96fd50d7ad99-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"62c5d71d-6283-44b4-9b50-96fd50d7ad99\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 06:03:00 crc kubenswrapper[4930]: I1012 06:03:00.263958 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/62c5d71d-6283-44b4-9b50-96fd50d7ad99-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"62c5d71d-6283-44b4-9b50-96fd50d7ad99\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 06:03:00 crc kubenswrapper[4930]: I1012 06:03:00.265644 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/62c5d71d-6283-44b4-9b50-96fd50d7ad99-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"62c5d71d-6283-44b4-9b50-96fd50d7ad99\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 06:03:00 crc kubenswrapper[4930]: I1012 06:03:00.265841 4930 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"62c5d71d-6283-44b4-9b50-96fd50d7ad99\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/rabbitmq-cell1-server-0" Oct 12 06:03:00 crc kubenswrapper[4930]: I1012 06:03:00.265941 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/62c5d71d-6283-44b4-9b50-96fd50d7ad99-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"62c5d71d-6283-44b4-9b50-96fd50d7ad99\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 06:03:00 crc kubenswrapper[4930]: I1012 06:03:00.268272 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/62c5d71d-6283-44b4-9b50-96fd50d7ad99-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"62c5d71d-6283-44b4-9b50-96fd50d7ad99\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 06:03:00 crc kubenswrapper[4930]: I1012 06:03:00.269563 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/62c5d71d-6283-44b4-9b50-96fd50d7ad99-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"62c5d71d-6283-44b4-9b50-96fd50d7ad99\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 06:03:00 crc kubenswrapper[4930]: I1012 06:03:00.270467 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/62c5d71d-6283-44b4-9b50-96fd50d7ad99-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"62c5d71d-6283-44b4-9b50-96fd50d7ad99\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 06:03:00 crc kubenswrapper[4930]: I1012 06:03:00.271927 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/62c5d71d-6283-44b4-9b50-96fd50d7ad99-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"62c5d71d-6283-44b4-9b50-96fd50d7ad99\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 06:03:00 crc kubenswrapper[4930]: I1012 06:03:00.272139 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/62c5d71d-6283-44b4-9b50-96fd50d7ad99-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"62c5d71d-6283-44b4-9b50-96fd50d7ad99\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 06:03:00 crc kubenswrapper[4930]: I1012 06:03:00.286872 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsd6j\" (UniqueName: \"kubernetes.io/projected/62c5d71d-6283-44b4-9b50-96fd50d7ad99-kube-api-access-tsd6j\") pod \"rabbitmq-cell1-server-0\" (UID: \"62c5d71d-6283-44b4-9b50-96fd50d7ad99\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 06:03:00 crc kubenswrapper[4930]: I1012 06:03:00.312342 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"62c5d71d-6283-44b4-9b50-96fd50d7ad99\") " pod="openstack/rabbitmq-cell1-server-0" Oct 12 06:03:00 crc kubenswrapper[4930]: I1012 06:03:00.326936 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 12 06:03:00 crc kubenswrapper[4930]: W1012 06:03:00.810260 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62c5d71d_6283_44b4_9b50_96fd50d7ad99.slice/crio-9652322ff9f130e19e6380c6a0b2a2f2adf322dfc3d592d3a1184a408e497276 WatchSource:0}: Error finding container 9652322ff9f130e19e6380c6a0b2a2f2adf322dfc3d592d3a1184a408e497276: Status 404 returned error can't find the container with id 9652322ff9f130e19e6380c6a0b2a2f2adf322dfc3d592d3a1184a408e497276 Oct 12 06:03:00 crc kubenswrapper[4930]: I1012 06:03:00.816755 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 12 06:03:00 crc kubenswrapper[4930]: I1012 06:03:00.940368 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"62c5d71d-6283-44b4-9b50-96fd50d7ad99","Type":"ContainerStarted","Data":"9652322ff9f130e19e6380c6a0b2a2f2adf322dfc3d592d3a1184a408e497276"} Oct 12 06:03:01 crc kubenswrapper[4930]: I1012 06:03:01.966788 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cb65a4f8-39a7-4b20-b1cd-a077fff7ad8d","Type":"ContainerStarted","Data":"c7f3d8997ea83328ad1814110e6a60ae0772c7ea68dab6290a554e975c15ebd5"} Oct 12 06:03:03 crc kubenswrapper[4930]: I1012 06:03:03.993953 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"62c5d71d-6283-44b4-9b50-96fd50d7ad99","Type":"ContainerStarted","Data":"62134c443653cfd84216a5751eb0c1a27d8a75699a649b11f5e6329e422f863c"} Oct 12 06:03:07 crc kubenswrapper[4930]: I1012 06:03:07.609727 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bf6c7df67-z88k5"] Oct 12 06:03:07 crc kubenswrapper[4930]: I1012 06:03:07.611521 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bf6c7df67-z88k5" Oct 12 06:03:07 crc kubenswrapper[4930]: I1012 06:03:07.613687 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Oct 12 06:03:07 crc kubenswrapper[4930]: I1012 06:03:07.624869 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bf6c7df67-z88k5"] Oct 12 06:03:07 crc kubenswrapper[4930]: I1012 06:03:07.730663 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5-ovsdbserver-nb\") pod \"dnsmasq-dns-bf6c7df67-z88k5\" (UID: \"ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5\") " pod="openstack/dnsmasq-dns-bf6c7df67-z88k5" Oct 12 06:03:07 crc kubenswrapper[4930]: I1012 06:03:07.730958 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwwq7\" (UniqueName: \"kubernetes.io/projected/ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5-kube-api-access-fwwq7\") pod \"dnsmasq-dns-bf6c7df67-z88k5\" (UID: \"ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5\") " pod="openstack/dnsmasq-dns-bf6c7df67-z88k5" Oct 12 06:03:07 crc kubenswrapper[4930]: I1012 06:03:07.731225 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5-ovsdbserver-sb\") pod \"dnsmasq-dns-bf6c7df67-z88k5\" (UID: \"ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5\") " pod="openstack/dnsmasq-dns-bf6c7df67-z88k5" Oct 12 06:03:07 crc kubenswrapper[4930]: I1012 06:03:07.731289 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5-config\") pod \"dnsmasq-dns-bf6c7df67-z88k5\" (UID: \"ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5\") " pod="openstack/dnsmasq-dns-bf6c7df67-z88k5" Oct 12 06:03:07 crc kubenswrapper[4930]: I1012 06:03:07.731532 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5-openstack-edpm-ipam\") pod \"dnsmasq-dns-bf6c7df67-z88k5\" (UID: \"ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5\") " pod="openstack/dnsmasq-dns-bf6c7df67-z88k5" Oct 12 06:03:07 crc kubenswrapper[4930]: I1012 06:03:07.731618 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5-dns-swift-storage-0\") pod \"dnsmasq-dns-bf6c7df67-z88k5\" (UID: \"ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5\") " pod="openstack/dnsmasq-dns-bf6c7df67-z88k5" Oct 12 06:03:07 crc kubenswrapper[4930]: I1012 06:03:07.731843 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5-dns-svc\") pod \"dnsmasq-dns-bf6c7df67-z88k5\" (UID: \"ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5\") " pod="openstack/dnsmasq-dns-bf6c7df67-z88k5" Oct 12 06:03:07 crc kubenswrapper[4930]: I1012 06:03:07.834000 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5-openstack-edpm-ipam\") pod \"dnsmasq-dns-bf6c7df67-z88k5\" (UID: \"ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5\") " pod="openstack/dnsmasq-dns-bf6c7df67-z88k5" Oct 12 06:03:07 crc kubenswrapper[4930]: I1012 06:03:07.834052 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5-dns-swift-storage-0\") pod \"dnsmasq-dns-bf6c7df67-z88k5\" (UID: \"ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5\") " pod="openstack/dnsmasq-dns-bf6c7df67-z88k5" Oct 12 06:03:07 crc kubenswrapper[4930]: I1012 06:03:07.834114 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5-dns-svc\") pod \"dnsmasq-dns-bf6c7df67-z88k5\" (UID: \"ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5\") " pod="openstack/dnsmasq-dns-bf6c7df67-z88k5" Oct 12 06:03:07 crc kubenswrapper[4930]: I1012 06:03:07.834146 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5-ovsdbserver-nb\") pod \"dnsmasq-dns-bf6c7df67-z88k5\" (UID: \"ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5\") " pod="openstack/dnsmasq-dns-bf6c7df67-z88k5" Oct 12 06:03:07 crc kubenswrapper[4930]: I1012 06:03:07.834173 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwwq7\" (UniqueName: \"kubernetes.io/projected/ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5-kube-api-access-fwwq7\") pod \"dnsmasq-dns-bf6c7df67-z88k5\" (UID: \"ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5\") " pod="openstack/dnsmasq-dns-bf6c7df67-z88k5" Oct 12 06:03:07 crc kubenswrapper[4930]: I1012 06:03:07.834239 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5-ovsdbserver-sb\") pod \"dnsmasq-dns-bf6c7df67-z88k5\" (UID: \"ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5\") " pod="openstack/dnsmasq-dns-bf6c7df67-z88k5" Oct 12 06:03:07 crc kubenswrapper[4930]: I1012 06:03:07.834467 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5-config\") pod \"dnsmasq-dns-bf6c7df67-z88k5\" (UID: \"ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5\") " pod="openstack/dnsmasq-dns-bf6c7df67-z88k5" Oct 12 06:03:07 crc kubenswrapper[4930]: I1012 06:03:07.835565 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5-config\") pod \"dnsmasq-dns-bf6c7df67-z88k5\" (UID: \"ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5\") " pod="openstack/dnsmasq-dns-bf6c7df67-z88k5" Oct 12 06:03:07 crc kubenswrapper[4930]: I1012 06:03:07.835800 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5-openstack-edpm-ipam\") pod \"dnsmasq-dns-bf6c7df67-z88k5\" (UID: \"ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5\") " pod="openstack/dnsmasq-dns-bf6c7df67-z88k5" Oct 12 06:03:07 crc kubenswrapper[4930]: I1012 06:03:07.835887 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5-dns-swift-storage-0\") pod \"dnsmasq-dns-bf6c7df67-z88k5\" (UID: \"ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5\") " pod="openstack/dnsmasq-dns-bf6c7df67-z88k5" Oct 12 06:03:07 crc kubenswrapper[4930]: I1012 06:03:07.836146 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5-ovsdbserver-nb\") pod \"dnsmasq-dns-bf6c7df67-z88k5\" (UID: \"ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5\") " pod="openstack/dnsmasq-dns-bf6c7df67-z88k5" Oct 12 06:03:07 crc kubenswrapper[4930]: I1012 06:03:07.836360 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5-ovsdbserver-sb\") pod \"dnsmasq-dns-bf6c7df67-z88k5\" (UID: \"ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5\") " pod="openstack/dnsmasq-dns-bf6c7df67-z88k5" Oct 12 06:03:07 crc kubenswrapper[4930]: I1012 06:03:07.837143 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5-dns-svc\") pod \"dnsmasq-dns-bf6c7df67-z88k5\" (UID: \"ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5\") " pod="openstack/dnsmasq-dns-bf6c7df67-z88k5" Oct 12 06:03:07 crc kubenswrapper[4930]: I1012 06:03:07.857535 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwwq7\" (UniqueName: \"kubernetes.io/projected/ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5-kube-api-access-fwwq7\") pod \"dnsmasq-dns-bf6c7df67-z88k5\" (UID: \"ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5\") " pod="openstack/dnsmasq-dns-bf6c7df67-z88k5" Oct 12 06:03:07 crc kubenswrapper[4930]: I1012 06:03:07.953433 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bf6c7df67-z88k5" Oct 12 06:03:08 crc kubenswrapper[4930]: I1012 06:03:08.435419 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bf6c7df67-z88k5"] Oct 12 06:03:08 crc kubenswrapper[4930]: W1012 06:03:08.450009 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce0650a2_2a68_4e75_9298_ac9e2fbdf4a5.slice/crio-4d4ced1a22c656c28e81a1539601a08077f2b5aa94f25e7f54c74824ae33d13c WatchSource:0}: Error finding container 4d4ced1a22c656c28e81a1539601a08077f2b5aa94f25e7f54c74824ae33d13c: Status 404 returned error can't find the container with id 4d4ced1a22c656c28e81a1539601a08077f2b5aa94f25e7f54c74824ae33d13c Oct 12 06:03:09 crc kubenswrapper[4930]: I1012 06:03:09.070108 4930 generic.go:334] "Generic (PLEG): container finished" podID="ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5" containerID="860df8425c0f4c466b85e8d18522dded51505cedc24a28ad9bb1b094a55ee7db" exitCode=0 Oct 12 06:03:09 crc kubenswrapper[4930]: I1012 06:03:09.070188 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bf6c7df67-z88k5" event={"ID":"ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5","Type":"ContainerDied","Data":"860df8425c0f4c466b85e8d18522dded51505cedc24a28ad9bb1b094a55ee7db"} Oct 12 06:03:09 crc kubenswrapper[4930]: I1012 06:03:09.070372 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bf6c7df67-z88k5" event={"ID":"ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5","Type":"ContainerStarted","Data":"4d4ced1a22c656c28e81a1539601a08077f2b5aa94f25e7f54c74824ae33d13c"} Oct 12 06:03:10 crc kubenswrapper[4930]: I1012 06:03:10.098138 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bf6c7df67-z88k5" event={"ID":"ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5","Type":"ContainerStarted","Data":"735dba7e3515e750cb1db29e94ef0a5d5608f30dc1d5980a9634b9caee53d3a5"} Oct 12 06:03:10 crc kubenswrapper[4930]: I1012 06:03:10.098344 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-bf6c7df67-z88k5" Oct 12 06:03:10 crc kubenswrapper[4930]: I1012 06:03:10.129834 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-bf6c7df67-z88k5" podStartSLOduration=3.129811096 podStartE2EDuration="3.129811096s" podCreationTimestamp="2025-10-12 06:03:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 06:03:10.117697974 +0000 UTC m=+1322.659799749" watchObservedRunningTime="2025-10-12 06:03:10.129811096 +0000 UTC m=+1322.671912871" Oct 12 06:03:17 crc kubenswrapper[4930]: I1012 06:03:17.955000 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-bf6c7df67-z88k5" Oct 12 06:03:18 crc kubenswrapper[4930]: I1012 06:03:18.069304 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54599d8f7-xv8dr"] Oct 12 06:03:18 crc kubenswrapper[4930]: I1012 06:03:18.069921 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-54599d8f7-xv8dr" podUID="57ae0cc6-55ee-4117-8d32-e82672b2ea46" containerName="dnsmasq-dns" containerID="cri-o://394b48bb4d59d15ea85d2840ac217d8379fd9c15434e3fb55aafe3a72687a4f8" gracePeriod=10 Oct 12 06:03:18 crc kubenswrapper[4930]: I1012 06:03:18.247521 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77b58f4b85-jll5z"] Oct 12 06:03:18 crc kubenswrapper[4930]: I1012 06:03:18.249197 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77b58f4b85-jll5z" Oct 12 06:03:18 crc kubenswrapper[4930]: I1012 06:03:18.265454 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77b58f4b85-jll5z"] Oct 12 06:03:18 crc kubenswrapper[4930]: I1012 06:03:18.286045 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7jmw\" (UniqueName: \"kubernetes.io/projected/829039a6-ad10-4532-b406-f497e661fd8d-kube-api-access-v7jmw\") pod \"dnsmasq-dns-77b58f4b85-jll5z\" (UID: \"829039a6-ad10-4532-b406-f497e661fd8d\") " pod="openstack/dnsmasq-dns-77b58f4b85-jll5z" Oct 12 06:03:18 crc kubenswrapper[4930]: I1012 06:03:18.286103 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/829039a6-ad10-4532-b406-f497e661fd8d-openstack-edpm-ipam\") pod \"dnsmasq-dns-77b58f4b85-jll5z\" (UID: \"829039a6-ad10-4532-b406-f497e661fd8d\") " pod="openstack/dnsmasq-dns-77b58f4b85-jll5z" Oct 12 06:03:18 crc kubenswrapper[4930]: I1012 06:03:18.286165 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/829039a6-ad10-4532-b406-f497e661fd8d-dns-swift-storage-0\") pod \"dnsmasq-dns-77b58f4b85-jll5z\" (UID: \"829039a6-ad10-4532-b406-f497e661fd8d\") " pod="openstack/dnsmasq-dns-77b58f4b85-jll5z" Oct 12 06:03:18 crc kubenswrapper[4930]: I1012 06:03:18.286186 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/829039a6-ad10-4532-b406-f497e661fd8d-ovsdbserver-sb\") pod \"dnsmasq-dns-77b58f4b85-jll5z\" (UID: \"829039a6-ad10-4532-b406-f497e661fd8d\") " pod="openstack/dnsmasq-dns-77b58f4b85-jll5z" Oct 12 06:03:18 crc kubenswrapper[4930]: I1012 06:03:18.286203 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/829039a6-ad10-4532-b406-f497e661fd8d-ovsdbserver-nb\") pod \"dnsmasq-dns-77b58f4b85-jll5z\" (UID: \"829039a6-ad10-4532-b406-f497e661fd8d\") " pod="openstack/dnsmasq-dns-77b58f4b85-jll5z" Oct 12 06:03:18 crc kubenswrapper[4930]: I1012 06:03:18.286271 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/829039a6-ad10-4532-b406-f497e661fd8d-dns-svc\") pod \"dnsmasq-dns-77b58f4b85-jll5z\" (UID: \"829039a6-ad10-4532-b406-f497e661fd8d\") " pod="openstack/dnsmasq-dns-77b58f4b85-jll5z" Oct 12 06:03:18 crc kubenswrapper[4930]: I1012 06:03:18.286290 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/829039a6-ad10-4532-b406-f497e661fd8d-config\") pod \"dnsmasq-dns-77b58f4b85-jll5z\" (UID: \"829039a6-ad10-4532-b406-f497e661fd8d\") " pod="openstack/dnsmasq-dns-77b58f4b85-jll5z" Oct 12 06:03:18 crc kubenswrapper[4930]: I1012 06:03:18.387976 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/829039a6-ad10-4532-b406-f497e661fd8d-dns-swift-storage-0\") pod \"dnsmasq-dns-77b58f4b85-jll5z\" (UID: \"829039a6-ad10-4532-b406-f497e661fd8d\") " pod="openstack/dnsmasq-dns-77b58f4b85-jll5z" Oct 12 06:03:18 crc kubenswrapper[4930]: I1012 06:03:18.388228 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/829039a6-ad10-4532-b406-f497e661fd8d-ovsdbserver-sb\") pod \"dnsmasq-dns-77b58f4b85-jll5z\" (UID: \"829039a6-ad10-4532-b406-f497e661fd8d\") " pod="openstack/dnsmasq-dns-77b58f4b85-jll5z" Oct 12 06:03:18 crc kubenswrapper[4930]: I1012 06:03:18.388250 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/829039a6-ad10-4532-b406-f497e661fd8d-ovsdbserver-nb\") pod \"dnsmasq-dns-77b58f4b85-jll5z\" (UID: \"829039a6-ad10-4532-b406-f497e661fd8d\") " pod="openstack/dnsmasq-dns-77b58f4b85-jll5z" Oct 12 06:03:18 crc kubenswrapper[4930]: I1012 06:03:18.388331 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/829039a6-ad10-4532-b406-f497e661fd8d-dns-svc\") pod \"dnsmasq-dns-77b58f4b85-jll5z\" (UID: \"829039a6-ad10-4532-b406-f497e661fd8d\") " pod="openstack/dnsmasq-dns-77b58f4b85-jll5z" Oct 12 06:03:18 crc kubenswrapper[4930]: I1012 06:03:18.388349 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/829039a6-ad10-4532-b406-f497e661fd8d-config\") pod \"dnsmasq-dns-77b58f4b85-jll5z\" (UID: \"829039a6-ad10-4532-b406-f497e661fd8d\") " pod="openstack/dnsmasq-dns-77b58f4b85-jll5z" Oct 12 06:03:18 crc kubenswrapper[4930]: I1012 06:03:18.388419 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7jmw\" (UniqueName: \"kubernetes.io/projected/829039a6-ad10-4532-b406-f497e661fd8d-kube-api-access-v7jmw\") pod \"dnsmasq-dns-77b58f4b85-jll5z\" (UID: \"829039a6-ad10-4532-b406-f497e661fd8d\") " pod="openstack/dnsmasq-dns-77b58f4b85-jll5z" Oct 12 06:03:18 crc kubenswrapper[4930]: I1012 06:03:18.388457 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/829039a6-ad10-4532-b406-f497e661fd8d-openstack-edpm-ipam\") pod \"dnsmasq-dns-77b58f4b85-jll5z\" (UID: \"829039a6-ad10-4532-b406-f497e661fd8d\") " pod="openstack/dnsmasq-dns-77b58f4b85-jll5z" Oct 12 06:03:18 crc kubenswrapper[4930]: I1012 06:03:18.389071 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/829039a6-ad10-4532-b406-f497e661fd8d-ovsdbserver-sb\") pod \"dnsmasq-dns-77b58f4b85-jll5z\" (UID: \"829039a6-ad10-4532-b406-f497e661fd8d\") " pod="openstack/dnsmasq-dns-77b58f4b85-jll5z" Oct 12 06:03:18 crc kubenswrapper[4930]: I1012 06:03:18.389153 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/829039a6-ad10-4532-b406-f497e661fd8d-dns-swift-storage-0\") pod \"dnsmasq-dns-77b58f4b85-jll5z\" (UID: \"829039a6-ad10-4532-b406-f497e661fd8d\") " pod="openstack/dnsmasq-dns-77b58f4b85-jll5z" Oct 12 06:03:18 crc kubenswrapper[4930]: I1012 06:03:18.389159 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/829039a6-ad10-4532-b406-f497e661fd8d-openstack-edpm-ipam\") pod \"dnsmasq-dns-77b58f4b85-jll5z\" (UID: \"829039a6-ad10-4532-b406-f497e661fd8d\") " pod="openstack/dnsmasq-dns-77b58f4b85-jll5z" Oct 12 06:03:18 crc kubenswrapper[4930]: I1012 06:03:18.389795 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/829039a6-ad10-4532-b406-f497e661fd8d-dns-svc\") pod \"dnsmasq-dns-77b58f4b85-jll5z\" (UID: \"829039a6-ad10-4532-b406-f497e661fd8d\") " pod="openstack/dnsmasq-dns-77b58f4b85-jll5z" Oct 12 06:03:18 crc kubenswrapper[4930]: I1012 06:03:18.390160 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/829039a6-ad10-4532-b406-f497e661fd8d-ovsdbserver-nb\") pod \"dnsmasq-dns-77b58f4b85-jll5z\" (UID: \"829039a6-ad10-4532-b406-f497e661fd8d\") " pod="openstack/dnsmasq-dns-77b58f4b85-jll5z" Oct 12 06:03:18 crc kubenswrapper[4930]: I1012 06:03:18.392530 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/829039a6-ad10-4532-b406-f497e661fd8d-config\") pod \"dnsmasq-dns-77b58f4b85-jll5z\" (UID: \"829039a6-ad10-4532-b406-f497e661fd8d\") " pod="openstack/dnsmasq-dns-77b58f4b85-jll5z" Oct 12 06:03:18 crc kubenswrapper[4930]: I1012 06:03:18.406696 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7jmw\" (UniqueName: \"kubernetes.io/projected/829039a6-ad10-4532-b406-f497e661fd8d-kube-api-access-v7jmw\") pod \"dnsmasq-dns-77b58f4b85-jll5z\" (UID: \"829039a6-ad10-4532-b406-f497e661fd8d\") " pod="openstack/dnsmasq-dns-77b58f4b85-jll5z" Oct 12 06:03:18 crc kubenswrapper[4930]: I1012 06:03:18.578529 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77b58f4b85-jll5z" Oct 12 06:03:18 crc kubenswrapper[4930]: I1012 06:03:18.708326 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54599d8f7-xv8dr" Oct 12 06:03:18 crc kubenswrapper[4930]: I1012 06:03:18.796603 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57ae0cc6-55ee-4117-8d32-e82672b2ea46-config\") pod \"57ae0cc6-55ee-4117-8d32-e82672b2ea46\" (UID: \"57ae0cc6-55ee-4117-8d32-e82672b2ea46\") " Oct 12 06:03:18 crc kubenswrapper[4930]: I1012 06:03:18.796664 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/57ae0cc6-55ee-4117-8d32-e82672b2ea46-ovsdbserver-sb\") pod \"57ae0cc6-55ee-4117-8d32-e82672b2ea46\" (UID: \"57ae0cc6-55ee-4117-8d32-e82672b2ea46\") " Oct 12 06:03:18 crc kubenswrapper[4930]: I1012 06:03:18.796789 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/57ae0cc6-55ee-4117-8d32-e82672b2ea46-dns-svc\") pod \"57ae0cc6-55ee-4117-8d32-e82672b2ea46\" (UID: \"57ae0cc6-55ee-4117-8d32-e82672b2ea46\") " Oct 12 06:03:18 crc kubenswrapper[4930]: I1012 06:03:18.796911 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/57ae0cc6-55ee-4117-8d32-e82672b2ea46-dns-swift-storage-0\") pod \"57ae0cc6-55ee-4117-8d32-e82672b2ea46\" (UID: \"57ae0cc6-55ee-4117-8d32-e82672b2ea46\") " Oct 12 06:03:18 crc kubenswrapper[4930]: I1012 06:03:18.796976 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8gvlh\" (UniqueName: \"kubernetes.io/projected/57ae0cc6-55ee-4117-8d32-e82672b2ea46-kube-api-access-8gvlh\") pod \"57ae0cc6-55ee-4117-8d32-e82672b2ea46\" (UID: \"57ae0cc6-55ee-4117-8d32-e82672b2ea46\") " Oct 12 06:03:18 crc kubenswrapper[4930]: I1012 06:03:18.797020 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/57ae0cc6-55ee-4117-8d32-e82672b2ea46-ovsdbserver-nb\") pod \"57ae0cc6-55ee-4117-8d32-e82672b2ea46\" (UID: \"57ae0cc6-55ee-4117-8d32-e82672b2ea46\") " Oct 12 06:03:18 crc kubenswrapper[4930]: I1012 06:03:18.807436 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57ae0cc6-55ee-4117-8d32-e82672b2ea46-kube-api-access-8gvlh" (OuterVolumeSpecName: "kube-api-access-8gvlh") pod "57ae0cc6-55ee-4117-8d32-e82672b2ea46" (UID: "57ae0cc6-55ee-4117-8d32-e82672b2ea46"). InnerVolumeSpecName "kube-api-access-8gvlh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:03:18 crc kubenswrapper[4930]: I1012 06:03:18.857646 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57ae0cc6-55ee-4117-8d32-e82672b2ea46-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "57ae0cc6-55ee-4117-8d32-e82672b2ea46" (UID: "57ae0cc6-55ee-4117-8d32-e82672b2ea46"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 06:03:18 crc kubenswrapper[4930]: I1012 06:03:18.858680 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57ae0cc6-55ee-4117-8d32-e82672b2ea46-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "57ae0cc6-55ee-4117-8d32-e82672b2ea46" (UID: "57ae0cc6-55ee-4117-8d32-e82672b2ea46"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 06:03:18 crc kubenswrapper[4930]: I1012 06:03:18.859970 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57ae0cc6-55ee-4117-8d32-e82672b2ea46-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "57ae0cc6-55ee-4117-8d32-e82672b2ea46" (UID: "57ae0cc6-55ee-4117-8d32-e82672b2ea46"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 06:03:18 crc kubenswrapper[4930]: I1012 06:03:18.863266 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57ae0cc6-55ee-4117-8d32-e82672b2ea46-config" (OuterVolumeSpecName: "config") pod "57ae0cc6-55ee-4117-8d32-e82672b2ea46" (UID: "57ae0cc6-55ee-4117-8d32-e82672b2ea46"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 06:03:18 crc kubenswrapper[4930]: I1012 06:03:18.879185 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57ae0cc6-55ee-4117-8d32-e82672b2ea46-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "57ae0cc6-55ee-4117-8d32-e82672b2ea46" (UID: "57ae0cc6-55ee-4117-8d32-e82672b2ea46"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 06:03:18 crc kubenswrapper[4930]: I1012 06:03:18.899930 4930 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/57ae0cc6-55ee-4117-8d32-e82672b2ea46-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 12 06:03:18 crc kubenswrapper[4930]: I1012 06:03:18.899958 4930 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57ae0cc6-55ee-4117-8d32-e82672b2ea46-config\") on node \"crc\" DevicePath \"\"" Oct 12 06:03:18 crc kubenswrapper[4930]: I1012 06:03:18.899966 4930 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/57ae0cc6-55ee-4117-8d32-e82672b2ea46-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 12 06:03:18 crc kubenswrapper[4930]: I1012 06:03:18.899975 4930 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/57ae0cc6-55ee-4117-8d32-e82672b2ea46-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 12 06:03:18 crc kubenswrapper[4930]: I1012 06:03:18.899983 4930 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/57ae0cc6-55ee-4117-8d32-e82672b2ea46-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 12 06:03:18 crc kubenswrapper[4930]: I1012 06:03:18.899992 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8gvlh\" (UniqueName: \"kubernetes.io/projected/57ae0cc6-55ee-4117-8d32-e82672b2ea46-kube-api-access-8gvlh\") on node \"crc\" DevicePath \"\"" Oct 12 06:03:19 crc kubenswrapper[4930]: I1012 06:03:19.086846 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77b58f4b85-jll5z"] Oct 12 06:03:19 crc kubenswrapper[4930]: I1012 06:03:19.232273 4930 generic.go:334] "Generic (PLEG): container finished" podID="57ae0cc6-55ee-4117-8d32-e82672b2ea46" containerID="394b48bb4d59d15ea85d2840ac217d8379fd9c15434e3fb55aafe3a72687a4f8" exitCode=0 Oct 12 06:03:19 crc kubenswrapper[4930]: I1012 06:03:19.232629 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54599d8f7-xv8dr" event={"ID":"57ae0cc6-55ee-4117-8d32-e82672b2ea46","Type":"ContainerDied","Data":"394b48bb4d59d15ea85d2840ac217d8379fd9c15434e3fb55aafe3a72687a4f8"} Oct 12 06:03:19 crc kubenswrapper[4930]: I1012 06:03:19.232667 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54599d8f7-xv8dr" event={"ID":"57ae0cc6-55ee-4117-8d32-e82672b2ea46","Type":"ContainerDied","Data":"487a561894a95fc1abee995a6090193bbae294233ea07dd5afece6d49d553e6b"} Oct 12 06:03:19 crc kubenswrapper[4930]: I1012 06:03:19.232686 4930 scope.go:117] "RemoveContainer" containerID="394b48bb4d59d15ea85d2840ac217d8379fd9c15434e3fb55aafe3a72687a4f8" Oct 12 06:03:19 crc kubenswrapper[4930]: I1012 06:03:19.232846 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54599d8f7-xv8dr" Oct 12 06:03:19 crc kubenswrapper[4930]: I1012 06:03:19.247193 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77b58f4b85-jll5z" event={"ID":"829039a6-ad10-4532-b406-f497e661fd8d","Type":"ContainerStarted","Data":"62bec41bc4c01ca7826575465a554932ff52c2f9d44bc3a83aaad535dad5319a"} Oct 12 06:03:19 crc kubenswrapper[4930]: I1012 06:03:19.324595 4930 scope.go:117] "RemoveContainer" containerID="014783dc84fc370fcb7fb3501ebc3ae96744624044bc2b495d7cd6520b66b768" Oct 12 06:03:19 crc kubenswrapper[4930]: I1012 06:03:19.355293 4930 scope.go:117] "RemoveContainer" containerID="394b48bb4d59d15ea85d2840ac217d8379fd9c15434e3fb55aafe3a72687a4f8" Oct 12 06:03:19 crc kubenswrapper[4930]: I1012 06:03:19.357977 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54599d8f7-xv8dr"] Oct 12 06:03:19 crc kubenswrapper[4930]: E1012 06:03:19.363251 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"394b48bb4d59d15ea85d2840ac217d8379fd9c15434e3fb55aafe3a72687a4f8\": container with ID starting with 394b48bb4d59d15ea85d2840ac217d8379fd9c15434e3fb55aafe3a72687a4f8 not found: ID does not exist" containerID="394b48bb4d59d15ea85d2840ac217d8379fd9c15434e3fb55aafe3a72687a4f8" Oct 12 06:03:19 crc kubenswrapper[4930]: I1012 06:03:19.363291 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"394b48bb4d59d15ea85d2840ac217d8379fd9c15434e3fb55aafe3a72687a4f8"} err="failed to get container status \"394b48bb4d59d15ea85d2840ac217d8379fd9c15434e3fb55aafe3a72687a4f8\": rpc error: code = NotFound desc = could not find container \"394b48bb4d59d15ea85d2840ac217d8379fd9c15434e3fb55aafe3a72687a4f8\": container with ID starting with 394b48bb4d59d15ea85d2840ac217d8379fd9c15434e3fb55aafe3a72687a4f8 not found: ID does not exist" Oct 12 06:03:19 crc kubenswrapper[4930]: I1012 06:03:19.363318 4930 scope.go:117] "RemoveContainer" containerID="014783dc84fc370fcb7fb3501ebc3ae96744624044bc2b495d7cd6520b66b768" Oct 12 06:03:19 crc kubenswrapper[4930]: E1012 06:03:19.363627 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"014783dc84fc370fcb7fb3501ebc3ae96744624044bc2b495d7cd6520b66b768\": container with ID starting with 014783dc84fc370fcb7fb3501ebc3ae96744624044bc2b495d7cd6520b66b768 not found: ID does not exist" containerID="014783dc84fc370fcb7fb3501ebc3ae96744624044bc2b495d7cd6520b66b768" Oct 12 06:03:19 crc kubenswrapper[4930]: I1012 06:03:19.363663 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"014783dc84fc370fcb7fb3501ebc3ae96744624044bc2b495d7cd6520b66b768"} err="failed to get container status \"014783dc84fc370fcb7fb3501ebc3ae96744624044bc2b495d7cd6520b66b768\": rpc error: code = NotFound desc = could not find container \"014783dc84fc370fcb7fb3501ebc3ae96744624044bc2b495d7cd6520b66b768\": container with ID starting with 014783dc84fc370fcb7fb3501ebc3ae96744624044bc2b495d7cd6520b66b768 not found: ID does not exist" Oct 12 06:03:19 crc kubenswrapper[4930]: I1012 06:03:19.367865 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-54599d8f7-xv8dr"] Oct 12 06:03:20 crc kubenswrapper[4930]: I1012 06:03:20.173673 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57ae0cc6-55ee-4117-8d32-e82672b2ea46" path="/var/lib/kubelet/pods/57ae0cc6-55ee-4117-8d32-e82672b2ea46/volumes" Oct 12 06:03:20 crc kubenswrapper[4930]: I1012 06:03:20.260879 4930 generic.go:334] "Generic (PLEG): container finished" podID="829039a6-ad10-4532-b406-f497e661fd8d" containerID="208654bbf3ab99ca34348050c214a50a4bb163ffcbbef3d22b9d35817a54fac1" exitCode=0 Oct 12 06:03:20 crc kubenswrapper[4930]: I1012 06:03:20.260981 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77b58f4b85-jll5z" event={"ID":"829039a6-ad10-4532-b406-f497e661fd8d","Type":"ContainerDied","Data":"208654bbf3ab99ca34348050c214a50a4bb163ffcbbef3d22b9d35817a54fac1"} Oct 12 06:03:21 crc kubenswrapper[4930]: I1012 06:03:21.326669 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77b58f4b85-jll5z" event={"ID":"829039a6-ad10-4532-b406-f497e661fd8d","Type":"ContainerStarted","Data":"ef09e72f670f93572496c40140d142aadc8ea035e6befffde99a99aeff2b7f07"} Oct 12 06:03:21 crc kubenswrapper[4930]: I1012 06:03:21.327018 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77b58f4b85-jll5z" Oct 12 06:03:21 crc kubenswrapper[4930]: I1012 06:03:21.357276 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-77b58f4b85-jll5z" podStartSLOduration=3.357257969 podStartE2EDuration="3.357257969s" podCreationTimestamp="2025-10-12 06:03:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 06:03:21.353761592 +0000 UTC m=+1333.895863357" watchObservedRunningTime="2025-10-12 06:03:21.357257969 +0000 UTC m=+1333.899359754" Oct 12 06:03:23 crc kubenswrapper[4930]: I1012 06:03:23.710458 4930 scope.go:117] "RemoveContainer" containerID="600c87b339b6336ad0bdb7f69f659fd2b7a9dc3ae9650c061969bc4d027d0a09" Oct 12 06:03:28 crc kubenswrapper[4930]: I1012 06:03:28.580078 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-77b58f4b85-jll5z" Oct 12 06:03:28 crc kubenswrapper[4930]: I1012 06:03:28.693703 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bf6c7df67-z88k5"] Oct 12 06:03:28 crc kubenswrapper[4930]: I1012 06:03:28.694581 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-bf6c7df67-z88k5" podUID="ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5" containerName="dnsmasq-dns" containerID="cri-o://735dba7e3515e750cb1db29e94ef0a5d5608f30dc1d5980a9634b9caee53d3a5" gracePeriod=10 Oct 12 06:03:29 crc kubenswrapper[4930]: I1012 06:03:29.234095 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bf6c7df67-z88k5" Oct 12 06:03:29 crc kubenswrapper[4930]: I1012 06:03:29.338920 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5-openstack-edpm-ipam\") pod \"ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5\" (UID: \"ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5\") " Oct 12 06:03:29 crc kubenswrapper[4930]: I1012 06:03:29.338972 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5-config\") pod \"ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5\" (UID: \"ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5\") " Oct 12 06:03:29 crc kubenswrapper[4930]: I1012 06:03:29.339035 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5-dns-swift-storage-0\") pod \"ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5\" (UID: \"ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5\") " Oct 12 06:03:29 crc kubenswrapper[4930]: I1012 06:03:29.339155 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5-ovsdbserver-sb\") pod \"ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5\" (UID: \"ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5\") " Oct 12 06:03:29 crc kubenswrapper[4930]: I1012 06:03:29.339206 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5-ovsdbserver-nb\") pod \"ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5\" (UID: \"ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5\") " Oct 12 06:03:29 crc kubenswrapper[4930]: I1012 06:03:29.339248 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5-dns-svc\") pod \"ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5\" (UID: \"ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5\") " Oct 12 06:03:29 crc kubenswrapper[4930]: I1012 06:03:29.339302 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwwq7\" (UniqueName: \"kubernetes.io/projected/ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5-kube-api-access-fwwq7\") pod \"ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5\" (UID: \"ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5\") " Oct 12 06:03:29 crc kubenswrapper[4930]: I1012 06:03:29.349375 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5-kube-api-access-fwwq7" (OuterVolumeSpecName: "kube-api-access-fwwq7") pod "ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5" (UID: "ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5"). InnerVolumeSpecName "kube-api-access-fwwq7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:03:29 crc kubenswrapper[4930]: I1012 06:03:29.431687 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5-config" (OuterVolumeSpecName: "config") pod "ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5" (UID: "ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 06:03:29 crc kubenswrapper[4930]: I1012 06:03:29.445020 4930 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5-config\") on node \"crc\" DevicePath \"\"" Oct 12 06:03:29 crc kubenswrapper[4930]: I1012 06:03:29.445233 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwwq7\" (UniqueName: \"kubernetes.io/projected/ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5-kube-api-access-fwwq7\") on node \"crc\" DevicePath \"\"" Oct 12 06:03:29 crc kubenswrapper[4930]: I1012 06:03:29.454557 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5" (UID: "ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 06:03:29 crc kubenswrapper[4930]: I1012 06:03:29.462666 4930 generic.go:334] "Generic (PLEG): container finished" podID="ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5" containerID="735dba7e3515e750cb1db29e94ef0a5d5608f30dc1d5980a9634b9caee53d3a5" exitCode=0 Oct 12 06:03:29 crc kubenswrapper[4930]: I1012 06:03:29.462705 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bf6c7df67-z88k5" event={"ID":"ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5","Type":"ContainerDied","Data":"735dba7e3515e750cb1db29e94ef0a5d5608f30dc1d5980a9634b9caee53d3a5"} Oct 12 06:03:29 crc kubenswrapper[4930]: I1012 06:03:29.462731 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bf6c7df67-z88k5" event={"ID":"ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5","Type":"ContainerDied","Data":"4d4ced1a22c656c28e81a1539601a08077f2b5aa94f25e7f54c74824ae33d13c"} Oct 12 06:03:29 crc kubenswrapper[4930]: I1012 06:03:29.462760 4930 scope.go:117] "RemoveContainer" containerID="735dba7e3515e750cb1db29e94ef0a5d5608f30dc1d5980a9634b9caee53d3a5" Oct 12 06:03:29 crc kubenswrapper[4930]: I1012 06:03:29.462871 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bf6c7df67-z88k5" Oct 12 06:03:29 crc kubenswrapper[4930]: I1012 06:03:29.478290 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5" (UID: "ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 06:03:29 crc kubenswrapper[4930]: I1012 06:03:29.511950 4930 scope.go:117] "RemoveContainer" containerID="860df8425c0f4c466b85e8d18522dded51505cedc24a28ad9bb1b094a55ee7db" Oct 12 06:03:29 crc kubenswrapper[4930]: I1012 06:03:29.512331 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5" (UID: "ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 06:03:29 crc kubenswrapper[4930]: I1012 06:03:29.512362 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5" (UID: "ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 06:03:29 crc kubenswrapper[4930]: I1012 06:03:29.516120 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5" (UID: "ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 06:03:29 crc kubenswrapper[4930]: I1012 06:03:29.547194 4930 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 12 06:03:29 crc kubenswrapper[4930]: I1012 06:03:29.547238 4930 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 12 06:03:29 crc kubenswrapper[4930]: I1012 06:03:29.547250 4930 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 12 06:03:29 crc kubenswrapper[4930]: I1012 06:03:29.547277 4930 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 12 06:03:29 crc kubenswrapper[4930]: I1012 06:03:29.547314 4930 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 12 06:03:29 crc kubenswrapper[4930]: I1012 06:03:29.584234 4930 scope.go:117] "RemoveContainer" containerID="735dba7e3515e750cb1db29e94ef0a5d5608f30dc1d5980a9634b9caee53d3a5" Oct 12 06:03:29 crc kubenswrapper[4930]: E1012 06:03:29.585995 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"735dba7e3515e750cb1db29e94ef0a5d5608f30dc1d5980a9634b9caee53d3a5\": container with ID starting with 735dba7e3515e750cb1db29e94ef0a5d5608f30dc1d5980a9634b9caee53d3a5 not found: ID does not exist" containerID="735dba7e3515e750cb1db29e94ef0a5d5608f30dc1d5980a9634b9caee53d3a5" Oct 12 06:03:29 crc kubenswrapper[4930]: I1012 06:03:29.586038 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"735dba7e3515e750cb1db29e94ef0a5d5608f30dc1d5980a9634b9caee53d3a5"} err="failed to get container status \"735dba7e3515e750cb1db29e94ef0a5d5608f30dc1d5980a9634b9caee53d3a5\": rpc error: code = NotFound desc = could not find container \"735dba7e3515e750cb1db29e94ef0a5d5608f30dc1d5980a9634b9caee53d3a5\": container with ID starting with 735dba7e3515e750cb1db29e94ef0a5d5608f30dc1d5980a9634b9caee53d3a5 not found: ID does not exist" Oct 12 06:03:29 crc kubenswrapper[4930]: I1012 06:03:29.586064 4930 scope.go:117] "RemoveContainer" containerID="860df8425c0f4c466b85e8d18522dded51505cedc24a28ad9bb1b094a55ee7db" Oct 12 06:03:29 crc kubenswrapper[4930]: E1012 06:03:29.589060 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"860df8425c0f4c466b85e8d18522dded51505cedc24a28ad9bb1b094a55ee7db\": container with ID starting with 860df8425c0f4c466b85e8d18522dded51505cedc24a28ad9bb1b094a55ee7db not found: ID does not exist" containerID="860df8425c0f4c466b85e8d18522dded51505cedc24a28ad9bb1b094a55ee7db" Oct 12 06:03:29 crc kubenswrapper[4930]: I1012 06:03:29.589119 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"860df8425c0f4c466b85e8d18522dded51505cedc24a28ad9bb1b094a55ee7db"} err="failed to get container status \"860df8425c0f4c466b85e8d18522dded51505cedc24a28ad9bb1b094a55ee7db\": rpc error: code = NotFound desc = could not find container \"860df8425c0f4c466b85e8d18522dded51505cedc24a28ad9bb1b094a55ee7db\": container with ID starting with 860df8425c0f4c466b85e8d18522dded51505cedc24a28ad9bb1b094a55ee7db not found: ID does not exist" Oct 12 06:03:29 crc kubenswrapper[4930]: I1012 06:03:29.805568 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bf6c7df67-z88k5"] Oct 12 06:03:29 crc kubenswrapper[4930]: I1012 06:03:29.817631 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bf6c7df67-z88k5"] Oct 12 06:03:30 crc kubenswrapper[4930]: I1012 06:03:30.154233 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5" path="/var/lib/kubelet/pods/ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5/volumes" Oct 12 06:03:33 crc kubenswrapper[4930]: I1012 06:03:33.669396 4930 patch_prober.go:28] interesting pod/machine-config-daemon-mk4tf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 06:03:33 crc kubenswrapper[4930]: I1012 06:03:33.669858 4930 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 06:03:35 crc kubenswrapper[4930]: I1012 06:03:35.535722 4930 generic.go:334] "Generic (PLEG): container finished" podID="cb65a4f8-39a7-4b20-b1cd-a077fff7ad8d" containerID="c7f3d8997ea83328ad1814110e6a60ae0772c7ea68dab6290a554e975c15ebd5" exitCode=0 Oct 12 06:03:35 crc kubenswrapper[4930]: I1012 06:03:35.535854 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cb65a4f8-39a7-4b20-b1cd-a077fff7ad8d","Type":"ContainerDied","Data":"c7f3d8997ea83328ad1814110e6a60ae0772c7ea68dab6290a554e975c15ebd5"} Oct 12 06:03:36 crc kubenswrapper[4930]: I1012 06:03:36.547806 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cb65a4f8-39a7-4b20-b1cd-a077fff7ad8d","Type":"ContainerStarted","Data":"b81b342221caa7ad9ee880ce9a3903304974fd466f21546aa3570e01c410ae54"} Oct 12 06:03:36 crc kubenswrapper[4930]: I1012 06:03:36.548883 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 12 06:03:36 crc kubenswrapper[4930]: I1012 06:03:36.579358 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=38.579341367 podStartE2EDuration="38.579341367s" podCreationTimestamp="2025-10-12 06:02:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 06:03:36.574968459 +0000 UTC m=+1349.117070224" watchObservedRunningTime="2025-10-12 06:03:36.579341367 +0000 UTC m=+1349.121443132" Oct 12 06:03:37 crc kubenswrapper[4930]: I1012 06:03:37.560475 4930 generic.go:334] "Generic (PLEG): container finished" podID="62c5d71d-6283-44b4-9b50-96fd50d7ad99" containerID="62134c443653cfd84216a5751eb0c1a27d8a75699a649b11f5e6329e422f863c" exitCode=0 Oct 12 06:03:37 crc kubenswrapper[4930]: I1012 06:03:37.560594 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"62c5d71d-6283-44b4-9b50-96fd50d7ad99","Type":"ContainerDied","Data":"62134c443653cfd84216a5751eb0c1a27d8a75699a649b11f5e6329e422f863c"} Oct 12 06:03:38 crc kubenswrapper[4930]: I1012 06:03:38.574171 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"62c5d71d-6283-44b4-9b50-96fd50d7ad99","Type":"ContainerStarted","Data":"08d1ec286b10b1a8d38ab5e010d8a477063be1c56ba54ed120cb21fba40ab62c"} Oct 12 06:03:38 crc kubenswrapper[4930]: I1012 06:03:38.574650 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 12 06:03:38 crc kubenswrapper[4930]: I1012 06:03:38.609258 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=39.609240901 podStartE2EDuration="39.609240901s" podCreationTimestamp="2025-10-12 06:02:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 06:03:38.600925464 +0000 UTC m=+1351.143027229" watchObservedRunningTime="2025-10-12 06:03:38.609240901 +0000 UTC m=+1351.151342666" Oct 12 06:03:47 crc kubenswrapper[4930]: I1012 06:03:47.302177 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lw8bg"] Oct 12 06:03:47 crc kubenswrapper[4930]: E1012 06:03:47.303274 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5" containerName="dnsmasq-dns" Oct 12 06:03:47 crc kubenswrapper[4930]: I1012 06:03:47.303293 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5" containerName="dnsmasq-dns" Oct 12 06:03:47 crc kubenswrapper[4930]: E1012 06:03:47.303331 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57ae0cc6-55ee-4117-8d32-e82672b2ea46" containerName="init" Oct 12 06:03:47 crc kubenswrapper[4930]: I1012 06:03:47.303339 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="57ae0cc6-55ee-4117-8d32-e82672b2ea46" containerName="init" Oct 12 06:03:47 crc kubenswrapper[4930]: E1012 06:03:47.303367 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57ae0cc6-55ee-4117-8d32-e82672b2ea46" containerName="dnsmasq-dns" Oct 12 06:03:47 crc kubenswrapper[4930]: I1012 06:03:47.303376 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="57ae0cc6-55ee-4117-8d32-e82672b2ea46" containerName="dnsmasq-dns" Oct 12 06:03:47 crc kubenswrapper[4930]: E1012 06:03:47.303392 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5" containerName="init" Oct 12 06:03:47 crc kubenswrapper[4930]: I1012 06:03:47.303400 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5" containerName="init" Oct 12 06:03:47 crc kubenswrapper[4930]: I1012 06:03:47.303678 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="57ae0cc6-55ee-4117-8d32-e82672b2ea46" containerName="dnsmasq-dns" Oct 12 06:03:47 crc kubenswrapper[4930]: I1012 06:03:47.303701 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce0650a2-2a68-4e75-9298-ac9e2fbdf4a5" containerName="dnsmasq-dns" Oct 12 06:03:47 crc kubenswrapper[4930]: I1012 06:03:47.304553 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lw8bg" Oct 12 06:03:47 crc kubenswrapper[4930]: I1012 06:03:47.306661 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 12 06:03:47 crc kubenswrapper[4930]: I1012 06:03:47.306816 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vhlxg" Oct 12 06:03:47 crc kubenswrapper[4930]: I1012 06:03:47.307140 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 12 06:03:47 crc kubenswrapper[4930]: I1012 06:03:47.307179 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 12 06:03:47 crc kubenswrapper[4930]: I1012 06:03:47.310952 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lw8bg"] Oct 12 06:03:47 crc kubenswrapper[4930]: I1012 06:03:47.455381 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b8c39cb-bec0-49b4-a4bb-5949e695db04-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lw8bg\" (UID: \"6b8c39cb-bec0-49b4-a4bb-5949e695db04\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lw8bg" Oct 12 06:03:47 crc kubenswrapper[4930]: I1012 06:03:47.455433 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvwtl\" (UniqueName: \"kubernetes.io/projected/6b8c39cb-bec0-49b4-a4bb-5949e695db04-kube-api-access-cvwtl\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lw8bg\" (UID: \"6b8c39cb-bec0-49b4-a4bb-5949e695db04\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lw8bg" Oct 12 06:03:47 crc kubenswrapper[4930]: I1012 06:03:47.455546 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6b8c39cb-bec0-49b4-a4bb-5949e695db04-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lw8bg\" (UID: \"6b8c39cb-bec0-49b4-a4bb-5949e695db04\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lw8bg" Oct 12 06:03:47 crc kubenswrapper[4930]: I1012 06:03:47.455577 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6b8c39cb-bec0-49b4-a4bb-5949e695db04-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lw8bg\" (UID: \"6b8c39cb-bec0-49b4-a4bb-5949e695db04\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lw8bg" Oct 12 06:03:47 crc kubenswrapper[4930]: I1012 06:03:47.557781 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b8c39cb-bec0-49b4-a4bb-5949e695db04-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lw8bg\" (UID: \"6b8c39cb-bec0-49b4-a4bb-5949e695db04\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lw8bg" Oct 12 06:03:47 crc kubenswrapper[4930]: I1012 06:03:47.557856 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvwtl\" (UniqueName: \"kubernetes.io/projected/6b8c39cb-bec0-49b4-a4bb-5949e695db04-kube-api-access-cvwtl\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lw8bg\" (UID: \"6b8c39cb-bec0-49b4-a4bb-5949e695db04\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lw8bg" Oct 12 06:03:47 crc kubenswrapper[4930]: I1012 06:03:47.558025 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6b8c39cb-bec0-49b4-a4bb-5949e695db04-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lw8bg\" (UID: \"6b8c39cb-bec0-49b4-a4bb-5949e695db04\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lw8bg" Oct 12 06:03:47 crc kubenswrapper[4930]: I1012 06:03:47.558065 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6b8c39cb-bec0-49b4-a4bb-5949e695db04-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lw8bg\" (UID: \"6b8c39cb-bec0-49b4-a4bb-5949e695db04\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lw8bg" Oct 12 06:03:47 crc kubenswrapper[4930]: I1012 06:03:47.563823 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b8c39cb-bec0-49b4-a4bb-5949e695db04-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lw8bg\" (UID: \"6b8c39cb-bec0-49b4-a4bb-5949e695db04\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lw8bg" Oct 12 06:03:47 crc kubenswrapper[4930]: I1012 06:03:47.563916 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6b8c39cb-bec0-49b4-a4bb-5949e695db04-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lw8bg\" (UID: \"6b8c39cb-bec0-49b4-a4bb-5949e695db04\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lw8bg" Oct 12 06:03:47 crc kubenswrapper[4930]: I1012 06:03:47.573334 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6b8c39cb-bec0-49b4-a4bb-5949e695db04-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lw8bg\" (UID: \"6b8c39cb-bec0-49b4-a4bb-5949e695db04\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lw8bg" Oct 12 06:03:47 crc kubenswrapper[4930]: I1012 06:03:47.583854 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvwtl\" (UniqueName: \"kubernetes.io/projected/6b8c39cb-bec0-49b4-a4bb-5949e695db04-kube-api-access-cvwtl\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lw8bg\" (UID: \"6b8c39cb-bec0-49b4-a4bb-5949e695db04\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lw8bg" Oct 12 06:03:47 crc kubenswrapper[4930]: I1012 06:03:47.631196 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lw8bg" Oct 12 06:03:48 crc kubenswrapper[4930]: I1012 06:03:48.268029 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lw8bg"] Oct 12 06:03:48 crc kubenswrapper[4930]: I1012 06:03:48.670617 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lw8bg" event={"ID":"6b8c39cb-bec0-49b4-a4bb-5949e695db04","Type":"ContainerStarted","Data":"efc0aad1afb7c34e6d238ebbf6051969f1fe0fa7641ece4af833ae2a0cd35d35"} Oct 12 06:03:49 crc kubenswrapper[4930]: I1012 06:03:49.331346 4930 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="cb65a4f8-39a7-4b20-b1cd-a077fff7ad8d" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.230:5671: connect: connection refused" Oct 12 06:03:50 crc kubenswrapper[4930]: I1012 06:03:50.329972 4930 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="62c5d71d-6283-44b4-9b50-96fd50d7ad99" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.231:5671: connect: connection refused" Oct 12 06:03:57 crc kubenswrapper[4930]: I1012 06:03:57.772031 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lw8bg" event={"ID":"6b8c39cb-bec0-49b4-a4bb-5949e695db04","Type":"ContainerStarted","Data":"eff18a4b0a4224f7ad3941cb26d78c2823af6cba8ada348c30e09e3c13584aa5"} Oct 12 06:03:57 crc kubenswrapper[4930]: I1012 06:03:57.808661 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lw8bg" podStartSLOduration=1.738915804 podStartE2EDuration="10.808630129s" podCreationTimestamp="2025-10-12 06:03:47 +0000 UTC" firstStartedPulling="2025-10-12 06:03:48.264972602 +0000 UTC m=+1360.807074377" lastFinishedPulling="2025-10-12 06:03:57.334686897 +0000 UTC m=+1369.876788702" observedRunningTime="2025-10-12 06:03:57.794247492 +0000 UTC m=+1370.336349267" watchObservedRunningTime="2025-10-12 06:03:57.808630129 +0000 UTC m=+1370.350731934" Oct 12 06:03:59 crc kubenswrapper[4930]: I1012 06:03:59.331129 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 12 06:04:00 crc kubenswrapper[4930]: I1012 06:04:00.330063 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 12 06:04:03 crc kubenswrapper[4930]: I1012 06:04:03.669930 4930 patch_prober.go:28] interesting pod/machine-config-daemon-mk4tf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 06:04:03 crc kubenswrapper[4930]: I1012 06:04:03.670393 4930 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 06:04:08 crc kubenswrapper[4930]: I1012 06:04:08.930448 4930 generic.go:334] "Generic (PLEG): container finished" podID="6b8c39cb-bec0-49b4-a4bb-5949e695db04" containerID="eff18a4b0a4224f7ad3941cb26d78c2823af6cba8ada348c30e09e3c13584aa5" exitCode=0 Oct 12 06:04:08 crc kubenswrapper[4930]: I1012 06:04:08.930667 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lw8bg" event={"ID":"6b8c39cb-bec0-49b4-a4bb-5949e695db04","Type":"ContainerDied","Data":"eff18a4b0a4224f7ad3941cb26d78c2823af6cba8ada348c30e09e3c13584aa5"} Oct 12 06:04:10 crc kubenswrapper[4930]: I1012 06:04:10.494659 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lw8bg" Oct 12 06:04:10 crc kubenswrapper[4930]: I1012 06:04:10.513932 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvwtl\" (UniqueName: \"kubernetes.io/projected/6b8c39cb-bec0-49b4-a4bb-5949e695db04-kube-api-access-cvwtl\") pod \"6b8c39cb-bec0-49b4-a4bb-5949e695db04\" (UID: \"6b8c39cb-bec0-49b4-a4bb-5949e695db04\") " Oct 12 06:04:10 crc kubenswrapper[4930]: I1012 06:04:10.516071 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b8c39cb-bec0-49b4-a4bb-5949e695db04-repo-setup-combined-ca-bundle\") pod \"6b8c39cb-bec0-49b4-a4bb-5949e695db04\" (UID: \"6b8c39cb-bec0-49b4-a4bb-5949e695db04\") " Oct 12 06:04:10 crc kubenswrapper[4930]: I1012 06:04:10.516169 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6b8c39cb-bec0-49b4-a4bb-5949e695db04-ssh-key\") pod \"6b8c39cb-bec0-49b4-a4bb-5949e695db04\" (UID: \"6b8c39cb-bec0-49b4-a4bb-5949e695db04\") " Oct 12 06:04:10 crc kubenswrapper[4930]: I1012 06:04:10.516236 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6b8c39cb-bec0-49b4-a4bb-5949e695db04-inventory\") pod \"6b8c39cb-bec0-49b4-a4bb-5949e695db04\" (UID: \"6b8c39cb-bec0-49b4-a4bb-5949e695db04\") " Oct 12 06:04:10 crc kubenswrapper[4930]: I1012 06:04:10.521177 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b8c39cb-bec0-49b4-a4bb-5949e695db04-kube-api-access-cvwtl" (OuterVolumeSpecName: "kube-api-access-cvwtl") pod "6b8c39cb-bec0-49b4-a4bb-5949e695db04" (UID: "6b8c39cb-bec0-49b4-a4bb-5949e695db04"). InnerVolumeSpecName "kube-api-access-cvwtl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:04:10 crc kubenswrapper[4930]: I1012 06:04:10.521424 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b8c39cb-bec0-49b4-a4bb-5949e695db04-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "6b8c39cb-bec0-49b4-a4bb-5949e695db04" (UID: "6b8c39cb-bec0-49b4-a4bb-5949e695db04"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:04:10 crc kubenswrapper[4930]: I1012 06:04:10.548231 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b8c39cb-bec0-49b4-a4bb-5949e695db04-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6b8c39cb-bec0-49b4-a4bb-5949e695db04" (UID: "6b8c39cb-bec0-49b4-a4bb-5949e695db04"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:04:10 crc kubenswrapper[4930]: I1012 06:04:10.549471 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b8c39cb-bec0-49b4-a4bb-5949e695db04-inventory" (OuterVolumeSpecName: "inventory") pod "6b8c39cb-bec0-49b4-a4bb-5949e695db04" (UID: "6b8c39cb-bec0-49b4-a4bb-5949e695db04"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:04:10 crc kubenswrapper[4930]: I1012 06:04:10.619190 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvwtl\" (UniqueName: \"kubernetes.io/projected/6b8c39cb-bec0-49b4-a4bb-5949e695db04-kube-api-access-cvwtl\") on node \"crc\" DevicePath \"\"" Oct 12 06:04:10 crc kubenswrapper[4930]: I1012 06:04:10.619222 4930 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b8c39cb-bec0-49b4-a4bb-5949e695db04-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 06:04:10 crc kubenswrapper[4930]: I1012 06:04:10.619235 4930 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6b8c39cb-bec0-49b4-a4bb-5949e695db04-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 12 06:04:10 crc kubenswrapper[4930]: I1012 06:04:10.619248 4930 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6b8c39cb-bec0-49b4-a4bb-5949e695db04-inventory\") on node \"crc\" DevicePath \"\"" Oct 12 06:04:10 crc kubenswrapper[4930]: I1012 06:04:10.964362 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lw8bg" event={"ID":"6b8c39cb-bec0-49b4-a4bb-5949e695db04","Type":"ContainerDied","Data":"efc0aad1afb7c34e6d238ebbf6051969f1fe0fa7641ece4af833ae2a0cd35d35"} Oct 12 06:04:10 crc kubenswrapper[4930]: I1012 06:04:10.964419 4930 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="efc0aad1afb7c34e6d238ebbf6051969f1fe0fa7641ece4af833ae2a0cd35d35" Oct 12 06:04:10 crc kubenswrapper[4930]: I1012 06:04:10.964464 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lw8bg" Oct 12 06:04:11 crc kubenswrapper[4930]: I1012 06:04:11.100009 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-rj48r"] Oct 12 06:04:11 crc kubenswrapper[4930]: E1012 06:04:11.100877 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b8c39cb-bec0-49b4-a4bb-5949e695db04" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 12 06:04:11 crc kubenswrapper[4930]: I1012 06:04:11.100907 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b8c39cb-bec0-49b4-a4bb-5949e695db04" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 12 06:04:11 crc kubenswrapper[4930]: I1012 06:04:11.101156 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b8c39cb-bec0-49b4-a4bb-5949e695db04" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 12 06:04:11 crc kubenswrapper[4930]: I1012 06:04:11.102072 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-rj48r" Oct 12 06:04:11 crc kubenswrapper[4930]: I1012 06:04:11.105922 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 12 06:04:11 crc kubenswrapper[4930]: I1012 06:04:11.106444 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 12 06:04:11 crc kubenswrapper[4930]: I1012 06:04:11.106770 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vhlxg" Oct 12 06:04:11 crc kubenswrapper[4930]: I1012 06:04:11.108143 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 12 06:04:11 crc kubenswrapper[4930]: I1012 06:04:11.129316 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4f819c97-4853-42ee-ac71-a252fefe38c5-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-rj48r\" (UID: \"4f819c97-4853-42ee-ac71-a252fefe38c5\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-rj48r" Oct 12 06:04:11 crc kubenswrapper[4930]: I1012 06:04:11.129458 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f819c97-4853-42ee-ac71-a252fefe38c5-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-rj48r\" (UID: \"4f819c97-4853-42ee-ac71-a252fefe38c5\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-rj48r" Oct 12 06:04:11 crc kubenswrapper[4930]: I1012 06:04:11.129505 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84pk5\" (UniqueName: \"kubernetes.io/projected/4f819c97-4853-42ee-ac71-a252fefe38c5-kube-api-access-84pk5\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-rj48r\" (UID: \"4f819c97-4853-42ee-ac71-a252fefe38c5\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-rj48r" Oct 12 06:04:11 crc kubenswrapper[4930]: I1012 06:04:11.138862 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-rj48r"] Oct 12 06:04:11 crc kubenswrapper[4930]: I1012 06:04:11.233444 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4f819c97-4853-42ee-ac71-a252fefe38c5-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-rj48r\" (UID: \"4f819c97-4853-42ee-ac71-a252fefe38c5\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-rj48r" Oct 12 06:04:11 crc kubenswrapper[4930]: I1012 06:04:11.233970 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f819c97-4853-42ee-ac71-a252fefe38c5-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-rj48r\" (UID: \"4f819c97-4853-42ee-ac71-a252fefe38c5\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-rj48r" Oct 12 06:04:11 crc kubenswrapper[4930]: I1012 06:04:11.234143 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84pk5\" (UniqueName: \"kubernetes.io/projected/4f819c97-4853-42ee-ac71-a252fefe38c5-kube-api-access-84pk5\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-rj48r\" (UID: \"4f819c97-4853-42ee-ac71-a252fefe38c5\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-rj48r" Oct 12 06:04:11 crc kubenswrapper[4930]: I1012 06:04:11.238264 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4f819c97-4853-42ee-ac71-a252fefe38c5-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-rj48r\" (UID: \"4f819c97-4853-42ee-ac71-a252fefe38c5\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-rj48r" Oct 12 06:04:11 crc kubenswrapper[4930]: I1012 06:04:11.240480 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f819c97-4853-42ee-ac71-a252fefe38c5-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-rj48r\" (UID: \"4f819c97-4853-42ee-ac71-a252fefe38c5\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-rj48r" Oct 12 06:04:11 crc kubenswrapper[4930]: I1012 06:04:11.261414 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84pk5\" (UniqueName: \"kubernetes.io/projected/4f819c97-4853-42ee-ac71-a252fefe38c5-kube-api-access-84pk5\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-rj48r\" (UID: \"4f819c97-4853-42ee-ac71-a252fefe38c5\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-rj48r" Oct 12 06:04:11 crc kubenswrapper[4930]: I1012 06:04:11.460615 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-rj48r" Oct 12 06:04:12 crc kubenswrapper[4930]: I1012 06:04:12.029560 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-rj48r"] Oct 12 06:04:12 crc kubenswrapper[4930]: W1012 06:04:12.039109 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f819c97_4853_42ee_ac71_a252fefe38c5.slice/crio-d5166805212d6985e76407bb2f5deac87255d588c2edd9d5dd14a0338071e278 WatchSource:0}: Error finding container d5166805212d6985e76407bb2f5deac87255d588c2edd9d5dd14a0338071e278: Status 404 returned error can't find the container with id d5166805212d6985e76407bb2f5deac87255d588c2edd9d5dd14a0338071e278 Oct 12 06:04:12 crc kubenswrapper[4930]: I1012 06:04:12.995148 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-rj48r" event={"ID":"4f819c97-4853-42ee-ac71-a252fefe38c5","Type":"ContainerStarted","Data":"654018e99d82286603672f5c483d43d3104768938bc4b7deb430c1060c4725a3"} Oct 12 06:04:12 crc kubenswrapper[4930]: I1012 06:04:12.995695 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-rj48r" event={"ID":"4f819c97-4853-42ee-ac71-a252fefe38c5","Type":"ContainerStarted","Data":"d5166805212d6985e76407bb2f5deac87255d588c2edd9d5dd14a0338071e278"} Oct 12 06:04:13 crc kubenswrapper[4930]: I1012 06:04:13.035225 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-rj48r" podStartSLOduration=1.558901512 podStartE2EDuration="2.035198401s" podCreationTimestamp="2025-10-12 06:04:11 +0000 UTC" firstStartedPulling="2025-10-12 06:04:12.046527307 +0000 UTC m=+1384.588629082" lastFinishedPulling="2025-10-12 06:04:12.522824196 +0000 UTC m=+1385.064925971" observedRunningTime="2025-10-12 06:04:13.016636151 +0000 UTC m=+1385.558737926" watchObservedRunningTime="2025-10-12 06:04:13.035198401 +0000 UTC m=+1385.577300206" Oct 12 06:04:16 crc kubenswrapper[4930]: I1012 06:04:16.043391 4930 generic.go:334] "Generic (PLEG): container finished" podID="4f819c97-4853-42ee-ac71-a252fefe38c5" containerID="654018e99d82286603672f5c483d43d3104768938bc4b7deb430c1060c4725a3" exitCode=0 Oct 12 06:04:16 crc kubenswrapper[4930]: I1012 06:04:16.043434 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-rj48r" event={"ID":"4f819c97-4853-42ee-ac71-a252fefe38c5","Type":"ContainerDied","Data":"654018e99d82286603672f5c483d43d3104768938bc4b7deb430c1060c4725a3"} Oct 12 06:04:17 crc kubenswrapper[4930]: I1012 06:04:17.621466 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-rj48r" Oct 12 06:04:17 crc kubenswrapper[4930]: I1012 06:04:17.802858 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4f819c97-4853-42ee-ac71-a252fefe38c5-ssh-key\") pod \"4f819c97-4853-42ee-ac71-a252fefe38c5\" (UID: \"4f819c97-4853-42ee-ac71-a252fefe38c5\") " Oct 12 06:04:17 crc kubenswrapper[4930]: I1012 06:04:17.803145 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84pk5\" (UniqueName: \"kubernetes.io/projected/4f819c97-4853-42ee-ac71-a252fefe38c5-kube-api-access-84pk5\") pod \"4f819c97-4853-42ee-ac71-a252fefe38c5\" (UID: \"4f819c97-4853-42ee-ac71-a252fefe38c5\") " Oct 12 06:04:17 crc kubenswrapper[4930]: I1012 06:04:17.803257 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f819c97-4853-42ee-ac71-a252fefe38c5-inventory\") pod \"4f819c97-4853-42ee-ac71-a252fefe38c5\" (UID: \"4f819c97-4853-42ee-ac71-a252fefe38c5\") " Oct 12 06:04:17 crc kubenswrapper[4930]: I1012 06:04:17.812435 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f819c97-4853-42ee-ac71-a252fefe38c5-kube-api-access-84pk5" (OuterVolumeSpecName: "kube-api-access-84pk5") pod "4f819c97-4853-42ee-ac71-a252fefe38c5" (UID: "4f819c97-4853-42ee-ac71-a252fefe38c5"). InnerVolumeSpecName "kube-api-access-84pk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:04:17 crc kubenswrapper[4930]: I1012 06:04:17.838472 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f819c97-4853-42ee-ac71-a252fefe38c5-inventory" (OuterVolumeSpecName: "inventory") pod "4f819c97-4853-42ee-ac71-a252fefe38c5" (UID: "4f819c97-4853-42ee-ac71-a252fefe38c5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:04:17 crc kubenswrapper[4930]: I1012 06:04:17.857330 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f819c97-4853-42ee-ac71-a252fefe38c5-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4f819c97-4853-42ee-ac71-a252fefe38c5" (UID: "4f819c97-4853-42ee-ac71-a252fefe38c5"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:04:17 crc kubenswrapper[4930]: I1012 06:04:17.908850 4930 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4f819c97-4853-42ee-ac71-a252fefe38c5-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 12 06:04:17 crc kubenswrapper[4930]: I1012 06:04:17.908899 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84pk5\" (UniqueName: \"kubernetes.io/projected/4f819c97-4853-42ee-ac71-a252fefe38c5-kube-api-access-84pk5\") on node \"crc\" DevicePath \"\"" Oct 12 06:04:17 crc kubenswrapper[4930]: I1012 06:04:17.908920 4930 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f819c97-4853-42ee-ac71-a252fefe38c5-inventory\") on node \"crc\" DevicePath \"\"" Oct 12 06:04:18 crc kubenswrapper[4930]: I1012 06:04:18.084894 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-rj48r" event={"ID":"4f819c97-4853-42ee-ac71-a252fefe38c5","Type":"ContainerDied","Data":"d5166805212d6985e76407bb2f5deac87255d588c2edd9d5dd14a0338071e278"} Oct 12 06:04:18 crc kubenswrapper[4930]: I1012 06:04:18.084967 4930 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5166805212d6985e76407bb2f5deac87255d588c2edd9d5dd14a0338071e278" Oct 12 06:04:18 crc kubenswrapper[4930]: I1012 06:04:18.085123 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-rj48r" Oct 12 06:04:18 crc kubenswrapper[4930]: I1012 06:04:18.191822 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fdbjn"] Oct 12 06:04:18 crc kubenswrapper[4930]: E1012 06:04:18.192555 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f819c97-4853-42ee-ac71-a252fefe38c5" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 12 06:04:18 crc kubenswrapper[4930]: I1012 06:04:18.192589 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f819c97-4853-42ee-ac71-a252fefe38c5" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 12 06:04:18 crc kubenswrapper[4930]: I1012 06:04:18.193053 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f819c97-4853-42ee-ac71-a252fefe38c5" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 12 06:04:18 crc kubenswrapper[4930]: I1012 06:04:18.194326 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fdbjn" Oct 12 06:04:18 crc kubenswrapper[4930]: I1012 06:04:18.197561 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vhlxg" Oct 12 06:04:18 crc kubenswrapper[4930]: I1012 06:04:18.197704 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 12 06:04:18 crc kubenswrapper[4930]: I1012 06:04:18.198492 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 12 06:04:18 crc kubenswrapper[4930]: I1012 06:04:18.199587 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 12 06:04:18 crc kubenswrapper[4930]: I1012 06:04:18.210238 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fdbjn"] Oct 12 06:04:18 crc kubenswrapper[4930]: I1012 06:04:18.228302 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fe5863bc-46d9-4eb8-8ab5-e5c6c2e6dbdc-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fdbjn\" (UID: \"fe5863bc-46d9-4eb8-8ab5-e5c6c2e6dbdc\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fdbjn" Oct 12 06:04:18 crc kubenswrapper[4930]: I1012 06:04:18.228396 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fe5863bc-46d9-4eb8-8ab5-e5c6c2e6dbdc-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fdbjn\" (UID: \"fe5863bc-46d9-4eb8-8ab5-e5c6c2e6dbdc\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fdbjn" Oct 12 06:04:18 crc kubenswrapper[4930]: I1012 06:04:18.228438 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkr9p\" (UniqueName: \"kubernetes.io/projected/fe5863bc-46d9-4eb8-8ab5-e5c6c2e6dbdc-kube-api-access-bkr9p\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fdbjn\" (UID: \"fe5863bc-46d9-4eb8-8ab5-e5c6c2e6dbdc\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fdbjn" Oct 12 06:04:18 crc kubenswrapper[4930]: I1012 06:04:18.228502 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe5863bc-46d9-4eb8-8ab5-e5c6c2e6dbdc-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fdbjn\" (UID: \"fe5863bc-46d9-4eb8-8ab5-e5c6c2e6dbdc\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fdbjn" Oct 12 06:04:18 crc kubenswrapper[4930]: I1012 06:04:18.330645 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fe5863bc-46d9-4eb8-8ab5-e5c6c2e6dbdc-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fdbjn\" (UID: \"fe5863bc-46d9-4eb8-8ab5-e5c6c2e6dbdc\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fdbjn" Oct 12 06:04:18 crc kubenswrapper[4930]: I1012 06:04:18.330848 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fe5863bc-46d9-4eb8-8ab5-e5c6c2e6dbdc-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fdbjn\" (UID: \"fe5863bc-46d9-4eb8-8ab5-e5c6c2e6dbdc\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fdbjn" Oct 12 06:04:18 crc kubenswrapper[4930]: I1012 06:04:18.330937 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkr9p\" (UniqueName: \"kubernetes.io/projected/fe5863bc-46d9-4eb8-8ab5-e5c6c2e6dbdc-kube-api-access-bkr9p\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fdbjn\" (UID: \"fe5863bc-46d9-4eb8-8ab5-e5c6c2e6dbdc\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fdbjn" Oct 12 06:04:18 crc kubenswrapper[4930]: I1012 06:04:18.331086 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe5863bc-46d9-4eb8-8ab5-e5c6c2e6dbdc-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fdbjn\" (UID: \"fe5863bc-46d9-4eb8-8ab5-e5c6c2e6dbdc\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fdbjn" Oct 12 06:04:18 crc kubenswrapper[4930]: I1012 06:04:18.337408 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fe5863bc-46d9-4eb8-8ab5-e5c6c2e6dbdc-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fdbjn\" (UID: \"fe5863bc-46d9-4eb8-8ab5-e5c6c2e6dbdc\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fdbjn" Oct 12 06:04:18 crc kubenswrapper[4930]: I1012 06:04:18.337657 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fe5863bc-46d9-4eb8-8ab5-e5c6c2e6dbdc-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fdbjn\" (UID: \"fe5863bc-46d9-4eb8-8ab5-e5c6c2e6dbdc\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fdbjn" Oct 12 06:04:18 crc kubenswrapper[4930]: I1012 06:04:18.351029 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe5863bc-46d9-4eb8-8ab5-e5c6c2e6dbdc-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fdbjn\" (UID: \"fe5863bc-46d9-4eb8-8ab5-e5c6c2e6dbdc\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fdbjn" Oct 12 06:04:18 crc kubenswrapper[4930]: I1012 06:04:18.362631 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkr9p\" (UniqueName: \"kubernetes.io/projected/fe5863bc-46d9-4eb8-8ab5-e5c6c2e6dbdc-kube-api-access-bkr9p\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fdbjn\" (UID: \"fe5863bc-46d9-4eb8-8ab5-e5c6c2e6dbdc\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fdbjn" Oct 12 06:04:18 crc kubenswrapper[4930]: I1012 06:04:18.533007 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fdbjn" Oct 12 06:04:19 crc kubenswrapper[4930]: I1012 06:04:19.015100 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fdbjn"] Oct 12 06:04:19 crc kubenswrapper[4930]: I1012 06:04:19.093964 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fdbjn" event={"ID":"fe5863bc-46d9-4eb8-8ab5-e5c6c2e6dbdc","Type":"ContainerStarted","Data":"61f8db69d611427d25c4024902b3fb899dca3445169d52c032b7928ee4f22b85"} Oct 12 06:04:20 crc kubenswrapper[4930]: I1012 06:04:20.106846 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fdbjn" event={"ID":"fe5863bc-46d9-4eb8-8ab5-e5c6c2e6dbdc","Type":"ContainerStarted","Data":"f0514e363e9732d3a7c12f102c16c8168b5d9e12a5b86270492a119ae1778f85"} Oct 12 06:04:20 crc kubenswrapper[4930]: I1012 06:04:20.131692 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fdbjn" podStartSLOduration=1.70761527 podStartE2EDuration="2.131667453s" podCreationTimestamp="2025-10-12 06:04:18 +0000 UTC" firstStartedPulling="2025-10-12 06:04:19.024132349 +0000 UTC m=+1391.566234114" lastFinishedPulling="2025-10-12 06:04:19.448184502 +0000 UTC m=+1391.990286297" observedRunningTime="2025-10-12 06:04:20.119566323 +0000 UTC m=+1392.661668088" watchObservedRunningTime="2025-10-12 06:04:20.131667453 +0000 UTC m=+1392.673769258" Oct 12 06:04:23 crc kubenswrapper[4930]: I1012 06:04:23.888091 4930 scope.go:117] "RemoveContainer" containerID="20d540d2eb93bee693da7417e88176ccc63ce67b45c9423965b7eccef1b42fd9" Oct 12 06:04:23 crc kubenswrapper[4930]: I1012 06:04:23.938113 4930 scope.go:117] "RemoveContainer" containerID="62f020028b6b17a42630094b79d284a4e9e7103dc1dc7e65f9313baef3502d4e" Oct 12 06:04:32 crc kubenswrapper[4930]: I1012 06:04:32.427838 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-s24nr"] Oct 12 06:04:32 crc kubenswrapper[4930]: I1012 06:04:32.430601 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s24nr" Oct 12 06:04:32 crc kubenswrapper[4930]: I1012 06:04:32.436029 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-s24nr"] Oct 12 06:04:32 crc kubenswrapper[4930]: I1012 06:04:32.554544 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/954904d1-95d2-4ffc-a37b-bb607ca2fdb5-catalog-content\") pod \"redhat-operators-s24nr\" (UID: \"954904d1-95d2-4ffc-a37b-bb607ca2fdb5\") " pod="openshift-marketplace/redhat-operators-s24nr" Oct 12 06:04:32 crc kubenswrapper[4930]: I1012 06:04:32.554632 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rsw7\" (UniqueName: \"kubernetes.io/projected/954904d1-95d2-4ffc-a37b-bb607ca2fdb5-kube-api-access-7rsw7\") pod \"redhat-operators-s24nr\" (UID: \"954904d1-95d2-4ffc-a37b-bb607ca2fdb5\") " pod="openshift-marketplace/redhat-operators-s24nr" Oct 12 06:04:32 crc kubenswrapper[4930]: I1012 06:04:32.554760 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/954904d1-95d2-4ffc-a37b-bb607ca2fdb5-utilities\") pod \"redhat-operators-s24nr\" (UID: \"954904d1-95d2-4ffc-a37b-bb607ca2fdb5\") " pod="openshift-marketplace/redhat-operators-s24nr" Oct 12 06:04:32 crc kubenswrapper[4930]: I1012 06:04:32.656538 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/954904d1-95d2-4ffc-a37b-bb607ca2fdb5-catalog-content\") pod \"redhat-operators-s24nr\" (UID: \"954904d1-95d2-4ffc-a37b-bb607ca2fdb5\") " pod="openshift-marketplace/redhat-operators-s24nr" Oct 12 06:04:32 crc kubenswrapper[4930]: I1012 06:04:32.656651 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rsw7\" (UniqueName: \"kubernetes.io/projected/954904d1-95d2-4ffc-a37b-bb607ca2fdb5-kube-api-access-7rsw7\") pod \"redhat-operators-s24nr\" (UID: \"954904d1-95d2-4ffc-a37b-bb607ca2fdb5\") " pod="openshift-marketplace/redhat-operators-s24nr" Oct 12 06:04:32 crc kubenswrapper[4930]: I1012 06:04:32.656779 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/954904d1-95d2-4ffc-a37b-bb607ca2fdb5-utilities\") pod \"redhat-operators-s24nr\" (UID: \"954904d1-95d2-4ffc-a37b-bb607ca2fdb5\") " pod="openshift-marketplace/redhat-operators-s24nr" Oct 12 06:04:32 crc kubenswrapper[4930]: I1012 06:04:32.657066 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/954904d1-95d2-4ffc-a37b-bb607ca2fdb5-catalog-content\") pod \"redhat-operators-s24nr\" (UID: \"954904d1-95d2-4ffc-a37b-bb607ca2fdb5\") " pod="openshift-marketplace/redhat-operators-s24nr" Oct 12 06:04:32 crc kubenswrapper[4930]: I1012 06:04:32.657343 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/954904d1-95d2-4ffc-a37b-bb607ca2fdb5-utilities\") pod \"redhat-operators-s24nr\" (UID: \"954904d1-95d2-4ffc-a37b-bb607ca2fdb5\") " pod="openshift-marketplace/redhat-operators-s24nr" Oct 12 06:04:32 crc kubenswrapper[4930]: I1012 06:04:32.688407 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rsw7\" (UniqueName: \"kubernetes.io/projected/954904d1-95d2-4ffc-a37b-bb607ca2fdb5-kube-api-access-7rsw7\") pod \"redhat-operators-s24nr\" (UID: \"954904d1-95d2-4ffc-a37b-bb607ca2fdb5\") " pod="openshift-marketplace/redhat-operators-s24nr" Oct 12 06:04:32 crc kubenswrapper[4930]: I1012 06:04:32.763719 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s24nr" Oct 12 06:04:33 crc kubenswrapper[4930]: I1012 06:04:33.252161 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-s24nr"] Oct 12 06:04:33 crc kubenswrapper[4930]: I1012 06:04:33.669803 4930 patch_prober.go:28] interesting pod/machine-config-daemon-mk4tf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 06:04:33 crc kubenswrapper[4930]: I1012 06:04:33.669865 4930 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 06:04:33 crc kubenswrapper[4930]: I1012 06:04:33.669908 4930 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" Oct 12 06:04:33 crc kubenswrapper[4930]: I1012 06:04:33.670862 4930 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a3725e0633e74d0677c6bbc6f3b93966f5f5bd1dac0a945c6898b98315d866e3"} pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 12 06:04:33 crc kubenswrapper[4930]: I1012 06:04:33.670915 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerName="machine-config-daemon" containerID="cri-o://a3725e0633e74d0677c6bbc6f3b93966f5f5bd1dac0a945c6898b98315d866e3" gracePeriod=600 Oct 12 06:04:34 crc kubenswrapper[4930]: I1012 06:04:34.276405 4930 generic.go:334] "Generic (PLEG): container finished" podID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerID="a3725e0633e74d0677c6bbc6f3b93966f5f5bd1dac0a945c6898b98315d866e3" exitCode=0 Oct 12 06:04:34 crc kubenswrapper[4930]: I1012 06:04:34.276498 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" event={"ID":"02f8684c-a3e4-44e8-9741-9f54488d8d8d","Type":"ContainerDied","Data":"a3725e0633e74d0677c6bbc6f3b93966f5f5bd1dac0a945c6898b98315d866e3"} Oct 12 06:04:34 crc kubenswrapper[4930]: I1012 06:04:34.276911 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" event={"ID":"02f8684c-a3e4-44e8-9741-9f54488d8d8d","Type":"ContainerStarted","Data":"32f5a30ef6dd4a307d4229972f1b9582e9cf02f46ecd5b40412f5857437dcf0a"} Oct 12 06:04:34 crc kubenswrapper[4930]: I1012 06:04:34.276937 4930 scope.go:117] "RemoveContainer" containerID="c01e0f78e06c76804a67ffb0c83af238f0fea06c1b96b28458581d18668d1cf0" Oct 12 06:04:34 crc kubenswrapper[4930]: I1012 06:04:34.279961 4930 generic.go:334] "Generic (PLEG): container finished" podID="954904d1-95d2-4ffc-a37b-bb607ca2fdb5" containerID="10c231b9617a54b4781f973f31ec0c0df3e15ab84de685625ad4651e3cc7b939" exitCode=0 Oct 12 06:04:34 crc kubenswrapper[4930]: I1012 06:04:34.280011 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s24nr" event={"ID":"954904d1-95d2-4ffc-a37b-bb607ca2fdb5","Type":"ContainerDied","Data":"10c231b9617a54b4781f973f31ec0c0df3e15ab84de685625ad4651e3cc7b939"} Oct 12 06:04:34 crc kubenswrapper[4930]: I1012 06:04:34.280033 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s24nr" event={"ID":"954904d1-95d2-4ffc-a37b-bb607ca2fdb5","Type":"ContainerStarted","Data":"9e5a5eff47dba192efab219a2ae045f895196b1cccf65c1fccee3b88e5f58c72"} Oct 12 06:04:35 crc kubenswrapper[4930]: I1012 06:04:35.295199 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s24nr" event={"ID":"954904d1-95d2-4ffc-a37b-bb607ca2fdb5","Type":"ContainerStarted","Data":"5f5104dbaafc0bc8953f8d53132442dca7d4be67a152c8fc2ae151b44b4a8f4b"} Oct 12 06:04:40 crc kubenswrapper[4930]: I1012 06:04:40.349687 4930 generic.go:334] "Generic (PLEG): container finished" podID="954904d1-95d2-4ffc-a37b-bb607ca2fdb5" containerID="5f5104dbaafc0bc8953f8d53132442dca7d4be67a152c8fc2ae151b44b4a8f4b" exitCode=0 Oct 12 06:04:40 crc kubenswrapper[4930]: I1012 06:04:40.349880 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s24nr" event={"ID":"954904d1-95d2-4ffc-a37b-bb607ca2fdb5","Type":"ContainerDied","Data":"5f5104dbaafc0bc8953f8d53132442dca7d4be67a152c8fc2ae151b44b4a8f4b"} Oct 12 06:04:41 crc kubenswrapper[4930]: I1012 06:04:41.365850 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s24nr" event={"ID":"954904d1-95d2-4ffc-a37b-bb607ca2fdb5","Type":"ContainerStarted","Data":"28be7507b5f343862082ec6a623667617e622526990f8d6cc815135c04486f2f"} Oct 12 06:04:41 crc kubenswrapper[4930]: I1012 06:04:41.394115 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-s24nr" podStartSLOduration=2.815857957 podStartE2EDuration="9.394095798s" podCreationTimestamp="2025-10-12 06:04:32 +0000 UTC" firstStartedPulling="2025-10-12 06:04:34.286179913 +0000 UTC m=+1406.828281688" lastFinishedPulling="2025-10-12 06:04:40.864417764 +0000 UTC m=+1413.406519529" observedRunningTime="2025-10-12 06:04:41.39336505 +0000 UTC m=+1413.935466845" watchObservedRunningTime="2025-10-12 06:04:41.394095798 +0000 UTC m=+1413.936197573" Oct 12 06:04:42 crc kubenswrapper[4930]: I1012 06:04:42.764540 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-s24nr" Oct 12 06:04:42 crc kubenswrapper[4930]: I1012 06:04:42.764795 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-s24nr" Oct 12 06:04:43 crc kubenswrapper[4930]: I1012 06:04:43.818384 4930 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-s24nr" podUID="954904d1-95d2-4ffc-a37b-bb607ca2fdb5" containerName="registry-server" probeResult="failure" output=< Oct 12 06:04:43 crc kubenswrapper[4930]: timeout: failed to connect service ":50051" within 1s Oct 12 06:04:43 crc kubenswrapper[4930]: > Oct 12 06:04:52 crc kubenswrapper[4930]: I1012 06:04:52.847159 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-s24nr" Oct 12 06:04:52 crc kubenswrapper[4930]: I1012 06:04:52.935759 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-s24nr" Oct 12 06:04:53 crc kubenswrapper[4930]: I1012 06:04:53.102857 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-s24nr"] Oct 12 06:04:54 crc kubenswrapper[4930]: I1012 06:04:54.539015 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-s24nr" podUID="954904d1-95d2-4ffc-a37b-bb607ca2fdb5" containerName="registry-server" containerID="cri-o://28be7507b5f343862082ec6a623667617e622526990f8d6cc815135c04486f2f" gracePeriod=2 Oct 12 06:04:55 crc kubenswrapper[4930]: I1012 06:04:55.103684 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s24nr" Oct 12 06:04:55 crc kubenswrapper[4930]: I1012 06:04:55.161082 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/954904d1-95d2-4ffc-a37b-bb607ca2fdb5-utilities\") pod \"954904d1-95d2-4ffc-a37b-bb607ca2fdb5\" (UID: \"954904d1-95d2-4ffc-a37b-bb607ca2fdb5\") " Oct 12 06:04:55 crc kubenswrapper[4930]: I1012 06:04:55.161222 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/954904d1-95d2-4ffc-a37b-bb607ca2fdb5-catalog-content\") pod \"954904d1-95d2-4ffc-a37b-bb607ca2fdb5\" (UID: \"954904d1-95d2-4ffc-a37b-bb607ca2fdb5\") " Oct 12 06:04:55 crc kubenswrapper[4930]: I1012 06:04:55.161383 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rsw7\" (UniqueName: \"kubernetes.io/projected/954904d1-95d2-4ffc-a37b-bb607ca2fdb5-kube-api-access-7rsw7\") pod \"954904d1-95d2-4ffc-a37b-bb607ca2fdb5\" (UID: \"954904d1-95d2-4ffc-a37b-bb607ca2fdb5\") " Oct 12 06:04:55 crc kubenswrapper[4930]: I1012 06:04:55.162854 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/954904d1-95d2-4ffc-a37b-bb607ca2fdb5-utilities" (OuterVolumeSpecName: "utilities") pod "954904d1-95d2-4ffc-a37b-bb607ca2fdb5" (UID: "954904d1-95d2-4ffc-a37b-bb607ca2fdb5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 06:04:55 crc kubenswrapper[4930]: I1012 06:04:55.170079 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/954904d1-95d2-4ffc-a37b-bb607ca2fdb5-kube-api-access-7rsw7" (OuterVolumeSpecName: "kube-api-access-7rsw7") pod "954904d1-95d2-4ffc-a37b-bb607ca2fdb5" (UID: "954904d1-95d2-4ffc-a37b-bb607ca2fdb5"). InnerVolumeSpecName "kube-api-access-7rsw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:04:55 crc kubenswrapper[4930]: I1012 06:04:55.248418 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/954904d1-95d2-4ffc-a37b-bb607ca2fdb5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "954904d1-95d2-4ffc-a37b-bb607ca2fdb5" (UID: "954904d1-95d2-4ffc-a37b-bb607ca2fdb5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 06:04:55 crc kubenswrapper[4930]: I1012 06:04:55.264309 4930 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/954904d1-95d2-4ffc-a37b-bb607ca2fdb5-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 06:04:55 crc kubenswrapper[4930]: I1012 06:04:55.264345 4930 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/954904d1-95d2-4ffc-a37b-bb607ca2fdb5-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 06:04:55 crc kubenswrapper[4930]: I1012 06:04:55.264359 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rsw7\" (UniqueName: \"kubernetes.io/projected/954904d1-95d2-4ffc-a37b-bb607ca2fdb5-kube-api-access-7rsw7\") on node \"crc\" DevicePath \"\"" Oct 12 06:04:55 crc kubenswrapper[4930]: I1012 06:04:55.553359 4930 generic.go:334] "Generic (PLEG): container finished" podID="954904d1-95d2-4ffc-a37b-bb607ca2fdb5" containerID="28be7507b5f343862082ec6a623667617e622526990f8d6cc815135c04486f2f" exitCode=0 Oct 12 06:04:55 crc kubenswrapper[4930]: I1012 06:04:55.553413 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s24nr" event={"ID":"954904d1-95d2-4ffc-a37b-bb607ca2fdb5","Type":"ContainerDied","Data":"28be7507b5f343862082ec6a623667617e622526990f8d6cc815135c04486f2f"} Oct 12 06:04:55 crc kubenswrapper[4930]: I1012 06:04:55.553424 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s24nr" Oct 12 06:04:55 crc kubenswrapper[4930]: I1012 06:04:55.553443 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s24nr" event={"ID":"954904d1-95d2-4ffc-a37b-bb607ca2fdb5","Type":"ContainerDied","Data":"9e5a5eff47dba192efab219a2ae045f895196b1cccf65c1fccee3b88e5f58c72"} Oct 12 06:04:55 crc kubenswrapper[4930]: I1012 06:04:55.553464 4930 scope.go:117] "RemoveContainer" containerID="28be7507b5f343862082ec6a623667617e622526990f8d6cc815135c04486f2f" Oct 12 06:04:55 crc kubenswrapper[4930]: I1012 06:04:55.585675 4930 scope.go:117] "RemoveContainer" containerID="5f5104dbaafc0bc8953f8d53132442dca7d4be67a152c8fc2ae151b44b4a8f4b" Oct 12 06:04:55 crc kubenswrapper[4930]: I1012 06:04:55.596022 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-s24nr"] Oct 12 06:04:55 crc kubenswrapper[4930]: I1012 06:04:55.603629 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-s24nr"] Oct 12 06:04:55 crc kubenswrapper[4930]: I1012 06:04:55.642094 4930 scope.go:117] "RemoveContainer" containerID="10c231b9617a54b4781f973f31ec0c0df3e15ab84de685625ad4651e3cc7b939" Oct 12 06:04:55 crc kubenswrapper[4930]: I1012 06:04:55.684674 4930 scope.go:117] "RemoveContainer" containerID="28be7507b5f343862082ec6a623667617e622526990f8d6cc815135c04486f2f" Oct 12 06:04:55 crc kubenswrapper[4930]: E1012 06:04:55.685173 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28be7507b5f343862082ec6a623667617e622526990f8d6cc815135c04486f2f\": container with ID starting with 28be7507b5f343862082ec6a623667617e622526990f8d6cc815135c04486f2f not found: ID does not exist" containerID="28be7507b5f343862082ec6a623667617e622526990f8d6cc815135c04486f2f" Oct 12 06:04:55 crc kubenswrapper[4930]: I1012 06:04:55.685213 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28be7507b5f343862082ec6a623667617e622526990f8d6cc815135c04486f2f"} err="failed to get container status \"28be7507b5f343862082ec6a623667617e622526990f8d6cc815135c04486f2f\": rpc error: code = NotFound desc = could not find container \"28be7507b5f343862082ec6a623667617e622526990f8d6cc815135c04486f2f\": container with ID starting with 28be7507b5f343862082ec6a623667617e622526990f8d6cc815135c04486f2f not found: ID does not exist" Oct 12 06:04:55 crc kubenswrapper[4930]: I1012 06:04:55.685239 4930 scope.go:117] "RemoveContainer" containerID="5f5104dbaafc0bc8953f8d53132442dca7d4be67a152c8fc2ae151b44b4a8f4b" Oct 12 06:04:55 crc kubenswrapper[4930]: E1012 06:04:55.685636 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f5104dbaafc0bc8953f8d53132442dca7d4be67a152c8fc2ae151b44b4a8f4b\": container with ID starting with 5f5104dbaafc0bc8953f8d53132442dca7d4be67a152c8fc2ae151b44b4a8f4b not found: ID does not exist" containerID="5f5104dbaafc0bc8953f8d53132442dca7d4be67a152c8fc2ae151b44b4a8f4b" Oct 12 06:04:55 crc kubenswrapper[4930]: I1012 06:04:55.685664 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f5104dbaafc0bc8953f8d53132442dca7d4be67a152c8fc2ae151b44b4a8f4b"} err="failed to get container status \"5f5104dbaafc0bc8953f8d53132442dca7d4be67a152c8fc2ae151b44b4a8f4b\": rpc error: code = NotFound desc = could not find container \"5f5104dbaafc0bc8953f8d53132442dca7d4be67a152c8fc2ae151b44b4a8f4b\": container with ID starting with 5f5104dbaafc0bc8953f8d53132442dca7d4be67a152c8fc2ae151b44b4a8f4b not found: ID does not exist" Oct 12 06:04:55 crc kubenswrapper[4930]: I1012 06:04:55.685681 4930 scope.go:117] "RemoveContainer" containerID="10c231b9617a54b4781f973f31ec0c0df3e15ab84de685625ad4651e3cc7b939" Oct 12 06:04:55 crc kubenswrapper[4930]: E1012 06:04:55.686013 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10c231b9617a54b4781f973f31ec0c0df3e15ab84de685625ad4651e3cc7b939\": container with ID starting with 10c231b9617a54b4781f973f31ec0c0df3e15ab84de685625ad4651e3cc7b939 not found: ID does not exist" containerID="10c231b9617a54b4781f973f31ec0c0df3e15ab84de685625ad4651e3cc7b939" Oct 12 06:04:55 crc kubenswrapper[4930]: I1012 06:04:55.686038 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10c231b9617a54b4781f973f31ec0c0df3e15ab84de685625ad4651e3cc7b939"} err="failed to get container status \"10c231b9617a54b4781f973f31ec0c0df3e15ab84de685625ad4651e3cc7b939\": rpc error: code = NotFound desc = could not find container \"10c231b9617a54b4781f973f31ec0c0df3e15ab84de685625ad4651e3cc7b939\": container with ID starting with 10c231b9617a54b4781f973f31ec0c0df3e15ab84de685625ad4651e3cc7b939 not found: ID does not exist" Oct 12 06:04:56 crc kubenswrapper[4930]: I1012 06:04:56.148623 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="954904d1-95d2-4ffc-a37b-bb607ca2fdb5" path="/var/lib/kubelet/pods/954904d1-95d2-4ffc-a37b-bb607ca2fdb5/volumes" Oct 12 06:05:22 crc kubenswrapper[4930]: I1012 06:05:22.373175 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gq8f8"] Oct 12 06:05:22 crc kubenswrapper[4930]: E1012 06:05:22.374354 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="954904d1-95d2-4ffc-a37b-bb607ca2fdb5" containerName="extract-utilities" Oct 12 06:05:22 crc kubenswrapper[4930]: I1012 06:05:22.374377 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="954904d1-95d2-4ffc-a37b-bb607ca2fdb5" containerName="extract-utilities" Oct 12 06:05:22 crc kubenswrapper[4930]: E1012 06:05:22.374421 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="954904d1-95d2-4ffc-a37b-bb607ca2fdb5" containerName="extract-content" Oct 12 06:05:22 crc kubenswrapper[4930]: I1012 06:05:22.374434 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="954904d1-95d2-4ffc-a37b-bb607ca2fdb5" containerName="extract-content" Oct 12 06:05:22 crc kubenswrapper[4930]: E1012 06:05:22.374487 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="954904d1-95d2-4ffc-a37b-bb607ca2fdb5" containerName="registry-server" Oct 12 06:05:22 crc kubenswrapper[4930]: I1012 06:05:22.374499 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="954904d1-95d2-4ffc-a37b-bb607ca2fdb5" containerName="registry-server" Oct 12 06:05:22 crc kubenswrapper[4930]: I1012 06:05:22.374929 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="954904d1-95d2-4ffc-a37b-bb607ca2fdb5" containerName="registry-server" Oct 12 06:05:22 crc kubenswrapper[4930]: I1012 06:05:22.377504 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gq8f8" Oct 12 06:05:22 crc kubenswrapper[4930]: I1012 06:05:22.388099 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gq8f8"] Oct 12 06:05:22 crc kubenswrapper[4930]: I1012 06:05:22.425137 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e8e597c-1a76-48f0-a536-d1a10bb9bd06-catalog-content\") pod \"certified-operators-gq8f8\" (UID: \"7e8e597c-1a76-48f0-a536-d1a10bb9bd06\") " pod="openshift-marketplace/certified-operators-gq8f8" Oct 12 06:05:22 crc kubenswrapper[4930]: I1012 06:05:22.425375 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csdqz\" (UniqueName: \"kubernetes.io/projected/7e8e597c-1a76-48f0-a536-d1a10bb9bd06-kube-api-access-csdqz\") pod \"certified-operators-gq8f8\" (UID: \"7e8e597c-1a76-48f0-a536-d1a10bb9bd06\") " pod="openshift-marketplace/certified-operators-gq8f8" Oct 12 06:05:22 crc kubenswrapper[4930]: I1012 06:05:22.425467 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e8e597c-1a76-48f0-a536-d1a10bb9bd06-utilities\") pod \"certified-operators-gq8f8\" (UID: \"7e8e597c-1a76-48f0-a536-d1a10bb9bd06\") " pod="openshift-marketplace/certified-operators-gq8f8" Oct 12 06:05:22 crc kubenswrapper[4930]: I1012 06:05:22.527331 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e8e597c-1a76-48f0-a536-d1a10bb9bd06-catalog-content\") pod \"certified-operators-gq8f8\" (UID: \"7e8e597c-1a76-48f0-a536-d1a10bb9bd06\") " pod="openshift-marketplace/certified-operators-gq8f8" Oct 12 06:05:22 crc kubenswrapper[4930]: I1012 06:05:22.527409 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csdqz\" (UniqueName: \"kubernetes.io/projected/7e8e597c-1a76-48f0-a536-d1a10bb9bd06-kube-api-access-csdqz\") pod \"certified-operators-gq8f8\" (UID: \"7e8e597c-1a76-48f0-a536-d1a10bb9bd06\") " pod="openshift-marketplace/certified-operators-gq8f8" Oct 12 06:05:22 crc kubenswrapper[4930]: I1012 06:05:22.527431 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e8e597c-1a76-48f0-a536-d1a10bb9bd06-utilities\") pod \"certified-operators-gq8f8\" (UID: \"7e8e597c-1a76-48f0-a536-d1a10bb9bd06\") " pod="openshift-marketplace/certified-operators-gq8f8" Oct 12 06:05:22 crc kubenswrapper[4930]: I1012 06:05:22.527827 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e8e597c-1a76-48f0-a536-d1a10bb9bd06-catalog-content\") pod \"certified-operators-gq8f8\" (UID: \"7e8e597c-1a76-48f0-a536-d1a10bb9bd06\") " pod="openshift-marketplace/certified-operators-gq8f8" Oct 12 06:05:22 crc kubenswrapper[4930]: I1012 06:05:22.528151 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e8e597c-1a76-48f0-a536-d1a10bb9bd06-utilities\") pod \"certified-operators-gq8f8\" (UID: \"7e8e597c-1a76-48f0-a536-d1a10bb9bd06\") " pod="openshift-marketplace/certified-operators-gq8f8" Oct 12 06:05:22 crc kubenswrapper[4930]: I1012 06:05:22.547350 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csdqz\" (UniqueName: \"kubernetes.io/projected/7e8e597c-1a76-48f0-a536-d1a10bb9bd06-kube-api-access-csdqz\") pod \"certified-operators-gq8f8\" (UID: \"7e8e597c-1a76-48f0-a536-d1a10bb9bd06\") " pod="openshift-marketplace/certified-operators-gq8f8" Oct 12 06:05:22 crc kubenswrapper[4930]: I1012 06:05:22.705579 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gq8f8" Oct 12 06:05:23 crc kubenswrapper[4930]: I1012 06:05:23.285512 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gq8f8"] Oct 12 06:05:23 crc kubenswrapper[4930]: I1012 06:05:23.939623 4930 generic.go:334] "Generic (PLEG): container finished" podID="7e8e597c-1a76-48f0-a536-d1a10bb9bd06" containerID="c9b97067c30d6e8657064f81f8685811887782a9749ae89e2e70cd7126d91902" exitCode=0 Oct 12 06:05:23 crc kubenswrapper[4930]: I1012 06:05:23.939669 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gq8f8" event={"ID":"7e8e597c-1a76-48f0-a536-d1a10bb9bd06","Type":"ContainerDied","Data":"c9b97067c30d6e8657064f81f8685811887782a9749ae89e2e70cd7126d91902"} Oct 12 06:05:23 crc kubenswrapper[4930]: I1012 06:05:23.940090 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gq8f8" event={"ID":"7e8e597c-1a76-48f0-a536-d1a10bb9bd06","Type":"ContainerStarted","Data":"b112835a038dff2c8aeb046b78ea67d8c518431e4e90a878e6319bab3ac2a4ec"} Oct 12 06:05:23 crc kubenswrapper[4930]: I1012 06:05:23.943333 4930 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 12 06:05:24 crc kubenswrapper[4930]: I1012 06:05:24.038359 4930 scope.go:117] "RemoveContainer" containerID="452b6aabed2e3099d896db9eed8281d48b7281c884e3b2385020e0b8bf256332" Oct 12 06:05:24 crc kubenswrapper[4930]: I1012 06:05:24.956529 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gq8f8" event={"ID":"7e8e597c-1a76-48f0-a536-d1a10bb9bd06","Type":"ContainerStarted","Data":"8deec5d092d04f0d4a5e67dfff4b0dbb64caf282e8fd231a7a988ec3720ab47d"} Oct 12 06:05:25 crc kubenswrapper[4930]: I1012 06:05:25.977869 4930 generic.go:334] "Generic (PLEG): container finished" podID="7e8e597c-1a76-48f0-a536-d1a10bb9bd06" containerID="8deec5d092d04f0d4a5e67dfff4b0dbb64caf282e8fd231a7a988ec3720ab47d" exitCode=0 Oct 12 06:05:25 crc kubenswrapper[4930]: I1012 06:05:25.977963 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gq8f8" event={"ID":"7e8e597c-1a76-48f0-a536-d1a10bb9bd06","Type":"ContainerDied","Data":"8deec5d092d04f0d4a5e67dfff4b0dbb64caf282e8fd231a7a988ec3720ab47d"} Oct 12 06:05:26 crc kubenswrapper[4930]: I1012 06:05:26.992651 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gq8f8" event={"ID":"7e8e597c-1a76-48f0-a536-d1a10bb9bd06","Type":"ContainerStarted","Data":"74d2f0045ab7f178ead64c7b96c49c00f78620aeae0f6143e13bed85f7403abd"} Oct 12 06:05:27 crc kubenswrapper[4930]: I1012 06:05:27.021300 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gq8f8" podStartSLOduration=2.550861531 podStartE2EDuration="5.021282815s" podCreationTimestamp="2025-10-12 06:05:22 +0000 UTC" firstStartedPulling="2025-10-12 06:05:23.943072259 +0000 UTC m=+1456.485174014" lastFinishedPulling="2025-10-12 06:05:26.413493493 +0000 UTC m=+1458.955595298" observedRunningTime="2025-10-12 06:05:27.013692767 +0000 UTC m=+1459.555794542" watchObservedRunningTime="2025-10-12 06:05:27.021282815 +0000 UTC m=+1459.563384580" Oct 12 06:05:32 crc kubenswrapper[4930]: I1012 06:05:32.706463 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gq8f8" Oct 12 06:05:32 crc kubenswrapper[4930]: I1012 06:05:32.707125 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gq8f8" Oct 12 06:05:32 crc kubenswrapper[4930]: I1012 06:05:32.797617 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gq8f8" Oct 12 06:05:33 crc kubenswrapper[4930]: I1012 06:05:33.153000 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gq8f8" Oct 12 06:05:33 crc kubenswrapper[4930]: I1012 06:05:33.229117 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gq8f8"] Oct 12 06:05:35 crc kubenswrapper[4930]: I1012 06:05:35.096724 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gq8f8" podUID="7e8e597c-1a76-48f0-a536-d1a10bb9bd06" containerName="registry-server" containerID="cri-o://74d2f0045ab7f178ead64c7b96c49c00f78620aeae0f6143e13bed85f7403abd" gracePeriod=2 Oct 12 06:05:35 crc kubenswrapper[4930]: I1012 06:05:35.644880 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gq8f8" Oct 12 06:05:35 crc kubenswrapper[4930]: I1012 06:05:35.755136 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e8e597c-1a76-48f0-a536-d1a10bb9bd06-catalog-content\") pod \"7e8e597c-1a76-48f0-a536-d1a10bb9bd06\" (UID: \"7e8e597c-1a76-48f0-a536-d1a10bb9bd06\") " Oct 12 06:05:35 crc kubenswrapper[4930]: I1012 06:05:35.755254 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-csdqz\" (UniqueName: \"kubernetes.io/projected/7e8e597c-1a76-48f0-a536-d1a10bb9bd06-kube-api-access-csdqz\") pod \"7e8e597c-1a76-48f0-a536-d1a10bb9bd06\" (UID: \"7e8e597c-1a76-48f0-a536-d1a10bb9bd06\") " Oct 12 06:05:35 crc kubenswrapper[4930]: I1012 06:05:35.755370 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e8e597c-1a76-48f0-a536-d1a10bb9bd06-utilities\") pod \"7e8e597c-1a76-48f0-a536-d1a10bb9bd06\" (UID: \"7e8e597c-1a76-48f0-a536-d1a10bb9bd06\") " Oct 12 06:05:35 crc kubenswrapper[4930]: I1012 06:05:35.756156 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e8e597c-1a76-48f0-a536-d1a10bb9bd06-utilities" (OuterVolumeSpecName: "utilities") pod "7e8e597c-1a76-48f0-a536-d1a10bb9bd06" (UID: "7e8e597c-1a76-48f0-a536-d1a10bb9bd06"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 06:05:35 crc kubenswrapper[4930]: I1012 06:05:35.761789 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e8e597c-1a76-48f0-a536-d1a10bb9bd06-kube-api-access-csdqz" (OuterVolumeSpecName: "kube-api-access-csdqz") pod "7e8e597c-1a76-48f0-a536-d1a10bb9bd06" (UID: "7e8e597c-1a76-48f0-a536-d1a10bb9bd06"). InnerVolumeSpecName "kube-api-access-csdqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:05:35 crc kubenswrapper[4930]: I1012 06:05:35.857862 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-csdqz\" (UniqueName: \"kubernetes.io/projected/7e8e597c-1a76-48f0-a536-d1a10bb9bd06-kube-api-access-csdqz\") on node \"crc\" DevicePath \"\"" Oct 12 06:05:35 crc kubenswrapper[4930]: I1012 06:05:35.857899 4930 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e8e597c-1a76-48f0-a536-d1a10bb9bd06-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 06:05:36 crc kubenswrapper[4930]: I1012 06:05:36.060807 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e8e597c-1a76-48f0-a536-d1a10bb9bd06-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7e8e597c-1a76-48f0-a536-d1a10bb9bd06" (UID: "7e8e597c-1a76-48f0-a536-d1a10bb9bd06"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 06:05:36 crc kubenswrapper[4930]: I1012 06:05:36.062441 4930 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e8e597c-1a76-48f0-a536-d1a10bb9bd06-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 06:05:36 crc kubenswrapper[4930]: I1012 06:05:36.111088 4930 generic.go:334] "Generic (PLEG): container finished" podID="7e8e597c-1a76-48f0-a536-d1a10bb9bd06" containerID="74d2f0045ab7f178ead64c7b96c49c00f78620aeae0f6143e13bed85f7403abd" exitCode=0 Oct 12 06:05:36 crc kubenswrapper[4930]: I1012 06:05:36.111134 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gq8f8" Oct 12 06:05:36 crc kubenswrapper[4930]: I1012 06:05:36.111152 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gq8f8" event={"ID":"7e8e597c-1a76-48f0-a536-d1a10bb9bd06","Type":"ContainerDied","Data":"74d2f0045ab7f178ead64c7b96c49c00f78620aeae0f6143e13bed85f7403abd"} Oct 12 06:05:36 crc kubenswrapper[4930]: I1012 06:05:36.111203 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gq8f8" event={"ID":"7e8e597c-1a76-48f0-a536-d1a10bb9bd06","Type":"ContainerDied","Data":"b112835a038dff2c8aeb046b78ea67d8c518431e4e90a878e6319bab3ac2a4ec"} Oct 12 06:05:36 crc kubenswrapper[4930]: I1012 06:05:36.111234 4930 scope.go:117] "RemoveContainer" containerID="74d2f0045ab7f178ead64c7b96c49c00f78620aeae0f6143e13bed85f7403abd" Oct 12 06:05:36 crc kubenswrapper[4930]: I1012 06:05:36.151296 4930 scope.go:117] "RemoveContainer" containerID="8deec5d092d04f0d4a5e67dfff4b0dbb64caf282e8fd231a7a988ec3720ab47d" Oct 12 06:05:36 crc kubenswrapper[4930]: I1012 06:05:36.165211 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gq8f8"] Oct 12 06:05:36 crc kubenswrapper[4930]: I1012 06:05:36.176618 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gq8f8"] Oct 12 06:05:36 crc kubenswrapper[4930]: I1012 06:05:36.209171 4930 scope.go:117] "RemoveContainer" containerID="c9b97067c30d6e8657064f81f8685811887782a9749ae89e2e70cd7126d91902" Oct 12 06:05:36 crc kubenswrapper[4930]: I1012 06:05:36.278181 4930 scope.go:117] "RemoveContainer" containerID="74d2f0045ab7f178ead64c7b96c49c00f78620aeae0f6143e13bed85f7403abd" Oct 12 06:05:36 crc kubenswrapper[4930]: E1012 06:05:36.278802 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74d2f0045ab7f178ead64c7b96c49c00f78620aeae0f6143e13bed85f7403abd\": container with ID starting with 74d2f0045ab7f178ead64c7b96c49c00f78620aeae0f6143e13bed85f7403abd not found: ID does not exist" containerID="74d2f0045ab7f178ead64c7b96c49c00f78620aeae0f6143e13bed85f7403abd" Oct 12 06:05:36 crc kubenswrapper[4930]: I1012 06:05:36.278855 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74d2f0045ab7f178ead64c7b96c49c00f78620aeae0f6143e13bed85f7403abd"} err="failed to get container status \"74d2f0045ab7f178ead64c7b96c49c00f78620aeae0f6143e13bed85f7403abd\": rpc error: code = NotFound desc = could not find container \"74d2f0045ab7f178ead64c7b96c49c00f78620aeae0f6143e13bed85f7403abd\": container with ID starting with 74d2f0045ab7f178ead64c7b96c49c00f78620aeae0f6143e13bed85f7403abd not found: ID does not exist" Oct 12 06:05:36 crc kubenswrapper[4930]: I1012 06:05:36.278891 4930 scope.go:117] "RemoveContainer" containerID="8deec5d092d04f0d4a5e67dfff4b0dbb64caf282e8fd231a7a988ec3720ab47d" Oct 12 06:05:36 crc kubenswrapper[4930]: E1012 06:05:36.279510 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8deec5d092d04f0d4a5e67dfff4b0dbb64caf282e8fd231a7a988ec3720ab47d\": container with ID starting with 8deec5d092d04f0d4a5e67dfff4b0dbb64caf282e8fd231a7a988ec3720ab47d not found: ID does not exist" containerID="8deec5d092d04f0d4a5e67dfff4b0dbb64caf282e8fd231a7a988ec3720ab47d" Oct 12 06:05:36 crc kubenswrapper[4930]: I1012 06:05:36.279539 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8deec5d092d04f0d4a5e67dfff4b0dbb64caf282e8fd231a7a988ec3720ab47d"} err="failed to get container status \"8deec5d092d04f0d4a5e67dfff4b0dbb64caf282e8fd231a7a988ec3720ab47d\": rpc error: code = NotFound desc = could not find container \"8deec5d092d04f0d4a5e67dfff4b0dbb64caf282e8fd231a7a988ec3720ab47d\": container with ID starting with 8deec5d092d04f0d4a5e67dfff4b0dbb64caf282e8fd231a7a988ec3720ab47d not found: ID does not exist" Oct 12 06:05:36 crc kubenswrapper[4930]: I1012 06:05:36.279562 4930 scope.go:117] "RemoveContainer" containerID="c9b97067c30d6e8657064f81f8685811887782a9749ae89e2e70cd7126d91902" Oct 12 06:05:36 crc kubenswrapper[4930]: E1012 06:05:36.279956 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9b97067c30d6e8657064f81f8685811887782a9749ae89e2e70cd7126d91902\": container with ID starting with c9b97067c30d6e8657064f81f8685811887782a9749ae89e2e70cd7126d91902 not found: ID does not exist" containerID="c9b97067c30d6e8657064f81f8685811887782a9749ae89e2e70cd7126d91902" Oct 12 06:05:36 crc kubenswrapper[4930]: I1012 06:05:36.279978 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9b97067c30d6e8657064f81f8685811887782a9749ae89e2e70cd7126d91902"} err="failed to get container status \"c9b97067c30d6e8657064f81f8685811887782a9749ae89e2e70cd7126d91902\": rpc error: code = NotFound desc = could not find container \"c9b97067c30d6e8657064f81f8685811887782a9749ae89e2e70cd7126d91902\": container with ID starting with c9b97067c30d6e8657064f81f8685811887782a9749ae89e2e70cd7126d91902 not found: ID does not exist" Oct 12 06:05:38 crc kubenswrapper[4930]: I1012 06:05:38.152677 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e8e597c-1a76-48f0-a536-d1a10bb9bd06" path="/var/lib/kubelet/pods/7e8e597c-1a76-48f0-a536-d1a10bb9bd06/volumes" Oct 12 06:06:24 crc kubenswrapper[4930]: I1012 06:06:24.193543 4930 scope.go:117] "RemoveContainer" containerID="2330a8f81f829cfc92c56cdbb3d5729dd0bf48d8f9f7220b0f3440e7ce7eee40" Oct 12 06:06:24 crc kubenswrapper[4930]: I1012 06:06:24.223663 4930 scope.go:117] "RemoveContainer" containerID="d63fc718d21b66de4049176761e68d69ef1fbc8b48a25789af68d6eb85a65f4d" Oct 12 06:06:24 crc kubenswrapper[4930]: I1012 06:06:24.249068 4930 scope.go:117] "RemoveContainer" containerID="6460600dc2240bf70a42d8ce5d6ad8f89e6a182400f65215964547911db1e3c1" Oct 12 06:06:24 crc kubenswrapper[4930]: I1012 06:06:24.282119 4930 scope.go:117] "RemoveContainer" containerID="6077c7bd89312c12d12cf690bc136c749a0bc0dba67844fa5056e36f36f0da44" Oct 12 06:06:24 crc kubenswrapper[4930]: I1012 06:06:24.311145 4930 scope.go:117] "RemoveContainer" containerID="4597264a5a1f2045f2dfd2c65470cdc4a45efec1958c0f9c123bc342b19d8b98" Oct 12 06:06:24 crc kubenswrapper[4930]: I1012 06:06:24.349099 4930 scope.go:117] "RemoveContainer" containerID="a607f1cebf14a923275c8ceab7752588186b342d8877d40c3c9276d90f221c5e" Oct 12 06:06:33 crc kubenswrapper[4930]: I1012 06:06:33.669714 4930 patch_prober.go:28] interesting pod/machine-config-daemon-mk4tf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 06:06:33 crc kubenswrapper[4930]: I1012 06:06:33.670314 4930 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 06:07:03 crc kubenswrapper[4930]: I1012 06:07:03.669725 4930 patch_prober.go:28] interesting pod/machine-config-daemon-mk4tf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 06:07:03 crc kubenswrapper[4930]: I1012 06:07:03.670547 4930 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 06:07:25 crc kubenswrapper[4930]: I1012 06:07:25.562397 4930 generic.go:334] "Generic (PLEG): container finished" podID="fe5863bc-46d9-4eb8-8ab5-e5c6c2e6dbdc" containerID="f0514e363e9732d3a7c12f102c16c8168b5d9e12a5b86270492a119ae1778f85" exitCode=0 Oct 12 06:07:25 crc kubenswrapper[4930]: I1012 06:07:25.562495 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fdbjn" event={"ID":"fe5863bc-46d9-4eb8-8ab5-e5c6c2e6dbdc","Type":"ContainerDied","Data":"f0514e363e9732d3a7c12f102c16c8168b5d9e12a5b86270492a119ae1778f85"} Oct 12 06:07:26 crc kubenswrapper[4930]: I1012 06:07:26.052077 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-s7sp9"] Oct 12 06:07:26 crc kubenswrapper[4930]: I1012 06:07:26.061426 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-s7sp9"] Oct 12 06:07:26 crc kubenswrapper[4930]: I1012 06:07:26.154446 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8cd3366-214d-44fe-bef6-99a5522924c0" path="/var/lib/kubelet/pods/a8cd3366-214d-44fe-bef6-99a5522924c0/volumes" Oct 12 06:07:27 crc kubenswrapper[4930]: I1012 06:07:27.032780 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-n2tb8"] Oct 12 06:07:27 crc kubenswrapper[4930]: I1012 06:07:27.041893 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-n2tb8"] Oct 12 06:07:27 crc kubenswrapper[4930]: I1012 06:07:27.057988 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fdbjn" Oct 12 06:07:27 crc kubenswrapper[4930]: I1012 06:07:27.163708 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe5863bc-46d9-4eb8-8ab5-e5c6c2e6dbdc-bootstrap-combined-ca-bundle\") pod \"fe5863bc-46d9-4eb8-8ab5-e5c6c2e6dbdc\" (UID: \"fe5863bc-46d9-4eb8-8ab5-e5c6c2e6dbdc\") " Oct 12 06:07:27 crc kubenswrapper[4930]: I1012 06:07:27.163831 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bkr9p\" (UniqueName: \"kubernetes.io/projected/fe5863bc-46d9-4eb8-8ab5-e5c6c2e6dbdc-kube-api-access-bkr9p\") pod \"fe5863bc-46d9-4eb8-8ab5-e5c6c2e6dbdc\" (UID: \"fe5863bc-46d9-4eb8-8ab5-e5c6c2e6dbdc\") " Oct 12 06:07:27 crc kubenswrapper[4930]: I1012 06:07:27.163929 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fe5863bc-46d9-4eb8-8ab5-e5c6c2e6dbdc-inventory\") pod \"fe5863bc-46d9-4eb8-8ab5-e5c6c2e6dbdc\" (UID: \"fe5863bc-46d9-4eb8-8ab5-e5c6c2e6dbdc\") " Oct 12 06:07:27 crc kubenswrapper[4930]: I1012 06:07:27.164016 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fe5863bc-46d9-4eb8-8ab5-e5c6c2e6dbdc-ssh-key\") pod \"fe5863bc-46d9-4eb8-8ab5-e5c6c2e6dbdc\" (UID: \"fe5863bc-46d9-4eb8-8ab5-e5c6c2e6dbdc\") " Oct 12 06:07:27 crc kubenswrapper[4930]: I1012 06:07:27.169526 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe5863bc-46d9-4eb8-8ab5-e5c6c2e6dbdc-kube-api-access-bkr9p" (OuterVolumeSpecName: "kube-api-access-bkr9p") pod "fe5863bc-46d9-4eb8-8ab5-e5c6c2e6dbdc" (UID: "fe5863bc-46d9-4eb8-8ab5-e5c6c2e6dbdc"). InnerVolumeSpecName "kube-api-access-bkr9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:07:27 crc kubenswrapper[4930]: I1012 06:07:27.170262 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe5863bc-46d9-4eb8-8ab5-e5c6c2e6dbdc-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "fe5863bc-46d9-4eb8-8ab5-e5c6c2e6dbdc" (UID: "fe5863bc-46d9-4eb8-8ab5-e5c6c2e6dbdc"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:07:27 crc kubenswrapper[4930]: I1012 06:07:27.202186 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe5863bc-46d9-4eb8-8ab5-e5c6c2e6dbdc-inventory" (OuterVolumeSpecName: "inventory") pod "fe5863bc-46d9-4eb8-8ab5-e5c6c2e6dbdc" (UID: "fe5863bc-46d9-4eb8-8ab5-e5c6c2e6dbdc"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:07:27 crc kubenswrapper[4930]: I1012 06:07:27.209604 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe5863bc-46d9-4eb8-8ab5-e5c6c2e6dbdc-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "fe5863bc-46d9-4eb8-8ab5-e5c6c2e6dbdc" (UID: "fe5863bc-46d9-4eb8-8ab5-e5c6c2e6dbdc"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:07:27 crc kubenswrapper[4930]: I1012 06:07:27.266835 4930 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe5863bc-46d9-4eb8-8ab5-e5c6c2e6dbdc-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 06:07:27 crc kubenswrapper[4930]: I1012 06:07:27.266878 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bkr9p\" (UniqueName: \"kubernetes.io/projected/fe5863bc-46d9-4eb8-8ab5-e5c6c2e6dbdc-kube-api-access-bkr9p\") on node \"crc\" DevicePath \"\"" Oct 12 06:07:27 crc kubenswrapper[4930]: I1012 06:07:27.266892 4930 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fe5863bc-46d9-4eb8-8ab5-e5c6c2e6dbdc-inventory\") on node \"crc\" DevicePath \"\"" Oct 12 06:07:27 crc kubenswrapper[4930]: I1012 06:07:27.266905 4930 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fe5863bc-46d9-4eb8-8ab5-e5c6c2e6dbdc-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 12 06:07:27 crc kubenswrapper[4930]: I1012 06:07:27.594357 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fdbjn" event={"ID":"fe5863bc-46d9-4eb8-8ab5-e5c6c2e6dbdc","Type":"ContainerDied","Data":"61f8db69d611427d25c4024902b3fb899dca3445169d52c032b7928ee4f22b85"} Oct 12 06:07:27 crc kubenswrapper[4930]: I1012 06:07:27.594416 4930 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="61f8db69d611427d25c4024902b3fb899dca3445169d52c032b7928ee4f22b85" Oct 12 06:07:27 crc kubenswrapper[4930]: I1012 06:07:27.594827 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fdbjn" Oct 12 06:07:27 crc kubenswrapper[4930]: I1012 06:07:27.725133 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-k6s6t"] Oct 12 06:07:27 crc kubenswrapper[4930]: E1012 06:07:27.725615 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e8e597c-1a76-48f0-a536-d1a10bb9bd06" containerName="extract-content" Oct 12 06:07:27 crc kubenswrapper[4930]: I1012 06:07:27.725632 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e8e597c-1a76-48f0-a536-d1a10bb9bd06" containerName="extract-content" Oct 12 06:07:27 crc kubenswrapper[4930]: E1012 06:07:27.725650 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe5863bc-46d9-4eb8-8ab5-e5c6c2e6dbdc" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 12 06:07:27 crc kubenswrapper[4930]: I1012 06:07:27.725659 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe5863bc-46d9-4eb8-8ab5-e5c6c2e6dbdc" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 12 06:07:27 crc kubenswrapper[4930]: E1012 06:07:27.725676 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e8e597c-1a76-48f0-a536-d1a10bb9bd06" containerName="extract-utilities" Oct 12 06:07:27 crc kubenswrapper[4930]: I1012 06:07:27.725683 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e8e597c-1a76-48f0-a536-d1a10bb9bd06" containerName="extract-utilities" Oct 12 06:07:27 crc kubenswrapper[4930]: E1012 06:07:27.725723 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e8e597c-1a76-48f0-a536-d1a10bb9bd06" containerName="registry-server" Oct 12 06:07:27 crc kubenswrapper[4930]: I1012 06:07:27.725729 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e8e597c-1a76-48f0-a536-d1a10bb9bd06" containerName="registry-server" Oct 12 06:07:27 crc kubenswrapper[4930]: I1012 06:07:27.725923 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e8e597c-1a76-48f0-a536-d1a10bb9bd06" containerName="registry-server" Oct 12 06:07:27 crc kubenswrapper[4930]: I1012 06:07:27.725962 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe5863bc-46d9-4eb8-8ab5-e5c6c2e6dbdc" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 12 06:07:27 crc kubenswrapper[4930]: I1012 06:07:27.726839 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-k6s6t" Oct 12 06:07:27 crc kubenswrapper[4930]: I1012 06:07:27.734286 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 12 06:07:27 crc kubenswrapper[4930]: I1012 06:07:27.734281 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 12 06:07:27 crc kubenswrapper[4930]: I1012 06:07:27.734551 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vhlxg" Oct 12 06:07:27 crc kubenswrapper[4930]: I1012 06:07:27.734687 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 12 06:07:27 crc kubenswrapper[4930]: I1012 06:07:27.743514 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-k6s6t"] Oct 12 06:07:27 crc kubenswrapper[4930]: I1012 06:07:27.775084 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9727cf23-8270-491f-ba18-218bd73cd0c8-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-k6s6t\" (UID: \"9727cf23-8270-491f-ba18-218bd73cd0c8\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-k6s6t" Oct 12 06:07:27 crc kubenswrapper[4930]: I1012 06:07:27.775405 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdbmm\" (UniqueName: \"kubernetes.io/projected/9727cf23-8270-491f-ba18-218bd73cd0c8-kube-api-access-xdbmm\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-k6s6t\" (UID: \"9727cf23-8270-491f-ba18-218bd73cd0c8\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-k6s6t" Oct 12 06:07:27 crc kubenswrapper[4930]: I1012 06:07:27.775680 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9727cf23-8270-491f-ba18-218bd73cd0c8-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-k6s6t\" (UID: \"9727cf23-8270-491f-ba18-218bd73cd0c8\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-k6s6t" Oct 12 06:07:27 crc kubenswrapper[4930]: I1012 06:07:27.877365 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9727cf23-8270-491f-ba18-218bd73cd0c8-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-k6s6t\" (UID: \"9727cf23-8270-491f-ba18-218bd73cd0c8\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-k6s6t" Oct 12 06:07:27 crc kubenswrapper[4930]: I1012 06:07:27.877590 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9727cf23-8270-491f-ba18-218bd73cd0c8-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-k6s6t\" (UID: \"9727cf23-8270-491f-ba18-218bd73cd0c8\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-k6s6t" Oct 12 06:07:27 crc kubenswrapper[4930]: I1012 06:07:27.877664 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdbmm\" (UniqueName: \"kubernetes.io/projected/9727cf23-8270-491f-ba18-218bd73cd0c8-kube-api-access-xdbmm\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-k6s6t\" (UID: \"9727cf23-8270-491f-ba18-218bd73cd0c8\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-k6s6t" Oct 12 06:07:27 crc kubenswrapper[4930]: I1012 06:07:27.882978 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9727cf23-8270-491f-ba18-218bd73cd0c8-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-k6s6t\" (UID: \"9727cf23-8270-491f-ba18-218bd73cd0c8\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-k6s6t" Oct 12 06:07:27 crc kubenswrapper[4930]: I1012 06:07:27.883717 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9727cf23-8270-491f-ba18-218bd73cd0c8-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-k6s6t\" (UID: \"9727cf23-8270-491f-ba18-218bd73cd0c8\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-k6s6t" Oct 12 06:07:27 crc kubenswrapper[4930]: I1012 06:07:27.905194 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdbmm\" (UniqueName: \"kubernetes.io/projected/9727cf23-8270-491f-ba18-218bd73cd0c8-kube-api-access-xdbmm\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-k6s6t\" (UID: \"9727cf23-8270-491f-ba18-218bd73cd0c8\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-k6s6t" Oct 12 06:07:28 crc kubenswrapper[4930]: I1012 06:07:28.065673 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-k6s6t" Oct 12 06:07:28 crc kubenswrapper[4930]: I1012 06:07:28.154909 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="065099a3-1832-4ac1-9654-080eee86c974" path="/var/lib/kubelet/pods/065099a3-1832-4ac1-9654-080eee86c974/volumes" Oct 12 06:07:28 crc kubenswrapper[4930]: I1012 06:07:28.562361 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-k6s6t"] Oct 12 06:07:28 crc kubenswrapper[4930]: I1012 06:07:28.609236 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-k6s6t" event={"ID":"9727cf23-8270-491f-ba18-218bd73cd0c8","Type":"ContainerStarted","Data":"287a167bf6ed030a3cd2bff614258e5d9e9c005ab9da9255c29ca9acf0be0a4b"} Oct 12 06:07:29 crc kubenswrapper[4930]: I1012 06:07:29.047969 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-db-create-gzvqc"] Oct 12 06:07:29 crc kubenswrapper[4930]: I1012 06:07:29.062331 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-db-create-gzvqc"] Oct 12 06:07:29 crc kubenswrapper[4930]: I1012 06:07:29.625318 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-k6s6t" event={"ID":"9727cf23-8270-491f-ba18-218bd73cd0c8","Type":"ContainerStarted","Data":"0732c8792432be68aaf9c6edf9da182f6fdc71d2a0de529ff35f580e08a65bf8"} Oct 12 06:07:29 crc kubenswrapper[4930]: I1012 06:07:29.644787 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-k6s6t" podStartSLOduration=1.881587867 podStartE2EDuration="2.644770655s" podCreationTimestamp="2025-10-12 06:07:27 +0000 UTC" firstStartedPulling="2025-10-12 06:07:28.579780428 +0000 UTC m=+1581.121882213" lastFinishedPulling="2025-10-12 06:07:29.342963236 +0000 UTC m=+1581.885065001" observedRunningTime="2025-10-12 06:07:29.640657973 +0000 UTC m=+1582.182759738" watchObservedRunningTime="2025-10-12 06:07:29.644770655 +0000 UTC m=+1582.186872420" Oct 12 06:07:30 crc kubenswrapper[4930]: I1012 06:07:30.156692 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51978201-ae64-4c05-9e62-17e1eb5111d1" path="/var/lib/kubelet/pods/51978201-ae64-4c05-9e62-17e1eb5111d1/volumes" Oct 12 06:07:33 crc kubenswrapper[4930]: I1012 06:07:33.669086 4930 patch_prober.go:28] interesting pod/machine-config-daemon-mk4tf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 06:07:33 crc kubenswrapper[4930]: I1012 06:07:33.669588 4930 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 06:07:33 crc kubenswrapper[4930]: I1012 06:07:33.669629 4930 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" Oct 12 06:07:33 crc kubenswrapper[4930]: I1012 06:07:33.670335 4930 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"32f5a30ef6dd4a307d4229972f1b9582e9cf02f46ecd5b40412f5857437dcf0a"} pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 12 06:07:33 crc kubenswrapper[4930]: I1012 06:07:33.670385 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerName="machine-config-daemon" containerID="cri-o://32f5a30ef6dd4a307d4229972f1b9582e9cf02f46ecd5b40412f5857437dcf0a" gracePeriod=600 Oct 12 06:07:33 crc kubenswrapper[4930]: E1012 06:07:33.792273 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:07:34 crc kubenswrapper[4930]: I1012 06:07:34.685052 4930 generic.go:334] "Generic (PLEG): container finished" podID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerID="32f5a30ef6dd4a307d4229972f1b9582e9cf02f46ecd5b40412f5857437dcf0a" exitCode=0 Oct 12 06:07:34 crc kubenswrapper[4930]: I1012 06:07:34.685163 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" event={"ID":"02f8684c-a3e4-44e8-9741-9f54488d8d8d","Type":"ContainerDied","Data":"32f5a30ef6dd4a307d4229972f1b9582e9cf02f46ecd5b40412f5857437dcf0a"} Oct 12 06:07:34 crc kubenswrapper[4930]: I1012 06:07:34.685536 4930 scope.go:117] "RemoveContainer" containerID="a3725e0633e74d0677c6bbc6f3b93966f5f5bd1dac0a945c6898b98315d866e3" Oct 12 06:07:34 crc kubenswrapper[4930]: I1012 06:07:34.686491 4930 scope.go:117] "RemoveContainer" containerID="32f5a30ef6dd4a307d4229972f1b9582e9cf02f46ecd5b40412f5857437dcf0a" Oct 12 06:07:34 crc kubenswrapper[4930]: E1012 06:07:34.687064 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:07:41 crc kubenswrapper[4930]: I1012 06:07:41.050554 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-2133-account-create-h59cp"] Oct 12 06:07:41 crc kubenswrapper[4930]: I1012 06:07:41.071099 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-2133-account-create-h59cp"] Oct 12 06:07:41 crc kubenswrapper[4930]: I1012 06:07:41.083717 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-8e21-account-create-645jw"] Oct 12 06:07:41 crc kubenswrapper[4930]: I1012 06:07:41.100190 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-b403-account-create-lzlnf"] Oct 12 06:07:41 crc kubenswrapper[4930]: I1012 06:07:41.109103 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-b403-account-create-lzlnf"] Oct 12 06:07:41 crc kubenswrapper[4930]: I1012 06:07:41.116463 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-8e21-account-create-645jw"] Oct 12 06:07:42 crc kubenswrapper[4930]: I1012 06:07:42.145603 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31682f1f-3cd9-4ad2-a3ed-46a6c686c0f3" path="/var/lib/kubelet/pods/31682f1f-3cd9-4ad2-a3ed-46a6c686c0f3/volumes" Oct 12 06:07:42 crc kubenswrapper[4930]: I1012 06:07:42.146410 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd2f9a6f-cb9e-4db3-8165-e2cc18f1daff" path="/var/lib/kubelet/pods/bd2f9a6f-cb9e-4db3-8165-e2cc18f1daff/volumes" Oct 12 06:07:42 crc kubenswrapper[4930]: I1012 06:07:42.146911 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f603981a-3581-49ba-8726-f21e981d4988" path="/var/lib/kubelet/pods/f603981a-3581-49ba-8726-f21e981d4988/volumes" Oct 12 06:07:46 crc kubenswrapper[4930]: I1012 06:07:46.136664 4930 scope.go:117] "RemoveContainer" containerID="32f5a30ef6dd4a307d4229972f1b9582e9cf02f46ecd5b40412f5857437dcf0a" Oct 12 06:07:46 crc kubenswrapper[4930]: E1012 06:07:46.137903 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:07:57 crc kubenswrapper[4930]: I1012 06:07:57.135856 4930 scope.go:117] "RemoveContainer" containerID="32f5a30ef6dd4a307d4229972f1b9582e9cf02f46ecd5b40412f5857437dcf0a" Oct 12 06:07:57 crc kubenswrapper[4930]: E1012 06:07:57.136995 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:08:03 crc kubenswrapper[4930]: I1012 06:08:03.074943 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-sxb5z"] Oct 12 06:08:03 crc kubenswrapper[4930]: I1012 06:08:03.094553 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-sxb5z"] Oct 12 06:08:04 crc kubenswrapper[4930]: I1012 06:08:04.159186 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53492810-e3e8-42c4-b6ec-df0913d9c969" path="/var/lib/kubelet/pods/53492810-e3e8-42c4-b6ec-df0913d9c969/volumes" Oct 12 06:08:05 crc kubenswrapper[4930]: I1012 06:08:05.033834 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-x67ml"] Oct 12 06:08:05 crc kubenswrapper[4930]: I1012 06:08:05.044552 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-j8k6q"] Oct 12 06:08:05 crc kubenswrapper[4930]: I1012 06:08:05.052100 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-xqv8n"] Oct 12 06:08:05 crc kubenswrapper[4930]: I1012 06:08:05.066919 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-x67ml"] Oct 12 06:08:05 crc kubenswrapper[4930]: I1012 06:08:05.089195 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-j8k6q"] Oct 12 06:08:05 crc kubenswrapper[4930]: I1012 06:08:05.100617 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-xqv8n"] Oct 12 06:08:06 crc kubenswrapper[4930]: I1012 06:08:06.156161 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53923311-a7cd-46ee-b287-26dd3cc96916" path="/var/lib/kubelet/pods/53923311-a7cd-46ee-b287-26dd3cc96916/volumes" Oct 12 06:08:06 crc kubenswrapper[4930]: I1012 06:08:06.158362 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a81240c3-2ae0-494f-af8e-8e9d60aca47b" path="/var/lib/kubelet/pods/a81240c3-2ae0-494f-af8e-8e9d60aca47b/volumes" Oct 12 06:08:06 crc kubenswrapper[4930]: I1012 06:08:06.159972 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acfc6300-e2c5-40ba-885c-f3f62a3f6e29" path="/var/lib/kubelet/pods/acfc6300-e2c5-40ba-885c-f3f62a3f6e29/volumes" Oct 12 06:08:08 crc kubenswrapper[4930]: I1012 06:08:08.150580 4930 scope.go:117] "RemoveContainer" containerID="32f5a30ef6dd4a307d4229972f1b9582e9cf02f46ecd5b40412f5857437dcf0a" Oct 12 06:08:08 crc kubenswrapper[4930]: E1012 06:08:08.152938 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:08:17 crc kubenswrapper[4930]: I1012 06:08:17.046575 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-5ea2-account-create-5srb4"] Oct 12 06:08:17 crc kubenswrapper[4930]: I1012 06:08:17.060023 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-a944-account-create-nrsz7"] Oct 12 06:08:17 crc kubenswrapper[4930]: I1012 06:08:17.072443 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-5ea2-account-create-5srb4"] Oct 12 06:08:17 crc kubenswrapper[4930]: I1012 06:08:17.082786 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-8362-account-create-r4fxc"] Oct 12 06:08:17 crc kubenswrapper[4930]: I1012 06:08:17.090913 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-a944-account-create-nrsz7"] Oct 12 06:08:17 crc kubenswrapper[4930]: I1012 06:08:17.098761 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-8362-account-create-r4fxc"] Oct 12 06:08:18 crc kubenswrapper[4930]: I1012 06:08:18.047995 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-b9d1-account-create-np4q6"] Oct 12 06:08:18 crc kubenswrapper[4930]: I1012 06:08:18.064920 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-b9d1-account-create-np4q6"] Oct 12 06:08:18 crc kubenswrapper[4930]: I1012 06:08:18.158512 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38bf3863-9d2b-4158-afb6-e1feb4963f15" path="/var/lib/kubelet/pods/38bf3863-9d2b-4158-afb6-e1feb4963f15/volumes" Oct 12 06:08:18 crc kubenswrapper[4930]: I1012 06:08:18.160074 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="678827f4-28a3-4cb5-9886-96e11e89c172" path="/var/lib/kubelet/pods/678827f4-28a3-4cb5-9886-96e11e89c172/volumes" Oct 12 06:08:18 crc kubenswrapper[4930]: I1012 06:08:18.161127 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5a8fa49-dfc5-4c99-b1b1-fc2795013a1c" path="/var/lib/kubelet/pods/f5a8fa49-dfc5-4c99-b1b1-fc2795013a1c/volumes" Oct 12 06:08:18 crc kubenswrapper[4930]: I1012 06:08:18.162301 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9377945-1809-42da-b3c2-f38d3d91a1a6" path="/var/lib/kubelet/pods/f9377945-1809-42da-b3c2-f38d3d91a1a6/volumes" Oct 12 06:08:19 crc kubenswrapper[4930]: I1012 06:08:19.135041 4930 scope.go:117] "RemoveContainer" containerID="32f5a30ef6dd4a307d4229972f1b9582e9cf02f46ecd5b40412f5857437dcf0a" Oct 12 06:08:19 crc kubenswrapper[4930]: E1012 06:08:19.135427 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:08:20 crc kubenswrapper[4930]: I1012 06:08:20.043600 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-db-sync-wqxq9"] Oct 12 06:08:20 crc kubenswrapper[4930]: I1012 06:08:20.055643 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-db-sync-wqxq9"] Oct 12 06:08:20 crc kubenswrapper[4930]: I1012 06:08:20.155406 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96917cd8-6c91-463b-b8de-9d854e0ee581" path="/var/lib/kubelet/pods/96917cd8-6c91-463b-b8de-9d854e0ee581/volumes" Oct 12 06:08:21 crc kubenswrapper[4930]: I1012 06:08:21.039154 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-wndmm"] Oct 12 06:08:21 crc kubenswrapper[4930]: I1012 06:08:21.050355 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-wndmm"] Oct 12 06:08:22 crc kubenswrapper[4930]: I1012 06:08:22.154619 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9bb388c-0196-4f0c-9567-2ef5b30889dd" path="/var/lib/kubelet/pods/b9bb388c-0196-4f0c-9567-2ef5b30889dd/volumes" Oct 12 06:08:24 crc kubenswrapper[4930]: I1012 06:08:24.466135 4930 scope.go:117] "RemoveContainer" containerID="d6460b3debccdb12a4bb1f2ef621d96eb151d8b96d3a917641eedb0887a754da" Oct 12 06:08:24 crc kubenswrapper[4930]: I1012 06:08:24.514669 4930 scope.go:117] "RemoveContainer" containerID="08ce7ca7283a64b89f5cd5b7d08d7e47145a7c445f078df713a61e6404c9cb09" Oct 12 06:08:24 crc kubenswrapper[4930]: I1012 06:08:24.584047 4930 scope.go:117] "RemoveContainer" containerID="45d0f645d2ed4c0fe05ec4a344139c07459e51cc4421e6696bd2bfb516a7ab36" Oct 12 06:08:24 crc kubenswrapper[4930]: I1012 06:08:24.620028 4930 scope.go:117] "RemoveContainer" containerID="f6a746baccc0baea788f924a3f35120bccfd6e07f7331bc73e35b86127f35086" Oct 12 06:08:24 crc kubenswrapper[4930]: I1012 06:08:24.677998 4930 scope.go:117] "RemoveContainer" containerID="4bdf5fa1f489f1ddcb7a4da154479d9a13a53608a029a1324e4ba6df881293bd" Oct 12 06:08:24 crc kubenswrapper[4930]: I1012 06:08:24.713468 4930 scope.go:117] "RemoveContainer" containerID="c9309d12f84eac3d78b59c6275628f9c0d95432ed1ede6bafcae3d969bfe6dd9" Oct 12 06:08:24 crc kubenswrapper[4930]: I1012 06:08:24.762087 4930 scope.go:117] "RemoveContainer" containerID="413e0fc328ed4b1593c4e95b37dc172bf363e4aed76afddc02cb3ba4e8f2a92e" Oct 12 06:08:24 crc kubenswrapper[4930]: I1012 06:08:24.781634 4930 scope.go:117] "RemoveContainer" containerID="4f2c0af4131f43daff6c7e90cceb9c7eae521130630264d618da1cb0f8d63b1d" Oct 12 06:08:24 crc kubenswrapper[4930]: I1012 06:08:24.814816 4930 scope.go:117] "RemoveContainer" containerID="8e9b3463ba8a90c44ce9a067f41e90b9110e616c4f1008d907e7b4af331a38b1" Oct 12 06:08:24 crc kubenswrapper[4930]: I1012 06:08:24.844057 4930 scope.go:117] "RemoveContainer" containerID="650dcdcd858d5b7beb90c3731640c5ccf71b4ef197dde82c17b6cc833219a1c8" Oct 12 06:08:24 crc kubenswrapper[4930]: I1012 06:08:24.878206 4930 scope.go:117] "RemoveContainer" containerID="8dcda72439b2b02f5cabb35d03b2d4beff4f959d2cb7a59112130545921e1872" Oct 12 06:08:24 crc kubenswrapper[4930]: I1012 06:08:24.907292 4930 scope.go:117] "RemoveContainer" containerID="8fa8efc30466226e8621280addac73a3c2576c77641c3e9c6f2db92911dd05a6" Oct 12 06:08:24 crc kubenswrapper[4930]: I1012 06:08:24.944390 4930 scope.go:117] "RemoveContainer" containerID="2db4e0d07355e79407e76c1c971d84326154f48ef1080a39a3c069a8606f40b5" Oct 12 06:08:24 crc kubenswrapper[4930]: I1012 06:08:24.979039 4930 scope.go:117] "RemoveContainer" containerID="600c90f643ed8c7dc8a88e357c9f9c68e185e9a62a849ede53cd466b7b86d776" Oct 12 06:08:25 crc kubenswrapper[4930]: I1012 06:08:25.016291 4930 scope.go:117] "RemoveContainer" containerID="762b85a5b5828a13bbc556e082c56c99ad851f798985561a4615c24df133c99a" Oct 12 06:08:25 crc kubenswrapper[4930]: I1012 06:08:25.047937 4930 scope.go:117] "RemoveContainer" containerID="3052de8a3e627f0cfd9983c59ff403b54f3820761a61c595ef8caee6a1314cef" Oct 12 06:08:25 crc kubenswrapper[4930]: I1012 06:08:25.087514 4930 scope.go:117] "RemoveContainer" containerID="2ff4ac790498d9d4bc5cbc349cebba80d54288bb0a0ca16deb94f3e6a28c0a8b" Oct 12 06:08:33 crc kubenswrapper[4930]: I1012 06:08:33.136480 4930 scope.go:117] "RemoveContainer" containerID="32f5a30ef6dd4a307d4229972f1b9582e9cf02f46ecd5b40412f5857437dcf0a" Oct 12 06:08:33 crc kubenswrapper[4930]: E1012 06:08:33.137585 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:08:46 crc kubenswrapper[4930]: I1012 06:08:46.135560 4930 scope.go:117] "RemoveContainer" containerID="32f5a30ef6dd4a307d4229972f1b9582e9cf02f46ecd5b40412f5857437dcf0a" Oct 12 06:08:46 crc kubenswrapper[4930]: E1012 06:08:46.136874 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:09:00 crc kubenswrapper[4930]: I1012 06:09:00.136263 4930 scope.go:117] "RemoveContainer" containerID="32f5a30ef6dd4a307d4229972f1b9582e9cf02f46ecd5b40412f5857437dcf0a" Oct 12 06:09:00 crc kubenswrapper[4930]: E1012 06:09:00.137602 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:09:02 crc kubenswrapper[4930]: I1012 06:09:02.071062 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-96nsw"] Oct 12 06:09:02 crc kubenswrapper[4930]: I1012 06:09:02.087279 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-96nsw"] Oct 12 06:09:02 crc kubenswrapper[4930]: I1012 06:09:02.158525 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="995795c9-befc-4ce9-8a38-8791ba628061" path="/var/lib/kubelet/pods/995795c9-befc-4ce9-8a38-8791ba628061/volumes" Oct 12 06:09:13 crc kubenswrapper[4930]: I1012 06:09:13.135820 4930 scope.go:117] "RemoveContainer" containerID="32f5a30ef6dd4a307d4229972f1b9582e9cf02f46ecd5b40412f5857437dcf0a" Oct 12 06:09:13 crc kubenswrapper[4930]: E1012 06:09:13.136854 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:09:18 crc kubenswrapper[4930]: I1012 06:09:18.051818 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-sdmls"] Oct 12 06:09:18 crc kubenswrapper[4930]: I1012 06:09:18.074827 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-sdmls"] Oct 12 06:09:18 crc kubenswrapper[4930]: I1012 06:09:18.158129 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea4a96dd-8928-4248-b294-1b0b6413abef" path="/var/lib/kubelet/pods/ea4a96dd-8928-4248-b294-1b0b6413abef/volumes" Oct 12 06:09:22 crc kubenswrapper[4930]: I1012 06:09:22.041425 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-h2tnf"] Oct 12 06:09:22 crc kubenswrapper[4930]: I1012 06:09:22.056601 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-h2tnf"] Oct 12 06:09:22 crc kubenswrapper[4930]: I1012 06:09:22.162832 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b228afd2-c418-49a3-97d4-35a298e324a6" path="/var/lib/kubelet/pods/b228afd2-c418-49a3-97d4-35a298e324a6/volumes" Oct 12 06:09:25 crc kubenswrapper[4930]: I1012 06:09:25.035283 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-dmtxt"] Oct 12 06:09:25 crc kubenswrapper[4930]: I1012 06:09:25.048440 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-dmtxt"] Oct 12 06:09:25 crc kubenswrapper[4930]: I1012 06:09:25.551639 4930 scope.go:117] "RemoveContainer" containerID="3f438248d7c6cec3c44e79f882a5dd6fa35ec572adbbd2750232f6baa074b758" Oct 12 06:09:25 crc kubenswrapper[4930]: I1012 06:09:25.612466 4930 scope.go:117] "RemoveContainer" containerID="30ebdfc31a962560651811b3199c242f5f32817e99142ec937334cfc7c99609f" Oct 12 06:09:25 crc kubenswrapper[4930]: I1012 06:09:25.695070 4930 scope.go:117] "RemoveContainer" containerID="1479b3928a1c33b74735ec97bccec250012d2096e39d5f6687fc2fb1e8bcd0d1" Oct 12 06:09:26 crc kubenswrapper[4930]: I1012 06:09:26.136072 4930 scope.go:117] "RemoveContainer" containerID="32f5a30ef6dd4a307d4229972f1b9582e9cf02f46ecd5b40412f5857437dcf0a" Oct 12 06:09:26 crc kubenswrapper[4930]: E1012 06:09:26.136368 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:09:26 crc kubenswrapper[4930]: I1012 06:09:26.148153 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce22649f-fe81-4c02-a26a-15e45e306b82" path="/var/lib/kubelet/pods/ce22649f-fe81-4c02-a26a-15e45e306b82/volumes" Oct 12 06:09:31 crc kubenswrapper[4930]: I1012 06:09:31.066598 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-bg9r7"] Oct 12 06:09:31 crc kubenswrapper[4930]: I1012 06:09:31.075938 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-bg9r7"] Oct 12 06:09:32 crc kubenswrapper[4930]: I1012 06:09:32.043536 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-jj2m5"] Oct 12 06:09:32 crc kubenswrapper[4930]: I1012 06:09:32.059793 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-jj2m5"] Oct 12 06:09:32 crc kubenswrapper[4930]: I1012 06:09:32.151863 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="448db83a-c0af-4680-890f-24b8d8da1088" path="/var/lib/kubelet/pods/448db83a-c0af-4680-890f-24b8d8da1088/volumes" Oct 12 06:09:32 crc kubenswrapper[4930]: I1012 06:09:32.152825 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99ff807c-d810-4ff9-9ed3-3b2d37d3fbed" path="/var/lib/kubelet/pods/99ff807c-d810-4ff9-9ed3-3b2d37d3fbed/volumes" Oct 12 06:09:35 crc kubenswrapper[4930]: I1012 06:09:35.206348 4930 generic.go:334] "Generic (PLEG): container finished" podID="9727cf23-8270-491f-ba18-218bd73cd0c8" containerID="0732c8792432be68aaf9c6edf9da182f6fdc71d2a0de529ff35f580e08a65bf8" exitCode=0 Oct 12 06:09:35 crc kubenswrapper[4930]: I1012 06:09:35.206495 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-k6s6t" event={"ID":"9727cf23-8270-491f-ba18-218bd73cd0c8","Type":"ContainerDied","Data":"0732c8792432be68aaf9c6edf9da182f6fdc71d2a0de529ff35f580e08a65bf8"} Oct 12 06:09:36 crc kubenswrapper[4930]: I1012 06:09:36.799432 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-k6s6t" Oct 12 06:09:36 crc kubenswrapper[4930]: I1012 06:09:36.937671 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9727cf23-8270-491f-ba18-218bd73cd0c8-ssh-key\") pod \"9727cf23-8270-491f-ba18-218bd73cd0c8\" (UID: \"9727cf23-8270-491f-ba18-218bd73cd0c8\") " Oct 12 06:09:36 crc kubenswrapper[4930]: I1012 06:09:36.937781 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9727cf23-8270-491f-ba18-218bd73cd0c8-inventory\") pod \"9727cf23-8270-491f-ba18-218bd73cd0c8\" (UID: \"9727cf23-8270-491f-ba18-218bd73cd0c8\") " Oct 12 06:09:36 crc kubenswrapper[4930]: I1012 06:09:36.937963 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdbmm\" (UniqueName: \"kubernetes.io/projected/9727cf23-8270-491f-ba18-218bd73cd0c8-kube-api-access-xdbmm\") pod \"9727cf23-8270-491f-ba18-218bd73cd0c8\" (UID: \"9727cf23-8270-491f-ba18-218bd73cd0c8\") " Oct 12 06:09:36 crc kubenswrapper[4930]: I1012 06:09:36.946107 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9727cf23-8270-491f-ba18-218bd73cd0c8-kube-api-access-xdbmm" (OuterVolumeSpecName: "kube-api-access-xdbmm") pod "9727cf23-8270-491f-ba18-218bd73cd0c8" (UID: "9727cf23-8270-491f-ba18-218bd73cd0c8"). InnerVolumeSpecName "kube-api-access-xdbmm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:09:36 crc kubenswrapper[4930]: I1012 06:09:36.973078 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9727cf23-8270-491f-ba18-218bd73cd0c8-inventory" (OuterVolumeSpecName: "inventory") pod "9727cf23-8270-491f-ba18-218bd73cd0c8" (UID: "9727cf23-8270-491f-ba18-218bd73cd0c8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:09:36 crc kubenswrapper[4930]: I1012 06:09:36.983149 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9727cf23-8270-491f-ba18-218bd73cd0c8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9727cf23-8270-491f-ba18-218bd73cd0c8" (UID: "9727cf23-8270-491f-ba18-218bd73cd0c8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:09:37 crc kubenswrapper[4930]: I1012 06:09:37.040212 4930 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9727cf23-8270-491f-ba18-218bd73cd0c8-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 12 06:09:37 crc kubenswrapper[4930]: I1012 06:09:37.040244 4930 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9727cf23-8270-491f-ba18-218bd73cd0c8-inventory\") on node \"crc\" DevicePath \"\"" Oct 12 06:09:37 crc kubenswrapper[4930]: I1012 06:09:37.040254 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdbmm\" (UniqueName: \"kubernetes.io/projected/9727cf23-8270-491f-ba18-218bd73cd0c8-kube-api-access-xdbmm\") on node \"crc\" DevicePath \"\"" Oct 12 06:09:37 crc kubenswrapper[4930]: I1012 06:09:37.238865 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-k6s6t" event={"ID":"9727cf23-8270-491f-ba18-218bd73cd0c8","Type":"ContainerDied","Data":"287a167bf6ed030a3cd2bff614258e5d9e9c005ab9da9255c29ca9acf0be0a4b"} Oct 12 06:09:37 crc kubenswrapper[4930]: I1012 06:09:37.238902 4930 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="287a167bf6ed030a3cd2bff614258e5d9e9c005ab9da9255c29ca9acf0be0a4b" Oct 12 06:09:37 crc kubenswrapper[4930]: I1012 06:09:37.238975 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-k6s6t" Oct 12 06:09:37 crc kubenswrapper[4930]: I1012 06:09:37.346542 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wrzrm"] Oct 12 06:09:37 crc kubenswrapper[4930]: E1012 06:09:37.347190 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9727cf23-8270-491f-ba18-218bd73cd0c8" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 12 06:09:37 crc kubenswrapper[4930]: I1012 06:09:37.347220 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="9727cf23-8270-491f-ba18-218bd73cd0c8" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 12 06:09:37 crc kubenswrapper[4930]: I1012 06:09:37.347540 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="9727cf23-8270-491f-ba18-218bd73cd0c8" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 12 06:09:37 crc kubenswrapper[4930]: I1012 06:09:37.348433 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wrzrm" Oct 12 06:09:37 crc kubenswrapper[4930]: I1012 06:09:37.351467 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 12 06:09:37 crc kubenswrapper[4930]: I1012 06:09:37.351826 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 12 06:09:37 crc kubenswrapper[4930]: I1012 06:09:37.352284 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vhlxg" Oct 12 06:09:37 crc kubenswrapper[4930]: I1012 06:09:37.352361 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 12 06:09:37 crc kubenswrapper[4930]: I1012 06:09:37.394254 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wrzrm"] Oct 12 06:09:37 crc kubenswrapper[4930]: I1012 06:09:37.446428 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d8b9bf67-f82f-421d-98df-4f8e95911d5a-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-wrzrm\" (UID: \"d8b9bf67-f82f-421d-98df-4f8e95911d5a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wrzrm" Oct 12 06:09:37 crc kubenswrapper[4930]: I1012 06:09:37.446580 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6f2gg\" (UniqueName: \"kubernetes.io/projected/d8b9bf67-f82f-421d-98df-4f8e95911d5a-kube-api-access-6f2gg\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-wrzrm\" (UID: \"d8b9bf67-f82f-421d-98df-4f8e95911d5a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wrzrm" Oct 12 06:09:37 crc kubenswrapper[4930]: I1012 06:09:37.446652 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d8b9bf67-f82f-421d-98df-4f8e95911d5a-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-wrzrm\" (UID: \"d8b9bf67-f82f-421d-98df-4f8e95911d5a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wrzrm" Oct 12 06:09:37 crc kubenswrapper[4930]: I1012 06:09:37.548617 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d8b9bf67-f82f-421d-98df-4f8e95911d5a-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-wrzrm\" (UID: \"d8b9bf67-f82f-421d-98df-4f8e95911d5a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wrzrm" Oct 12 06:09:37 crc kubenswrapper[4930]: I1012 06:09:37.548784 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6f2gg\" (UniqueName: \"kubernetes.io/projected/d8b9bf67-f82f-421d-98df-4f8e95911d5a-kube-api-access-6f2gg\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-wrzrm\" (UID: \"d8b9bf67-f82f-421d-98df-4f8e95911d5a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wrzrm" Oct 12 06:09:37 crc kubenswrapper[4930]: I1012 06:09:37.548849 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d8b9bf67-f82f-421d-98df-4f8e95911d5a-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-wrzrm\" (UID: \"d8b9bf67-f82f-421d-98df-4f8e95911d5a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wrzrm" Oct 12 06:09:37 crc kubenswrapper[4930]: I1012 06:09:37.552497 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d8b9bf67-f82f-421d-98df-4f8e95911d5a-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-wrzrm\" (UID: \"d8b9bf67-f82f-421d-98df-4f8e95911d5a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wrzrm" Oct 12 06:09:37 crc kubenswrapper[4930]: I1012 06:09:37.558437 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d8b9bf67-f82f-421d-98df-4f8e95911d5a-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-wrzrm\" (UID: \"d8b9bf67-f82f-421d-98df-4f8e95911d5a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wrzrm" Oct 12 06:09:37 crc kubenswrapper[4930]: I1012 06:09:37.566341 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6f2gg\" (UniqueName: \"kubernetes.io/projected/d8b9bf67-f82f-421d-98df-4f8e95911d5a-kube-api-access-6f2gg\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-wrzrm\" (UID: \"d8b9bf67-f82f-421d-98df-4f8e95911d5a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wrzrm" Oct 12 06:09:37 crc kubenswrapper[4930]: I1012 06:09:37.700070 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wrzrm" Oct 12 06:09:38 crc kubenswrapper[4930]: I1012 06:09:38.273431 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wrzrm"] Oct 12 06:09:39 crc kubenswrapper[4930]: I1012 06:09:39.265394 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wrzrm" event={"ID":"d8b9bf67-f82f-421d-98df-4f8e95911d5a","Type":"ContainerStarted","Data":"a3a302cf08df5aac4a511f8851e9f383123d9efe637d065609b23aa281b0ec9e"} Oct 12 06:09:40 crc kubenswrapper[4930]: I1012 06:09:40.302938 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wrzrm" event={"ID":"d8b9bf67-f82f-421d-98df-4f8e95911d5a","Type":"ContainerStarted","Data":"86535d95cd34273ba0c8115a223c14037493a59dbd93b160399dd4cd3c2ceda5"} Oct 12 06:09:40 crc kubenswrapper[4930]: I1012 06:09:40.326487 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wrzrm" podStartSLOduration=2.654296344 podStartE2EDuration="3.326467167s" podCreationTimestamp="2025-10-12 06:09:37 +0000 UTC" firstStartedPulling="2025-10-12 06:09:38.295640601 +0000 UTC m=+1710.837742366" lastFinishedPulling="2025-10-12 06:09:38.967811384 +0000 UTC m=+1711.509913189" observedRunningTime="2025-10-12 06:09:40.324712984 +0000 UTC m=+1712.866814759" watchObservedRunningTime="2025-10-12 06:09:40.326467167 +0000 UTC m=+1712.868568942" Oct 12 06:09:41 crc kubenswrapper[4930]: I1012 06:09:41.135511 4930 scope.go:117] "RemoveContainer" containerID="32f5a30ef6dd4a307d4229972f1b9582e9cf02f46ecd5b40412f5857437dcf0a" Oct 12 06:09:41 crc kubenswrapper[4930]: E1012 06:09:41.136036 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:09:53 crc kubenswrapper[4930]: I1012 06:09:53.136214 4930 scope.go:117] "RemoveContainer" containerID="32f5a30ef6dd4a307d4229972f1b9582e9cf02f46ecd5b40412f5857437dcf0a" Oct 12 06:09:53 crc kubenswrapper[4930]: E1012 06:09:53.137282 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:10:07 crc kubenswrapper[4930]: I1012 06:10:07.136006 4930 scope.go:117] "RemoveContainer" containerID="32f5a30ef6dd4a307d4229972f1b9582e9cf02f46ecd5b40412f5857437dcf0a" Oct 12 06:10:07 crc kubenswrapper[4930]: E1012 06:10:07.136983 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:10:18 crc kubenswrapper[4930]: I1012 06:10:18.147854 4930 scope.go:117] "RemoveContainer" containerID="32f5a30ef6dd4a307d4229972f1b9582e9cf02f46ecd5b40412f5857437dcf0a" Oct 12 06:10:18 crc kubenswrapper[4930]: E1012 06:10:18.148884 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:10:25 crc kubenswrapper[4930]: I1012 06:10:25.827634 4930 scope.go:117] "RemoveContainer" containerID="5c5db18e20e927f2b3f8fc81b314b0d8b00783743c0b89453be15b7b032c587b" Oct 12 06:10:25 crc kubenswrapper[4930]: I1012 06:10:25.892706 4930 scope.go:117] "RemoveContainer" containerID="148e80f03a3b8ca1088059a87fb924230d189b4c9191ed927d45cd9215c559c8" Oct 12 06:10:25 crc kubenswrapper[4930]: I1012 06:10:25.984063 4930 scope.go:117] "RemoveContainer" containerID="1b9aba71eeec6c8e02392ccc8b7f58218c5cc9ebe67585f1ff3bd806865a7174" Oct 12 06:10:32 crc kubenswrapper[4930]: I1012 06:10:32.136176 4930 scope.go:117] "RemoveContainer" containerID="32f5a30ef6dd4a307d4229972f1b9582e9cf02f46ecd5b40412f5857437dcf0a" Oct 12 06:10:32 crc kubenswrapper[4930]: E1012 06:10:32.136761 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:10:34 crc kubenswrapper[4930]: I1012 06:10:34.046714 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-94srd"] Oct 12 06:10:34 crc kubenswrapper[4930]: I1012 06:10:34.057187 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-fg84h"] Oct 12 06:10:34 crc kubenswrapper[4930]: I1012 06:10:34.068846 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-9k65c"] Oct 12 06:10:34 crc kubenswrapper[4930]: I1012 06:10:34.080296 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-94srd"] Oct 12 06:10:34 crc kubenswrapper[4930]: I1012 06:10:34.089433 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-9k65c"] Oct 12 06:10:34 crc kubenswrapper[4930]: I1012 06:10:34.096038 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-fg84h"] Oct 12 06:10:34 crc kubenswrapper[4930]: I1012 06:10:34.149240 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="234f0277-c632-443a-a332-07699c63b28a" path="/var/lib/kubelet/pods/234f0277-c632-443a-a332-07699c63b28a/volumes" Oct 12 06:10:34 crc kubenswrapper[4930]: I1012 06:10:34.150113 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6af5c93e-5d4d-4176-9cf0-fcb5197c7774" path="/var/lib/kubelet/pods/6af5c93e-5d4d-4176-9cf0-fcb5197c7774/volumes" Oct 12 06:10:34 crc kubenswrapper[4930]: I1012 06:10:34.150973 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f73d1a72-3045-4bf4-b851-b52907101d74" path="/var/lib/kubelet/pods/f73d1a72-3045-4bf4-b851-b52907101d74/volumes" Oct 12 06:10:44 crc kubenswrapper[4930]: I1012 06:10:44.039413 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-0a55-account-create-jjl4h"] Oct 12 06:10:44 crc kubenswrapper[4930]: I1012 06:10:44.052220 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-0a55-account-create-jjl4h"] Oct 12 06:10:44 crc kubenswrapper[4930]: I1012 06:10:44.136088 4930 scope.go:117] "RemoveContainer" containerID="32f5a30ef6dd4a307d4229972f1b9582e9cf02f46ecd5b40412f5857437dcf0a" Oct 12 06:10:44 crc kubenswrapper[4930]: E1012 06:10:44.136566 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:10:44 crc kubenswrapper[4930]: I1012 06:10:44.148808 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e74a93bc-5b73-4e2b-815a-14398881caf5" path="/var/lib/kubelet/pods/e74a93bc-5b73-4e2b-815a-14398881caf5/volumes" Oct 12 06:10:45 crc kubenswrapper[4930]: I1012 06:10:45.038585 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-9054-account-create-rr9h4"] Oct 12 06:10:45 crc kubenswrapper[4930]: I1012 06:10:45.055342 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-57c6-account-create-26nks"] Oct 12 06:10:45 crc kubenswrapper[4930]: I1012 06:10:45.076788 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-57c6-account-create-26nks"] Oct 12 06:10:45 crc kubenswrapper[4930]: I1012 06:10:45.081699 4930 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-7d8c9db847-bqfrb" podUID="4ed14594-beb5-4ce3-bf04-4a9299a932be" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Oct 12 06:10:45 crc kubenswrapper[4930]: I1012 06:10:45.090987 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-9054-account-create-rr9h4"] Oct 12 06:10:46 crc kubenswrapper[4930]: I1012 06:10:46.145526 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="656956b7-bd5b-4cab-9db0-53f9776ee51e" path="/var/lib/kubelet/pods/656956b7-bd5b-4cab-9db0-53f9776ee51e/volumes" Oct 12 06:10:46 crc kubenswrapper[4930]: I1012 06:10:46.146138 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74e2cfe9-e552-4e89-b0dc-7af425310799" path="/var/lib/kubelet/pods/74e2cfe9-e552-4e89-b0dc-7af425310799/volumes" Oct 12 06:10:55 crc kubenswrapper[4930]: I1012 06:10:55.136498 4930 scope.go:117] "RemoveContainer" containerID="32f5a30ef6dd4a307d4229972f1b9582e9cf02f46ecd5b40412f5857437dcf0a" Oct 12 06:10:55 crc kubenswrapper[4930]: E1012 06:10:55.137526 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:11:00 crc kubenswrapper[4930]: I1012 06:11:00.277443 4930 generic.go:334] "Generic (PLEG): container finished" podID="d8b9bf67-f82f-421d-98df-4f8e95911d5a" containerID="86535d95cd34273ba0c8115a223c14037493a59dbd93b160399dd4cd3c2ceda5" exitCode=0 Oct 12 06:11:00 crc kubenswrapper[4930]: I1012 06:11:00.278092 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wrzrm" event={"ID":"d8b9bf67-f82f-421d-98df-4f8e95911d5a","Type":"ContainerDied","Data":"86535d95cd34273ba0c8115a223c14037493a59dbd93b160399dd4cd3c2ceda5"} Oct 12 06:11:01 crc kubenswrapper[4930]: I1012 06:11:01.764610 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wrzrm" Oct 12 06:11:01 crc kubenswrapper[4930]: I1012 06:11:01.902233 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6f2gg\" (UniqueName: \"kubernetes.io/projected/d8b9bf67-f82f-421d-98df-4f8e95911d5a-kube-api-access-6f2gg\") pod \"d8b9bf67-f82f-421d-98df-4f8e95911d5a\" (UID: \"d8b9bf67-f82f-421d-98df-4f8e95911d5a\") " Oct 12 06:11:01 crc kubenswrapper[4930]: I1012 06:11:01.902557 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d8b9bf67-f82f-421d-98df-4f8e95911d5a-inventory\") pod \"d8b9bf67-f82f-421d-98df-4f8e95911d5a\" (UID: \"d8b9bf67-f82f-421d-98df-4f8e95911d5a\") " Oct 12 06:11:01 crc kubenswrapper[4930]: I1012 06:11:01.902661 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d8b9bf67-f82f-421d-98df-4f8e95911d5a-ssh-key\") pod \"d8b9bf67-f82f-421d-98df-4f8e95911d5a\" (UID: \"d8b9bf67-f82f-421d-98df-4f8e95911d5a\") " Oct 12 06:11:01 crc kubenswrapper[4930]: I1012 06:11:01.909554 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8b9bf67-f82f-421d-98df-4f8e95911d5a-kube-api-access-6f2gg" (OuterVolumeSpecName: "kube-api-access-6f2gg") pod "d8b9bf67-f82f-421d-98df-4f8e95911d5a" (UID: "d8b9bf67-f82f-421d-98df-4f8e95911d5a"). InnerVolumeSpecName "kube-api-access-6f2gg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:11:01 crc kubenswrapper[4930]: I1012 06:11:01.928637 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8b9bf67-f82f-421d-98df-4f8e95911d5a-inventory" (OuterVolumeSpecName: "inventory") pod "d8b9bf67-f82f-421d-98df-4f8e95911d5a" (UID: "d8b9bf67-f82f-421d-98df-4f8e95911d5a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:11:01 crc kubenswrapper[4930]: I1012 06:11:01.944761 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8b9bf67-f82f-421d-98df-4f8e95911d5a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d8b9bf67-f82f-421d-98df-4f8e95911d5a" (UID: "d8b9bf67-f82f-421d-98df-4f8e95911d5a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:11:02 crc kubenswrapper[4930]: I1012 06:11:02.005513 4930 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d8b9bf67-f82f-421d-98df-4f8e95911d5a-inventory\") on node \"crc\" DevicePath \"\"" Oct 12 06:11:02 crc kubenswrapper[4930]: I1012 06:11:02.005559 4930 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d8b9bf67-f82f-421d-98df-4f8e95911d5a-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 12 06:11:02 crc kubenswrapper[4930]: I1012 06:11:02.005573 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6f2gg\" (UniqueName: \"kubernetes.io/projected/d8b9bf67-f82f-421d-98df-4f8e95911d5a-kube-api-access-6f2gg\") on node \"crc\" DevicePath \"\"" Oct 12 06:11:02 crc kubenswrapper[4930]: I1012 06:11:02.303860 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wrzrm" event={"ID":"d8b9bf67-f82f-421d-98df-4f8e95911d5a","Type":"ContainerDied","Data":"a3a302cf08df5aac4a511f8851e9f383123d9efe637d065609b23aa281b0ec9e"} Oct 12 06:11:02 crc kubenswrapper[4930]: I1012 06:11:02.303922 4930 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a3a302cf08df5aac4a511f8851e9f383123d9efe637d065609b23aa281b0ec9e" Oct 12 06:11:02 crc kubenswrapper[4930]: I1012 06:11:02.303960 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-wrzrm" Oct 12 06:11:02 crc kubenswrapper[4930]: I1012 06:11:02.418321 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-w8fp5"] Oct 12 06:11:02 crc kubenswrapper[4930]: E1012 06:11:02.419051 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8b9bf67-f82f-421d-98df-4f8e95911d5a" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 12 06:11:02 crc kubenswrapper[4930]: I1012 06:11:02.419083 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8b9bf67-f82f-421d-98df-4f8e95911d5a" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 12 06:11:02 crc kubenswrapper[4930]: I1012 06:11:02.419489 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8b9bf67-f82f-421d-98df-4f8e95911d5a" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 12 06:11:02 crc kubenswrapper[4930]: I1012 06:11:02.420691 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-w8fp5" Oct 12 06:11:02 crc kubenswrapper[4930]: I1012 06:11:02.423534 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vhlxg" Oct 12 06:11:02 crc kubenswrapper[4930]: I1012 06:11:02.424052 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 12 06:11:02 crc kubenswrapper[4930]: I1012 06:11:02.424236 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 12 06:11:02 crc kubenswrapper[4930]: I1012 06:11:02.424669 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 12 06:11:02 crc kubenswrapper[4930]: I1012 06:11:02.432421 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-w8fp5"] Oct 12 06:11:02 crc kubenswrapper[4930]: I1012 06:11:02.619412 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/33c2f765-a7aa-4d04-87b3-2f8483e4623a-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-w8fp5\" (UID: \"33c2f765-a7aa-4d04-87b3-2f8483e4623a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-w8fp5" Oct 12 06:11:02 crc kubenswrapper[4930]: I1012 06:11:02.619527 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/33c2f765-a7aa-4d04-87b3-2f8483e4623a-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-w8fp5\" (UID: \"33c2f765-a7aa-4d04-87b3-2f8483e4623a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-w8fp5" Oct 12 06:11:02 crc kubenswrapper[4930]: I1012 06:11:02.619704 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8glp\" (UniqueName: \"kubernetes.io/projected/33c2f765-a7aa-4d04-87b3-2f8483e4623a-kube-api-access-s8glp\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-w8fp5\" (UID: \"33c2f765-a7aa-4d04-87b3-2f8483e4623a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-w8fp5" Oct 12 06:11:02 crc kubenswrapper[4930]: I1012 06:11:02.722704 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/33c2f765-a7aa-4d04-87b3-2f8483e4623a-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-w8fp5\" (UID: \"33c2f765-a7aa-4d04-87b3-2f8483e4623a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-w8fp5" Oct 12 06:11:02 crc kubenswrapper[4930]: I1012 06:11:02.723124 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8glp\" (UniqueName: \"kubernetes.io/projected/33c2f765-a7aa-4d04-87b3-2f8483e4623a-kube-api-access-s8glp\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-w8fp5\" (UID: \"33c2f765-a7aa-4d04-87b3-2f8483e4623a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-w8fp5" Oct 12 06:11:02 crc kubenswrapper[4930]: I1012 06:11:02.723278 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/33c2f765-a7aa-4d04-87b3-2f8483e4623a-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-w8fp5\" (UID: \"33c2f765-a7aa-4d04-87b3-2f8483e4623a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-w8fp5" Oct 12 06:11:02 crc kubenswrapper[4930]: I1012 06:11:02.729708 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/33c2f765-a7aa-4d04-87b3-2f8483e4623a-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-w8fp5\" (UID: \"33c2f765-a7aa-4d04-87b3-2f8483e4623a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-w8fp5" Oct 12 06:11:02 crc kubenswrapper[4930]: I1012 06:11:02.730661 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/33c2f765-a7aa-4d04-87b3-2f8483e4623a-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-w8fp5\" (UID: \"33c2f765-a7aa-4d04-87b3-2f8483e4623a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-w8fp5" Oct 12 06:11:02 crc kubenswrapper[4930]: I1012 06:11:02.764030 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8glp\" (UniqueName: \"kubernetes.io/projected/33c2f765-a7aa-4d04-87b3-2f8483e4623a-kube-api-access-s8glp\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-w8fp5\" (UID: \"33c2f765-a7aa-4d04-87b3-2f8483e4623a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-w8fp5" Oct 12 06:11:03 crc kubenswrapper[4930]: I1012 06:11:03.061135 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-w8fp5" Oct 12 06:11:03 crc kubenswrapper[4930]: W1012 06:11:03.733047 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33c2f765_a7aa_4d04_87b3_2f8483e4623a.slice/crio-0e3ffb99533bc243a028b98cb15a4f6121b920884702d9925a6d833ea5177f61 WatchSource:0}: Error finding container 0e3ffb99533bc243a028b98cb15a4f6121b920884702d9925a6d833ea5177f61: Status 404 returned error can't find the container with id 0e3ffb99533bc243a028b98cb15a4f6121b920884702d9925a6d833ea5177f61 Oct 12 06:11:03 crc kubenswrapper[4930]: I1012 06:11:03.735911 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-w8fp5"] Oct 12 06:11:03 crc kubenswrapper[4930]: I1012 06:11:03.736708 4930 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 12 06:11:04 crc kubenswrapper[4930]: I1012 06:11:04.329335 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-w8fp5" event={"ID":"33c2f765-a7aa-4d04-87b3-2f8483e4623a","Type":"ContainerStarted","Data":"0e3ffb99533bc243a028b98cb15a4f6121b920884702d9925a6d833ea5177f61"} Oct 12 06:11:05 crc kubenswrapper[4930]: I1012 06:11:05.342230 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-w8fp5" event={"ID":"33c2f765-a7aa-4d04-87b3-2f8483e4623a","Type":"ContainerStarted","Data":"5204694bd4fe225edfc21fca6dd1006e5d1f3049e29a2c7c55b7678ee3d9ccc7"} Oct 12 06:11:05 crc kubenswrapper[4930]: I1012 06:11:05.369290 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-w8fp5" podStartSLOduration=2.836439712 podStartE2EDuration="3.3692631s" podCreationTimestamp="2025-10-12 06:11:02 +0000 UTC" firstStartedPulling="2025-10-12 06:11:03.736294941 +0000 UTC m=+1796.278396746" lastFinishedPulling="2025-10-12 06:11:04.269118359 +0000 UTC m=+1796.811220134" observedRunningTime="2025-10-12 06:11:05.363475387 +0000 UTC m=+1797.905577172" watchObservedRunningTime="2025-10-12 06:11:05.3692631 +0000 UTC m=+1797.911364905" Oct 12 06:11:09 crc kubenswrapper[4930]: I1012 06:11:09.136241 4930 scope.go:117] "RemoveContainer" containerID="32f5a30ef6dd4a307d4229972f1b9582e9cf02f46ecd5b40412f5857437dcf0a" Oct 12 06:11:09 crc kubenswrapper[4930]: E1012 06:11:09.137376 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:11:10 crc kubenswrapper[4930]: I1012 06:11:10.407392 4930 generic.go:334] "Generic (PLEG): container finished" podID="33c2f765-a7aa-4d04-87b3-2f8483e4623a" containerID="5204694bd4fe225edfc21fca6dd1006e5d1f3049e29a2c7c55b7678ee3d9ccc7" exitCode=0 Oct 12 06:11:10 crc kubenswrapper[4930]: I1012 06:11:10.407442 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-w8fp5" event={"ID":"33c2f765-a7aa-4d04-87b3-2f8483e4623a","Type":"ContainerDied","Data":"5204694bd4fe225edfc21fca6dd1006e5d1f3049e29a2c7c55b7678ee3d9ccc7"} Oct 12 06:11:11 crc kubenswrapper[4930]: I1012 06:11:11.046675 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-mqxdr"] Oct 12 06:11:11 crc kubenswrapper[4930]: I1012 06:11:11.063967 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-mqxdr"] Oct 12 06:11:11 crc kubenswrapper[4930]: I1012 06:11:11.891068 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-w8fp5" Oct 12 06:11:11 crc kubenswrapper[4930]: I1012 06:11:11.941668 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8glp\" (UniqueName: \"kubernetes.io/projected/33c2f765-a7aa-4d04-87b3-2f8483e4623a-kube-api-access-s8glp\") pod \"33c2f765-a7aa-4d04-87b3-2f8483e4623a\" (UID: \"33c2f765-a7aa-4d04-87b3-2f8483e4623a\") " Oct 12 06:11:11 crc kubenswrapper[4930]: I1012 06:11:11.941789 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/33c2f765-a7aa-4d04-87b3-2f8483e4623a-ssh-key\") pod \"33c2f765-a7aa-4d04-87b3-2f8483e4623a\" (UID: \"33c2f765-a7aa-4d04-87b3-2f8483e4623a\") " Oct 12 06:11:11 crc kubenswrapper[4930]: I1012 06:11:11.941840 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/33c2f765-a7aa-4d04-87b3-2f8483e4623a-inventory\") pod \"33c2f765-a7aa-4d04-87b3-2f8483e4623a\" (UID: \"33c2f765-a7aa-4d04-87b3-2f8483e4623a\") " Oct 12 06:11:11 crc kubenswrapper[4930]: I1012 06:11:11.960567 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33c2f765-a7aa-4d04-87b3-2f8483e4623a-kube-api-access-s8glp" (OuterVolumeSpecName: "kube-api-access-s8glp") pod "33c2f765-a7aa-4d04-87b3-2f8483e4623a" (UID: "33c2f765-a7aa-4d04-87b3-2f8483e4623a"). InnerVolumeSpecName "kube-api-access-s8glp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:11:11 crc kubenswrapper[4930]: I1012 06:11:11.979344 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33c2f765-a7aa-4d04-87b3-2f8483e4623a-inventory" (OuterVolumeSpecName: "inventory") pod "33c2f765-a7aa-4d04-87b3-2f8483e4623a" (UID: "33c2f765-a7aa-4d04-87b3-2f8483e4623a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:11:11 crc kubenswrapper[4930]: I1012 06:11:11.987628 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33c2f765-a7aa-4d04-87b3-2f8483e4623a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "33c2f765-a7aa-4d04-87b3-2f8483e4623a" (UID: "33c2f765-a7aa-4d04-87b3-2f8483e4623a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:11:12 crc kubenswrapper[4930]: I1012 06:11:12.044886 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s8glp\" (UniqueName: \"kubernetes.io/projected/33c2f765-a7aa-4d04-87b3-2f8483e4623a-kube-api-access-s8glp\") on node \"crc\" DevicePath \"\"" Oct 12 06:11:12 crc kubenswrapper[4930]: I1012 06:11:12.044936 4930 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/33c2f765-a7aa-4d04-87b3-2f8483e4623a-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 12 06:11:12 crc kubenswrapper[4930]: I1012 06:11:12.044952 4930 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/33c2f765-a7aa-4d04-87b3-2f8483e4623a-inventory\") on node \"crc\" DevicePath \"\"" Oct 12 06:11:12 crc kubenswrapper[4930]: I1012 06:11:12.148617 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d0cf3ce-d82f-43e3-9542-6cca53e19b42" path="/var/lib/kubelet/pods/9d0cf3ce-d82f-43e3-9542-6cca53e19b42/volumes" Oct 12 06:11:12 crc kubenswrapper[4930]: I1012 06:11:12.432333 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-w8fp5" event={"ID":"33c2f765-a7aa-4d04-87b3-2f8483e4623a","Type":"ContainerDied","Data":"0e3ffb99533bc243a028b98cb15a4f6121b920884702d9925a6d833ea5177f61"} Oct 12 06:11:12 crc kubenswrapper[4930]: I1012 06:11:12.432404 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-w8fp5" Oct 12 06:11:12 crc kubenswrapper[4930]: I1012 06:11:12.432420 4930 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e3ffb99533bc243a028b98cb15a4f6121b920884702d9925a6d833ea5177f61" Oct 12 06:11:12 crc kubenswrapper[4930]: I1012 06:11:12.519986 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-pckm6"] Oct 12 06:11:12 crc kubenswrapper[4930]: E1012 06:11:12.520897 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33c2f765-a7aa-4d04-87b3-2f8483e4623a" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 12 06:11:12 crc kubenswrapper[4930]: I1012 06:11:12.520922 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="33c2f765-a7aa-4d04-87b3-2f8483e4623a" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 12 06:11:12 crc kubenswrapper[4930]: I1012 06:11:12.521446 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="33c2f765-a7aa-4d04-87b3-2f8483e4623a" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 12 06:11:12 crc kubenswrapper[4930]: I1012 06:11:12.522672 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pckm6" Oct 12 06:11:12 crc kubenswrapper[4930]: I1012 06:11:12.528311 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 12 06:11:12 crc kubenswrapper[4930]: I1012 06:11:12.528511 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 12 06:11:12 crc kubenswrapper[4930]: I1012 06:11:12.528639 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 12 06:11:12 crc kubenswrapper[4930]: I1012 06:11:12.528684 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vhlxg" Oct 12 06:11:12 crc kubenswrapper[4930]: I1012 06:11:12.547754 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-pckm6"] Oct 12 06:11:12 crc kubenswrapper[4930]: I1012 06:11:12.656572 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hx5mj\" (UniqueName: \"kubernetes.io/projected/f64044f7-a939-48e9-a986-a3a39f4d1a4a-kube-api-access-hx5mj\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pckm6\" (UID: \"f64044f7-a939-48e9-a986-a3a39f4d1a4a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pckm6" Oct 12 06:11:12 crc kubenswrapper[4930]: I1012 06:11:12.656804 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f64044f7-a939-48e9-a986-a3a39f4d1a4a-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pckm6\" (UID: \"f64044f7-a939-48e9-a986-a3a39f4d1a4a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pckm6" Oct 12 06:11:12 crc kubenswrapper[4930]: I1012 06:11:12.656865 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f64044f7-a939-48e9-a986-a3a39f4d1a4a-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pckm6\" (UID: \"f64044f7-a939-48e9-a986-a3a39f4d1a4a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pckm6" Oct 12 06:11:12 crc kubenswrapper[4930]: I1012 06:11:12.758554 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hx5mj\" (UniqueName: \"kubernetes.io/projected/f64044f7-a939-48e9-a986-a3a39f4d1a4a-kube-api-access-hx5mj\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pckm6\" (UID: \"f64044f7-a939-48e9-a986-a3a39f4d1a4a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pckm6" Oct 12 06:11:12 crc kubenswrapper[4930]: I1012 06:11:12.758675 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f64044f7-a939-48e9-a986-a3a39f4d1a4a-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pckm6\" (UID: \"f64044f7-a939-48e9-a986-a3a39f4d1a4a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pckm6" Oct 12 06:11:12 crc kubenswrapper[4930]: I1012 06:11:12.758710 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f64044f7-a939-48e9-a986-a3a39f4d1a4a-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pckm6\" (UID: \"f64044f7-a939-48e9-a986-a3a39f4d1a4a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pckm6" Oct 12 06:11:12 crc kubenswrapper[4930]: I1012 06:11:12.766342 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f64044f7-a939-48e9-a986-a3a39f4d1a4a-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pckm6\" (UID: \"f64044f7-a939-48e9-a986-a3a39f4d1a4a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pckm6" Oct 12 06:11:12 crc kubenswrapper[4930]: I1012 06:11:12.766622 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f64044f7-a939-48e9-a986-a3a39f4d1a4a-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pckm6\" (UID: \"f64044f7-a939-48e9-a986-a3a39f4d1a4a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pckm6" Oct 12 06:11:12 crc kubenswrapper[4930]: I1012 06:11:12.792892 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hx5mj\" (UniqueName: \"kubernetes.io/projected/f64044f7-a939-48e9-a986-a3a39f4d1a4a-kube-api-access-hx5mj\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pckm6\" (UID: \"f64044f7-a939-48e9-a986-a3a39f4d1a4a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pckm6" Oct 12 06:11:12 crc kubenswrapper[4930]: I1012 06:11:12.852263 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pckm6" Oct 12 06:11:13 crc kubenswrapper[4930]: I1012 06:11:13.254971 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-pckm6"] Oct 12 06:11:13 crc kubenswrapper[4930]: I1012 06:11:13.442969 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pckm6" event={"ID":"f64044f7-a939-48e9-a986-a3a39f4d1a4a","Type":"ContainerStarted","Data":"d46c3ff543375732260eff133a6b9051d5625ed04744eb094e28a216c9f71c35"} Oct 12 06:11:14 crc kubenswrapper[4930]: I1012 06:11:14.458844 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pckm6" event={"ID":"f64044f7-a939-48e9-a986-a3a39f4d1a4a","Type":"ContainerStarted","Data":"6837a00bb80987f968cd7e13059dee1bf66076a2bb8e09924dad9020691fc9ba"} Oct 12 06:11:14 crc kubenswrapper[4930]: I1012 06:11:14.486821 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pckm6" podStartSLOduration=2.020198283 podStartE2EDuration="2.486784878s" podCreationTimestamp="2025-10-12 06:11:12 +0000 UTC" firstStartedPulling="2025-10-12 06:11:13.26286528 +0000 UTC m=+1805.804967045" lastFinishedPulling="2025-10-12 06:11:13.729451835 +0000 UTC m=+1806.271553640" observedRunningTime="2025-10-12 06:11:14.479136779 +0000 UTC m=+1807.021238614" watchObservedRunningTime="2025-10-12 06:11:14.486784878 +0000 UTC m=+1807.028886673" Oct 12 06:11:23 crc kubenswrapper[4930]: I1012 06:11:23.136285 4930 scope.go:117] "RemoveContainer" containerID="32f5a30ef6dd4a307d4229972f1b9582e9cf02f46ecd5b40412f5857437dcf0a" Oct 12 06:11:23 crc kubenswrapper[4930]: E1012 06:11:23.137306 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:11:26 crc kubenswrapper[4930]: I1012 06:11:26.096552 4930 scope.go:117] "RemoveContainer" containerID="d267b4b820256d8a34e75b0920fde5a6f3d90bded0c05c4e00ffd803bfd2294b" Oct 12 06:11:26 crc kubenswrapper[4930]: I1012 06:11:26.121866 4930 scope.go:117] "RemoveContainer" containerID="16b5bce420b969768bdbae5362224fabb73c667affdefc4eb5fa1d2883c7c259" Oct 12 06:11:26 crc kubenswrapper[4930]: I1012 06:11:26.213142 4930 scope.go:117] "RemoveContainer" containerID="50d375193956d516aab058383a6effec3937384885e7a547a37888276b68e52f" Oct 12 06:11:26 crc kubenswrapper[4930]: I1012 06:11:26.268242 4930 scope.go:117] "RemoveContainer" containerID="ecf0ed29a10a370d3b75b2d69566a505fc269b4320c45de8f39f778246bbe12e" Oct 12 06:11:26 crc kubenswrapper[4930]: I1012 06:11:26.304879 4930 scope.go:117] "RemoveContainer" containerID="4db42ddbeca5bb8a612141328b7ed44d1925c02c297a7927e3d3ef3583ea9ce7" Oct 12 06:11:26 crc kubenswrapper[4930]: I1012 06:11:26.344841 4930 scope.go:117] "RemoveContainer" containerID="0242ef56b3fb8e9f247c830cac2b07fb839a01174f1f165da28a219d25ed243d" Oct 12 06:11:26 crc kubenswrapper[4930]: I1012 06:11:26.386391 4930 scope.go:117] "RemoveContainer" containerID="395c709050f36df78f829eedf3e2abbc3b33739a508462fafa1f64340258c9f0" Oct 12 06:11:34 crc kubenswrapper[4930]: I1012 06:11:34.136386 4930 scope.go:117] "RemoveContainer" containerID="32f5a30ef6dd4a307d4229972f1b9582e9cf02f46ecd5b40412f5857437dcf0a" Oct 12 06:11:34 crc kubenswrapper[4930]: E1012 06:11:34.138269 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:11:36 crc kubenswrapper[4930]: I1012 06:11:36.054914 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-fzfdt"] Oct 12 06:11:36 crc kubenswrapper[4930]: I1012 06:11:36.063641 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-fzfdt"] Oct 12 06:11:36 crc kubenswrapper[4930]: I1012 06:11:36.146014 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="022414f2-80b4-4bc1-81e9-df49ebbaae8e" path="/var/lib/kubelet/pods/022414f2-80b4-4bc1-81e9-df49ebbaae8e/volumes" Oct 12 06:11:39 crc kubenswrapper[4930]: I1012 06:11:39.040386 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-mqwg4"] Oct 12 06:11:39 crc kubenswrapper[4930]: I1012 06:11:39.058061 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-mqwg4"] Oct 12 06:11:40 crc kubenswrapper[4930]: I1012 06:11:40.151677 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25b0cb62-9d52-4ba1-9c00-3cda68c43da8" path="/var/lib/kubelet/pods/25b0cb62-9d52-4ba1-9c00-3cda68c43da8/volumes" Oct 12 06:11:49 crc kubenswrapper[4930]: I1012 06:11:49.135908 4930 scope.go:117] "RemoveContainer" containerID="32f5a30ef6dd4a307d4229972f1b9582e9cf02f46ecd5b40412f5857437dcf0a" Oct 12 06:11:49 crc kubenswrapper[4930]: E1012 06:11:49.136968 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:12:00 crc kubenswrapper[4930]: I1012 06:12:00.003627 4930 generic.go:334] "Generic (PLEG): container finished" podID="f64044f7-a939-48e9-a986-a3a39f4d1a4a" containerID="6837a00bb80987f968cd7e13059dee1bf66076a2bb8e09924dad9020691fc9ba" exitCode=0 Oct 12 06:12:00 crc kubenswrapper[4930]: I1012 06:12:00.003702 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pckm6" event={"ID":"f64044f7-a939-48e9-a986-a3a39f4d1a4a","Type":"ContainerDied","Data":"6837a00bb80987f968cd7e13059dee1bf66076a2bb8e09924dad9020691fc9ba"} Oct 12 06:12:01 crc kubenswrapper[4930]: I1012 06:12:01.603543 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pckm6" Oct 12 06:12:01 crc kubenswrapper[4930]: I1012 06:12:01.797647 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hx5mj\" (UniqueName: \"kubernetes.io/projected/f64044f7-a939-48e9-a986-a3a39f4d1a4a-kube-api-access-hx5mj\") pod \"f64044f7-a939-48e9-a986-a3a39f4d1a4a\" (UID: \"f64044f7-a939-48e9-a986-a3a39f4d1a4a\") " Oct 12 06:12:01 crc kubenswrapper[4930]: I1012 06:12:01.797942 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f64044f7-a939-48e9-a986-a3a39f4d1a4a-inventory\") pod \"f64044f7-a939-48e9-a986-a3a39f4d1a4a\" (UID: \"f64044f7-a939-48e9-a986-a3a39f4d1a4a\") " Oct 12 06:12:01 crc kubenswrapper[4930]: I1012 06:12:01.797969 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f64044f7-a939-48e9-a986-a3a39f4d1a4a-ssh-key\") pod \"f64044f7-a939-48e9-a986-a3a39f4d1a4a\" (UID: \"f64044f7-a939-48e9-a986-a3a39f4d1a4a\") " Oct 12 06:12:01 crc kubenswrapper[4930]: I1012 06:12:01.805386 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f64044f7-a939-48e9-a986-a3a39f4d1a4a-kube-api-access-hx5mj" (OuterVolumeSpecName: "kube-api-access-hx5mj") pod "f64044f7-a939-48e9-a986-a3a39f4d1a4a" (UID: "f64044f7-a939-48e9-a986-a3a39f4d1a4a"). InnerVolumeSpecName "kube-api-access-hx5mj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:12:01 crc kubenswrapper[4930]: I1012 06:12:01.833638 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f64044f7-a939-48e9-a986-a3a39f4d1a4a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f64044f7-a939-48e9-a986-a3a39f4d1a4a" (UID: "f64044f7-a939-48e9-a986-a3a39f4d1a4a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:12:01 crc kubenswrapper[4930]: I1012 06:12:01.851631 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f64044f7-a939-48e9-a986-a3a39f4d1a4a-inventory" (OuterVolumeSpecName: "inventory") pod "f64044f7-a939-48e9-a986-a3a39f4d1a4a" (UID: "f64044f7-a939-48e9-a986-a3a39f4d1a4a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:12:01 crc kubenswrapper[4930]: I1012 06:12:01.900933 4930 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f64044f7-a939-48e9-a986-a3a39f4d1a4a-inventory\") on node \"crc\" DevicePath \"\"" Oct 12 06:12:01 crc kubenswrapper[4930]: I1012 06:12:01.900967 4930 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f64044f7-a939-48e9-a986-a3a39f4d1a4a-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 12 06:12:01 crc kubenswrapper[4930]: I1012 06:12:01.900982 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hx5mj\" (UniqueName: \"kubernetes.io/projected/f64044f7-a939-48e9-a986-a3a39f4d1a4a-kube-api-access-hx5mj\") on node \"crc\" DevicePath \"\"" Oct 12 06:12:02 crc kubenswrapper[4930]: I1012 06:12:02.034412 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pckm6" event={"ID":"f64044f7-a939-48e9-a986-a3a39f4d1a4a","Type":"ContainerDied","Data":"d46c3ff543375732260eff133a6b9051d5625ed04744eb094e28a216c9f71c35"} Oct 12 06:12:02 crc kubenswrapper[4930]: I1012 06:12:02.034473 4930 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d46c3ff543375732260eff133a6b9051d5625ed04744eb094e28a216c9f71c35" Oct 12 06:12:02 crc kubenswrapper[4930]: I1012 06:12:02.034536 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pckm6" Oct 12 06:12:02 crc kubenswrapper[4930]: I1012 06:12:02.138700 4930 scope.go:117] "RemoveContainer" containerID="32f5a30ef6dd4a307d4229972f1b9582e9cf02f46ecd5b40412f5857437dcf0a" Oct 12 06:12:02 crc kubenswrapper[4930]: E1012 06:12:02.139051 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:12:02 crc kubenswrapper[4930]: I1012 06:12:02.181493 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-l8jmm"] Oct 12 06:12:02 crc kubenswrapper[4930]: E1012 06:12:02.182037 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f64044f7-a939-48e9-a986-a3a39f4d1a4a" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 12 06:12:02 crc kubenswrapper[4930]: I1012 06:12:02.182061 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="f64044f7-a939-48e9-a986-a3a39f4d1a4a" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 12 06:12:02 crc kubenswrapper[4930]: I1012 06:12:02.182285 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="f64044f7-a939-48e9-a986-a3a39f4d1a4a" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 12 06:12:02 crc kubenswrapper[4930]: I1012 06:12:02.183083 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-l8jmm" Oct 12 06:12:02 crc kubenswrapper[4930]: I1012 06:12:02.185840 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 12 06:12:02 crc kubenswrapper[4930]: I1012 06:12:02.186066 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 12 06:12:02 crc kubenswrapper[4930]: I1012 06:12:02.186380 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 12 06:12:02 crc kubenswrapper[4930]: I1012 06:12:02.191546 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-l8jmm"] Oct 12 06:12:02 crc kubenswrapper[4930]: I1012 06:12:02.192728 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vhlxg" Oct 12 06:12:02 crc kubenswrapper[4930]: I1012 06:12:02.308770 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6nnr\" (UniqueName: \"kubernetes.io/projected/8096b9b1-e514-4928-8e48-88e6519dc35e-kube-api-access-f6nnr\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-l8jmm\" (UID: \"8096b9b1-e514-4928-8e48-88e6519dc35e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-l8jmm" Oct 12 06:12:02 crc kubenswrapper[4930]: I1012 06:12:02.308995 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8096b9b1-e514-4928-8e48-88e6519dc35e-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-l8jmm\" (UID: \"8096b9b1-e514-4928-8e48-88e6519dc35e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-l8jmm" Oct 12 06:12:02 crc kubenswrapper[4930]: I1012 06:12:02.309082 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8096b9b1-e514-4928-8e48-88e6519dc35e-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-l8jmm\" (UID: \"8096b9b1-e514-4928-8e48-88e6519dc35e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-l8jmm" Oct 12 06:12:02 crc kubenswrapper[4930]: I1012 06:12:02.410946 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8096b9b1-e514-4928-8e48-88e6519dc35e-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-l8jmm\" (UID: \"8096b9b1-e514-4928-8e48-88e6519dc35e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-l8jmm" Oct 12 06:12:02 crc kubenswrapper[4930]: I1012 06:12:02.411063 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8096b9b1-e514-4928-8e48-88e6519dc35e-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-l8jmm\" (UID: \"8096b9b1-e514-4928-8e48-88e6519dc35e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-l8jmm" Oct 12 06:12:02 crc kubenswrapper[4930]: I1012 06:12:02.411290 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6nnr\" (UniqueName: \"kubernetes.io/projected/8096b9b1-e514-4928-8e48-88e6519dc35e-kube-api-access-f6nnr\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-l8jmm\" (UID: \"8096b9b1-e514-4928-8e48-88e6519dc35e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-l8jmm" Oct 12 06:12:02 crc kubenswrapper[4930]: I1012 06:12:02.417815 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8096b9b1-e514-4928-8e48-88e6519dc35e-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-l8jmm\" (UID: \"8096b9b1-e514-4928-8e48-88e6519dc35e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-l8jmm" Oct 12 06:12:02 crc kubenswrapper[4930]: I1012 06:12:02.419199 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8096b9b1-e514-4928-8e48-88e6519dc35e-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-l8jmm\" (UID: \"8096b9b1-e514-4928-8e48-88e6519dc35e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-l8jmm" Oct 12 06:12:02 crc kubenswrapper[4930]: I1012 06:12:02.462024 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6nnr\" (UniqueName: \"kubernetes.io/projected/8096b9b1-e514-4928-8e48-88e6519dc35e-kube-api-access-f6nnr\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-l8jmm\" (UID: \"8096b9b1-e514-4928-8e48-88e6519dc35e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-l8jmm" Oct 12 06:12:02 crc kubenswrapper[4930]: I1012 06:12:02.504268 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-l8jmm" Oct 12 06:12:03 crc kubenswrapper[4930]: I1012 06:12:03.060221 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-l8jmm"] Oct 12 06:12:04 crc kubenswrapper[4930]: I1012 06:12:04.068971 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-l8jmm" event={"ID":"8096b9b1-e514-4928-8e48-88e6519dc35e","Type":"ContainerStarted","Data":"e31baf9bf86c776b665383b5541899e10121cb53f19296364888c3ec6e00fcbb"} Oct 12 06:12:04 crc kubenswrapper[4930]: I1012 06:12:04.070845 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-l8jmm" event={"ID":"8096b9b1-e514-4928-8e48-88e6519dc35e","Type":"ContainerStarted","Data":"3505d5149bd05b3e873c0b9b4cf4875bb5492957ac6f9e5d4fcc7ed257af35e1"} Oct 12 06:12:04 crc kubenswrapper[4930]: I1012 06:12:04.090129 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-l8jmm" podStartSLOduration=1.651227611 podStartE2EDuration="2.090112571s" podCreationTimestamp="2025-10-12 06:12:02 +0000 UTC" firstStartedPulling="2025-10-12 06:12:03.070517027 +0000 UTC m=+1855.612618812" lastFinishedPulling="2025-10-12 06:12:03.509402007 +0000 UTC m=+1856.051503772" observedRunningTime="2025-10-12 06:12:04.086351027 +0000 UTC m=+1856.628452832" watchObservedRunningTime="2025-10-12 06:12:04.090112571 +0000 UTC m=+1856.632214336" Oct 12 06:12:16 crc kubenswrapper[4930]: I1012 06:12:16.135512 4930 scope.go:117] "RemoveContainer" containerID="32f5a30ef6dd4a307d4229972f1b9582e9cf02f46ecd5b40412f5857437dcf0a" Oct 12 06:12:16 crc kubenswrapper[4930]: E1012 06:12:16.137688 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:12:21 crc kubenswrapper[4930]: I1012 06:12:21.047286 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-mnnd5"] Oct 12 06:12:21 crc kubenswrapper[4930]: I1012 06:12:21.054665 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-mnnd5"] Oct 12 06:12:22 crc kubenswrapper[4930]: I1012 06:12:22.154933 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2f80483-5b42-4fa4-8013-f2a8fd535c9e" path="/var/lib/kubelet/pods/c2f80483-5b42-4fa4-8013-f2a8fd535c9e/volumes" Oct 12 06:12:26 crc kubenswrapper[4930]: I1012 06:12:26.556932 4930 scope.go:117] "RemoveContainer" containerID="7c5c4a121c4d9f3987ec65112d4c46c15bea1770075f362a6949b4fd38d1788f" Oct 12 06:12:26 crc kubenswrapper[4930]: I1012 06:12:26.617402 4930 scope.go:117] "RemoveContainer" containerID="11587a9e8bb1e90a07706f5bca9e8051fea00fd934ade62a22227ec2378089af" Oct 12 06:12:26 crc kubenswrapper[4930]: I1012 06:12:26.691841 4930 scope.go:117] "RemoveContainer" containerID="c2e9c3a7da9bb6b0c961c0d94ff115cebf02d88855e7f1e38799f8dde8387931" Oct 12 06:12:28 crc kubenswrapper[4930]: I1012 06:12:28.145920 4930 scope.go:117] "RemoveContainer" containerID="32f5a30ef6dd4a307d4229972f1b9582e9cf02f46ecd5b40412f5857437dcf0a" Oct 12 06:12:28 crc kubenswrapper[4930]: E1012 06:12:28.146682 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:12:42 crc kubenswrapper[4930]: I1012 06:12:42.135871 4930 scope.go:117] "RemoveContainer" containerID="32f5a30ef6dd4a307d4229972f1b9582e9cf02f46ecd5b40412f5857437dcf0a" Oct 12 06:12:42 crc kubenswrapper[4930]: I1012 06:12:42.495412 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" event={"ID":"02f8684c-a3e4-44e8-9741-9f54488d8d8d","Type":"ContainerStarted","Data":"1bd8424d852b3aed65544bab3c60df461f0881b0c3e1f0ea5ec7fed246751ec8"} Oct 12 06:13:03 crc kubenswrapper[4930]: I1012 06:13:03.760123 4930 generic.go:334] "Generic (PLEG): container finished" podID="8096b9b1-e514-4928-8e48-88e6519dc35e" containerID="e31baf9bf86c776b665383b5541899e10121cb53f19296364888c3ec6e00fcbb" exitCode=2 Oct 12 06:13:03 crc kubenswrapper[4930]: I1012 06:13:03.760211 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-l8jmm" event={"ID":"8096b9b1-e514-4928-8e48-88e6519dc35e","Type":"ContainerDied","Data":"e31baf9bf86c776b665383b5541899e10121cb53f19296364888c3ec6e00fcbb"} Oct 12 06:13:05 crc kubenswrapper[4930]: I1012 06:13:05.355009 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-l8jmm" Oct 12 06:13:05 crc kubenswrapper[4930]: I1012 06:13:05.552239 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6nnr\" (UniqueName: \"kubernetes.io/projected/8096b9b1-e514-4928-8e48-88e6519dc35e-kube-api-access-f6nnr\") pod \"8096b9b1-e514-4928-8e48-88e6519dc35e\" (UID: \"8096b9b1-e514-4928-8e48-88e6519dc35e\") " Oct 12 06:13:05 crc kubenswrapper[4930]: I1012 06:13:05.552494 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8096b9b1-e514-4928-8e48-88e6519dc35e-inventory\") pod \"8096b9b1-e514-4928-8e48-88e6519dc35e\" (UID: \"8096b9b1-e514-4928-8e48-88e6519dc35e\") " Oct 12 06:13:05 crc kubenswrapper[4930]: I1012 06:13:05.552529 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8096b9b1-e514-4928-8e48-88e6519dc35e-ssh-key\") pod \"8096b9b1-e514-4928-8e48-88e6519dc35e\" (UID: \"8096b9b1-e514-4928-8e48-88e6519dc35e\") " Oct 12 06:13:05 crc kubenswrapper[4930]: I1012 06:13:05.565093 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8096b9b1-e514-4928-8e48-88e6519dc35e-kube-api-access-f6nnr" (OuterVolumeSpecName: "kube-api-access-f6nnr") pod "8096b9b1-e514-4928-8e48-88e6519dc35e" (UID: "8096b9b1-e514-4928-8e48-88e6519dc35e"). InnerVolumeSpecName "kube-api-access-f6nnr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:13:05 crc kubenswrapper[4930]: I1012 06:13:05.592947 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8096b9b1-e514-4928-8e48-88e6519dc35e-inventory" (OuterVolumeSpecName: "inventory") pod "8096b9b1-e514-4928-8e48-88e6519dc35e" (UID: "8096b9b1-e514-4928-8e48-88e6519dc35e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:13:05 crc kubenswrapper[4930]: I1012 06:13:05.601505 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8096b9b1-e514-4928-8e48-88e6519dc35e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8096b9b1-e514-4928-8e48-88e6519dc35e" (UID: "8096b9b1-e514-4928-8e48-88e6519dc35e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:13:05 crc kubenswrapper[4930]: I1012 06:13:05.655624 4930 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8096b9b1-e514-4928-8e48-88e6519dc35e-inventory\") on node \"crc\" DevicePath \"\"" Oct 12 06:13:05 crc kubenswrapper[4930]: I1012 06:13:05.655687 4930 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8096b9b1-e514-4928-8e48-88e6519dc35e-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 12 06:13:05 crc kubenswrapper[4930]: I1012 06:13:05.655701 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f6nnr\" (UniqueName: \"kubernetes.io/projected/8096b9b1-e514-4928-8e48-88e6519dc35e-kube-api-access-f6nnr\") on node \"crc\" DevicePath \"\"" Oct 12 06:13:05 crc kubenswrapper[4930]: I1012 06:13:05.785841 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-l8jmm" event={"ID":"8096b9b1-e514-4928-8e48-88e6519dc35e","Type":"ContainerDied","Data":"3505d5149bd05b3e873c0b9b4cf4875bb5492957ac6f9e5d4fcc7ed257af35e1"} Oct 12 06:13:05 crc kubenswrapper[4930]: I1012 06:13:05.785878 4930 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3505d5149bd05b3e873c0b9b4cf4875bb5492957ac6f9e5d4fcc7ed257af35e1" Oct 12 06:13:05 crc kubenswrapper[4930]: I1012 06:13:05.785923 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-l8jmm" Oct 12 06:13:09 crc kubenswrapper[4930]: I1012 06:13:09.671095 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-shqh2"] Oct 12 06:13:09 crc kubenswrapper[4930]: E1012 06:13:09.672333 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8096b9b1-e514-4928-8e48-88e6519dc35e" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 12 06:13:09 crc kubenswrapper[4930]: I1012 06:13:09.672362 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="8096b9b1-e514-4928-8e48-88e6519dc35e" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 12 06:13:09 crc kubenswrapper[4930]: I1012 06:13:09.672808 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="8096b9b1-e514-4928-8e48-88e6519dc35e" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 12 06:13:09 crc kubenswrapper[4930]: I1012 06:13:09.691284 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-shqh2" Oct 12 06:13:09 crc kubenswrapper[4930]: I1012 06:13:09.699528 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-shqh2"] Oct 12 06:13:09 crc kubenswrapper[4930]: I1012 06:13:09.750869 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b69cc4bf-7e25-4d26-a759-82ba9cdf7f52-utilities\") pod \"redhat-marketplace-shqh2\" (UID: \"b69cc4bf-7e25-4d26-a759-82ba9cdf7f52\") " pod="openshift-marketplace/redhat-marketplace-shqh2" Oct 12 06:13:09 crc kubenswrapper[4930]: I1012 06:13:09.751004 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkv7s\" (UniqueName: \"kubernetes.io/projected/b69cc4bf-7e25-4d26-a759-82ba9cdf7f52-kube-api-access-tkv7s\") pod \"redhat-marketplace-shqh2\" (UID: \"b69cc4bf-7e25-4d26-a759-82ba9cdf7f52\") " pod="openshift-marketplace/redhat-marketplace-shqh2" Oct 12 06:13:09 crc kubenswrapper[4930]: I1012 06:13:09.751140 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b69cc4bf-7e25-4d26-a759-82ba9cdf7f52-catalog-content\") pod \"redhat-marketplace-shqh2\" (UID: \"b69cc4bf-7e25-4d26-a759-82ba9cdf7f52\") " pod="openshift-marketplace/redhat-marketplace-shqh2" Oct 12 06:13:09 crc kubenswrapper[4930]: I1012 06:13:09.853765 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkv7s\" (UniqueName: \"kubernetes.io/projected/b69cc4bf-7e25-4d26-a759-82ba9cdf7f52-kube-api-access-tkv7s\") pod \"redhat-marketplace-shqh2\" (UID: \"b69cc4bf-7e25-4d26-a759-82ba9cdf7f52\") " pod="openshift-marketplace/redhat-marketplace-shqh2" Oct 12 06:13:09 crc kubenswrapper[4930]: I1012 06:13:09.853889 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b69cc4bf-7e25-4d26-a759-82ba9cdf7f52-catalog-content\") pod \"redhat-marketplace-shqh2\" (UID: \"b69cc4bf-7e25-4d26-a759-82ba9cdf7f52\") " pod="openshift-marketplace/redhat-marketplace-shqh2" Oct 12 06:13:09 crc kubenswrapper[4930]: I1012 06:13:09.853950 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b69cc4bf-7e25-4d26-a759-82ba9cdf7f52-utilities\") pod \"redhat-marketplace-shqh2\" (UID: \"b69cc4bf-7e25-4d26-a759-82ba9cdf7f52\") " pod="openshift-marketplace/redhat-marketplace-shqh2" Oct 12 06:13:09 crc kubenswrapper[4930]: I1012 06:13:09.854376 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b69cc4bf-7e25-4d26-a759-82ba9cdf7f52-catalog-content\") pod \"redhat-marketplace-shqh2\" (UID: \"b69cc4bf-7e25-4d26-a759-82ba9cdf7f52\") " pod="openshift-marketplace/redhat-marketplace-shqh2" Oct 12 06:13:09 crc kubenswrapper[4930]: I1012 06:13:09.854413 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b69cc4bf-7e25-4d26-a759-82ba9cdf7f52-utilities\") pod \"redhat-marketplace-shqh2\" (UID: \"b69cc4bf-7e25-4d26-a759-82ba9cdf7f52\") " pod="openshift-marketplace/redhat-marketplace-shqh2" Oct 12 06:13:09 crc kubenswrapper[4930]: I1012 06:13:09.878148 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkv7s\" (UniqueName: \"kubernetes.io/projected/b69cc4bf-7e25-4d26-a759-82ba9cdf7f52-kube-api-access-tkv7s\") pod \"redhat-marketplace-shqh2\" (UID: \"b69cc4bf-7e25-4d26-a759-82ba9cdf7f52\") " pod="openshift-marketplace/redhat-marketplace-shqh2" Oct 12 06:13:10 crc kubenswrapper[4930]: I1012 06:13:10.011526 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-shqh2" Oct 12 06:13:10 crc kubenswrapper[4930]: I1012 06:13:10.499325 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-shqh2"] Oct 12 06:13:10 crc kubenswrapper[4930]: I1012 06:13:10.849029 4930 generic.go:334] "Generic (PLEG): container finished" podID="b69cc4bf-7e25-4d26-a759-82ba9cdf7f52" containerID="b3007c7f77e046389369eea2a0b7497e752ab9de71cdd6c6638e22cc7c614bed" exitCode=0 Oct 12 06:13:10 crc kubenswrapper[4930]: I1012 06:13:10.849141 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-shqh2" event={"ID":"b69cc4bf-7e25-4d26-a759-82ba9cdf7f52","Type":"ContainerDied","Data":"b3007c7f77e046389369eea2a0b7497e752ab9de71cdd6c6638e22cc7c614bed"} Oct 12 06:13:10 crc kubenswrapper[4930]: I1012 06:13:10.849371 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-shqh2" event={"ID":"b69cc4bf-7e25-4d26-a759-82ba9cdf7f52","Type":"ContainerStarted","Data":"d802f4b65173ad30a9422934d514980837aa78aa6c2063de4ce578d32e283651"} Oct 12 06:13:12 crc kubenswrapper[4930]: I1012 06:13:12.876129 4930 generic.go:334] "Generic (PLEG): container finished" podID="b69cc4bf-7e25-4d26-a759-82ba9cdf7f52" containerID="942c87da266126c8ba5ed31e69c0986336c75639bce462e6daca49498dae91cf" exitCode=0 Oct 12 06:13:12 crc kubenswrapper[4930]: I1012 06:13:12.876244 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-shqh2" event={"ID":"b69cc4bf-7e25-4d26-a759-82ba9cdf7f52","Type":"ContainerDied","Data":"942c87da266126c8ba5ed31e69c0986336c75639bce462e6daca49498dae91cf"} Oct 12 06:13:13 crc kubenswrapper[4930]: I1012 06:13:13.042829 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w7vqt"] Oct 12 06:13:13 crc kubenswrapper[4930]: I1012 06:13:13.044723 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w7vqt" Oct 12 06:13:13 crc kubenswrapper[4930]: I1012 06:13:13.048799 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 12 06:13:13 crc kubenswrapper[4930]: I1012 06:13:13.053871 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 12 06:13:13 crc kubenswrapper[4930]: I1012 06:13:13.054624 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44xr5\" (UniqueName: \"kubernetes.io/projected/eebc8efc-b160-4a75-a213-74fcf9c2595e-kube-api-access-44xr5\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-w7vqt\" (UID: \"eebc8efc-b160-4a75-a213-74fcf9c2595e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w7vqt" Oct 12 06:13:13 crc kubenswrapper[4930]: I1012 06:13:13.054983 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/eebc8efc-b160-4a75-a213-74fcf9c2595e-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-w7vqt\" (UID: \"eebc8efc-b160-4a75-a213-74fcf9c2595e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w7vqt" Oct 12 06:13:13 crc kubenswrapper[4930]: I1012 06:13:13.055194 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 12 06:13:13 crc kubenswrapper[4930]: I1012 06:13:13.055229 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eebc8efc-b160-4a75-a213-74fcf9c2595e-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-w7vqt\" (UID: \"eebc8efc-b160-4a75-a213-74fcf9c2595e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w7vqt" Oct 12 06:13:13 crc kubenswrapper[4930]: I1012 06:13:13.058618 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w7vqt"] Oct 12 06:13:13 crc kubenswrapper[4930]: I1012 06:13:13.059029 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vhlxg" Oct 12 06:13:13 crc kubenswrapper[4930]: I1012 06:13:13.160147 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eebc8efc-b160-4a75-a213-74fcf9c2595e-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-w7vqt\" (UID: \"eebc8efc-b160-4a75-a213-74fcf9c2595e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w7vqt" Oct 12 06:13:13 crc kubenswrapper[4930]: I1012 06:13:13.160275 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44xr5\" (UniqueName: \"kubernetes.io/projected/eebc8efc-b160-4a75-a213-74fcf9c2595e-kube-api-access-44xr5\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-w7vqt\" (UID: \"eebc8efc-b160-4a75-a213-74fcf9c2595e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w7vqt" Oct 12 06:13:13 crc kubenswrapper[4930]: I1012 06:13:13.160602 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/eebc8efc-b160-4a75-a213-74fcf9c2595e-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-w7vqt\" (UID: \"eebc8efc-b160-4a75-a213-74fcf9c2595e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w7vqt" Oct 12 06:13:13 crc kubenswrapper[4930]: I1012 06:13:13.169555 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/eebc8efc-b160-4a75-a213-74fcf9c2595e-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-w7vqt\" (UID: \"eebc8efc-b160-4a75-a213-74fcf9c2595e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w7vqt" Oct 12 06:13:13 crc kubenswrapper[4930]: I1012 06:13:13.169585 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eebc8efc-b160-4a75-a213-74fcf9c2595e-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-w7vqt\" (UID: \"eebc8efc-b160-4a75-a213-74fcf9c2595e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w7vqt" Oct 12 06:13:13 crc kubenswrapper[4930]: I1012 06:13:13.201015 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44xr5\" (UniqueName: \"kubernetes.io/projected/eebc8efc-b160-4a75-a213-74fcf9c2595e-kube-api-access-44xr5\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-w7vqt\" (UID: \"eebc8efc-b160-4a75-a213-74fcf9c2595e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w7vqt" Oct 12 06:13:13 crc kubenswrapper[4930]: I1012 06:13:13.395223 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w7vqt" Oct 12 06:13:13 crc kubenswrapper[4930]: I1012 06:13:13.887143 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-shqh2" event={"ID":"b69cc4bf-7e25-4d26-a759-82ba9cdf7f52","Type":"ContainerStarted","Data":"099fc532c2a3f28d2e064fd4939a609688c3d6454ebc04ff4d7dca2287a8600c"} Oct 12 06:13:13 crc kubenswrapper[4930]: I1012 06:13:13.919268 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-shqh2" podStartSLOduration=2.485219622 podStartE2EDuration="4.919253218s" podCreationTimestamp="2025-10-12 06:13:09 +0000 UTC" firstStartedPulling="2025-10-12 06:13:10.852262722 +0000 UTC m=+1923.394364497" lastFinishedPulling="2025-10-12 06:13:13.286296318 +0000 UTC m=+1925.828398093" observedRunningTime="2025-10-12 06:13:13.909984798 +0000 UTC m=+1926.452086563" watchObservedRunningTime="2025-10-12 06:13:13.919253218 +0000 UTC m=+1926.461354983" Oct 12 06:13:13 crc kubenswrapper[4930]: I1012 06:13:13.999180 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w7vqt"] Oct 12 06:13:14 crc kubenswrapper[4930]: W1012 06:13:14.002294 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeebc8efc_b160_4a75_a213_74fcf9c2595e.slice/crio-db2d923b9d6e284d4e62a09778e31e724eda8f702a71c510683c3bb3d974e827 WatchSource:0}: Error finding container db2d923b9d6e284d4e62a09778e31e724eda8f702a71c510683c3bb3d974e827: Status 404 returned error can't find the container with id db2d923b9d6e284d4e62a09778e31e724eda8f702a71c510683c3bb3d974e827 Oct 12 06:13:14 crc kubenswrapper[4930]: I1012 06:13:14.898496 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w7vqt" event={"ID":"eebc8efc-b160-4a75-a213-74fcf9c2595e","Type":"ContainerStarted","Data":"9b246dc801fc7aca4aa9bae8a4b118c7ff0475852c7eb163f78b9130079177e4"} Oct 12 06:13:14 crc kubenswrapper[4930]: I1012 06:13:14.898909 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w7vqt" event={"ID":"eebc8efc-b160-4a75-a213-74fcf9c2595e","Type":"ContainerStarted","Data":"db2d923b9d6e284d4e62a09778e31e724eda8f702a71c510683c3bb3d974e827"} Oct 12 06:13:14 crc kubenswrapper[4930]: I1012 06:13:14.919640 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w7vqt" podStartSLOduration=1.448340593 podStartE2EDuration="1.919612395s" podCreationTimestamp="2025-10-12 06:13:13 +0000 UTC" firstStartedPulling="2025-10-12 06:13:14.005145577 +0000 UTC m=+1926.547247342" lastFinishedPulling="2025-10-12 06:13:14.476417379 +0000 UTC m=+1927.018519144" observedRunningTime="2025-10-12 06:13:14.915407721 +0000 UTC m=+1927.457509516" watchObservedRunningTime="2025-10-12 06:13:14.919612395 +0000 UTC m=+1927.461714200" Oct 12 06:13:19 crc kubenswrapper[4930]: I1012 06:13:19.836877 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pmn98"] Oct 12 06:13:19 crc kubenswrapper[4930]: I1012 06:13:19.839670 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pmn98" Oct 12 06:13:19 crc kubenswrapper[4930]: I1012 06:13:19.853360 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pmn98"] Oct 12 06:13:20 crc kubenswrapper[4930]: I1012 06:13:20.002108 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff973a4e-9b08-424a-9a6b-ff8220afa0a4-catalog-content\") pod \"community-operators-pmn98\" (UID: \"ff973a4e-9b08-424a-9a6b-ff8220afa0a4\") " pod="openshift-marketplace/community-operators-pmn98" Oct 12 06:13:20 crc kubenswrapper[4930]: I1012 06:13:20.002287 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff973a4e-9b08-424a-9a6b-ff8220afa0a4-utilities\") pod \"community-operators-pmn98\" (UID: \"ff973a4e-9b08-424a-9a6b-ff8220afa0a4\") " pod="openshift-marketplace/community-operators-pmn98" Oct 12 06:13:20 crc kubenswrapper[4930]: I1012 06:13:20.002306 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dp696\" (UniqueName: \"kubernetes.io/projected/ff973a4e-9b08-424a-9a6b-ff8220afa0a4-kube-api-access-dp696\") pod \"community-operators-pmn98\" (UID: \"ff973a4e-9b08-424a-9a6b-ff8220afa0a4\") " pod="openshift-marketplace/community-operators-pmn98" Oct 12 06:13:20 crc kubenswrapper[4930]: I1012 06:13:20.011673 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-shqh2" Oct 12 06:13:20 crc kubenswrapper[4930]: I1012 06:13:20.012842 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-shqh2" Oct 12 06:13:20 crc kubenswrapper[4930]: I1012 06:13:20.059535 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-shqh2" Oct 12 06:13:20 crc kubenswrapper[4930]: I1012 06:13:20.104512 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff973a4e-9b08-424a-9a6b-ff8220afa0a4-utilities\") pod \"community-operators-pmn98\" (UID: \"ff973a4e-9b08-424a-9a6b-ff8220afa0a4\") " pod="openshift-marketplace/community-operators-pmn98" Oct 12 06:13:20 crc kubenswrapper[4930]: I1012 06:13:20.104568 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dp696\" (UniqueName: \"kubernetes.io/projected/ff973a4e-9b08-424a-9a6b-ff8220afa0a4-kube-api-access-dp696\") pod \"community-operators-pmn98\" (UID: \"ff973a4e-9b08-424a-9a6b-ff8220afa0a4\") " pod="openshift-marketplace/community-operators-pmn98" Oct 12 06:13:20 crc kubenswrapper[4930]: I1012 06:13:20.104620 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff973a4e-9b08-424a-9a6b-ff8220afa0a4-catalog-content\") pod \"community-operators-pmn98\" (UID: \"ff973a4e-9b08-424a-9a6b-ff8220afa0a4\") " pod="openshift-marketplace/community-operators-pmn98" Oct 12 06:13:20 crc kubenswrapper[4930]: I1012 06:13:20.105204 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff973a4e-9b08-424a-9a6b-ff8220afa0a4-utilities\") pod \"community-operators-pmn98\" (UID: \"ff973a4e-9b08-424a-9a6b-ff8220afa0a4\") " pod="openshift-marketplace/community-operators-pmn98" Oct 12 06:13:20 crc kubenswrapper[4930]: I1012 06:13:20.105222 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff973a4e-9b08-424a-9a6b-ff8220afa0a4-catalog-content\") pod \"community-operators-pmn98\" (UID: \"ff973a4e-9b08-424a-9a6b-ff8220afa0a4\") " pod="openshift-marketplace/community-operators-pmn98" Oct 12 06:13:20 crc kubenswrapper[4930]: I1012 06:13:20.135888 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dp696\" (UniqueName: \"kubernetes.io/projected/ff973a4e-9b08-424a-9a6b-ff8220afa0a4-kube-api-access-dp696\") pod \"community-operators-pmn98\" (UID: \"ff973a4e-9b08-424a-9a6b-ff8220afa0a4\") " pod="openshift-marketplace/community-operators-pmn98" Oct 12 06:13:20 crc kubenswrapper[4930]: I1012 06:13:20.174765 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pmn98" Oct 12 06:13:20 crc kubenswrapper[4930]: I1012 06:13:20.723604 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pmn98"] Oct 12 06:13:20 crc kubenswrapper[4930]: I1012 06:13:20.995647 4930 generic.go:334] "Generic (PLEG): container finished" podID="ff973a4e-9b08-424a-9a6b-ff8220afa0a4" containerID="a36cd7737086f8dcc590680056dd5e473a244911470bf143f2aa107569f3c780" exitCode=0 Oct 12 06:13:20 crc kubenswrapper[4930]: I1012 06:13:20.995979 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pmn98" event={"ID":"ff973a4e-9b08-424a-9a6b-ff8220afa0a4","Type":"ContainerDied","Data":"a36cd7737086f8dcc590680056dd5e473a244911470bf143f2aa107569f3c780"} Oct 12 06:13:20 crc kubenswrapper[4930]: I1012 06:13:20.998293 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pmn98" event={"ID":"ff973a4e-9b08-424a-9a6b-ff8220afa0a4","Type":"ContainerStarted","Data":"20a8f78756720ef741b4098fd7fcd2e44ecca789961af09cad8bef22a7002386"} Oct 12 06:13:21 crc kubenswrapper[4930]: I1012 06:13:21.059828 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-shqh2" Oct 12 06:13:22 crc kubenswrapper[4930]: I1012 06:13:22.013473 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pmn98" event={"ID":"ff973a4e-9b08-424a-9a6b-ff8220afa0a4","Type":"ContainerStarted","Data":"6d1fddb02a0933c881ac670693a28ac648cc43b1b13450d84ce2ae59f78088f2"} Oct 12 06:13:24 crc kubenswrapper[4930]: I1012 06:13:24.036858 4930 generic.go:334] "Generic (PLEG): container finished" podID="ff973a4e-9b08-424a-9a6b-ff8220afa0a4" containerID="6d1fddb02a0933c881ac670693a28ac648cc43b1b13450d84ce2ae59f78088f2" exitCode=0 Oct 12 06:13:24 crc kubenswrapper[4930]: I1012 06:13:24.036967 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pmn98" event={"ID":"ff973a4e-9b08-424a-9a6b-ff8220afa0a4","Type":"ContainerDied","Data":"6d1fddb02a0933c881ac670693a28ac648cc43b1b13450d84ce2ae59f78088f2"} Oct 12 06:13:24 crc kubenswrapper[4930]: I1012 06:13:24.636713 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-shqh2"] Oct 12 06:13:24 crc kubenswrapper[4930]: I1012 06:13:24.636976 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-shqh2" podUID="b69cc4bf-7e25-4d26-a759-82ba9cdf7f52" containerName="registry-server" containerID="cri-o://099fc532c2a3f28d2e064fd4939a609688c3d6454ebc04ff4d7dca2287a8600c" gracePeriod=2 Oct 12 06:13:25 crc kubenswrapper[4930]: I1012 06:13:25.053060 4930 generic.go:334] "Generic (PLEG): container finished" podID="b69cc4bf-7e25-4d26-a759-82ba9cdf7f52" containerID="099fc532c2a3f28d2e064fd4939a609688c3d6454ebc04ff4d7dca2287a8600c" exitCode=0 Oct 12 06:13:25 crc kubenswrapper[4930]: I1012 06:13:25.053104 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-shqh2" event={"ID":"b69cc4bf-7e25-4d26-a759-82ba9cdf7f52","Type":"ContainerDied","Data":"099fc532c2a3f28d2e064fd4939a609688c3d6454ebc04ff4d7dca2287a8600c"} Oct 12 06:13:25 crc kubenswrapper[4930]: I1012 06:13:25.055761 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pmn98" event={"ID":"ff973a4e-9b08-424a-9a6b-ff8220afa0a4","Type":"ContainerStarted","Data":"582b68829d1ab4e7b1e3b7ec8ae701ac841a793a79c119eaf853c227f2bc3af0"} Oct 12 06:13:25 crc kubenswrapper[4930]: I1012 06:13:25.081499 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pmn98" podStartSLOduration=2.567861375 podStartE2EDuration="6.081482861s" podCreationTimestamp="2025-10-12 06:13:19 +0000 UTC" firstStartedPulling="2025-10-12 06:13:20.998217604 +0000 UTC m=+1933.540319369" lastFinishedPulling="2025-10-12 06:13:24.51183907 +0000 UTC m=+1937.053940855" observedRunningTime="2025-10-12 06:13:25.076837156 +0000 UTC m=+1937.618938931" watchObservedRunningTime="2025-10-12 06:13:25.081482861 +0000 UTC m=+1937.623584626" Oct 12 06:13:25 crc kubenswrapper[4930]: I1012 06:13:25.149420 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-shqh2" Oct 12 06:13:25 crc kubenswrapper[4930]: I1012 06:13:25.320245 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b69cc4bf-7e25-4d26-a759-82ba9cdf7f52-catalog-content\") pod \"b69cc4bf-7e25-4d26-a759-82ba9cdf7f52\" (UID: \"b69cc4bf-7e25-4d26-a759-82ba9cdf7f52\") " Oct 12 06:13:25 crc kubenswrapper[4930]: I1012 06:13:25.320812 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tkv7s\" (UniqueName: \"kubernetes.io/projected/b69cc4bf-7e25-4d26-a759-82ba9cdf7f52-kube-api-access-tkv7s\") pod \"b69cc4bf-7e25-4d26-a759-82ba9cdf7f52\" (UID: \"b69cc4bf-7e25-4d26-a759-82ba9cdf7f52\") " Oct 12 06:13:25 crc kubenswrapper[4930]: I1012 06:13:25.320990 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b69cc4bf-7e25-4d26-a759-82ba9cdf7f52-utilities\") pod \"b69cc4bf-7e25-4d26-a759-82ba9cdf7f52\" (UID: \"b69cc4bf-7e25-4d26-a759-82ba9cdf7f52\") " Oct 12 06:13:25 crc kubenswrapper[4930]: I1012 06:13:25.321679 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b69cc4bf-7e25-4d26-a759-82ba9cdf7f52-utilities" (OuterVolumeSpecName: "utilities") pod "b69cc4bf-7e25-4d26-a759-82ba9cdf7f52" (UID: "b69cc4bf-7e25-4d26-a759-82ba9cdf7f52"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 06:13:25 crc kubenswrapper[4930]: I1012 06:13:25.326914 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b69cc4bf-7e25-4d26-a759-82ba9cdf7f52-kube-api-access-tkv7s" (OuterVolumeSpecName: "kube-api-access-tkv7s") pod "b69cc4bf-7e25-4d26-a759-82ba9cdf7f52" (UID: "b69cc4bf-7e25-4d26-a759-82ba9cdf7f52"). InnerVolumeSpecName "kube-api-access-tkv7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:13:25 crc kubenswrapper[4930]: I1012 06:13:25.341875 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b69cc4bf-7e25-4d26-a759-82ba9cdf7f52-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b69cc4bf-7e25-4d26-a759-82ba9cdf7f52" (UID: "b69cc4bf-7e25-4d26-a759-82ba9cdf7f52"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 06:13:25 crc kubenswrapper[4930]: I1012 06:13:25.423163 4930 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b69cc4bf-7e25-4d26-a759-82ba9cdf7f52-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 06:13:25 crc kubenswrapper[4930]: I1012 06:13:25.423207 4930 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b69cc4bf-7e25-4d26-a759-82ba9cdf7f52-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 06:13:25 crc kubenswrapper[4930]: I1012 06:13:25.423228 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tkv7s\" (UniqueName: \"kubernetes.io/projected/b69cc4bf-7e25-4d26-a759-82ba9cdf7f52-kube-api-access-tkv7s\") on node \"crc\" DevicePath \"\"" Oct 12 06:13:26 crc kubenswrapper[4930]: I1012 06:13:26.069448 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-shqh2" event={"ID":"b69cc4bf-7e25-4d26-a759-82ba9cdf7f52","Type":"ContainerDied","Data":"d802f4b65173ad30a9422934d514980837aa78aa6c2063de4ce578d32e283651"} Oct 12 06:13:26 crc kubenswrapper[4930]: I1012 06:13:26.069502 4930 scope.go:117] "RemoveContainer" containerID="099fc532c2a3f28d2e064fd4939a609688c3d6454ebc04ff4d7dca2287a8600c" Oct 12 06:13:26 crc kubenswrapper[4930]: I1012 06:13:26.069532 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-shqh2" Oct 12 06:13:26 crc kubenswrapper[4930]: I1012 06:13:26.107847 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-shqh2"] Oct 12 06:13:26 crc kubenswrapper[4930]: I1012 06:13:26.122035 4930 scope.go:117] "RemoveContainer" containerID="942c87da266126c8ba5ed31e69c0986336c75639bce462e6daca49498dae91cf" Oct 12 06:13:26 crc kubenswrapper[4930]: I1012 06:13:26.126712 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-shqh2"] Oct 12 06:13:26 crc kubenswrapper[4930]: I1012 06:13:26.144889 4930 scope.go:117] "RemoveContainer" containerID="b3007c7f77e046389369eea2a0b7497e752ab9de71cdd6c6638e22cc7c614bed" Oct 12 06:13:26 crc kubenswrapper[4930]: I1012 06:13:26.151325 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b69cc4bf-7e25-4d26-a759-82ba9cdf7f52" path="/var/lib/kubelet/pods/b69cc4bf-7e25-4d26-a759-82ba9cdf7f52/volumes" Oct 12 06:13:30 crc kubenswrapper[4930]: I1012 06:13:30.175373 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pmn98" Oct 12 06:13:30 crc kubenswrapper[4930]: I1012 06:13:30.176118 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pmn98" Oct 12 06:13:30 crc kubenswrapper[4930]: I1012 06:13:30.238044 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pmn98" Oct 12 06:13:31 crc kubenswrapper[4930]: I1012 06:13:31.188503 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pmn98" Oct 12 06:13:33 crc kubenswrapper[4930]: I1012 06:13:33.837969 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pmn98"] Oct 12 06:13:33 crc kubenswrapper[4930]: I1012 06:13:33.838755 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-pmn98" podUID="ff973a4e-9b08-424a-9a6b-ff8220afa0a4" containerName="registry-server" containerID="cri-o://582b68829d1ab4e7b1e3b7ec8ae701ac841a793a79c119eaf853c227f2bc3af0" gracePeriod=2 Oct 12 06:13:34 crc kubenswrapper[4930]: I1012 06:13:34.155792 4930 generic.go:334] "Generic (PLEG): container finished" podID="ff973a4e-9b08-424a-9a6b-ff8220afa0a4" containerID="582b68829d1ab4e7b1e3b7ec8ae701ac841a793a79c119eaf853c227f2bc3af0" exitCode=0 Oct 12 06:13:34 crc kubenswrapper[4930]: I1012 06:13:34.178966 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pmn98" event={"ID":"ff973a4e-9b08-424a-9a6b-ff8220afa0a4","Type":"ContainerDied","Data":"582b68829d1ab4e7b1e3b7ec8ae701ac841a793a79c119eaf853c227f2bc3af0"} Oct 12 06:13:34 crc kubenswrapper[4930]: I1012 06:13:34.324756 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pmn98" Oct 12 06:13:34 crc kubenswrapper[4930]: I1012 06:13:34.418285 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff973a4e-9b08-424a-9a6b-ff8220afa0a4-catalog-content\") pod \"ff973a4e-9b08-424a-9a6b-ff8220afa0a4\" (UID: \"ff973a4e-9b08-424a-9a6b-ff8220afa0a4\") " Oct 12 06:13:34 crc kubenswrapper[4930]: I1012 06:13:34.418422 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff973a4e-9b08-424a-9a6b-ff8220afa0a4-utilities\") pod \"ff973a4e-9b08-424a-9a6b-ff8220afa0a4\" (UID: \"ff973a4e-9b08-424a-9a6b-ff8220afa0a4\") " Oct 12 06:13:34 crc kubenswrapper[4930]: I1012 06:13:34.418551 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dp696\" (UniqueName: \"kubernetes.io/projected/ff973a4e-9b08-424a-9a6b-ff8220afa0a4-kube-api-access-dp696\") pod \"ff973a4e-9b08-424a-9a6b-ff8220afa0a4\" (UID: \"ff973a4e-9b08-424a-9a6b-ff8220afa0a4\") " Oct 12 06:13:34 crc kubenswrapper[4930]: I1012 06:13:34.419905 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff973a4e-9b08-424a-9a6b-ff8220afa0a4-utilities" (OuterVolumeSpecName: "utilities") pod "ff973a4e-9b08-424a-9a6b-ff8220afa0a4" (UID: "ff973a4e-9b08-424a-9a6b-ff8220afa0a4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 06:13:34 crc kubenswrapper[4930]: I1012 06:13:34.425832 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff973a4e-9b08-424a-9a6b-ff8220afa0a4-kube-api-access-dp696" (OuterVolumeSpecName: "kube-api-access-dp696") pod "ff973a4e-9b08-424a-9a6b-ff8220afa0a4" (UID: "ff973a4e-9b08-424a-9a6b-ff8220afa0a4"). InnerVolumeSpecName "kube-api-access-dp696". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:13:34 crc kubenswrapper[4930]: I1012 06:13:34.480813 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff973a4e-9b08-424a-9a6b-ff8220afa0a4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ff973a4e-9b08-424a-9a6b-ff8220afa0a4" (UID: "ff973a4e-9b08-424a-9a6b-ff8220afa0a4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 06:13:34 crc kubenswrapper[4930]: I1012 06:13:34.520700 4930 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff973a4e-9b08-424a-9a6b-ff8220afa0a4-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 06:13:34 crc kubenswrapper[4930]: I1012 06:13:34.520845 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dp696\" (UniqueName: \"kubernetes.io/projected/ff973a4e-9b08-424a-9a6b-ff8220afa0a4-kube-api-access-dp696\") on node \"crc\" DevicePath \"\"" Oct 12 06:13:34 crc kubenswrapper[4930]: I1012 06:13:34.520863 4930 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff973a4e-9b08-424a-9a6b-ff8220afa0a4-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 06:13:35 crc kubenswrapper[4930]: I1012 06:13:35.166153 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pmn98" event={"ID":"ff973a4e-9b08-424a-9a6b-ff8220afa0a4","Type":"ContainerDied","Data":"20a8f78756720ef741b4098fd7fcd2e44ecca789961af09cad8bef22a7002386"} Oct 12 06:13:35 crc kubenswrapper[4930]: I1012 06:13:35.166257 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pmn98" Oct 12 06:13:35 crc kubenswrapper[4930]: I1012 06:13:35.166403 4930 scope.go:117] "RemoveContainer" containerID="582b68829d1ab4e7b1e3b7ec8ae701ac841a793a79c119eaf853c227f2bc3af0" Oct 12 06:13:35 crc kubenswrapper[4930]: I1012 06:13:35.187353 4930 scope.go:117] "RemoveContainer" containerID="6d1fddb02a0933c881ac670693a28ac648cc43b1b13450d84ce2ae59f78088f2" Oct 12 06:13:35 crc kubenswrapper[4930]: I1012 06:13:35.212521 4930 scope.go:117] "RemoveContainer" containerID="a36cd7737086f8dcc590680056dd5e473a244911470bf143f2aa107569f3c780" Oct 12 06:13:35 crc kubenswrapper[4930]: I1012 06:13:35.214638 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pmn98"] Oct 12 06:13:35 crc kubenswrapper[4930]: I1012 06:13:35.223263 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-pmn98"] Oct 12 06:13:36 crc kubenswrapper[4930]: I1012 06:13:36.150371 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff973a4e-9b08-424a-9a6b-ff8220afa0a4" path="/var/lib/kubelet/pods/ff973a4e-9b08-424a-9a6b-ff8220afa0a4/volumes" Oct 12 06:14:07 crc kubenswrapper[4930]: I1012 06:14:07.573563 4930 generic.go:334] "Generic (PLEG): container finished" podID="eebc8efc-b160-4a75-a213-74fcf9c2595e" containerID="9b246dc801fc7aca4aa9bae8a4b118c7ff0475852c7eb163f78b9130079177e4" exitCode=0 Oct 12 06:14:07 crc kubenswrapper[4930]: I1012 06:14:07.573608 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w7vqt" event={"ID":"eebc8efc-b160-4a75-a213-74fcf9c2595e","Type":"ContainerDied","Data":"9b246dc801fc7aca4aa9bae8a4b118c7ff0475852c7eb163f78b9130079177e4"} Oct 12 06:14:09 crc kubenswrapper[4930]: I1012 06:14:09.142238 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w7vqt" Oct 12 06:14:09 crc kubenswrapper[4930]: I1012 06:14:09.234413 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eebc8efc-b160-4a75-a213-74fcf9c2595e-inventory\") pod \"eebc8efc-b160-4a75-a213-74fcf9c2595e\" (UID: \"eebc8efc-b160-4a75-a213-74fcf9c2595e\") " Oct 12 06:14:09 crc kubenswrapper[4930]: I1012 06:14:09.234518 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44xr5\" (UniqueName: \"kubernetes.io/projected/eebc8efc-b160-4a75-a213-74fcf9c2595e-kube-api-access-44xr5\") pod \"eebc8efc-b160-4a75-a213-74fcf9c2595e\" (UID: \"eebc8efc-b160-4a75-a213-74fcf9c2595e\") " Oct 12 06:14:09 crc kubenswrapper[4930]: I1012 06:14:09.234556 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/eebc8efc-b160-4a75-a213-74fcf9c2595e-ssh-key\") pod \"eebc8efc-b160-4a75-a213-74fcf9c2595e\" (UID: \"eebc8efc-b160-4a75-a213-74fcf9c2595e\") " Oct 12 06:14:09 crc kubenswrapper[4930]: I1012 06:14:09.243334 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eebc8efc-b160-4a75-a213-74fcf9c2595e-kube-api-access-44xr5" (OuterVolumeSpecName: "kube-api-access-44xr5") pod "eebc8efc-b160-4a75-a213-74fcf9c2595e" (UID: "eebc8efc-b160-4a75-a213-74fcf9c2595e"). InnerVolumeSpecName "kube-api-access-44xr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:14:09 crc kubenswrapper[4930]: I1012 06:14:09.275395 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eebc8efc-b160-4a75-a213-74fcf9c2595e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "eebc8efc-b160-4a75-a213-74fcf9c2595e" (UID: "eebc8efc-b160-4a75-a213-74fcf9c2595e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:14:09 crc kubenswrapper[4930]: I1012 06:14:09.288843 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eebc8efc-b160-4a75-a213-74fcf9c2595e-inventory" (OuterVolumeSpecName: "inventory") pod "eebc8efc-b160-4a75-a213-74fcf9c2595e" (UID: "eebc8efc-b160-4a75-a213-74fcf9c2595e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:14:09 crc kubenswrapper[4930]: I1012 06:14:09.337264 4930 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eebc8efc-b160-4a75-a213-74fcf9c2595e-inventory\") on node \"crc\" DevicePath \"\"" Oct 12 06:14:09 crc kubenswrapper[4930]: I1012 06:14:09.337314 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44xr5\" (UniqueName: \"kubernetes.io/projected/eebc8efc-b160-4a75-a213-74fcf9c2595e-kube-api-access-44xr5\") on node \"crc\" DevicePath \"\"" Oct 12 06:14:09 crc kubenswrapper[4930]: I1012 06:14:09.337336 4930 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/eebc8efc-b160-4a75-a213-74fcf9c2595e-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 12 06:14:09 crc kubenswrapper[4930]: I1012 06:14:09.598637 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w7vqt" event={"ID":"eebc8efc-b160-4a75-a213-74fcf9c2595e","Type":"ContainerDied","Data":"db2d923b9d6e284d4e62a09778e31e724eda8f702a71c510683c3bb3d974e827"} Oct 12 06:14:09 crc kubenswrapper[4930]: I1012 06:14:09.598704 4930 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db2d923b9d6e284d4e62a09778e31e724eda8f702a71c510683c3bb3d974e827" Oct 12 06:14:09 crc kubenswrapper[4930]: I1012 06:14:09.598719 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w7vqt" Oct 12 06:14:09 crc kubenswrapper[4930]: I1012 06:14:09.735918 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-xksvf"] Oct 12 06:14:09 crc kubenswrapper[4930]: E1012 06:14:09.736448 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff973a4e-9b08-424a-9a6b-ff8220afa0a4" containerName="registry-server" Oct 12 06:14:09 crc kubenswrapper[4930]: I1012 06:14:09.736480 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff973a4e-9b08-424a-9a6b-ff8220afa0a4" containerName="registry-server" Oct 12 06:14:09 crc kubenswrapper[4930]: E1012 06:14:09.736502 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eebc8efc-b160-4a75-a213-74fcf9c2595e" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 12 06:14:09 crc kubenswrapper[4930]: I1012 06:14:09.736515 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="eebc8efc-b160-4a75-a213-74fcf9c2595e" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 12 06:14:09 crc kubenswrapper[4930]: E1012 06:14:09.736550 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b69cc4bf-7e25-4d26-a759-82ba9cdf7f52" containerName="extract-content" Oct 12 06:14:09 crc kubenswrapper[4930]: I1012 06:14:09.736561 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="b69cc4bf-7e25-4d26-a759-82ba9cdf7f52" containerName="extract-content" Oct 12 06:14:09 crc kubenswrapper[4930]: E1012 06:14:09.736587 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b69cc4bf-7e25-4d26-a759-82ba9cdf7f52" containerName="registry-server" Oct 12 06:14:09 crc kubenswrapper[4930]: I1012 06:14:09.736596 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="b69cc4bf-7e25-4d26-a759-82ba9cdf7f52" containerName="registry-server" Oct 12 06:14:09 crc kubenswrapper[4930]: E1012 06:14:09.736615 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff973a4e-9b08-424a-9a6b-ff8220afa0a4" containerName="extract-utilities" Oct 12 06:14:09 crc kubenswrapper[4930]: I1012 06:14:09.736628 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff973a4e-9b08-424a-9a6b-ff8220afa0a4" containerName="extract-utilities" Oct 12 06:14:09 crc kubenswrapper[4930]: E1012 06:14:09.736660 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff973a4e-9b08-424a-9a6b-ff8220afa0a4" containerName="extract-content" Oct 12 06:14:09 crc kubenswrapper[4930]: I1012 06:14:09.736670 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff973a4e-9b08-424a-9a6b-ff8220afa0a4" containerName="extract-content" Oct 12 06:14:09 crc kubenswrapper[4930]: E1012 06:14:09.736683 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b69cc4bf-7e25-4d26-a759-82ba9cdf7f52" containerName="extract-utilities" Oct 12 06:14:09 crc kubenswrapper[4930]: I1012 06:14:09.736692 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="b69cc4bf-7e25-4d26-a759-82ba9cdf7f52" containerName="extract-utilities" Oct 12 06:14:09 crc kubenswrapper[4930]: I1012 06:14:09.737015 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="b69cc4bf-7e25-4d26-a759-82ba9cdf7f52" containerName="registry-server" Oct 12 06:14:09 crc kubenswrapper[4930]: I1012 06:14:09.737074 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff973a4e-9b08-424a-9a6b-ff8220afa0a4" containerName="registry-server" Oct 12 06:14:09 crc kubenswrapper[4930]: I1012 06:14:09.737103 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="eebc8efc-b160-4a75-a213-74fcf9c2595e" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 12 06:14:09 crc kubenswrapper[4930]: I1012 06:14:09.738080 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-xksvf" Oct 12 06:14:09 crc kubenswrapper[4930]: I1012 06:14:09.740053 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 12 06:14:09 crc kubenswrapper[4930]: I1012 06:14:09.745123 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 12 06:14:09 crc kubenswrapper[4930]: I1012 06:14:09.745391 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vhlxg" Oct 12 06:14:09 crc kubenswrapper[4930]: I1012 06:14:09.745701 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-xksvf"] Oct 12 06:14:09 crc kubenswrapper[4930]: I1012 06:14:09.747433 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 12 06:14:09 crc kubenswrapper[4930]: I1012 06:14:09.855909 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79x95\" (UniqueName: \"kubernetes.io/projected/0b9f3033-c01f-46bf-9d12-3e60310ec6f3-kube-api-access-79x95\") pod \"ssh-known-hosts-edpm-deployment-xksvf\" (UID: \"0b9f3033-c01f-46bf-9d12-3e60310ec6f3\") " pod="openstack/ssh-known-hosts-edpm-deployment-xksvf" Oct 12 06:14:09 crc kubenswrapper[4930]: I1012 06:14:09.855973 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/0b9f3033-c01f-46bf-9d12-3e60310ec6f3-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-xksvf\" (UID: \"0b9f3033-c01f-46bf-9d12-3e60310ec6f3\") " pod="openstack/ssh-known-hosts-edpm-deployment-xksvf" Oct 12 06:14:09 crc kubenswrapper[4930]: I1012 06:14:09.856017 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0b9f3033-c01f-46bf-9d12-3e60310ec6f3-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-xksvf\" (UID: \"0b9f3033-c01f-46bf-9d12-3e60310ec6f3\") " pod="openstack/ssh-known-hosts-edpm-deployment-xksvf" Oct 12 06:14:09 crc kubenswrapper[4930]: I1012 06:14:09.958532 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79x95\" (UniqueName: \"kubernetes.io/projected/0b9f3033-c01f-46bf-9d12-3e60310ec6f3-kube-api-access-79x95\") pod \"ssh-known-hosts-edpm-deployment-xksvf\" (UID: \"0b9f3033-c01f-46bf-9d12-3e60310ec6f3\") " pod="openstack/ssh-known-hosts-edpm-deployment-xksvf" Oct 12 06:14:09 crc kubenswrapper[4930]: I1012 06:14:09.958621 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/0b9f3033-c01f-46bf-9d12-3e60310ec6f3-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-xksvf\" (UID: \"0b9f3033-c01f-46bf-9d12-3e60310ec6f3\") " pod="openstack/ssh-known-hosts-edpm-deployment-xksvf" Oct 12 06:14:09 crc kubenswrapper[4930]: I1012 06:14:09.958682 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0b9f3033-c01f-46bf-9d12-3e60310ec6f3-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-xksvf\" (UID: \"0b9f3033-c01f-46bf-9d12-3e60310ec6f3\") " pod="openstack/ssh-known-hosts-edpm-deployment-xksvf" Oct 12 06:14:09 crc kubenswrapper[4930]: I1012 06:14:09.966884 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0b9f3033-c01f-46bf-9d12-3e60310ec6f3-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-xksvf\" (UID: \"0b9f3033-c01f-46bf-9d12-3e60310ec6f3\") " pod="openstack/ssh-known-hosts-edpm-deployment-xksvf" Oct 12 06:14:09 crc kubenswrapper[4930]: I1012 06:14:09.967064 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/0b9f3033-c01f-46bf-9d12-3e60310ec6f3-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-xksvf\" (UID: \"0b9f3033-c01f-46bf-9d12-3e60310ec6f3\") " pod="openstack/ssh-known-hosts-edpm-deployment-xksvf" Oct 12 06:14:09 crc kubenswrapper[4930]: I1012 06:14:09.988844 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79x95\" (UniqueName: \"kubernetes.io/projected/0b9f3033-c01f-46bf-9d12-3e60310ec6f3-kube-api-access-79x95\") pod \"ssh-known-hosts-edpm-deployment-xksvf\" (UID: \"0b9f3033-c01f-46bf-9d12-3e60310ec6f3\") " pod="openstack/ssh-known-hosts-edpm-deployment-xksvf" Oct 12 06:14:10 crc kubenswrapper[4930]: I1012 06:14:10.060262 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-xksvf" Oct 12 06:14:10 crc kubenswrapper[4930]: I1012 06:14:10.634262 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-xksvf"] Oct 12 06:14:10 crc kubenswrapper[4930]: W1012 06:14:10.638214 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0b9f3033_c01f_46bf_9d12_3e60310ec6f3.slice/crio-8aa3ade66b17f4f83e17f7aff537a607b04e012407896c698dfe729c64e27378 WatchSource:0}: Error finding container 8aa3ade66b17f4f83e17f7aff537a607b04e012407896c698dfe729c64e27378: Status 404 returned error can't find the container with id 8aa3ade66b17f4f83e17f7aff537a607b04e012407896c698dfe729c64e27378 Oct 12 06:14:11 crc kubenswrapper[4930]: I1012 06:14:11.620078 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-xksvf" event={"ID":"0b9f3033-c01f-46bf-9d12-3e60310ec6f3","Type":"ContainerStarted","Data":"56a0af05cf3f546732119ee355ecf663848e597431e421d81331acbec20a3853"} Oct 12 06:14:11 crc kubenswrapper[4930]: I1012 06:14:11.620723 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-xksvf" event={"ID":"0b9f3033-c01f-46bf-9d12-3e60310ec6f3","Type":"ContainerStarted","Data":"8aa3ade66b17f4f83e17f7aff537a607b04e012407896c698dfe729c64e27378"} Oct 12 06:14:19 crc kubenswrapper[4930]: I1012 06:14:19.719083 4930 generic.go:334] "Generic (PLEG): container finished" podID="0b9f3033-c01f-46bf-9d12-3e60310ec6f3" containerID="56a0af05cf3f546732119ee355ecf663848e597431e421d81331acbec20a3853" exitCode=0 Oct 12 06:14:19 crc kubenswrapper[4930]: I1012 06:14:19.719348 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-xksvf" event={"ID":"0b9f3033-c01f-46bf-9d12-3e60310ec6f3","Type":"ContainerDied","Data":"56a0af05cf3f546732119ee355ecf663848e597431e421d81331acbec20a3853"} Oct 12 06:14:21 crc kubenswrapper[4930]: I1012 06:14:21.268142 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-xksvf" Oct 12 06:14:21 crc kubenswrapper[4930]: I1012 06:14:21.411352 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79x95\" (UniqueName: \"kubernetes.io/projected/0b9f3033-c01f-46bf-9d12-3e60310ec6f3-kube-api-access-79x95\") pod \"0b9f3033-c01f-46bf-9d12-3e60310ec6f3\" (UID: \"0b9f3033-c01f-46bf-9d12-3e60310ec6f3\") " Oct 12 06:14:21 crc kubenswrapper[4930]: I1012 06:14:21.411487 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/0b9f3033-c01f-46bf-9d12-3e60310ec6f3-inventory-0\") pod \"0b9f3033-c01f-46bf-9d12-3e60310ec6f3\" (UID: \"0b9f3033-c01f-46bf-9d12-3e60310ec6f3\") " Oct 12 06:14:21 crc kubenswrapper[4930]: I1012 06:14:21.411624 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0b9f3033-c01f-46bf-9d12-3e60310ec6f3-ssh-key-openstack-edpm-ipam\") pod \"0b9f3033-c01f-46bf-9d12-3e60310ec6f3\" (UID: \"0b9f3033-c01f-46bf-9d12-3e60310ec6f3\") " Oct 12 06:14:21 crc kubenswrapper[4930]: I1012 06:14:21.430823 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b9f3033-c01f-46bf-9d12-3e60310ec6f3-kube-api-access-79x95" (OuterVolumeSpecName: "kube-api-access-79x95") pod "0b9f3033-c01f-46bf-9d12-3e60310ec6f3" (UID: "0b9f3033-c01f-46bf-9d12-3e60310ec6f3"). InnerVolumeSpecName "kube-api-access-79x95". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:14:21 crc kubenswrapper[4930]: I1012 06:14:21.446203 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b9f3033-c01f-46bf-9d12-3e60310ec6f3-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "0b9f3033-c01f-46bf-9d12-3e60310ec6f3" (UID: "0b9f3033-c01f-46bf-9d12-3e60310ec6f3"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:14:21 crc kubenswrapper[4930]: I1012 06:14:21.464304 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b9f3033-c01f-46bf-9d12-3e60310ec6f3-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "0b9f3033-c01f-46bf-9d12-3e60310ec6f3" (UID: "0b9f3033-c01f-46bf-9d12-3e60310ec6f3"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:14:21 crc kubenswrapper[4930]: I1012 06:14:21.513931 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-79x95\" (UniqueName: \"kubernetes.io/projected/0b9f3033-c01f-46bf-9d12-3e60310ec6f3-kube-api-access-79x95\") on node \"crc\" DevicePath \"\"" Oct 12 06:14:21 crc kubenswrapper[4930]: I1012 06:14:21.513964 4930 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/0b9f3033-c01f-46bf-9d12-3e60310ec6f3-inventory-0\") on node \"crc\" DevicePath \"\"" Oct 12 06:14:21 crc kubenswrapper[4930]: I1012 06:14:21.513974 4930 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0b9f3033-c01f-46bf-9d12-3e60310ec6f3-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 12 06:14:21 crc kubenswrapper[4930]: I1012 06:14:21.746699 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-xksvf" event={"ID":"0b9f3033-c01f-46bf-9d12-3e60310ec6f3","Type":"ContainerDied","Data":"8aa3ade66b17f4f83e17f7aff537a607b04e012407896c698dfe729c64e27378"} Oct 12 06:14:21 crc kubenswrapper[4930]: I1012 06:14:21.747142 4930 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8aa3ade66b17f4f83e17f7aff537a607b04e012407896c698dfe729c64e27378" Oct 12 06:14:21 crc kubenswrapper[4930]: I1012 06:14:21.746807 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-xksvf" Oct 12 06:14:21 crc kubenswrapper[4930]: I1012 06:14:21.847252 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-nnxtv"] Oct 12 06:14:21 crc kubenswrapper[4930]: E1012 06:14:21.849394 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b9f3033-c01f-46bf-9d12-3e60310ec6f3" containerName="ssh-known-hosts-edpm-deployment" Oct 12 06:14:21 crc kubenswrapper[4930]: I1012 06:14:21.849685 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b9f3033-c01f-46bf-9d12-3e60310ec6f3" containerName="ssh-known-hosts-edpm-deployment" Oct 12 06:14:21 crc kubenswrapper[4930]: I1012 06:14:21.850879 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b9f3033-c01f-46bf-9d12-3e60310ec6f3" containerName="ssh-known-hosts-edpm-deployment" Oct 12 06:14:21 crc kubenswrapper[4930]: I1012 06:14:21.854081 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nnxtv" Oct 12 06:14:21 crc kubenswrapper[4930]: I1012 06:14:21.873309 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vhlxg" Oct 12 06:14:21 crc kubenswrapper[4930]: I1012 06:14:21.873685 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-nnxtv"] Oct 12 06:14:21 crc kubenswrapper[4930]: I1012 06:14:21.873689 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 12 06:14:21 crc kubenswrapper[4930]: I1012 06:14:21.873768 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 12 06:14:21 crc kubenswrapper[4930]: I1012 06:14:21.873779 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 12 06:14:22 crc kubenswrapper[4930]: I1012 06:14:22.025308 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d23402d4-c8f9-4e04-9972-f40037dadec9-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-nnxtv\" (UID: \"d23402d4-c8f9-4e04-9972-f40037dadec9\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nnxtv" Oct 12 06:14:22 crc kubenswrapper[4930]: I1012 06:14:22.025506 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d23402d4-c8f9-4e04-9972-f40037dadec9-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-nnxtv\" (UID: \"d23402d4-c8f9-4e04-9972-f40037dadec9\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nnxtv" Oct 12 06:14:22 crc kubenswrapper[4930]: I1012 06:14:22.025642 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbhgr\" (UniqueName: \"kubernetes.io/projected/d23402d4-c8f9-4e04-9972-f40037dadec9-kube-api-access-dbhgr\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-nnxtv\" (UID: \"d23402d4-c8f9-4e04-9972-f40037dadec9\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nnxtv" Oct 12 06:14:22 crc kubenswrapper[4930]: I1012 06:14:22.127599 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d23402d4-c8f9-4e04-9972-f40037dadec9-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-nnxtv\" (UID: \"d23402d4-c8f9-4e04-9972-f40037dadec9\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nnxtv" Oct 12 06:14:22 crc kubenswrapper[4930]: I1012 06:14:22.127669 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d23402d4-c8f9-4e04-9972-f40037dadec9-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-nnxtv\" (UID: \"d23402d4-c8f9-4e04-9972-f40037dadec9\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nnxtv" Oct 12 06:14:22 crc kubenswrapper[4930]: I1012 06:14:22.127729 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbhgr\" (UniqueName: \"kubernetes.io/projected/d23402d4-c8f9-4e04-9972-f40037dadec9-kube-api-access-dbhgr\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-nnxtv\" (UID: \"d23402d4-c8f9-4e04-9972-f40037dadec9\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nnxtv" Oct 12 06:14:22 crc kubenswrapper[4930]: I1012 06:14:22.133312 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d23402d4-c8f9-4e04-9972-f40037dadec9-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-nnxtv\" (UID: \"d23402d4-c8f9-4e04-9972-f40037dadec9\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nnxtv" Oct 12 06:14:22 crc kubenswrapper[4930]: I1012 06:14:22.133672 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d23402d4-c8f9-4e04-9972-f40037dadec9-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-nnxtv\" (UID: \"d23402d4-c8f9-4e04-9972-f40037dadec9\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nnxtv" Oct 12 06:14:22 crc kubenswrapper[4930]: I1012 06:14:22.155559 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbhgr\" (UniqueName: \"kubernetes.io/projected/d23402d4-c8f9-4e04-9972-f40037dadec9-kube-api-access-dbhgr\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-nnxtv\" (UID: \"d23402d4-c8f9-4e04-9972-f40037dadec9\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nnxtv" Oct 12 06:14:22 crc kubenswrapper[4930]: I1012 06:14:22.198404 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nnxtv" Oct 12 06:14:22 crc kubenswrapper[4930]: I1012 06:14:22.852512 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-nnxtv"] Oct 12 06:14:22 crc kubenswrapper[4930]: W1012 06:14:22.857899 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd23402d4_c8f9_4e04_9972_f40037dadec9.slice/crio-6b7a1706b6b19d6d6c116cb0d8ae34029aa77f4c85874d9196a3687345a6aedb WatchSource:0}: Error finding container 6b7a1706b6b19d6d6c116cb0d8ae34029aa77f4c85874d9196a3687345a6aedb: Status 404 returned error can't find the container with id 6b7a1706b6b19d6d6c116cb0d8ae34029aa77f4c85874d9196a3687345a6aedb Oct 12 06:14:23 crc kubenswrapper[4930]: I1012 06:14:23.774841 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nnxtv" event={"ID":"d23402d4-c8f9-4e04-9972-f40037dadec9","Type":"ContainerStarted","Data":"6b7a1706b6b19d6d6c116cb0d8ae34029aa77f4c85874d9196a3687345a6aedb"} Oct 12 06:14:24 crc kubenswrapper[4930]: I1012 06:14:24.787130 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nnxtv" event={"ID":"d23402d4-c8f9-4e04-9972-f40037dadec9","Type":"ContainerStarted","Data":"9f77bf2d348d6c675653271d2fab2290b37a97ced5c5885c3c5da0fd1e295299"} Oct 12 06:14:24 crc kubenswrapper[4930]: I1012 06:14:24.814485 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nnxtv" podStartSLOduration=3.162023119 podStartE2EDuration="3.814466112s" podCreationTimestamp="2025-10-12 06:14:21 +0000 UTC" firstStartedPulling="2025-10-12 06:14:22.861106871 +0000 UTC m=+1995.403208626" lastFinishedPulling="2025-10-12 06:14:23.513549814 +0000 UTC m=+1996.055651619" observedRunningTime="2025-10-12 06:14:24.805815797 +0000 UTC m=+1997.347917562" watchObservedRunningTime="2025-10-12 06:14:24.814466112 +0000 UTC m=+1997.356567867" Oct 12 06:14:33 crc kubenswrapper[4930]: I1012 06:14:33.918364 4930 generic.go:334] "Generic (PLEG): container finished" podID="d23402d4-c8f9-4e04-9972-f40037dadec9" containerID="9f77bf2d348d6c675653271d2fab2290b37a97ced5c5885c3c5da0fd1e295299" exitCode=0 Oct 12 06:14:33 crc kubenswrapper[4930]: I1012 06:14:33.918539 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nnxtv" event={"ID":"d23402d4-c8f9-4e04-9972-f40037dadec9","Type":"ContainerDied","Data":"9f77bf2d348d6c675653271d2fab2290b37a97ced5c5885c3c5da0fd1e295299"} Oct 12 06:14:35 crc kubenswrapper[4930]: I1012 06:14:35.518960 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nnxtv" Oct 12 06:14:35 crc kubenswrapper[4930]: I1012 06:14:35.558677 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d23402d4-c8f9-4e04-9972-f40037dadec9-ssh-key\") pod \"d23402d4-c8f9-4e04-9972-f40037dadec9\" (UID: \"d23402d4-c8f9-4e04-9972-f40037dadec9\") " Oct 12 06:14:35 crc kubenswrapper[4930]: I1012 06:14:35.558860 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d23402d4-c8f9-4e04-9972-f40037dadec9-inventory\") pod \"d23402d4-c8f9-4e04-9972-f40037dadec9\" (UID: \"d23402d4-c8f9-4e04-9972-f40037dadec9\") " Oct 12 06:14:35 crc kubenswrapper[4930]: I1012 06:14:35.558945 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbhgr\" (UniqueName: \"kubernetes.io/projected/d23402d4-c8f9-4e04-9972-f40037dadec9-kube-api-access-dbhgr\") pod \"d23402d4-c8f9-4e04-9972-f40037dadec9\" (UID: \"d23402d4-c8f9-4e04-9972-f40037dadec9\") " Oct 12 06:14:35 crc kubenswrapper[4930]: I1012 06:14:35.566947 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d23402d4-c8f9-4e04-9972-f40037dadec9-kube-api-access-dbhgr" (OuterVolumeSpecName: "kube-api-access-dbhgr") pod "d23402d4-c8f9-4e04-9972-f40037dadec9" (UID: "d23402d4-c8f9-4e04-9972-f40037dadec9"). InnerVolumeSpecName "kube-api-access-dbhgr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:14:35 crc kubenswrapper[4930]: I1012 06:14:35.603004 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d23402d4-c8f9-4e04-9972-f40037dadec9-inventory" (OuterVolumeSpecName: "inventory") pod "d23402d4-c8f9-4e04-9972-f40037dadec9" (UID: "d23402d4-c8f9-4e04-9972-f40037dadec9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:14:35 crc kubenswrapper[4930]: I1012 06:14:35.610709 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d23402d4-c8f9-4e04-9972-f40037dadec9-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d23402d4-c8f9-4e04-9972-f40037dadec9" (UID: "d23402d4-c8f9-4e04-9972-f40037dadec9"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:14:35 crc kubenswrapper[4930]: I1012 06:14:35.661970 4930 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d23402d4-c8f9-4e04-9972-f40037dadec9-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 12 06:14:35 crc kubenswrapper[4930]: I1012 06:14:35.662003 4930 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d23402d4-c8f9-4e04-9972-f40037dadec9-inventory\") on node \"crc\" DevicePath \"\"" Oct 12 06:14:35 crc kubenswrapper[4930]: I1012 06:14:35.662019 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbhgr\" (UniqueName: \"kubernetes.io/projected/d23402d4-c8f9-4e04-9972-f40037dadec9-kube-api-access-dbhgr\") on node \"crc\" DevicePath \"\"" Oct 12 06:14:35 crc kubenswrapper[4930]: I1012 06:14:35.945894 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nnxtv" event={"ID":"d23402d4-c8f9-4e04-9972-f40037dadec9","Type":"ContainerDied","Data":"6b7a1706b6b19d6d6c116cb0d8ae34029aa77f4c85874d9196a3687345a6aedb"} Oct 12 06:14:35 crc kubenswrapper[4930]: I1012 06:14:35.946306 4930 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b7a1706b6b19d6d6c116cb0d8ae34029aa77f4c85874d9196a3687345a6aedb" Oct 12 06:14:35 crc kubenswrapper[4930]: I1012 06:14:35.946152 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nnxtv" Oct 12 06:14:36 crc kubenswrapper[4930]: I1012 06:14:36.056687 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bgltt"] Oct 12 06:14:36 crc kubenswrapper[4930]: E1012 06:14:36.057330 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d23402d4-c8f9-4e04-9972-f40037dadec9" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 12 06:14:36 crc kubenswrapper[4930]: I1012 06:14:36.057355 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="d23402d4-c8f9-4e04-9972-f40037dadec9" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 12 06:14:36 crc kubenswrapper[4930]: I1012 06:14:36.057611 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="d23402d4-c8f9-4e04-9972-f40037dadec9" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 12 06:14:36 crc kubenswrapper[4930]: I1012 06:14:36.058590 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bgltt" Oct 12 06:14:36 crc kubenswrapper[4930]: I1012 06:14:36.062302 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 12 06:14:36 crc kubenswrapper[4930]: I1012 06:14:36.062343 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 12 06:14:36 crc kubenswrapper[4930]: I1012 06:14:36.063234 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 12 06:14:36 crc kubenswrapper[4930]: I1012 06:14:36.064728 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vhlxg" Oct 12 06:14:36 crc kubenswrapper[4930]: I1012 06:14:36.070161 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mb8bl\" (UniqueName: \"kubernetes.io/projected/5afc7c35-49e0-45d7-a3fd-ab6584abe8a7-kube-api-access-mb8bl\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-bgltt\" (UID: \"5afc7c35-49e0-45d7-a3fd-ab6584abe8a7\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bgltt" Oct 12 06:14:36 crc kubenswrapper[4930]: I1012 06:14:36.070355 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5afc7c35-49e0-45d7-a3fd-ab6584abe8a7-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-bgltt\" (UID: \"5afc7c35-49e0-45d7-a3fd-ab6584abe8a7\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bgltt" Oct 12 06:14:36 crc kubenswrapper[4930]: I1012 06:14:36.070538 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5afc7c35-49e0-45d7-a3fd-ab6584abe8a7-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-bgltt\" (UID: \"5afc7c35-49e0-45d7-a3fd-ab6584abe8a7\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bgltt" Oct 12 06:14:36 crc kubenswrapper[4930]: I1012 06:14:36.073179 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bgltt"] Oct 12 06:14:36 crc kubenswrapper[4930]: I1012 06:14:36.174692 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5afc7c35-49e0-45d7-a3fd-ab6584abe8a7-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-bgltt\" (UID: \"5afc7c35-49e0-45d7-a3fd-ab6584abe8a7\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bgltt" Oct 12 06:14:36 crc kubenswrapper[4930]: I1012 06:14:36.174774 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5afc7c35-49e0-45d7-a3fd-ab6584abe8a7-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-bgltt\" (UID: \"5afc7c35-49e0-45d7-a3fd-ab6584abe8a7\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bgltt" Oct 12 06:14:36 crc kubenswrapper[4930]: I1012 06:14:36.174874 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mb8bl\" (UniqueName: \"kubernetes.io/projected/5afc7c35-49e0-45d7-a3fd-ab6584abe8a7-kube-api-access-mb8bl\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-bgltt\" (UID: \"5afc7c35-49e0-45d7-a3fd-ab6584abe8a7\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bgltt" Oct 12 06:14:36 crc kubenswrapper[4930]: I1012 06:14:36.178769 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5afc7c35-49e0-45d7-a3fd-ab6584abe8a7-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-bgltt\" (UID: \"5afc7c35-49e0-45d7-a3fd-ab6584abe8a7\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bgltt" Oct 12 06:14:36 crc kubenswrapper[4930]: I1012 06:14:36.179303 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5afc7c35-49e0-45d7-a3fd-ab6584abe8a7-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-bgltt\" (UID: \"5afc7c35-49e0-45d7-a3fd-ab6584abe8a7\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bgltt" Oct 12 06:14:36 crc kubenswrapper[4930]: I1012 06:14:36.196492 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mb8bl\" (UniqueName: \"kubernetes.io/projected/5afc7c35-49e0-45d7-a3fd-ab6584abe8a7-kube-api-access-mb8bl\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-bgltt\" (UID: \"5afc7c35-49e0-45d7-a3fd-ab6584abe8a7\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bgltt" Oct 12 06:14:36 crc kubenswrapper[4930]: I1012 06:14:36.382982 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bgltt" Oct 12 06:14:36 crc kubenswrapper[4930]: I1012 06:14:36.980553 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bgltt"] Oct 12 06:14:36 crc kubenswrapper[4930]: W1012 06:14:36.993697 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5afc7c35_49e0_45d7_a3fd_ab6584abe8a7.slice/crio-5bc05fbb1465591178cbed4a13ff6f899b1205d67f0316db87997f444c27febb WatchSource:0}: Error finding container 5bc05fbb1465591178cbed4a13ff6f899b1205d67f0316db87997f444c27febb: Status 404 returned error can't find the container with id 5bc05fbb1465591178cbed4a13ff6f899b1205d67f0316db87997f444c27febb Oct 12 06:14:37 crc kubenswrapper[4930]: I1012 06:14:37.969509 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bgltt" event={"ID":"5afc7c35-49e0-45d7-a3fd-ab6584abe8a7","Type":"ContainerStarted","Data":"240d3d556c33f9481a7479174a30052f090994cc04b7e9a95a4d67d9c980b053"} Oct 12 06:14:37 crc kubenswrapper[4930]: I1012 06:14:37.969845 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bgltt" event={"ID":"5afc7c35-49e0-45d7-a3fd-ab6584abe8a7","Type":"ContainerStarted","Data":"5bc05fbb1465591178cbed4a13ff6f899b1205d67f0316db87997f444c27febb"} Oct 12 06:14:38 crc kubenswrapper[4930]: I1012 06:14:38.004548 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bgltt" podStartSLOduration=1.319822892 podStartE2EDuration="2.004526259s" podCreationTimestamp="2025-10-12 06:14:36 +0000 UTC" firstStartedPulling="2025-10-12 06:14:37.00073116 +0000 UTC m=+2009.542832985" lastFinishedPulling="2025-10-12 06:14:37.685434577 +0000 UTC m=+2010.227536352" observedRunningTime="2025-10-12 06:14:37.989715931 +0000 UTC m=+2010.531817736" watchObservedRunningTime="2025-10-12 06:14:38.004526259 +0000 UTC m=+2010.546628034" Oct 12 06:14:50 crc kubenswrapper[4930]: I1012 06:14:50.100077 4930 generic.go:334] "Generic (PLEG): container finished" podID="5afc7c35-49e0-45d7-a3fd-ab6584abe8a7" containerID="240d3d556c33f9481a7479174a30052f090994cc04b7e9a95a4d67d9c980b053" exitCode=0 Oct 12 06:14:50 crc kubenswrapper[4930]: I1012 06:14:50.100353 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bgltt" event={"ID":"5afc7c35-49e0-45d7-a3fd-ab6584abe8a7","Type":"ContainerDied","Data":"240d3d556c33f9481a7479174a30052f090994cc04b7e9a95a4d67d9c980b053"} Oct 12 06:14:51 crc kubenswrapper[4930]: I1012 06:14:51.611026 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bgltt" Oct 12 06:14:51 crc kubenswrapper[4930]: I1012 06:14:51.738829 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5afc7c35-49e0-45d7-a3fd-ab6584abe8a7-inventory\") pod \"5afc7c35-49e0-45d7-a3fd-ab6584abe8a7\" (UID: \"5afc7c35-49e0-45d7-a3fd-ab6584abe8a7\") " Oct 12 06:14:51 crc kubenswrapper[4930]: I1012 06:14:51.739010 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5afc7c35-49e0-45d7-a3fd-ab6584abe8a7-ssh-key\") pod \"5afc7c35-49e0-45d7-a3fd-ab6584abe8a7\" (UID: \"5afc7c35-49e0-45d7-a3fd-ab6584abe8a7\") " Oct 12 06:14:51 crc kubenswrapper[4930]: I1012 06:14:51.739155 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mb8bl\" (UniqueName: \"kubernetes.io/projected/5afc7c35-49e0-45d7-a3fd-ab6584abe8a7-kube-api-access-mb8bl\") pod \"5afc7c35-49e0-45d7-a3fd-ab6584abe8a7\" (UID: \"5afc7c35-49e0-45d7-a3fd-ab6584abe8a7\") " Oct 12 06:14:51 crc kubenswrapper[4930]: I1012 06:14:51.748979 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5afc7c35-49e0-45d7-a3fd-ab6584abe8a7-kube-api-access-mb8bl" (OuterVolumeSpecName: "kube-api-access-mb8bl") pod "5afc7c35-49e0-45d7-a3fd-ab6584abe8a7" (UID: "5afc7c35-49e0-45d7-a3fd-ab6584abe8a7"). InnerVolumeSpecName "kube-api-access-mb8bl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:14:51 crc kubenswrapper[4930]: I1012 06:14:51.768345 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5afc7c35-49e0-45d7-a3fd-ab6584abe8a7-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5afc7c35-49e0-45d7-a3fd-ab6584abe8a7" (UID: "5afc7c35-49e0-45d7-a3fd-ab6584abe8a7"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:14:51 crc kubenswrapper[4930]: I1012 06:14:51.786937 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5afc7c35-49e0-45d7-a3fd-ab6584abe8a7-inventory" (OuterVolumeSpecName: "inventory") pod "5afc7c35-49e0-45d7-a3fd-ab6584abe8a7" (UID: "5afc7c35-49e0-45d7-a3fd-ab6584abe8a7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:14:51 crc kubenswrapper[4930]: I1012 06:14:51.842053 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mb8bl\" (UniqueName: \"kubernetes.io/projected/5afc7c35-49e0-45d7-a3fd-ab6584abe8a7-kube-api-access-mb8bl\") on node \"crc\" DevicePath \"\"" Oct 12 06:14:51 crc kubenswrapper[4930]: I1012 06:14:51.842091 4930 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5afc7c35-49e0-45d7-a3fd-ab6584abe8a7-inventory\") on node \"crc\" DevicePath \"\"" Oct 12 06:14:51 crc kubenswrapper[4930]: I1012 06:14:51.842106 4930 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5afc7c35-49e0-45d7-a3fd-ab6584abe8a7-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 12 06:14:52 crc kubenswrapper[4930]: I1012 06:14:52.126621 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bgltt" event={"ID":"5afc7c35-49e0-45d7-a3fd-ab6584abe8a7","Type":"ContainerDied","Data":"5bc05fbb1465591178cbed4a13ff6f899b1205d67f0316db87997f444c27febb"} Oct 12 06:14:52 crc kubenswrapper[4930]: I1012 06:14:52.127043 4930 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5bc05fbb1465591178cbed4a13ff6f899b1205d67f0316db87997f444c27febb" Oct 12 06:14:52 crc kubenswrapper[4930]: I1012 06:14:52.126703 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bgltt" Oct 12 06:14:52 crc kubenswrapper[4930]: I1012 06:14:52.244288 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9"] Oct 12 06:14:52 crc kubenswrapper[4930]: E1012 06:14:52.244700 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5afc7c35-49e0-45d7-a3fd-ab6584abe8a7" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 12 06:14:52 crc kubenswrapper[4930]: I1012 06:14:52.244717 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="5afc7c35-49e0-45d7-a3fd-ab6584abe8a7" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 12 06:14:52 crc kubenswrapper[4930]: I1012 06:14:52.244918 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="5afc7c35-49e0-45d7-a3fd-ab6584abe8a7" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 12 06:14:52 crc kubenswrapper[4930]: I1012 06:14:52.245922 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9" Oct 12 06:14:52 crc kubenswrapper[4930]: I1012 06:14:52.257642 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Oct 12 06:14:52 crc kubenswrapper[4930]: I1012 06:14:52.257842 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Oct 12 06:14:52 crc kubenswrapper[4930]: I1012 06:14:52.257842 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Oct 12 06:14:52 crc kubenswrapper[4930]: I1012 06:14:52.257941 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 12 06:14:52 crc kubenswrapper[4930]: I1012 06:14:52.258015 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vhlxg" Oct 12 06:14:52 crc kubenswrapper[4930]: I1012 06:14:52.258166 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 12 06:14:52 crc kubenswrapper[4930]: I1012 06:14:52.258182 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Oct 12 06:14:52 crc kubenswrapper[4930]: I1012 06:14:52.258305 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 12 06:14:52 crc kubenswrapper[4930]: I1012 06:14:52.268589 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9"] Oct 12 06:14:52 crc kubenswrapper[4930]: I1012 06:14:52.360972 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/91938dad-6cac-4246-94d7-d93214ae2a5d-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9\" (UID: \"91938dad-6cac-4246-94d7-d93214ae2a5d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9" Oct 12 06:14:52 crc kubenswrapper[4930]: I1012 06:14:52.361066 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91938dad-6cac-4246-94d7-d93214ae2a5d-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9\" (UID: \"91938dad-6cac-4246-94d7-d93214ae2a5d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9" Oct 12 06:14:52 crc kubenswrapper[4930]: I1012 06:14:52.361162 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91938dad-6cac-4246-94d7-d93214ae2a5d-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9\" (UID: \"91938dad-6cac-4246-94d7-d93214ae2a5d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9" Oct 12 06:14:52 crc kubenswrapper[4930]: I1012 06:14:52.361200 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/91938dad-6cac-4246-94d7-d93214ae2a5d-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9\" (UID: \"91938dad-6cac-4246-94d7-d93214ae2a5d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9" Oct 12 06:14:52 crc kubenswrapper[4930]: I1012 06:14:52.361255 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91938dad-6cac-4246-94d7-d93214ae2a5d-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9\" (UID: \"91938dad-6cac-4246-94d7-d93214ae2a5d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9" Oct 12 06:14:52 crc kubenswrapper[4930]: I1012 06:14:52.361295 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzkg2\" (UniqueName: \"kubernetes.io/projected/91938dad-6cac-4246-94d7-d93214ae2a5d-kube-api-access-lzkg2\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9\" (UID: \"91938dad-6cac-4246-94d7-d93214ae2a5d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9" Oct 12 06:14:52 crc kubenswrapper[4930]: I1012 06:14:52.361370 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/91938dad-6cac-4246-94d7-d93214ae2a5d-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9\" (UID: \"91938dad-6cac-4246-94d7-d93214ae2a5d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9" Oct 12 06:14:52 crc kubenswrapper[4930]: I1012 06:14:52.361429 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/91938dad-6cac-4246-94d7-d93214ae2a5d-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9\" (UID: \"91938dad-6cac-4246-94d7-d93214ae2a5d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9" Oct 12 06:14:52 crc kubenswrapper[4930]: I1012 06:14:52.361503 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91938dad-6cac-4246-94d7-d93214ae2a5d-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9\" (UID: \"91938dad-6cac-4246-94d7-d93214ae2a5d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9" Oct 12 06:14:52 crc kubenswrapper[4930]: I1012 06:14:52.361570 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91938dad-6cac-4246-94d7-d93214ae2a5d-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9\" (UID: \"91938dad-6cac-4246-94d7-d93214ae2a5d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9" Oct 12 06:14:52 crc kubenswrapper[4930]: I1012 06:14:52.361631 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/91938dad-6cac-4246-94d7-d93214ae2a5d-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9\" (UID: \"91938dad-6cac-4246-94d7-d93214ae2a5d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9" Oct 12 06:14:52 crc kubenswrapper[4930]: I1012 06:14:52.361767 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91938dad-6cac-4246-94d7-d93214ae2a5d-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9\" (UID: \"91938dad-6cac-4246-94d7-d93214ae2a5d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9" Oct 12 06:14:52 crc kubenswrapper[4930]: I1012 06:14:52.362840 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/91938dad-6cac-4246-94d7-d93214ae2a5d-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9\" (UID: \"91938dad-6cac-4246-94d7-d93214ae2a5d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9" Oct 12 06:14:52 crc kubenswrapper[4930]: I1012 06:14:52.363384 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91938dad-6cac-4246-94d7-d93214ae2a5d-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9\" (UID: \"91938dad-6cac-4246-94d7-d93214ae2a5d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9" Oct 12 06:14:52 crc kubenswrapper[4930]: I1012 06:14:52.465547 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91938dad-6cac-4246-94d7-d93214ae2a5d-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9\" (UID: \"91938dad-6cac-4246-94d7-d93214ae2a5d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9" Oct 12 06:14:52 crc kubenswrapper[4930]: I1012 06:14:52.465614 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/91938dad-6cac-4246-94d7-d93214ae2a5d-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9\" (UID: \"91938dad-6cac-4246-94d7-d93214ae2a5d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9" Oct 12 06:14:52 crc kubenswrapper[4930]: I1012 06:14:52.465633 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91938dad-6cac-4246-94d7-d93214ae2a5d-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9\" (UID: \"91938dad-6cac-4246-94d7-d93214ae2a5d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9" Oct 12 06:14:52 crc kubenswrapper[4930]: I1012 06:14:52.465656 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/91938dad-6cac-4246-94d7-d93214ae2a5d-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9\" (UID: \"91938dad-6cac-4246-94d7-d93214ae2a5d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9" Oct 12 06:14:52 crc kubenswrapper[4930]: I1012 06:14:52.465682 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91938dad-6cac-4246-94d7-d93214ae2a5d-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9\" (UID: \"91938dad-6cac-4246-94d7-d93214ae2a5d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9" Oct 12 06:14:52 crc kubenswrapper[4930]: I1012 06:14:52.465723 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91938dad-6cac-4246-94d7-d93214ae2a5d-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9\" (UID: \"91938dad-6cac-4246-94d7-d93214ae2a5d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9" Oct 12 06:14:52 crc kubenswrapper[4930]: I1012 06:14:52.465757 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/91938dad-6cac-4246-94d7-d93214ae2a5d-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9\" (UID: \"91938dad-6cac-4246-94d7-d93214ae2a5d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9" Oct 12 06:14:52 crc kubenswrapper[4930]: I1012 06:14:52.465780 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91938dad-6cac-4246-94d7-d93214ae2a5d-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9\" (UID: \"91938dad-6cac-4246-94d7-d93214ae2a5d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9" Oct 12 06:14:52 crc kubenswrapper[4930]: I1012 06:14:52.465797 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzkg2\" (UniqueName: \"kubernetes.io/projected/91938dad-6cac-4246-94d7-d93214ae2a5d-kube-api-access-lzkg2\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9\" (UID: \"91938dad-6cac-4246-94d7-d93214ae2a5d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9" Oct 12 06:14:52 crc kubenswrapper[4930]: I1012 06:14:52.465836 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/91938dad-6cac-4246-94d7-d93214ae2a5d-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9\" (UID: \"91938dad-6cac-4246-94d7-d93214ae2a5d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9" Oct 12 06:14:52 crc kubenswrapper[4930]: I1012 06:14:52.465881 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/91938dad-6cac-4246-94d7-d93214ae2a5d-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9\" (UID: \"91938dad-6cac-4246-94d7-d93214ae2a5d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9" Oct 12 06:14:52 crc kubenswrapper[4930]: I1012 06:14:52.465914 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91938dad-6cac-4246-94d7-d93214ae2a5d-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9\" (UID: \"91938dad-6cac-4246-94d7-d93214ae2a5d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9" Oct 12 06:14:52 crc kubenswrapper[4930]: I1012 06:14:52.465945 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91938dad-6cac-4246-94d7-d93214ae2a5d-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9\" (UID: \"91938dad-6cac-4246-94d7-d93214ae2a5d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9" Oct 12 06:14:52 crc kubenswrapper[4930]: I1012 06:14:52.465964 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/91938dad-6cac-4246-94d7-d93214ae2a5d-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9\" (UID: \"91938dad-6cac-4246-94d7-d93214ae2a5d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9" Oct 12 06:14:52 crc kubenswrapper[4930]: I1012 06:14:52.470301 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/91938dad-6cac-4246-94d7-d93214ae2a5d-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9\" (UID: \"91938dad-6cac-4246-94d7-d93214ae2a5d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9" Oct 12 06:14:52 crc kubenswrapper[4930]: I1012 06:14:52.472608 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91938dad-6cac-4246-94d7-d93214ae2a5d-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9\" (UID: \"91938dad-6cac-4246-94d7-d93214ae2a5d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9" Oct 12 06:14:52 crc kubenswrapper[4930]: I1012 06:14:52.472608 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/91938dad-6cac-4246-94d7-d93214ae2a5d-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9\" (UID: \"91938dad-6cac-4246-94d7-d93214ae2a5d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9" Oct 12 06:14:52 crc kubenswrapper[4930]: I1012 06:14:52.473826 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91938dad-6cac-4246-94d7-d93214ae2a5d-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9\" (UID: \"91938dad-6cac-4246-94d7-d93214ae2a5d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9" Oct 12 06:14:52 crc kubenswrapper[4930]: I1012 06:14:52.474855 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91938dad-6cac-4246-94d7-d93214ae2a5d-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9\" (UID: \"91938dad-6cac-4246-94d7-d93214ae2a5d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9" Oct 12 06:14:52 crc kubenswrapper[4930]: I1012 06:14:52.475049 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91938dad-6cac-4246-94d7-d93214ae2a5d-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9\" (UID: \"91938dad-6cac-4246-94d7-d93214ae2a5d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9" Oct 12 06:14:52 crc kubenswrapper[4930]: I1012 06:14:52.475200 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/91938dad-6cac-4246-94d7-d93214ae2a5d-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9\" (UID: \"91938dad-6cac-4246-94d7-d93214ae2a5d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9" Oct 12 06:14:52 crc kubenswrapper[4930]: I1012 06:14:52.475814 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91938dad-6cac-4246-94d7-d93214ae2a5d-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9\" (UID: \"91938dad-6cac-4246-94d7-d93214ae2a5d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9" Oct 12 06:14:52 crc kubenswrapper[4930]: I1012 06:14:52.475841 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/91938dad-6cac-4246-94d7-d93214ae2a5d-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9\" (UID: \"91938dad-6cac-4246-94d7-d93214ae2a5d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9" Oct 12 06:14:52 crc kubenswrapper[4930]: I1012 06:14:52.476270 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/91938dad-6cac-4246-94d7-d93214ae2a5d-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9\" (UID: \"91938dad-6cac-4246-94d7-d93214ae2a5d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9" Oct 12 06:14:52 crc kubenswrapper[4930]: I1012 06:14:52.476513 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91938dad-6cac-4246-94d7-d93214ae2a5d-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9\" (UID: \"91938dad-6cac-4246-94d7-d93214ae2a5d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9" Oct 12 06:14:52 crc kubenswrapper[4930]: I1012 06:14:52.481043 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91938dad-6cac-4246-94d7-d93214ae2a5d-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9\" (UID: \"91938dad-6cac-4246-94d7-d93214ae2a5d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9" Oct 12 06:14:52 crc kubenswrapper[4930]: I1012 06:14:52.476041 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/91938dad-6cac-4246-94d7-d93214ae2a5d-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9\" (UID: \"91938dad-6cac-4246-94d7-d93214ae2a5d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9" Oct 12 06:14:52 crc kubenswrapper[4930]: I1012 06:14:52.492889 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzkg2\" (UniqueName: \"kubernetes.io/projected/91938dad-6cac-4246-94d7-d93214ae2a5d-kube-api-access-lzkg2\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9\" (UID: \"91938dad-6cac-4246-94d7-d93214ae2a5d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9" Oct 12 06:14:52 crc kubenswrapper[4930]: I1012 06:14:52.574526 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9" Oct 12 06:14:53 crc kubenswrapper[4930]: W1012 06:14:53.168223 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91938dad_6cac_4246_94d7_d93214ae2a5d.slice/crio-7caa1b115e41c072ff64595dfbb10bf46f96c6c27685941e076985b522cdf572 WatchSource:0}: Error finding container 7caa1b115e41c072ff64595dfbb10bf46f96c6c27685941e076985b522cdf572: Status 404 returned error can't find the container with id 7caa1b115e41c072ff64595dfbb10bf46f96c6c27685941e076985b522cdf572 Oct 12 06:14:53 crc kubenswrapper[4930]: I1012 06:14:53.179484 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9"] Oct 12 06:14:54 crc kubenswrapper[4930]: I1012 06:14:54.150390 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9" event={"ID":"91938dad-6cac-4246-94d7-d93214ae2a5d","Type":"ContainerStarted","Data":"759a13dac69d92e810bc46e91094475766a8229def36b9b19e19cac330c42ec9"} Oct 12 06:14:54 crc kubenswrapper[4930]: I1012 06:14:54.151124 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9" event={"ID":"91938dad-6cac-4246-94d7-d93214ae2a5d","Type":"ContainerStarted","Data":"7caa1b115e41c072ff64595dfbb10bf46f96c6c27685941e076985b522cdf572"} Oct 12 06:14:54 crc kubenswrapper[4930]: I1012 06:14:54.182986 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9" podStartSLOduration=1.626737945 podStartE2EDuration="2.182966467s" podCreationTimestamp="2025-10-12 06:14:52 +0000 UTC" firstStartedPulling="2025-10-12 06:14:53.170761729 +0000 UTC m=+2025.712863514" lastFinishedPulling="2025-10-12 06:14:53.726990271 +0000 UTC m=+2026.269092036" observedRunningTime="2025-10-12 06:14:54.169280897 +0000 UTC m=+2026.711382672" watchObservedRunningTime="2025-10-12 06:14:54.182966467 +0000 UTC m=+2026.725068232" Oct 12 06:15:00 crc kubenswrapper[4930]: I1012 06:15:00.149041 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29337495-8dkz5"] Oct 12 06:15:00 crc kubenswrapper[4930]: I1012 06:15:00.155582 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29337495-8dkz5" Oct 12 06:15:00 crc kubenswrapper[4930]: I1012 06:15:00.157016 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29337495-8dkz5"] Oct 12 06:15:00 crc kubenswrapper[4930]: I1012 06:15:00.159184 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 12 06:15:00 crc kubenswrapper[4930]: I1012 06:15:00.159429 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 12 06:15:00 crc kubenswrapper[4930]: I1012 06:15:00.348903 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/14f6608c-f601-4659-aef3-4764a8727ae8-config-volume\") pod \"collect-profiles-29337495-8dkz5\" (UID: \"14f6608c-f601-4659-aef3-4764a8727ae8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337495-8dkz5" Oct 12 06:15:00 crc kubenswrapper[4930]: I1012 06:15:00.348982 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9nrb\" (UniqueName: \"kubernetes.io/projected/14f6608c-f601-4659-aef3-4764a8727ae8-kube-api-access-q9nrb\") pod \"collect-profiles-29337495-8dkz5\" (UID: \"14f6608c-f601-4659-aef3-4764a8727ae8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337495-8dkz5" Oct 12 06:15:00 crc kubenswrapper[4930]: I1012 06:15:00.349070 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/14f6608c-f601-4659-aef3-4764a8727ae8-secret-volume\") pod \"collect-profiles-29337495-8dkz5\" (UID: \"14f6608c-f601-4659-aef3-4764a8727ae8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337495-8dkz5" Oct 12 06:15:00 crc kubenswrapper[4930]: I1012 06:15:00.450734 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/14f6608c-f601-4659-aef3-4764a8727ae8-config-volume\") pod \"collect-profiles-29337495-8dkz5\" (UID: \"14f6608c-f601-4659-aef3-4764a8727ae8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337495-8dkz5" Oct 12 06:15:00 crc kubenswrapper[4930]: I1012 06:15:00.450812 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9nrb\" (UniqueName: \"kubernetes.io/projected/14f6608c-f601-4659-aef3-4764a8727ae8-kube-api-access-q9nrb\") pod \"collect-profiles-29337495-8dkz5\" (UID: \"14f6608c-f601-4659-aef3-4764a8727ae8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337495-8dkz5" Oct 12 06:15:00 crc kubenswrapper[4930]: I1012 06:15:00.450890 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/14f6608c-f601-4659-aef3-4764a8727ae8-secret-volume\") pod \"collect-profiles-29337495-8dkz5\" (UID: \"14f6608c-f601-4659-aef3-4764a8727ae8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337495-8dkz5" Oct 12 06:15:00 crc kubenswrapper[4930]: I1012 06:15:00.451997 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/14f6608c-f601-4659-aef3-4764a8727ae8-config-volume\") pod \"collect-profiles-29337495-8dkz5\" (UID: \"14f6608c-f601-4659-aef3-4764a8727ae8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337495-8dkz5" Oct 12 06:15:00 crc kubenswrapper[4930]: I1012 06:15:00.456306 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/14f6608c-f601-4659-aef3-4764a8727ae8-secret-volume\") pod \"collect-profiles-29337495-8dkz5\" (UID: \"14f6608c-f601-4659-aef3-4764a8727ae8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337495-8dkz5" Oct 12 06:15:00 crc kubenswrapper[4930]: I1012 06:15:00.483386 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9nrb\" (UniqueName: \"kubernetes.io/projected/14f6608c-f601-4659-aef3-4764a8727ae8-kube-api-access-q9nrb\") pod \"collect-profiles-29337495-8dkz5\" (UID: \"14f6608c-f601-4659-aef3-4764a8727ae8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337495-8dkz5" Oct 12 06:15:00 crc kubenswrapper[4930]: I1012 06:15:00.777397 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29337495-8dkz5" Oct 12 06:15:01 crc kubenswrapper[4930]: I1012 06:15:01.308259 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29337495-8dkz5"] Oct 12 06:15:01 crc kubenswrapper[4930]: W1012 06:15:01.315175 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14f6608c_f601_4659_aef3_4764a8727ae8.slice/crio-2bae46291b1fcd8ac4813394001e437528ce40228efcd7d7cda6a5d0d6eae0c0 WatchSource:0}: Error finding container 2bae46291b1fcd8ac4813394001e437528ce40228efcd7d7cda6a5d0d6eae0c0: Status 404 returned error can't find the container with id 2bae46291b1fcd8ac4813394001e437528ce40228efcd7d7cda6a5d0d6eae0c0 Oct 12 06:15:02 crc kubenswrapper[4930]: I1012 06:15:02.222307 4930 generic.go:334] "Generic (PLEG): container finished" podID="14f6608c-f601-4659-aef3-4764a8727ae8" containerID="7a9a7f19b821a58a80c8eca69cb140dbf1a2febb15cb6baedec0eb613e93c210" exitCode=0 Oct 12 06:15:02 crc kubenswrapper[4930]: I1012 06:15:02.222368 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29337495-8dkz5" event={"ID":"14f6608c-f601-4659-aef3-4764a8727ae8","Type":"ContainerDied","Data":"7a9a7f19b821a58a80c8eca69cb140dbf1a2febb15cb6baedec0eb613e93c210"} Oct 12 06:15:02 crc kubenswrapper[4930]: I1012 06:15:02.222432 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29337495-8dkz5" event={"ID":"14f6608c-f601-4659-aef3-4764a8727ae8","Type":"ContainerStarted","Data":"2bae46291b1fcd8ac4813394001e437528ce40228efcd7d7cda6a5d0d6eae0c0"} Oct 12 06:15:03 crc kubenswrapper[4930]: I1012 06:15:03.601313 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29337495-8dkz5" Oct 12 06:15:03 crc kubenswrapper[4930]: I1012 06:15:03.669593 4930 patch_prober.go:28] interesting pod/machine-config-daemon-mk4tf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 06:15:03 crc kubenswrapper[4930]: I1012 06:15:03.669648 4930 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 06:15:03 crc kubenswrapper[4930]: I1012 06:15:03.726253 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/14f6608c-f601-4659-aef3-4764a8727ae8-secret-volume\") pod \"14f6608c-f601-4659-aef3-4764a8727ae8\" (UID: \"14f6608c-f601-4659-aef3-4764a8727ae8\") " Oct 12 06:15:03 crc kubenswrapper[4930]: I1012 06:15:03.726558 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9nrb\" (UniqueName: \"kubernetes.io/projected/14f6608c-f601-4659-aef3-4764a8727ae8-kube-api-access-q9nrb\") pod \"14f6608c-f601-4659-aef3-4764a8727ae8\" (UID: \"14f6608c-f601-4659-aef3-4764a8727ae8\") " Oct 12 06:15:03 crc kubenswrapper[4930]: I1012 06:15:03.726707 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/14f6608c-f601-4659-aef3-4764a8727ae8-config-volume\") pod \"14f6608c-f601-4659-aef3-4764a8727ae8\" (UID: \"14f6608c-f601-4659-aef3-4764a8727ae8\") " Oct 12 06:15:03 crc kubenswrapper[4930]: I1012 06:15:03.727901 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14f6608c-f601-4659-aef3-4764a8727ae8-config-volume" (OuterVolumeSpecName: "config-volume") pod "14f6608c-f601-4659-aef3-4764a8727ae8" (UID: "14f6608c-f601-4659-aef3-4764a8727ae8"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 06:15:03 crc kubenswrapper[4930]: I1012 06:15:03.728686 4930 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/14f6608c-f601-4659-aef3-4764a8727ae8-config-volume\") on node \"crc\" DevicePath \"\"" Oct 12 06:15:03 crc kubenswrapper[4930]: I1012 06:15:03.731557 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14f6608c-f601-4659-aef3-4764a8727ae8-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "14f6608c-f601-4659-aef3-4764a8727ae8" (UID: "14f6608c-f601-4659-aef3-4764a8727ae8"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:15:03 crc kubenswrapper[4930]: I1012 06:15:03.733343 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14f6608c-f601-4659-aef3-4764a8727ae8-kube-api-access-q9nrb" (OuterVolumeSpecName: "kube-api-access-q9nrb") pod "14f6608c-f601-4659-aef3-4764a8727ae8" (UID: "14f6608c-f601-4659-aef3-4764a8727ae8"). InnerVolumeSpecName "kube-api-access-q9nrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:15:03 crc kubenswrapper[4930]: I1012 06:15:03.831979 4930 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/14f6608c-f601-4659-aef3-4764a8727ae8-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 12 06:15:03 crc kubenswrapper[4930]: I1012 06:15:03.832039 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q9nrb\" (UniqueName: \"kubernetes.io/projected/14f6608c-f601-4659-aef3-4764a8727ae8-kube-api-access-q9nrb\") on node \"crc\" DevicePath \"\"" Oct 12 06:15:04 crc kubenswrapper[4930]: I1012 06:15:04.259977 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29337495-8dkz5" event={"ID":"14f6608c-f601-4659-aef3-4764a8727ae8","Type":"ContainerDied","Data":"2bae46291b1fcd8ac4813394001e437528ce40228efcd7d7cda6a5d0d6eae0c0"} Oct 12 06:15:04 crc kubenswrapper[4930]: I1012 06:15:04.260388 4930 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2bae46291b1fcd8ac4813394001e437528ce40228efcd7d7cda6a5d0d6eae0c0" Oct 12 06:15:04 crc kubenswrapper[4930]: I1012 06:15:04.260524 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29337495-8dkz5" Oct 12 06:15:04 crc kubenswrapper[4930]: I1012 06:15:04.686496 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29337450-9g4rr"] Oct 12 06:15:04 crc kubenswrapper[4930]: I1012 06:15:04.693449 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29337450-9g4rr"] Oct 12 06:15:06 crc kubenswrapper[4930]: I1012 06:15:06.153290 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74be97dd-4d16-40ed-87e4-b707eccf422e" path="/var/lib/kubelet/pods/74be97dd-4d16-40ed-87e4-b707eccf422e/volumes" Oct 12 06:15:26 crc kubenswrapper[4930]: I1012 06:15:26.936939 4930 scope.go:117] "RemoveContainer" containerID="03c67c5984d37c63a32d182586fc0c79ea9235d37261509a22f89f4ee0167502" Oct 12 06:15:33 crc kubenswrapper[4930]: I1012 06:15:33.670148 4930 patch_prober.go:28] interesting pod/machine-config-daemon-mk4tf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 06:15:33 crc kubenswrapper[4930]: I1012 06:15:33.671103 4930 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 06:15:37 crc kubenswrapper[4930]: I1012 06:15:37.625309 4930 generic.go:334] "Generic (PLEG): container finished" podID="91938dad-6cac-4246-94d7-d93214ae2a5d" containerID="759a13dac69d92e810bc46e91094475766a8229def36b9b19e19cac330c42ec9" exitCode=0 Oct 12 06:15:37 crc kubenswrapper[4930]: I1012 06:15:37.625358 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9" event={"ID":"91938dad-6cac-4246-94d7-d93214ae2a5d","Type":"ContainerDied","Data":"759a13dac69d92e810bc46e91094475766a8229def36b9b19e19cac330c42ec9"} Oct 12 06:15:39 crc kubenswrapper[4930]: I1012 06:15:39.187825 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9" Oct 12 06:15:39 crc kubenswrapper[4930]: I1012 06:15:39.296988 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91938dad-6cac-4246-94d7-d93214ae2a5d-libvirt-combined-ca-bundle\") pod \"91938dad-6cac-4246-94d7-d93214ae2a5d\" (UID: \"91938dad-6cac-4246-94d7-d93214ae2a5d\") " Oct 12 06:15:39 crc kubenswrapper[4930]: I1012 06:15:39.297429 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91938dad-6cac-4246-94d7-d93214ae2a5d-bootstrap-combined-ca-bundle\") pod \"91938dad-6cac-4246-94d7-d93214ae2a5d\" (UID: \"91938dad-6cac-4246-94d7-d93214ae2a5d\") " Oct 12 06:15:39 crc kubenswrapper[4930]: I1012 06:15:39.297630 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91938dad-6cac-4246-94d7-d93214ae2a5d-nova-combined-ca-bundle\") pod \"91938dad-6cac-4246-94d7-d93214ae2a5d\" (UID: \"91938dad-6cac-4246-94d7-d93214ae2a5d\") " Oct 12 06:15:39 crc kubenswrapper[4930]: I1012 06:15:39.297828 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91938dad-6cac-4246-94d7-d93214ae2a5d-telemetry-combined-ca-bundle\") pod \"91938dad-6cac-4246-94d7-d93214ae2a5d\" (UID: \"91938dad-6cac-4246-94d7-d93214ae2a5d\") " Oct 12 06:15:39 crc kubenswrapper[4930]: I1012 06:15:39.298030 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/91938dad-6cac-4246-94d7-d93214ae2a5d-ssh-key\") pod \"91938dad-6cac-4246-94d7-d93214ae2a5d\" (UID: \"91938dad-6cac-4246-94d7-d93214ae2a5d\") " Oct 12 06:15:39 crc kubenswrapper[4930]: I1012 06:15:39.298220 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91938dad-6cac-4246-94d7-d93214ae2a5d-ovn-combined-ca-bundle\") pod \"91938dad-6cac-4246-94d7-d93214ae2a5d\" (UID: \"91938dad-6cac-4246-94d7-d93214ae2a5d\") " Oct 12 06:15:39 crc kubenswrapper[4930]: I1012 06:15:39.298391 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzkg2\" (UniqueName: \"kubernetes.io/projected/91938dad-6cac-4246-94d7-d93214ae2a5d-kube-api-access-lzkg2\") pod \"91938dad-6cac-4246-94d7-d93214ae2a5d\" (UID: \"91938dad-6cac-4246-94d7-d93214ae2a5d\") " Oct 12 06:15:39 crc kubenswrapper[4930]: I1012 06:15:39.298553 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91938dad-6cac-4246-94d7-d93214ae2a5d-neutron-metadata-combined-ca-bundle\") pod \"91938dad-6cac-4246-94d7-d93214ae2a5d\" (UID: \"91938dad-6cac-4246-94d7-d93214ae2a5d\") " Oct 12 06:15:39 crc kubenswrapper[4930]: I1012 06:15:39.298771 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91938dad-6cac-4246-94d7-d93214ae2a5d-repo-setup-combined-ca-bundle\") pod \"91938dad-6cac-4246-94d7-d93214ae2a5d\" (UID: \"91938dad-6cac-4246-94d7-d93214ae2a5d\") " Oct 12 06:15:39 crc kubenswrapper[4930]: I1012 06:15:39.298987 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/91938dad-6cac-4246-94d7-d93214ae2a5d-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"91938dad-6cac-4246-94d7-d93214ae2a5d\" (UID: \"91938dad-6cac-4246-94d7-d93214ae2a5d\") " Oct 12 06:15:39 crc kubenswrapper[4930]: I1012 06:15:39.299160 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/91938dad-6cac-4246-94d7-d93214ae2a5d-inventory\") pod \"91938dad-6cac-4246-94d7-d93214ae2a5d\" (UID: \"91938dad-6cac-4246-94d7-d93214ae2a5d\") " Oct 12 06:15:39 crc kubenswrapper[4930]: I1012 06:15:39.299324 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/91938dad-6cac-4246-94d7-d93214ae2a5d-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"91938dad-6cac-4246-94d7-d93214ae2a5d\" (UID: \"91938dad-6cac-4246-94d7-d93214ae2a5d\") " Oct 12 06:15:39 crc kubenswrapper[4930]: I1012 06:15:39.299503 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/91938dad-6cac-4246-94d7-d93214ae2a5d-openstack-edpm-ipam-ovn-default-certs-0\") pod \"91938dad-6cac-4246-94d7-d93214ae2a5d\" (UID: \"91938dad-6cac-4246-94d7-d93214ae2a5d\") " Oct 12 06:15:39 crc kubenswrapper[4930]: I1012 06:15:39.299783 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/91938dad-6cac-4246-94d7-d93214ae2a5d-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"91938dad-6cac-4246-94d7-d93214ae2a5d\" (UID: \"91938dad-6cac-4246-94d7-d93214ae2a5d\") " Oct 12 06:15:39 crc kubenswrapper[4930]: I1012 06:15:39.307696 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91938dad-6cac-4246-94d7-d93214ae2a5d-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "91938dad-6cac-4246-94d7-d93214ae2a5d" (UID: "91938dad-6cac-4246-94d7-d93214ae2a5d"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:15:39 crc kubenswrapper[4930]: I1012 06:15:39.308605 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91938dad-6cac-4246-94d7-d93214ae2a5d-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "91938dad-6cac-4246-94d7-d93214ae2a5d" (UID: "91938dad-6cac-4246-94d7-d93214ae2a5d"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:15:39 crc kubenswrapper[4930]: I1012 06:15:39.308827 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91938dad-6cac-4246-94d7-d93214ae2a5d-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "91938dad-6cac-4246-94d7-d93214ae2a5d" (UID: "91938dad-6cac-4246-94d7-d93214ae2a5d"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:15:39 crc kubenswrapper[4930]: I1012 06:15:39.308936 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91938dad-6cac-4246-94d7-d93214ae2a5d-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "91938dad-6cac-4246-94d7-d93214ae2a5d" (UID: "91938dad-6cac-4246-94d7-d93214ae2a5d"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:15:39 crc kubenswrapper[4930]: I1012 06:15:39.309409 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91938dad-6cac-4246-94d7-d93214ae2a5d-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "91938dad-6cac-4246-94d7-d93214ae2a5d" (UID: "91938dad-6cac-4246-94d7-d93214ae2a5d"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:15:39 crc kubenswrapper[4930]: I1012 06:15:39.309550 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91938dad-6cac-4246-94d7-d93214ae2a5d-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "91938dad-6cac-4246-94d7-d93214ae2a5d" (UID: "91938dad-6cac-4246-94d7-d93214ae2a5d"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:15:39 crc kubenswrapper[4930]: I1012 06:15:39.309952 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91938dad-6cac-4246-94d7-d93214ae2a5d-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "91938dad-6cac-4246-94d7-d93214ae2a5d" (UID: "91938dad-6cac-4246-94d7-d93214ae2a5d"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:15:39 crc kubenswrapper[4930]: I1012 06:15:39.310690 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91938dad-6cac-4246-94d7-d93214ae2a5d-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "91938dad-6cac-4246-94d7-d93214ae2a5d" (UID: "91938dad-6cac-4246-94d7-d93214ae2a5d"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:15:39 crc kubenswrapper[4930]: I1012 06:15:39.310807 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91938dad-6cac-4246-94d7-d93214ae2a5d-kube-api-access-lzkg2" (OuterVolumeSpecName: "kube-api-access-lzkg2") pod "91938dad-6cac-4246-94d7-d93214ae2a5d" (UID: "91938dad-6cac-4246-94d7-d93214ae2a5d"). InnerVolumeSpecName "kube-api-access-lzkg2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:15:39 crc kubenswrapper[4930]: I1012 06:15:39.311557 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91938dad-6cac-4246-94d7-d93214ae2a5d-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "91938dad-6cac-4246-94d7-d93214ae2a5d" (UID: "91938dad-6cac-4246-94d7-d93214ae2a5d"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:15:39 crc kubenswrapper[4930]: I1012 06:15:39.315647 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91938dad-6cac-4246-94d7-d93214ae2a5d-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "91938dad-6cac-4246-94d7-d93214ae2a5d" (UID: "91938dad-6cac-4246-94d7-d93214ae2a5d"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:15:39 crc kubenswrapper[4930]: I1012 06:15:39.323997 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91938dad-6cac-4246-94d7-d93214ae2a5d-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "91938dad-6cac-4246-94d7-d93214ae2a5d" (UID: "91938dad-6cac-4246-94d7-d93214ae2a5d"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:15:39 crc kubenswrapper[4930]: I1012 06:15:39.341851 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91938dad-6cac-4246-94d7-d93214ae2a5d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "91938dad-6cac-4246-94d7-d93214ae2a5d" (UID: "91938dad-6cac-4246-94d7-d93214ae2a5d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:15:39 crc kubenswrapper[4930]: I1012 06:15:39.362059 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91938dad-6cac-4246-94d7-d93214ae2a5d-inventory" (OuterVolumeSpecName: "inventory") pod "91938dad-6cac-4246-94d7-d93214ae2a5d" (UID: "91938dad-6cac-4246-94d7-d93214ae2a5d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:15:39 crc kubenswrapper[4930]: I1012 06:15:39.401752 4930 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91938dad-6cac-4246-94d7-d93214ae2a5d-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 06:15:39 crc kubenswrapper[4930]: I1012 06:15:39.401783 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzkg2\" (UniqueName: \"kubernetes.io/projected/91938dad-6cac-4246-94d7-d93214ae2a5d-kube-api-access-lzkg2\") on node \"crc\" DevicePath \"\"" Oct 12 06:15:39 crc kubenswrapper[4930]: I1012 06:15:39.401797 4930 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91938dad-6cac-4246-94d7-d93214ae2a5d-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 06:15:39 crc kubenswrapper[4930]: I1012 06:15:39.401810 4930 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91938dad-6cac-4246-94d7-d93214ae2a5d-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 06:15:39 crc kubenswrapper[4930]: I1012 06:15:39.401824 4930 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/91938dad-6cac-4246-94d7-d93214ae2a5d-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 12 06:15:39 crc kubenswrapper[4930]: I1012 06:15:39.401837 4930 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/91938dad-6cac-4246-94d7-d93214ae2a5d-inventory\") on node \"crc\" DevicePath \"\"" Oct 12 06:15:39 crc kubenswrapper[4930]: I1012 06:15:39.401849 4930 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/91938dad-6cac-4246-94d7-d93214ae2a5d-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 12 06:15:39 crc kubenswrapper[4930]: I1012 06:15:39.401861 4930 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/91938dad-6cac-4246-94d7-d93214ae2a5d-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 12 06:15:39 crc kubenswrapper[4930]: I1012 06:15:39.401875 4930 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/91938dad-6cac-4246-94d7-d93214ae2a5d-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 12 06:15:39 crc kubenswrapper[4930]: I1012 06:15:39.401887 4930 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91938dad-6cac-4246-94d7-d93214ae2a5d-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 06:15:39 crc kubenswrapper[4930]: I1012 06:15:39.401898 4930 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91938dad-6cac-4246-94d7-d93214ae2a5d-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 06:15:39 crc kubenswrapper[4930]: I1012 06:15:39.401910 4930 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91938dad-6cac-4246-94d7-d93214ae2a5d-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 06:15:39 crc kubenswrapper[4930]: I1012 06:15:39.401920 4930 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91938dad-6cac-4246-94d7-d93214ae2a5d-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 06:15:39 crc kubenswrapper[4930]: I1012 06:15:39.401930 4930 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/91938dad-6cac-4246-94d7-d93214ae2a5d-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 12 06:15:39 crc kubenswrapper[4930]: I1012 06:15:39.651932 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9" event={"ID":"91938dad-6cac-4246-94d7-d93214ae2a5d","Type":"ContainerDied","Data":"7caa1b115e41c072ff64595dfbb10bf46f96c6c27685941e076985b522cdf572"} Oct 12 06:15:39 crc kubenswrapper[4930]: I1012 06:15:39.651982 4930 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7caa1b115e41c072ff64595dfbb10bf46f96c6c27685941e076985b522cdf572" Oct 12 06:15:39 crc kubenswrapper[4930]: I1012 06:15:39.652048 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9" Oct 12 06:15:39 crc kubenswrapper[4930]: I1012 06:15:39.798388 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-h52cl"] Oct 12 06:15:39 crc kubenswrapper[4930]: E1012 06:15:39.799088 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91938dad-6cac-4246-94d7-d93214ae2a5d" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 12 06:15:39 crc kubenswrapper[4930]: I1012 06:15:39.799181 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="91938dad-6cac-4246-94d7-d93214ae2a5d" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 12 06:15:39 crc kubenswrapper[4930]: E1012 06:15:39.799336 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14f6608c-f601-4659-aef3-4764a8727ae8" containerName="collect-profiles" Oct 12 06:15:39 crc kubenswrapper[4930]: I1012 06:15:39.799424 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="14f6608c-f601-4659-aef3-4764a8727ae8" containerName="collect-profiles" Oct 12 06:15:39 crc kubenswrapper[4930]: I1012 06:15:39.799773 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="14f6608c-f601-4659-aef3-4764a8727ae8" containerName="collect-profiles" Oct 12 06:15:39 crc kubenswrapper[4930]: I1012 06:15:39.799884 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="91938dad-6cac-4246-94d7-d93214ae2a5d" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 12 06:15:39 crc kubenswrapper[4930]: I1012 06:15:39.803394 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h52cl" Oct 12 06:15:39 crc kubenswrapper[4930]: I1012 06:15:39.808403 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vhlxg" Oct 12 06:15:39 crc kubenswrapper[4930]: I1012 06:15:39.808920 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 12 06:15:39 crc kubenswrapper[4930]: I1012 06:15:39.811496 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 12 06:15:39 crc kubenswrapper[4930]: I1012 06:15:39.812049 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 12 06:15:39 crc kubenswrapper[4930]: I1012 06:15:39.813257 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Oct 12 06:15:39 crc kubenswrapper[4930]: I1012 06:15:39.825122 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-h52cl"] Oct 12 06:15:39 crc kubenswrapper[4930]: I1012 06:15:39.912456 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntvm4\" (UniqueName: \"kubernetes.io/projected/1d14f488-c958-4022-b1db-a1161afad246-kube-api-access-ntvm4\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-h52cl\" (UID: \"1d14f488-c958-4022-b1db-a1161afad246\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h52cl" Oct 12 06:15:39 crc kubenswrapper[4930]: I1012 06:15:39.912875 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1d14f488-c958-4022-b1db-a1161afad246-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-h52cl\" (UID: \"1d14f488-c958-4022-b1db-a1161afad246\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h52cl" Oct 12 06:15:39 crc kubenswrapper[4930]: I1012 06:15:39.913152 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1d14f488-c958-4022-b1db-a1161afad246-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-h52cl\" (UID: \"1d14f488-c958-4022-b1db-a1161afad246\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h52cl" Oct 12 06:15:39 crc kubenswrapper[4930]: I1012 06:15:39.913410 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/1d14f488-c958-4022-b1db-a1161afad246-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-h52cl\" (UID: \"1d14f488-c958-4022-b1db-a1161afad246\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h52cl" Oct 12 06:15:39 crc kubenswrapper[4930]: I1012 06:15:39.913778 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d14f488-c958-4022-b1db-a1161afad246-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-h52cl\" (UID: \"1d14f488-c958-4022-b1db-a1161afad246\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h52cl" Oct 12 06:15:40 crc kubenswrapper[4930]: I1012 06:15:40.017002 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntvm4\" (UniqueName: \"kubernetes.io/projected/1d14f488-c958-4022-b1db-a1161afad246-kube-api-access-ntvm4\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-h52cl\" (UID: \"1d14f488-c958-4022-b1db-a1161afad246\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h52cl" Oct 12 06:15:40 crc kubenswrapper[4930]: I1012 06:15:40.017071 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1d14f488-c958-4022-b1db-a1161afad246-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-h52cl\" (UID: \"1d14f488-c958-4022-b1db-a1161afad246\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h52cl" Oct 12 06:15:40 crc kubenswrapper[4930]: I1012 06:15:40.017979 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1d14f488-c958-4022-b1db-a1161afad246-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-h52cl\" (UID: \"1d14f488-c958-4022-b1db-a1161afad246\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h52cl" Oct 12 06:15:40 crc kubenswrapper[4930]: I1012 06:15:40.018140 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/1d14f488-c958-4022-b1db-a1161afad246-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-h52cl\" (UID: \"1d14f488-c958-4022-b1db-a1161afad246\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h52cl" Oct 12 06:15:40 crc kubenswrapper[4930]: I1012 06:15:40.018322 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d14f488-c958-4022-b1db-a1161afad246-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-h52cl\" (UID: \"1d14f488-c958-4022-b1db-a1161afad246\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h52cl" Oct 12 06:15:40 crc kubenswrapper[4930]: I1012 06:15:40.020502 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/1d14f488-c958-4022-b1db-a1161afad246-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-h52cl\" (UID: \"1d14f488-c958-4022-b1db-a1161afad246\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h52cl" Oct 12 06:15:40 crc kubenswrapper[4930]: I1012 06:15:40.024071 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1d14f488-c958-4022-b1db-a1161afad246-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-h52cl\" (UID: \"1d14f488-c958-4022-b1db-a1161afad246\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h52cl" Oct 12 06:15:40 crc kubenswrapper[4930]: I1012 06:15:40.025421 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1d14f488-c958-4022-b1db-a1161afad246-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-h52cl\" (UID: \"1d14f488-c958-4022-b1db-a1161afad246\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h52cl" Oct 12 06:15:40 crc kubenswrapper[4930]: I1012 06:15:40.026681 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d14f488-c958-4022-b1db-a1161afad246-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-h52cl\" (UID: \"1d14f488-c958-4022-b1db-a1161afad246\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h52cl" Oct 12 06:15:40 crc kubenswrapper[4930]: I1012 06:15:40.045996 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntvm4\" (UniqueName: \"kubernetes.io/projected/1d14f488-c958-4022-b1db-a1161afad246-kube-api-access-ntvm4\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-h52cl\" (UID: \"1d14f488-c958-4022-b1db-a1161afad246\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h52cl" Oct 12 06:15:40 crc kubenswrapper[4930]: I1012 06:15:40.126510 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h52cl" Oct 12 06:15:40 crc kubenswrapper[4930]: I1012 06:15:40.733424 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-h52cl"] Oct 12 06:15:41 crc kubenswrapper[4930]: I1012 06:15:41.711032 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h52cl" event={"ID":"1d14f488-c958-4022-b1db-a1161afad246","Type":"ContainerStarted","Data":"9498ace4b020124237a3ef9fb8bdc3505b98f8cf395c81861da2d2d4bce9a6bf"} Oct 12 06:15:41 crc kubenswrapper[4930]: I1012 06:15:41.711714 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h52cl" event={"ID":"1d14f488-c958-4022-b1db-a1161afad246","Type":"ContainerStarted","Data":"6ea9ddfe7c81408e69daeca8a478152ef8c05fb80450f1bbf88e6f3af5cf4049"} Oct 12 06:15:41 crc kubenswrapper[4930]: I1012 06:15:41.747104 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h52cl" podStartSLOduration=2.252688459 podStartE2EDuration="2.747085728s" podCreationTimestamp="2025-10-12 06:15:39 +0000 UTC" firstStartedPulling="2025-10-12 06:15:40.735176537 +0000 UTC m=+2073.277278332" lastFinishedPulling="2025-10-12 06:15:41.229573836 +0000 UTC m=+2073.771675601" observedRunningTime="2025-10-12 06:15:41.740951786 +0000 UTC m=+2074.283053551" watchObservedRunningTime="2025-10-12 06:15:41.747085728 +0000 UTC m=+2074.289187483" Oct 12 06:15:58 crc kubenswrapper[4930]: I1012 06:15:58.346509 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-t4g8g"] Oct 12 06:15:58 crc kubenswrapper[4930]: I1012 06:15:58.349567 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t4g8g" Oct 12 06:15:58 crc kubenswrapper[4930]: I1012 06:15:58.358022 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t4g8g"] Oct 12 06:15:58 crc kubenswrapper[4930]: I1012 06:15:58.479602 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd7fdf88-1168-498e-a63a-f4b942911d18-utilities\") pod \"redhat-operators-t4g8g\" (UID: \"dd7fdf88-1168-498e-a63a-f4b942911d18\") " pod="openshift-marketplace/redhat-operators-t4g8g" Oct 12 06:15:58 crc kubenswrapper[4930]: I1012 06:15:58.480608 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtm6l\" (UniqueName: \"kubernetes.io/projected/dd7fdf88-1168-498e-a63a-f4b942911d18-kube-api-access-dtm6l\") pod \"redhat-operators-t4g8g\" (UID: \"dd7fdf88-1168-498e-a63a-f4b942911d18\") " pod="openshift-marketplace/redhat-operators-t4g8g" Oct 12 06:15:58 crc kubenswrapper[4930]: I1012 06:15:58.481117 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd7fdf88-1168-498e-a63a-f4b942911d18-catalog-content\") pod \"redhat-operators-t4g8g\" (UID: \"dd7fdf88-1168-498e-a63a-f4b942911d18\") " pod="openshift-marketplace/redhat-operators-t4g8g" Oct 12 06:15:58 crc kubenswrapper[4930]: I1012 06:15:58.583140 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd7fdf88-1168-498e-a63a-f4b942911d18-catalog-content\") pod \"redhat-operators-t4g8g\" (UID: \"dd7fdf88-1168-498e-a63a-f4b942911d18\") " pod="openshift-marketplace/redhat-operators-t4g8g" Oct 12 06:15:58 crc kubenswrapper[4930]: I1012 06:15:58.583255 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd7fdf88-1168-498e-a63a-f4b942911d18-utilities\") pod \"redhat-operators-t4g8g\" (UID: \"dd7fdf88-1168-498e-a63a-f4b942911d18\") " pod="openshift-marketplace/redhat-operators-t4g8g" Oct 12 06:15:58 crc kubenswrapper[4930]: I1012 06:15:58.583377 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtm6l\" (UniqueName: \"kubernetes.io/projected/dd7fdf88-1168-498e-a63a-f4b942911d18-kube-api-access-dtm6l\") pod \"redhat-operators-t4g8g\" (UID: \"dd7fdf88-1168-498e-a63a-f4b942911d18\") " pod="openshift-marketplace/redhat-operators-t4g8g" Oct 12 06:15:58 crc kubenswrapper[4930]: I1012 06:15:58.583791 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd7fdf88-1168-498e-a63a-f4b942911d18-catalog-content\") pod \"redhat-operators-t4g8g\" (UID: \"dd7fdf88-1168-498e-a63a-f4b942911d18\") " pod="openshift-marketplace/redhat-operators-t4g8g" Oct 12 06:15:58 crc kubenswrapper[4930]: I1012 06:15:58.584332 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd7fdf88-1168-498e-a63a-f4b942911d18-utilities\") pod \"redhat-operators-t4g8g\" (UID: \"dd7fdf88-1168-498e-a63a-f4b942911d18\") " pod="openshift-marketplace/redhat-operators-t4g8g" Oct 12 06:15:58 crc kubenswrapper[4930]: I1012 06:15:58.629115 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtm6l\" (UniqueName: \"kubernetes.io/projected/dd7fdf88-1168-498e-a63a-f4b942911d18-kube-api-access-dtm6l\") pod \"redhat-operators-t4g8g\" (UID: \"dd7fdf88-1168-498e-a63a-f4b942911d18\") " pod="openshift-marketplace/redhat-operators-t4g8g" Oct 12 06:15:58 crc kubenswrapper[4930]: I1012 06:15:58.688991 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t4g8g" Oct 12 06:15:59 crc kubenswrapper[4930]: I1012 06:15:59.157307 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t4g8g"] Oct 12 06:15:59 crc kubenswrapper[4930]: W1012 06:15:59.170270 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd7fdf88_1168_498e_a63a_f4b942911d18.slice/crio-1375ae50cef48fa58ca6155853b7fedcb8781631e8cdbc2a56c7331dbacd8f37 WatchSource:0}: Error finding container 1375ae50cef48fa58ca6155853b7fedcb8781631e8cdbc2a56c7331dbacd8f37: Status 404 returned error can't find the container with id 1375ae50cef48fa58ca6155853b7fedcb8781631e8cdbc2a56c7331dbacd8f37 Oct 12 06:15:59 crc kubenswrapper[4930]: I1012 06:15:59.959293 4930 generic.go:334] "Generic (PLEG): container finished" podID="dd7fdf88-1168-498e-a63a-f4b942911d18" containerID="85fb2707e1ad437e89a244d480764f20d17a27de69a11f27a49be8dfd73e569e" exitCode=0 Oct 12 06:15:59 crc kubenswrapper[4930]: I1012 06:15:59.959485 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t4g8g" event={"ID":"dd7fdf88-1168-498e-a63a-f4b942911d18","Type":"ContainerDied","Data":"85fb2707e1ad437e89a244d480764f20d17a27de69a11f27a49be8dfd73e569e"} Oct 12 06:15:59 crc kubenswrapper[4930]: I1012 06:15:59.959562 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t4g8g" event={"ID":"dd7fdf88-1168-498e-a63a-f4b942911d18","Type":"ContainerStarted","Data":"1375ae50cef48fa58ca6155853b7fedcb8781631e8cdbc2a56c7331dbacd8f37"} Oct 12 06:16:00 crc kubenswrapper[4930]: I1012 06:16:00.974211 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t4g8g" event={"ID":"dd7fdf88-1168-498e-a63a-f4b942911d18","Type":"ContainerStarted","Data":"5d610f5bbb219dcc027a2d42ff6ff45bfe8509c041271c7cdb5b68829b07a42c"} Oct 12 06:16:03 crc kubenswrapper[4930]: I1012 06:16:03.669948 4930 patch_prober.go:28] interesting pod/machine-config-daemon-mk4tf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 06:16:03 crc kubenswrapper[4930]: I1012 06:16:03.671681 4930 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 06:16:03 crc kubenswrapper[4930]: I1012 06:16:03.671953 4930 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" Oct 12 06:16:03 crc kubenswrapper[4930]: I1012 06:16:03.673087 4930 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1bd8424d852b3aed65544bab3c60df461f0881b0c3e1f0ea5ec7fed246751ec8"} pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 12 06:16:03 crc kubenswrapper[4930]: I1012 06:16:03.673340 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerName="machine-config-daemon" containerID="cri-o://1bd8424d852b3aed65544bab3c60df461f0881b0c3e1f0ea5ec7fed246751ec8" gracePeriod=600 Oct 12 06:16:04 crc kubenswrapper[4930]: I1012 06:16:04.010241 4930 generic.go:334] "Generic (PLEG): container finished" podID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerID="1bd8424d852b3aed65544bab3c60df461f0881b0c3e1f0ea5ec7fed246751ec8" exitCode=0 Oct 12 06:16:04 crc kubenswrapper[4930]: I1012 06:16:04.010311 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" event={"ID":"02f8684c-a3e4-44e8-9741-9f54488d8d8d","Type":"ContainerDied","Data":"1bd8424d852b3aed65544bab3c60df461f0881b0c3e1f0ea5ec7fed246751ec8"} Oct 12 06:16:04 crc kubenswrapper[4930]: I1012 06:16:04.010948 4930 scope.go:117] "RemoveContainer" containerID="32f5a30ef6dd4a307d4229972f1b9582e9cf02f46ecd5b40412f5857437dcf0a" Oct 12 06:16:04 crc kubenswrapper[4930]: I1012 06:16:04.014659 4930 generic.go:334] "Generic (PLEG): container finished" podID="dd7fdf88-1168-498e-a63a-f4b942911d18" containerID="5d610f5bbb219dcc027a2d42ff6ff45bfe8509c041271c7cdb5b68829b07a42c" exitCode=0 Oct 12 06:16:04 crc kubenswrapper[4930]: I1012 06:16:04.014699 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t4g8g" event={"ID":"dd7fdf88-1168-498e-a63a-f4b942911d18","Type":"ContainerDied","Data":"5d610f5bbb219dcc027a2d42ff6ff45bfe8509c041271c7cdb5b68829b07a42c"} Oct 12 06:16:04 crc kubenswrapper[4930]: I1012 06:16:04.025957 4930 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 12 06:16:05 crc kubenswrapper[4930]: I1012 06:16:05.027496 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" event={"ID":"02f8684c-a3e4-44e8-9741-9f54488d8d8d","Type":"ContainerStarted","Data":"c5afaefc8cc429cd5319bf9dedb80677b518177bff290876a471659f3fc22ce9"} Oct 12 06:16:05 crc kubenswrapper[4930]: I1012 06:16:05.031123 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t4g8g" event={"ID":"dd7fdf88-1168-498e-a63a-f4b942911d18","Type":"ContainerStarted","Data":"794dc283677d941c61e5816d927369378132e687335c59d6a0420ca291edd7de"} Oct 12 06:16:05 crc kubenswrapper[4930]: I1012 06:16:05.082790 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-t4g8g" podStartSLOduration=2.495092617 podStartE2EDuration="7.08275903s" podCreationTimestamp="2025-10-12 06:15:58 +0000 UTC" firstStartedPulling="2025-10-12 06:15:59.965946177 +0000 UTC m=+2092.508047952" lastFinishedPulling="2025-10-12 06:16:04.5536126 +0000 UTC m=+2097.095714365" observedRunningTime="2025-10-12 06:16:05.065904452 +0000 UTC m=+2097.608006237" watchObservedRunningTime="2025-10-12 06:16:05.08275903 +0000 UTC m=+2097.624860835" Oct 12 06:16:06 crc kubenswrapper[4930]: I1012 06:16:06.009863 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gqftp"] Oct 12 06:16:06 crc kubenswrapper[4930]: I1012 06:16:06.016107 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gqftp" Oct 12 06:16:06 crc kubenswrapper[4930]: I1012 06:16:06.028848 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gqftp"] Oct 12 06:16:06 crc kubenswrapper[4930]: I1012 06:16:06.159635 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f749aa11-fbfe-4089-b331-268e4a081b70-catalog-content\") pod \"certified-operators-gqftp\" (UID: \"f749aa11-fbfe-4089-b331-268e4a081b70\") " pod="openshift-marketplace/certified-operators-gqftp" Oct 12 06:16:06 crc kubenswrapper[4930]: I1012 06:16:06.159726 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8c8hg\" (UniqueName: \"kubernetes.io/projected/f749aa11-fbfe-4089-b331-268e4a081b70-kube-api-access-8c8hg\") pod \"certified-operators-gqftp\" (UID: \"f749aa11-fbfe-4089-b331-268e4a081b70\") " pod="openshift-marketplace/certified-operators-gqftp" Oct 12 06:16:06 crc kubenswrapper[4930]: I1012 06:16:06.159778 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f749aa11-fbfe-4089-b331-268e4a081b70-utilities\") pod \"certified-operators-gqftp\" (UID: \"f749aa11-fbfe-4089-b331-268e4a081b70\") " pod="openshift-marketplace/certified-operators-gqftp" Oct 12 06:16:06 crc kubenswrapper[4930]: I1012 06:16:06.261901 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f749aa11-fbfe-4089-b331-268e4a081b70-catalog-content\") pod \"certified-operators-gqftp\" (UID: \"f749aa11-fbfe-4089-b331-268e4a081b70\") " pod="openshift-marketplace/certified-operators-gqftp" Oct 12 06:16:06 crc kubenswrapper[4930]: I1012 06:16:06.261985 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8c8hg\" (UniqueName: \"kubernetes.io/projected/f749aa11-fbfe-4089-b331-268e4a081b70-kube-api-access-8c8hg\") pod \"certified-operators-gqftp\" (UID: \"f749aa11-fbfe-4089-b331-268e4a081b70\") " pod="openshift-marketplace/certified-operators-gqftp" Oct 12 06:16:06 crc kubenswrapper[4930]: I1012 06:16:06.262021 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f749aa11-fbfe-4089-b331-268e4a081b70-utilities\") pod \"certified-operators-gqftp\" (UID: \"f749aa11-fbfe-4089-b331-268e4a081b70\") " pod="openshift-marketplace/certified-operators-gqftp" Oct 12 06:16:06 crc kubenswrapper[4930]: I1012 06:16:06.262435 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f749aa11-fbfe-4089-b331-268e4a081b70-catalog-content\") pod \"certified-operators-gqftp\" (UID: \"f749aa11-fbfe-4089-b331-268e4a081b70\") " pod="openshift-marketplace/certified-operators-gqftp" Oct 12 06:16:06 crc kubenswrapper[4930]: I1012 06:16:06.262483 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f749aa11-fbfe-4089-b331-268e4a081b70-utilities\") pod \"certified-operators-gqftp\" (UID: \"f749aa11-fbfe-4089-b331-268e4a081b70\") " pod="openshift-marketplace/certified-operators-gqftp" Oct 12 06:16:06 crc kubenswrapper[4930]: I1012 06:16:06.288309 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8c8hg\" (UniqueName: \"kubernetes.io/projected/f749aa11-fbfe-4089-b331-268e4a081b70-kube-api-access-8c8hg\") pod \"certified-operators-gqftp\" (UID: \"f749aa11-fbfe-4089-b331-268e4a081b70\") " pod="openshift-marketplace/certified-operators-gqftp" Oct 12 06:16:06 crc kubenswrapper[4930]: I1012 06:16:06.391950 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gqftp" Oct 12 06:16:06 crc kubenswrapper[4930]: I1012 06:16:06.920012 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gqftp"] Oct 12 06:16:07 crc kubenswrapper[4930]: I1012 06:16:07.058412 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gqftp" event={"ID":"f749aa11-fbfe-4089-b331-268e4a081b70","Type":"ContainerStarted","Data":"34a83eb739d96df3cbc6fe121fc4bbdf90e36660308f51eb80b54da86847fa9a"} Oct 12 06:16:08 crc kubenswrapper[4930]: I1012 06:16:08.072915 4930 generic.go:334] "Generic (PLEG): container finished" podID="f749aa11-fbfe-4089-b331-268e4a081b70" containerID="4d87db0ee0f50acb1ffe3c5edbb5fbf598108855d93d690e1b44dfb5ee1bcbaa" exitCode=0 Oct 12 06:16:08 crc kubenswrapper[4930]: I1012 06:16:08.073032 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gqftp" event={"ID":"f749aa11-fbfe-4089-b331-268e4a081b70","Type":"ContainerDied","Data":"4d87db0ee0f50acb1ffe3c5edbb5fbf598108855d93d690e1b44dfb5ee1bcbaa"} Oct 12 06:16:08 crc kubenswrapper[4930]: I1012 06:16:08.690304 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-t4g8g" Oct 12 06:16:08 crc kubenswrapper[4930]: I1012 06:16:08.690723 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-t4g8g" Oct 12 06:16:09 crc kubenswrapper[4930]: I1012 06:16:09.087868 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gqftp" event={"ID":"f749aa11-fbfe-4089-b331-268e4a081b70","Type":"ContainerStarted","Data":"bbe0d59f79b9550cf820ad6fd2c77d792e7fa5dc8ae353733bd79ad56bb2af15"} Oct 12 06:16:09 crc kubenswrapper[4930]: I1012 06:16:09.783630 4930 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-t4g8g" podUID="dd7fdf88-1168-498e-a63a-f4b942911d18" containerName="registry-server" probeResult="failure" output=< Oct 12 06:16:09 crc kubenswrapper[4930]: timeout: failed to connect service ":50051" within 1s Oct 12 06:16:09 crc kubenswrapper[4930]: > Oct 12 06:16:11 crc kubenswrapper[4930]: I1012 06:16:11.114872 4930 generic.go:334] "Generic (PLEG): container finished" podID="f749aa11-fbfe-4089-b331-268e4a081b70" containerID="bbe0d59f79b9550cf820ad6fd2c77d792e7fa5dc8ae353733bd79ad56bb2af15" exitCode=0 Oct 12 06:16:11 crc kubenswrapper[4930]: I1012 06:16:11.114983 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gqftp" event={"ID":"f749aa11-fbfe-4089-b331-268e4a081b70","Type":"ContainerDied","Data":"bbe0d59f79b9550cf820ad6fd2c77d792e7fa5dc8ae353733bd79ad56bb2af15"} Oct 12 06:16:12 crc kubenswrapper[4930]: I1012 06:16:12.127992 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gqftp" event={"ID":"f749aa11-fbfe-4089-b331-268e4a081b70","Type":"ContainerStarted","Data":"ed38e6838f94203c6af2abe4e015edb3e8567337f0b755cfdb233401baa6e9f7"} Oct 12 06:16:12 crc kubenswrapper[4930]: I1012 06:16:12.157001 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gqftp" podStartSLOduration=3.622445647 podStartE2EDuration="7.156982637s" podCreationTimestamp="2025-10-12 06:16:05 +0000 UTC" firstStartedPulling="2025-10-12 06:16:08.075491925 +0000 UTC m=+2100.617593710" lastFinishedPulling="2025-10-12 06:16:11.610028895 +0000 UTC m=+2104.152130700" observedRunningTime="2025-10-12 06:16:12.155635263 +0000 UTC m=+2104.697737058" watchObservedRunningTime="2025-10-12 06:16:12.156982637 +0000 UTC m=+2104.699084412" Oct 12 06:16:16 crc kubenswrapper[4930]: I1012 06:16:16.392234 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gqftp" Oct 12 06:16:16 crc kubenswrapper[4930]: I1012 06:16:16.392957 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gqftp" Oct 12 06:16:17 crc kubenswrapper[4930]: I1012 06:16:17.474494 4930 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-gqftp" podUID="f749aa11-fbfe-4089-b331-268e4a081b70" containerName="registry-server" probeResult="failure" output=< Oct 12 06:16:17 crc kubenswrapper[4930]: timeout: failed to connect service ":50051" within 1s Oct 12 06:16:17 crc kubenswrapper[4930]: > Oct 12 06:16:19 crc kubenswrapper[4930]: I1012 06:16:19.736812 4930 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-t4g8g" podUID="dd7fdf88-1168-498e-a63a-f4b942911d18" containerName="registry-server" probeResult="failure" output=< Oct 12 06:16:19 crc kubenswrapper[4930]: timeout: failed to connect service ":50051" within 1s Oct 12 06:16:19 crc kubenswrapper[4930]: > Oct 12 06:16:26 crc kubenswrapper[4930]: I1012 06:16:26.462026 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gqftp" Oct 12 06:16:26 crc kubenswrapper[4930]: I1012 06:16:26.554755 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gqftp" Oct 12 06:16:26 crc kubenswrapper[4930]: I1012 06:16:26.716842 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gqftp"] Oct 12 06:16:28 crc kubenswrapper[4930]: I1012 06:16:28.335991 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gqftp" podUID="f749aa11-fbfe-4089-b331-268e4a081b70" containerName="registry-server" containerID="cri-o://ed38e6838f94203c6af2abe4e015edb3e8567337f0b755cfdb233401baa6e9f7" gracePeriod=2 Oct 12 06:16:28 crc kubenswrapper[4930]: I1012 06:16:28.786684 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-t4g8g" Oct 12 06:16:28 crc kubenswrapper[4930]: I1012 06:16:28.855310 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-t4g8g" Oct 12 06:16:28 crc kubenswrapper[4930]: I1012 06:16:28.897697 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gqftp" Oct 12 06:16:28 crc kubenswrapper[4930]: I1012 06:16:28.995486 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f749aa11-fbfe-4089-b331-268e4a081b70-utilities\") pod \"f749aa11-fbfe-4089-b331-268e4a081b70\" (UID: \"f749aa11-fbfe-4089-b331-268e4a081b70\") " Oct 12 06:16:28 crc kubenswrapper[4930]: I1012 06:16:28.996188 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8c8hg\" (UniqueName: \"kubernetes.io/projected/f749aa11-fbfe-4089-b331-268e4a081b70-kube-api-access-8c8hg\") pod \"f749aa11-fbfe-4089-b331-268e4a081b70\" (UID: \"f749aa11-fbfe-4089-b331-268e4a081b70\") " Oct 12 06:16:28 crc kubenswrapper[4930]: I1012 06:16:28.996327 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f749aa11-fbfe-4089-b331-268e4a081b70-catalog-content\") pod \"f749aa11-fbfe-4089-b331-268e4a081b70\" (UID: \"f749aa11-fbfe-4089-b331-268e4a081b70\") " Oct 12 06:16:28 crc kubenswrapper[4930]: I1012 06:16:28.997131 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f749aa11-fbfe-4089-b331-268e4a081b70-utilities" (OuterVolumeSpecName: "utilities") pod "f749aa11-fbfe-4089-b331-268e4a081b70" (UID: "f749aa11-fbfe-4089-b331-268e4a081b70"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 06:16:29 crc kubenswrapper[4930]: I1012 06:16:29.004156 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f749aa11-fbfe-4089-b331-268e4a081b70-kube-api-access-8c8hg" (OuterVolumeSpecName: "kube-api-access-8c8hg") pod "f749aa11-fbfe-4089-b331-268e4a081b70" (UID: "f749aa11-fbfe-4089-b331-268e4a081b70"). InnerVolumeSpecName "kube-api-access-8c8hg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:16:29 crc kubenswrapper[4930]: I1012 06:16:29.043189 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f749aa11-fbfe-4089-b331-268e4a081b70-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f749aa11-fbfe-4089-b331-268e4a081b70" (UID: "f749aa11-fbfe-4089-b331-268e4a081b70"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 06:16:29 crc kubenswrapper[4930]: I1012 06:16:29.099333 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8c8hg\" (UniqueName: \"kubernetes.io/projected/f749aa11-fbfe-4089-b331-268e4a081b70-kube-api-access-8c8hg\") on node \"crc\" DevicePath \"\"" Oct 12 06:16:29 crc kubenswrapper[4930]: I1012 06:16:29.099376 4930 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f749aa11-fbfe-4089-b331-268e4a081b70-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 06:16:29 crc kubenswrapper[4930]: I1012 06:16:29.099386 4930 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f749aa11-fbfe-4089-b331-268e4a081b70-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 06:16:29 crc kubenswrapper[4930]: I1012 06:16:29.118099 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t4g8g"] Oct 12 06:16:29 crc kubenswrapper[4930]: I1012 06:16:29.353830 4930 generic.go:334] "Generic (PLEG): container finished" podID="f749aa11-fbfe-4089-b331-268e4a081b70" containerID="ed38e6838f94203c6af2abe4e015edb3e8567337f0b755cfdb233401baa6e9f7" exitCode=0 Oct 12 06:16:29 crc kubenswrapper[4930]: I1012 06:16:29.355336 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gqftp" Oct 12 06:16:29 crc kubenswrapper[4930]: I1012 06:16:29.356951 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gqftp" event={"ID":"f749aa11-fbfe-4089-b331-268e4a081b70","Type":"ContainerDied","Data":"ed38e6838f94203c6af2abe4e015edb3e8567337f0b755cfdb233401baa6e9f7"} Oct 12 06:16:29 crc kubenswrapper[4930]: I1012 06:16:29.357039 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gqftp" event={"ID":"f749aa11-fbfe-4089-b331-268e4a081b70","Type":"ContainerDied","Data":"34a83eb739d96df3cbc6fe121fc4bbdf90e36660308f51eb80b54da86847fa9a"} Oct 12 06:16:29 crc kubenswrapper[4930]: I1012 06:16:29.357074 4930 scope.go:117] "RemoveContainer" containerID="ed38e6838f94203c6af2abe4e015edb3e8567337f0b755cfdb233401baa6e9f7" Oct 12 06:16:29 crc kubenswrapper[4930]: I1012 06:16:29.403617 4930 scope.go:117] "RemoveContainer" containerID="bbe0d59f79b9550cf820ad6fd2c77d792e7fa5dc8ae353733bd79ad56bb2af15" Oct 12 06:16:29 crc kubenswrapper[4930]: I1012 06:16:29.418943 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gqftp"] Oct 12 06:16:29 crc kubenswrapper[4930]: I1012 06:16:29.430584 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gqftp"] Oct 12 06:16:29 crc kubenswrapper[4930]: I1012 06:16:29.446289 4930 scope.go:117] "RemoveContainer" containerID="4d87db0ee0f50acb1ffe3c5edbb5fbf598108855d93d690e1b44dfb5ee1bcbaa" Oct 12 06:16:29 crc kubenswrapper[4930]: I1012 06:16:29.508107 4930 scope.go:117] "RemoveContainer" containerID="ed38e6838f94203c6af2abe4e015edb3e8567337f0b755cfdb233401baa6e9f7" Oct 12 06:16:29 crc kubenswrapper[4930]: E1012 06:16:29.508896 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed38e6838f94203c6af2abe4e015edb3e8567337f0b755cfdb233401baa6e9f7\": container with ID starting with ed38e6838f94203c6af2abe4e015edb3e8567337f0b755cfdb233401baa6e9f7 not found: ID does not exist" containerID="ed38e6838f94203c6af2abe4e015edb3e8567337f0b755cfdb233401baa6e9f7" Oct 12 06:16:29 crc kubenswrapper[4930]: I1012 06:16:29.508937 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed38e6838f94203c6af2abe4e015edb3e8567337f0b755cfdb233401baa6e9f7"} err="failed to get container status \"ed38e6838f94203c6af2abe4e015edb3e8567337f0b755cfdb233401baa6e9f7\": rpc error: code = NotFound desc = could not find container \"ed38e6838f94203c6af2abe4e015edb3e8567337f0b755cfdb233401baa6e9f7\": container with ID starting with ed38e6838f94203c6af2abe4e015edb3e8567337f0b755cfdb233401baa6e9f7 not found: ID does not exist" Oct 12 06:16:29 crc kubenswrapper[4930]: I1012 06:16:29.509146 4930 scope.go:117] "RemoveContainer" containerID="bbe0d59f79b9550cf820ad6fd2c77d792e7fa5dc8ae353733bd79ad56bb2af15" Oct 12 06:16:29 crc kubenswrapper[4930]: E1012 06:16:29.509840 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbe0d59f79b9550cf820ad6fd2c77d792e7fa5dc8ae353733bd79ad56bb2af15\": container with ID starting with bbe0d59f79b9550cf820ad6fd2c77d792e7fa5dc8ae353733bd79ad56bb2af15 not found: ID does not exist" containerID="bbe0d59f79b9550cf820ad6fd2c77d792e7fa5dc8ae353733bd79ad56bb2af15" Oct 12 06:16:29 crc kubenswrapper[4930]: I1012 06:16:29.509867 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbe0d59f79b9550cf820ad6fd2c77d792e7fa5dc8ae353733bd79ad56bb2af15"} err="failed to get container status \"bbe0d59f79b9550cf820ad6fd2c77d792e7fa5dc8ae353733bd79ad56bb2af15\": rpc error: code = NotFound desc = could not find container \"bbe0d59f79b9550cf820ad6fd2c77d792e7fa5dc8ae353733bd79ad56bb2af15\": container with ID starting with bbe0d59f79b9550cf820ad6fd2c77d792e7fa5dc8ae353733bd79ad56bb2af15 not found: ID does not exist" Oct 12 06:16:29 crc kubenswrapper[4930]: I1012 06:16:29.509884 4930 scope.go:117] "RemoveContainer" containerID="4d87db0ee0f50acb1ffe3c5edbb5fbf598108855d93d690e1b44dfb5ee1bcbaa" Oct 12 06:16:29 crc kubenswrapper[4930]: E1012 06:16:29.510220 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d87db0ee0f50acb1ffe3c5edbb5fbf598108855d93d690e1b44dfb5ee1bcbaa\": container with ID starting with 4d87db0ee0f50acb1ffe3c5edbb5fbf598108855d93d690e1b44dfb5ee1bcbaa not found: ID does not exist" containerID="4d87db0ee0f50acb1ffe3c5edbb5fbf598108855d93d690e1b44dfb5ee1bcbaa" Oct 12 06:16:29 crc kubenswrapper[4930]: I1012 06:16:29.510266 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d87db0ee0f50acb1ffe3c5edbb5fbf598108855d93d690e1b44dfb5ee1bcbaa"} err="failed to get container status \"4d87db0ee0f50acb1ffe3c5edbb5fbf598108855d93d690e1b44dfb5ee1bcbaa\": rpc error: code = NotFound desc = could not find container \"4d87db0ee0f50acb1ffe3c5edbb5fbf598108855d93d690e1b44dfb5ee1bcbaa\": container with ID starting with 4d87db0ee0f50acb1ffe3c5edbb5fbf598108855d93d690e1b44dfb5ee1bcbaa not found: ID does not exist" Oct 12 06:16:30 crc kubenswrapper[4930]: I1012 06:16:30.155341 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f749aa11-fbfe-4089-b331-268e4a081b70" path="/var/lib/kubelet/pods/f749aa11-fbfe-4089-b331-268e4a081b70/volumes" Oct 12 06:16:30 crc kubenswrapper[4930]: I1012 06:16:30.373224 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-t4g8g" podUID="dd7fdf88-1168-498e-a63a-f4b942911d18" containerName="registry-server" containerID="cri-o://794dc283677d941c61e5816d927369378132e687335c59d6a0420ca291edd7de" gracePeriod=2 Oct 12 06:16:30 crc kubenswrapper[4930]: I1012 06:16:30.933188 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t4g8g" Oct 12 06:16:31 crc kubenswrapper[4930]: I1012 06:16:31.062918 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd7fdf88-1168-498e-a63a-f4b942911d18-utilities\") pod \"dd7fdf88-1168-498e-a63a-f4b942911d18\" (UID: \"dd7fdf88-1168-498e-a63a-f4b942911d18\") " Oct 12 06:16:31 crc kubenswrapper[4930]: I1012 06:16:31.063068 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd7fdf88-1168-498e-a63a-f4b942911d18-catalog-content\") pod \"dd7fdf88-1168-498e-a63a-f4b942911d18\" (UID: \"dd7fdf88-1168-498e-a63a-f4b942911d18\") " Oct 12 06:16:31 crc kubenswrapper[4930]: I1012 06:16:31.063153 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dtm6l\" (UniqueName: \"kubernetes.io/projected/dd7fdf88-1168-498e-a63a-f4b942911d18-kube-api-access-dtm6l\") pod \"dd7fdf88-1168-498e-a63a-f4b942911d18\" (UID: \"dd7fdf88-1168-498e-a63a-f4b942911d18\") " Oct 12 06:16:31 crc kubenswrapper[4930]: I1012 06:16:31.064581 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd7fdf88-1168-498e-a63a-f4b942911d18-utilities" (OuterVolumeSpecName: "utilities") pod "dd7fdf88-1168-498e-a63a-f4b942911d18" (UID: "dd7fdf88-1168-498e-a63a-f4b942911d18"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 06:16:31 crc kubenswrapper[4930]: I1012 06:16:31.070957 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd7fdf88-1168-498e-a63a-f4b942911d18-kube-api-access-dtm6l" (OuterVolumeSpecName: "kube-api-access-dtm6l") pod "dd7fdf88-1168-498e-a63a-f4b942911d18" (UID: "dd7fdf88-1168-498e-a63a-f4b942911d18"). InnerVolumeSpecName "kube-api-access-dtm6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:16:31 crc kubenswrapper[4930]: I1012 06:16:31.166403 4930 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd7fdf88-1168-498e-a63a-f4b942911d18-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 06:16:31 crc kubenswrapper[4930]: I1012 06:16:31.166456 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dtm6l\" (UniqueName: \"kubernetes.io/projected/dd7fdf88-1168-498e-a63a-f4b942911d18-kube-api-access-dtm6l\") on node \"crc\" DevicePath \"\"" Oct 12 06:16:31 crc kubenswrapper[4930]: I1012 06:16:31.166569 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd7fdf88-1168-498e-a63a-f4b942911d18-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dd7fdf88-1168-498e-a63a-f4b942911d18" (UID: "dd7fdf88-1168-498e-a63a-f4b942911d18"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 06:16:31 crc kubenswrapper[4930]: I1012 06:16:31.269051 4930 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd7fdf88-1168-498e-a63a-f4b942911d18-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 06:16:31 crc kubenswrapper[4930]: I1012 06:16:31.388583 4930 generic.go:334] "Generic (PLEG): container finished" podID="dd7fdf88-1168-498e-a63a-f4b942911d18" containerID="794dc283677d941c61e5816d927369378132e687335c59d6a0420ca291edd7de" exitCode=0 Oct 12 06:16:31 crc kubenswrapper[4930]: I1012 06:16:31.388635 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t4g8g" event={"ID":"dd7fdf88-1168-498e-a63a-f4b942911d18","Type":"ContainerDied","Data":"794dc283677d941c61e5816d927369378132e687335c59d6a0420ca291edd7de"} Oct 12 06:16:31 crc kubenswrapper[4930]: I1012 06:16:31.388703 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t4g8g" Oct 12 06:16:31 crc kubenswrapper[4930]: I1012 06:16:31.389564 4930 scope.go:117] "RemoveContainer" containerID="794dc283677d941c61e5816d927369378132e687335c59d6a0420ca291edd7de" Oct 12 06:16:31 crc kubenswrapper[4930]: I1012 06:16:31.389523 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t4g8g" event={"ID":"dd7fdf88-1168-498e-a63a-f4b942911d18","Type":"ContainerDied","Data":"1375ae50cef48fa58ca6155853b7fedcb8781631e8cdbc2a56c7331dbacd8f37"} Oct 12 06:16:31 crc kubenswrapper[4930]: I1012 06:16:31.444685 4930 scope.go:117] "RemoveContainer" containerID="5d610f5bbb219dcc027a2d42ff6ff45bfe8509c041271c7cdb5b68829b07a42c" Oct 12 06:16:31 crc kubenswrapper[4930]: I1012 06:16:31.460668 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t4g8g"] Oct 12 06:16:31 crc kubenswrapper[4930]: I1012 06:16:31.473553 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-t4g8g"] Oct 12 06:16:31 crc kubenswrapper[4930]: I1012 06:16:31.474119 4930 scope.go:117] "RemoveContainer" containerID="85fb2707e1ad437e89a244d480764f20d17a27de69a11f27a49be8dfd73e569e" Oct 12 06:16:31 crc kubenswrapper[4930]: I1012 06:16:31.531187 4930 scope.go:117] "RemoveContainer" containerID="794dc283677d941c61e5816d927369378132e687335c59d6a0420ca291edd7de" Oct 12 06:16:31 crc kubenswrapper[4930]: E1012 06:16:31.531710 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"794dc283677d941c61e5816d927369378132e687335c59d6a0420ca291edd7de\": container with ID starting with 794dc283677d941c61e5816d927369378132e687335c59d6a0420ca291edd7de not found: ID does not exist" containerID="794dc283677d941c61e5816d927369378132e687335c59d6a0420ca291edd7de" Oct 12 06:16:31 crc kubenswrapper[4930]: I1012 06:16:31.531783 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"794dc283677d941c61e5816d927369378132e687335c59d6a0420ca291edd7de"} err="failed to get container status \"794dc283677d941c61e5816d927369378132e687335c59d6a0420ca291edd7de\": rpc error: code = NotFound desc = could not find container \"794dc283677d941c61e5816d927369378132e687335c59d6a0420ca291edd7de\": container with ID starting with 794dc283677d941c61e5816d927369378132e687335c59d6a0420ca291edd7de not found: ID does not exist" Oct 12 06:16:31 crc kubenswrapper[4930]: I1012 06:16:31.531816 4930 scope.go:117] "RemoveContainer" containerID="5d610f5bbb219dcc027a2d42ff6ff45bfe8509c041271c7cdb5b68829b07a42c" Oct 12 06:16:31 crc kubenswrapper[4930]: E1012 06:16:31.532376 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d610f5bbb219dcc027a2d42ff6ff45bfe8509c041271c7cdb5b68829b07a42c\": container with ID starting with 5d610f5bbb219dcc027a2d42ff6ff45bfe8509c041271c7cdb5b68829b07a42c not found: ID does not exist" containerID="5d610f5bbb219dcc027a2d42ff6ff45bfe8509c041271c7cdb5b68829b07a42c" Oct 12 06:16:31 crc kubenswrapper[4930]: I1012 06:16:31.532437 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d610f5bbb219dcc027a2d42ff6ff45bfe8509c041271c7cdb5b68829b07a42c"} err="failed to get container status \"5d610f5bbb219dcc027a2d42ff6ff45bfe8509c041271c7cdb5b68829b07a42c\": rpc error: code = NotFound desc = could not find container \"5d610f5bbb219dcc027a2d42ff6ff45bfe8509c041271c7cdb5b68829b07a42c\": container with ID starting with 5d610f5bbb219dcc027a2d42ff6ff45bfe8509c041271c7cdb5b68829b07a42c not found: ID does not exist" Oct 12 06:16:31 crc kubenswrapper[4930]: I1012 06:16:31.532469 4930 scope.go:117] "RemoveContainer" containerID="85fb2707e1ad437e89a244d480764f20d17a27de69a11f27a49be8dfd73e569e" Oct 12 06:16:31 crc kubenswrapper[4930]: E1012 06:16:31.532854 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85fb2707e1ad437e89a244d480764f20d17a27de69a11f27a49be8dfd73e569e\": container with ID starting with 85fb2707e1ad437e89a244d480764f20d17a27de69a11f27a49be8dfd73e569e not found: ID does not exist" containerID="85fb2707e1ad437e89a244d480764f20d17a27de69a11f27a49be8dfd73e569e" Oct 12 06:16:31 crc kubenswrapper[4930]: I1012 06:16:31.532904 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85fb2707e1ad437e89a244d480764f20d17a27de69a11f27a49be8dfd73e569e"} err="failed to get container status \"85fb2707e1ad437e89a244d480764f20d17a27de69a11f27a49be8dfd73e569e\": rpc error: code = NotFound desc = could not find container \"85fb2707e1ad437e89a244d480764f20d17a27de69a11f27a49be8dfd73e569e\": container with ID starting with 85fb2707e1ad437e89a244d480764f20d17a27de69a11f27a49be8dfd73e569e not found: ID does not exist" Oct 12 06:16:32 crc kubenswrapper[4930]: I1012 06:16:32.161626 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd7fdf88-1168-498e-a63a-f4b942911d18" path="/var/lib/kubelet/pods/dd7fdf88-1168-498e-a63a-f4b942911d18/volumes" Oct 12 06:16:56 crc kubenswrapper[4930]: I1012 06:16:56.695148 4930 generic.go:334] "Generic (PLEG): container finished" podID="1d14f488-c958-4022-b1db-a1161afad246" containerID="9498ace4b020124237a3ef9fb8bdc3505b98f8cf395c81861da2d2d4bce9a6bf" exitCode=0 Oct 12 06:16:56 crc kubenswrapper[4930]: I1012 06:16:56.695311 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h52cl" event={"ID":"1d14f488-c958-4022-b1db-a1161afad246","Type":"ContainerDied","Data":"9498ace4b020124237a3ef9fb8bdc3505b98f8cf395c81861da2d2d4bce9a6bf"} Oct 12 06:16:58 crc kubenswrapper[4930]: I1012 06:16:58.208793 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h52cl" Oct 12 06:16:58 crc kubenswrapper[4930]: I1012 06:16:58.306462 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/1d14f488-c958-4022-b1db-a1161afad246-ovncontroller-config-0\") pod \"1d14f488-c958-4022-b1db-a1161afad246\" (UID: \"1d14f488-c958-4022-b1db-a1161afad246\") " Oct 12 06:16:58 crc kubenswrapper[4930]: I1012 06:16:58.306562 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ntvm4\" (UniqueName: \"kubernetes.io/projected/1d14f488-c958-4022-b1db-a1161afad246-kube-api-access-ntvm4\") pod \"1d14f488-c958-4022-b1db-a1161afad246\" (UID: \"1d14f488-c958-4022-b1db-a1161afad246\") " Oct 12 06:16:58 crc kubenswrapper[4930]: I1012 06:16:58.306690 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1d14f488-c958-4022-b1db-a1161afad246-inventory\") pod \"1d14f488-c958-4022-b1db-a1161afad246\" (UID: \"1d14f488-c958-4022-b1db-a1161afad246\") " Oct 12 06:16:58 crc kubenswrapper[4930]: I1012 06:16:58.306902 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1d14f488-c958-4022-b1db-a1161afad246-ssh-key\") pod \"1d14f488-c958-4022-b1db-a1161afad246\" (UID: \"1d14f488-c958-4022-b1db-a1161afad246\") " Oct 12 06:16:58 crc kubenswrapper[4930]: I1012 06:16:58.306947 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d14f488-c958-4022-b1db-a1161afad246-ovn-combined-ca-bundle\") pod \"1d14f488-c958-4022-b1db-a1161afad246\" (UID: \"1d14f488-c958-4022-b1db-a1161afad246\") " Oct 12 06:16:58 crc kubenswrapper[4930]: I1012 06:16:58.313183 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d14f488-c958-4022-b1db-a1161afad246-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "1d14f488-c958-4022-b1db-a1161afad246" (UID: "1d14f488-c958-4022-b1db-a1161afad246"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:16:58 crc kubenswrapper[4930]: I1012 06:16:58.318018 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d14f488-c958-4022-b1db-a1161afad246-kube-api-access-ntvm4" (OuterVolumeSpecName: "kube-api-access-ntvm4") pod "1d14f488-c958-4022-b1db-a1161afad246" (UID: "1d14f488-c958-4022-b1db-a1161afad246"). InnerVolumeSpecName "kube-api-access-ntvm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:16:58 crc kubenswrapper[4930]: I1012 06:16:58.337146 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d14f488-c958-4022-b1db-a1161afad246-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "1d14f488-c958-4022-b1db-a1161afad246" (UID: "1d14f488-c958-4022-b1db-a1161afad246"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 06:16:58 crc kubenswrapper[4930]: I1012 06:16:58.339988 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d14f488-c958-4022-b1db-a1161afad246-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1d14f488-c958-4022-b1db-a1161afad246" (UID: "1d14f488-c958-4022-b1db-a1161afad246"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:16:58 crc kubenswrapper[4930]: I1012 06:16:58.340324 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d14f488-c958-4022-b1db-a1161afad246-inventory" (OuterVolumeSpecName: "inventory") pod "1d14f488-c958-4022-b1db-a1161afad246" (UID: "1d14f488-c958-4022-b1db-a1161afad246"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:16:58 crc kubenswrapper[4930]: I1012 06:16:58.409878 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ntvm4\" (UniqueName: \"kubernetes.io/projected/1d14f488-c958-4022-b1db-a1161afad246-kube-api-access-ntvm4\") on node \"crc\" DevicePath \"\"" Oct 12 06:16:58 crc kubenswrapper[4930]: I1012 06:16:58.409910 4930 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1d14f488-c958-4022-b1db-a1161afad246-inventory\") on node \"crc\" DevicePath \"\"" Oct 12 06:16:58 crc kubenswrapper[4930]: I1012 06:16:58.409919 4930 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1d14f488-c958-4022-b1db-a1161afad246-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 12 06:16:58 crc kubenswrapper[4930]: I1012 06:16:58.409928 4930 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d14f488-c958-4022-b1db-a1161afad246-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 06:16:58 crc kubenswrapper[4930]: I1012 06:16:58.409937 4930 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/1d14f488-c958-4022-b1db-a1161afad246-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Oct 12 06:16:58 crc kubenswrapper[4930]: I1012 06:16:58.719390 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h52cl" event={"ID":"1d14f488-c958-4022-b1db-a1161afad246","Type":"ContainerDied","Data":"6ea9ddfe7c81408e69daeca8a478152ef8c05fb80450f1bbf88e6f3af5cf4049"} Oct 12 06:16:58 crc kubenswrapper[4930]: I1012 06:16:58.719451 4930 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ea9ddfe7c81408e69daeca8a478152ef8c05fb80450f1bbf88e6f3af5cf4049" Oct 12 06:16:58 crc kubenswrapper[4930]: I1012 06:16:58.719536 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h52cl" Oct 12 06:16:58 crc kubenswrapper[4930]: I1012 06:16:58.871187 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-68ch8"] Oct 12 06:16:58 crc kubenswrapper[4930]: E1012 06:16:58.871646 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f749aa11-fbfe-4089-b331-268e4a081b70" containerName="registry-server" Oct 12 06:16:58 crc kubenswrapper[4930]: I1012 06:16:58.871673 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="f749aa11-fbfe-4089-b331-268e4a081b70" containerName="registry-server" Oct 12 06:16:58 crc kubenswrapper[4930]: E1012 06:16:58.871705 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f749aa11-fbfe-4089-b331-268e4a081b70" containerName="extract-utilities" Oct 12 06:16:58 crc kubenswrapper[4930]: I1012 06:16:58.871714 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="f749aa11-fbfe-4089-b331-268e4a081b70" containerName="extract-utilities" Oct 12 06:16:58 crc kubenswrapper[4930]: E1012 06:16:58.871724 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd7fdf88-1168-498e-a63a-f4b942911d18" containerName="extract-utilities" Oct 12 06:16:58 crc kubenswrapper[4930]: I1012 06:16:58.871731 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd7fdf88-1168-498e-a63a-f4b942911d18" containerName="extract-utilities" Oct 12 06:16:58 crc kubenswrapper[4930]: E1012 06:16:58.871771 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f749aa11-fbfe-4089-b331-268e4a081b70" containerName="extract-content" Oct 12 06:16:58 crc kubenswrapper[4930]: I1012 06:16:58.871779 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="f749aa11-fbfe-4089-b331-268e4a081b70" containerName="extract-content" Oct 12 06:16:58 crc kubenswrapper[4930]: E1012 06:16:58.871793 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd7fdf88-1168-498e-a63a-f4b942911d18" containerName="registry-server" Oct 12 06:16:58 crc kubenswrapper[4930]: I1012 06:16:58.871801 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd7fdf88-1168-498e-a63a-f4b942911d18" containerName="registry-server" Oct 12 06:16:58 crc kubenswrapper[4930]: E1012 06:16:58.871821 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd7fdf88-1168-498e-a63a-f4b942911d18" containerName="extract-content" Oct 12 06:16:58 crc kubenswrapper[4930]: I1012 06:16:58.871827 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd7fdf88-1168-498e-a63a-f4b942911d18" containerName="extract-content" Oct 12 06:16:58 crc kubenswrapper[4930]: E1012 06:16:58.871840 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d14f488-c958-4022-b1db-a1161afad246" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 12 06:16:58 crc kubenswrapper[4930]: I1012 06:16:58.871847 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d14f488-c958-4022-b1db-a1161afad246" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 12 06:16:58 crc kubenswrapper[4930]: I1012 06:16:58.872145 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd7fdf88-1168-498e-a63a-f4b942911d18" containerName="registry-server" Oct 12 06:16:58 crc kubenswrapper[4930]: I1012 06:16:58.872162 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="f749aa11-fbfe-4089-b331-268e4a081b70" containerName="registry-server" Oct 12 06:16:58 crc kubenswrapper[4930]: I1012 06:16:58.872182 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d14f488-c958-4022-b1db-a1161afad246" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 12 06:16:58 crc kubenswrapper[4930]: I1012 06:16:58.873141 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-68ch8" Oct 12 06:16:58 crc kubenswrapper[4930]: I1012 06:16:58.877273 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-68ch8"] Oct 12 06:16:58 crc kubenswrapper[4930]: I1012 06:16:58.881155 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 12 06:16:58 crc kubenswrapper[4930]: I1012 06:16:58.881430 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Oct 12 06:16:58 crc kubenswrapper[4930]: I1012 06:16:58.881549 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vhlxg" Oct 12 06:16:58 crc kubenswrapper[4930]: I1012 06:16:58.881633 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 12 06:16:58 crc kubenswrapper[4930]: I1012 06:16:58.881691 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Oct 12 06:16:58 crc kubenswrapper[4930]: I1012 06:16:58.882694 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 12 06:16:58 crc kubenswrapper[4930]: I1012 06:16:58.919151 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/31615b46-4290-46db-993e-3e5afa29c3f6-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-68ch8\" (UID: \"31615b46-4290-46db-993e-3e5afa29c3f6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-68ch8" Oct 12 06:16:58 crc kubenswrapper[4930]: I1012 06:16:58.919501 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jj92\" (UniqueName: \"kubernetes.io/projected/31615b46-4290-46db-993e-3e5afa29c3f6-kube-api-access-4jj92\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-68ch8\" (UID: \"31615b46-4290-46db-993e-3e5afa29c3f6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-68ch8" Oct 12 06:16:58 crc kubenswrapper[4930]: I1012 06:16:58.919661 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/31615b46-4290-46db-993e-3e5afa29c3f6-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-68ch8\" (UID: \"31615b46-4290-46db-993e-3e5afa29c3f6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-68ch8" Oct 12 06:16:58 crc kubenswrapper[4930]: I1012 06:16:58.919850 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31615b46-4290-46db-993e-3e5afa29c3f6-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-68ch8\" (UID: \"31615b46-4290-46db-993e-3e5afa29c3f6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-68ch8" Oct 12 06:16:58 crc kubenswrapper[4930]: I1012 06:16:58.920462 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/31615b46-4290-46db-993e-3e5afa29c3f6-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-68ch8\" (UID: \"31615b46-4290-46db-993e-3e5afa29c3f6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-68ch8" Oct 12 06:16:58 crc kubenswrapper[4930]: I1012 06:16:58.920751 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/31615b46-4290-46db-993e-3e5afa29c3f6-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-68ch8\" (UID: \"31615b46-4290-46db-993e-3e5afa29c3f6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-68ch8" Oct 12 06:16:59 crc kubenswrapper[4930]: I1012 06:16:59.023605 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/31615b46-4290-46db-993e-3e5afa29c3f6-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-68ch8\" (UID: \"31615b46-4290-46db-993e-3e5afa29c3f6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-68ch8" Oct 12 06:16:59 crc kubenswrapper[4930]: I1012 06:16:59.024184 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/31615b46-4290-46db-993e-3e5afa29c3f6-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-68ch8\" (UID: \"31615b46-4290-46db-993e-3e5afa29c3f6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-68ch8" Oct 12 06:16:59 crc kubenswrapper[4930]: I1012 06:16:59.024229 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jj92\" (UniqueName: \"kubernetes.io/projected/31615b46-4290-46db-993e-3e5afa29c3f6-kube-api-access-4jj92\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-68ch8\" (UID: \"31615b46-4290-46db-993e-3e5afa29c3f6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-68ch8" Oct 12 06:16:59 crc kubenswrapper[4930]: I1012 06:16:59.024281 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/31615b46-4290-46db-993e-3e5afa29c3f6-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-68ch8\" (UID: \"31615b46-4290-46db-993e-3e5afa29c3f6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-68ch8" Oct 12 06:16:59 crc kubenswrapper[4930]: I1012 06:16:59.024367 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31615b46-4290-46db-993e-3e5afa29c3f6-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-68ch8\" (UID: \"31615b46-4290-46db-993e-3e5afa29c3f6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-68ch8" Oct 12 06:16:59 crc kubenswrapper[4930]: I1012 06:16:59.024452 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/31615b46-4290-46db-993e-3e5afa29c3f6-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-68ch8\" (UID: \"31615b46-4290-46db-993e-3e5afa29c3f6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-68ch8" Oct 12 06:16:59 crc kubenswrapper[4930]: I1012 06:16:59.029346 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/31615b46-4290-46db-993e-3e5afa29c3f6-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-68ch8\" (UID: \"31615b46-4290-46db-993e-3e5afa29c3f6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-68ch8" Oct 12 06:16:59 crc kubenswrapper[4930]: I1012 06:16:59.029582 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/31615b46-4290-46db-993e-3e5afa29c3f6-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-68ch8\" (UID: \"31615b46-4290-46db-993e-3e5afa29c3f6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-68ch8" Oct 12 06:16:59 crc kubenswrapper[4930]: I1012 06:16:59.030672 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/31615b46-4290-46db-993e-3e5afa29c3f6-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-68ch8\" (UID: \"31615b46-4290-46db-993e-3e5afa29c3f6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-68ch8" Oct 12 06:16:59 crc kubenswrapper[4930]: I1012 06:16:59.034202 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/31615b46-4290-46db-993e-3e5afa29c3f6-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-68ch8\" (UID: \"31615b46-4290-46db-993e-3e5afa29c3f6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-68ch8" Oct 12 06:16:59 crc kubenswrapper[4930]: I1012 06:16:59.035393 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31615b46-4290-46db-993e-3e5afa29c3f6-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-68ch8\" (UID: \"31615b46-4290-46db-993e-3e5afa29c3f6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-68ch8" Oct 12 06:16:59 crc kubenswrapper[4930]: I1012 06:16:59.045998 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jj92\" (UniqueName: \"kubernetes.io/projected/31615b46-4290-46db-993e-3e5afa29c3f6-kube-api-access-4jj92\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-68ch8\" (UID: \"31615b46-4290-46db-993e-3e5afa29c3f6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-68ch8" Oct 12 06:16:59 crc kubenswrapper[4930]: I1012 06:16:59.210550 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-68ch8" Oct 12 06:16:59 crc kubenswrapper[4930]: I1012 06:16:59.855870 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-68ch8"] Oct 12 06:16:59 crc kubenswrapper[4930]: W1012 06:16:59.861321 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31615b46_4290_46db_993e_3e5afa29c3f6.slice/crio-ce424db0e1576bf30fadc913fa0cc86657ffd9be52e3d4363206c9b71144cc10 WatchSource:0}: Error finding container ce424db0e1576bf30fadc913fa0cc86657ffd9be52e3d4363206c9b71144cc10: Status 404 returned error can't find the container with id ce424db0e1576bf30fadc913fa0cc86657ffd9be52e3d4363206c9b71144cc10 Oct 12 06:17:00 crc kubenswrapper[4930]: I1012 06:17:00.742362 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-68ch8" event={"ID":"31615b46-4290-46db-993e-3e5afa29c3f6","Type":"ContainerStarted","Data":"ce424db0e1576bf30fadc913fa0cc86657ffd9be52e3d4363206c9b71144cc10"} Oct 12 06:17:01 crc kubenswrapper[4930]: I1012 06:17:01.759682 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-68ch8" event={"ID":"31615b46-4290-46db-993e-3e5afa29c3f6","Type":"ContainerStarted","Data":"688b5f53b88129f610ee6f2cdbfc8b6a7159107cc0a97eebf4a651fb641f6e1a"} Oct 12 06:17:01 crc kubenswrapper[4930]: I1012 06:17:01.796105 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-68ch8" podStartSLOduration=3.160753644 podStartE2EDuration="3.796080757s" podCreationTimestamp="2025-10-12 06:16:58 +0000 UTC" firstStartedPulling="2025-10-12 06:16:59.866635457 +0000 UTC m=+2152.408737232" lastFinishedPulling="2025-10-12 06:17:00.50196255 +0000 UTC m=+2153.044064345" observedRunningTime="2025-10-12 06:17:01.784765927 +0000 UTC m=+2154.326867772" watchObservedRunningTime="2025-10-12 06:17:01.796080757 +0000 UTC m=+2154.338182562" Oct 12 06:17:59 crc kubenswrapper[4930]: I1012 06:17:59.491326 4930 generic.go:334] "Generic (PLEG): container finished" podID="31615b46-4290-46db-993e-3e5afa29c3f6" containerID="688b5f53b88129f610ee6f2cdbfc8b6a7159107cc0a97eebf4a651fb641f6e1a" exitCode=0 Oct 12 06:17:59 crc kubenswrapper[4930]: I1012 06:17:59.491429 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-68ch8" event={"ID":"31615b46-4290-46db-993e-3e5afa29c3f6","Type":"ContainerDied","Data":"688b5f53b88129f610ee6f2cdbfc8b6a7159107cc0a97eebf4a651fb641f6e1a"} Oct 12 06:18:01 crc kubenswrapper[4930]: I1012 06:18:01.073196 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-68ch8" Oct 12 06:18:01 crc kubenswrapper[4930]: I1012 06:18:01.254686 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4jj92\" (UniqueName: \"kubernetes.io/projected/31615b46-4290-46db-993e-3e5afa29c3f6-kube-api-access-4jj92\") pod \"31615b46-4290-46db-993e-3e5afa29c3f6\" (UID: \"31615b46-4290-46db-993e-3e5afa29c3f6\") " Oct 12 06:18:01 crc kubenswrapper[4930]: I1012 06:18:01.254814 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/31615b46-4290-46db-993e-3e5afa29c3f6-inventory\") pod \"31615b46-4290-46db-993e-3e5afa29c3f6\" (UID: \"31615b46-4290-46db-993e-3e5afa29c3f6\") " Oct 12 06:18:01 crc kubenswrapper[4930]: I1012 06:18:01.254867 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31615b46-4290-46db-993e-3e5afa29c3f6-neutron-metadata-combined-ca-bundle\") pod \"31615b46-4290-46db-993e-3e5afa29c3f6\" (UID: \"31615b46-4290-46db-993e-3e5afa29c3f6\") " Oct 12 06:18:01 crc kubenswrapper[4930]: I1012 06:18:01.254908 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/31615b46-4290-46db-993e-3e5afa29c3f6-neutron-ovn-metadata-agent-neutron-config-0\") pod \"31615b46-4290-46db-993e-3e5afa29c3f6\" (UID: \"31615b46-4290-46db-993e-3e5afa29c3f6\") " Oct 12 06:18:01 crc kubenswrapper[4930]: I1012 06:18:01.254977 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/31615b46-4290-46db-993e-3e5afa29c3f6-ssh-key\") pod \"31615b46-4290-46db-993e-3e5afa29c3f6\" (UID: \"31615b46-4290-46db-993e-3e5afa29c3f6\") " Oct 12 06:18:01 crc kubenswrapper[4930]: I1012 06:18:01.255146 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/31615b46-4290-46db-993e-3e5afa29c3f6-nova-metadata-neutron-config-0\") pod \"31615b46-4290-46db-993e-3e5afa29c3f6\" (UID: \"31615b46-4290-46db-993e-3e5afa29c3f6\") " Oct 12 06:18:01 crc kubenswrapper[4930]: I1012 06:18:01.263243 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31615b46-4290-46db-993e-3e5afa29c3f6-kube-api-access-4jj92" (OuterVolumeSpecName: "kube-api-access-4jj92") pod "31615b46-4290-46db-993e-3e5afa29c3f6" (UID: "31615b46-4290-46db-993e-3e5afa29c3f6"). InnerVolumeSpecName "kube-api-access-4jj92". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:18:01 crc kubenswrapper[4930]: I1012 06:18:01.267974 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31615b46-4290-46db-993e-3e5afa29c3f6-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "31615b46-4290-46db-993e-3e5afa29c3f6" (UID: "31615b46-4290-46db-993e-3e5afa29c3f6"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:18:01 crc kubenswrapper[4930]: I1012 06:18:01.297438 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31615b46-4290-46db-993e-3e5afa29c3f6-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "31615b46-4290-46db-993e-3e5afa29c3f6" (UID: "31615b46-4290-46db-993e-3e5afa29c3f6"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:18:01 crc kubenswrapper[4930]: I1012 06:18:01.300934 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31615b46-4290-46db-993e-3e5afa29c3f6-inventory" (OuterVolumeSpecName: "inventory") pod "31615b46-4290-46db-993e-3e5afa29c3f6" (UID: "31615b46-4290-46db-993e-3e5afa29c3f6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:18:01 crc kubenswrapper[4930]: I1012 06:18:01.319400 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31615b46-4290-46db-993e-3e5afa29c3f6-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "31615b46-4290-46db-993e-3e5afa29c3f6" (UID: "31615b46-4290-46db-993e-3e5afa29c3f6"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:18:01 crc kubenswrapper[4930]: I1012 06:18:01.321316 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31615b46-4290-46db-993e-3e5afa29c3f6-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "31615b46-4290-46db-993e-3e5afa29c3f6" (UID: "31615b46-4290-46db-993e-3e5afa29c3f6"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:18:01 crc kubenswrapper[4930]: I1012 06:18:01.357643 4930 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/31615b46-4290-46db-993e-3e5afa29c3f6-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 12 06:18:01 crc kubenswrapper[4930]: I1012 06:18:01.357680 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4jj92\" (UniqueName: \"kubernetes.io/projected/31615b46-4290-46db-993e-3e5afa29c3f6-kube-api-access-4jj92\") on node \"crc\" DevicePath \"\"" Oct 12 06:18:01 crc kubenswrapper[4930]: I1012 06:18:01.357696 4930 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/31615b46-4290-46db-993e-3e5afa29c3f6-inventory\") on node \"crc\" DevicePath \"\"" Oct 12 06:18:01 crc kubenswrapper[4930]: I1012 06:18:01.357708 4930 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31615b46-4290-46db-993e-3e5afa29c3f6-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 06:18:01 crc kubenswrapper[4930]: I1012 06:18:01.357721 4930 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/31615b46-4290-46db-993e-3e5afa29c3f6-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 12 06:18:01 crc kubenswrapper[4930]: I1012 06:18:01.357752 4930 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/31615b46-4290-46db-993e-3e5afa29c3f6-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 12 06:18:01 crc kubenswrapper[4930]: I1012 06:18:01.517107 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-68ch8" event={"ID":"31615b46-4290-46db-993e-3e5afa29c3f6","Type":"ContainerDied","Data":"ce424db0e1576bf30fadc913fa0cc86657ffd9be52e3d4363206c9b71144cc10"} Oct 12 06:18:01 crc kubenswrapper[4930]: I1012 06:18:01.517569 4930 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce424db0e1576bf30fadc913fa0cc86657ffd9be52e3d4363206c9b71144cc10" Oct 12 06:18:01 crc kubenswrapper[4930]: I1012 06:18:01.517209 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-68ch8" Oct 12 06:18:01 crc kubenswrapper[4930]: I1012 06:18:01.652401 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4dbjr"] Oct 12 06:18:01 crc kubenswrapper[4930]: E1012 06:18:01.653067 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31615b46-4290-46db-993e-3e5afa29c3f6" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 12 06:18:01 crc kubenswrapper[4930]: I1012 06:18:01.653098 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="31615b46-4290-46db-993e-3e5afa29c3f6" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 12 06:18:01 crc kubenswrapper[4930]: I1012 06:18:01.653458 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="31615b46-4290-46db-993e-3e5afa29c3f6" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 12 06:18:01 crc kubenswrapper[4930]: I1012 06:18:01.654686 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4dbjr" Oct 12 06:18:01 crc kubenswrapper[4930]: I1012 06:18:01.660465 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vhlxg" Oct 12 06:18:01 crc kubenswrapper[4930]: I1012 06:18:01.661529 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 12 06:18:01 crc kubenswrapper[4930]: I1012 06:18:01.662191 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 12 06:18:01 crc kubenswrapper[4930]: I1012 06:18:01.662193 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 12 06:18:01 crc kubenswrapper[4930]: I1012 06:18:01.662268 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Oct 12 06:18:01 crc kubenswrapper[4930]: I1012 06:18:01.676921 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4dbjr"] Oct 12 06:18:01 crc kubenswrapper[4930]: I1012 06:18:01.769678 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c3c7557-a115-43bb-9147-7faf17337317-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4dbjr\" (UID: \"1c3c7557-a115-43bb-9147-7faf17337317\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4dbjr" Oct 12 06:18:01 crc kubenswrapper[4930]: I1012 06:18:01.771056 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1c3c7557-a115-43bb-9147-7faf17337317-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4dbjr\" (UID: \"1c3c7557-a115-43bb-9147-7faf17337317\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4dbjr" Oct 12 06:18:01 crc kubenswrapper[4930]: I1012 06:18:01.771107 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/1c3c7557-a115-43bb-9147-7faf17337317-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4dbjr\" (UID: \"1c3c7557-a115-43bb-9147-7faf17337317\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4dbjr" Oct 12 06:18:01 crc kubenswrapper[4930]: I1012 06:18:01.771291 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrd9r\" (UniqueName: \"kubernetes.io/projected/1c3c7557-a115-43bb-9147-7faf17337317-kube-api-access-wrd9r\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4dbjr\" (UID: \"1c3c7557-a115-43bb-9147-7faf17337317\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4dbjr" Oct 12 06:18:01 crc kubenswrapper[4930]: I1012 06:18:01.771382 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1c3c7557-a115-43bb-9147-7faf17337317-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4dbjr\" (UID: \"1c3c7557-a115-43bb-9147-7faf17337317\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4dbjr" Oct 12 06:18:01 crc kubenswrapper[4930]: I1012 06:18:01.874092 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1c3c7557-a115-43bb-9147-7faf17337317-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4dbjr\" (UID: \"1c3c7557-a115-43bb-9147-7faf17337317\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4dbjr" Oct 12 06:18:01 crc kubenswrapper[4930]: I1012 06:18:01.874322 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c3c7557-a115-43bb-9147-7faf17337317-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4dbjr\" (UID: \"1c3c7557-a115-43bb-9147-7faf17337317\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4dbjr" Oct 12 06:18:01 crc kubenswrapper[4930]: I1012 06:18:01.874388 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1c3c7557-a115-43bb-9147-7faf17337317-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4dbjr\" (UID: \"1c3c7557-a115-43bb-9147-7faf17337317\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4dbjr" Oct 12 06:18:01 crc kubenswrapper[4930]: I1012 06:18:01.874435 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/1c3c7557-a115-43bb-9147-7faf17337317-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4dbjr\" (UID: \"1c3c7557-a115-43bb-9147-7faf17337317\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4dbjr" Oct 12 06:18:01 crc kubenswrapper[4930]: I1012 06:18:01.874679 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrd9r\" (UniqueName: \"kubernetes.io/projected/1c3c7557-a115-43bb-9147-7faf17337317-kube-api-access-wrd9r\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4dbjr\" (UID: \"1c3c7557-a115-43bb-9147-7faf17337317\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4dbjr" Oct 12 06:18:01 crc kubenswrapper[4930]: I1012 06:18:01.880653 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1c3c7557-a115-43bb-9147-7faf17337317-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4dbjr\" (UID: \"1c3c7557-a115-43bb-9147-7faf17337317\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4dbjr" Oct 12 06:18:01 crc kubenswrapper[4930]: I1012 06:18:01.880882 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c3c7557-a115-43bb-9147-7faf17337317-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4dbjr\" (UID: \"1c3c7557-a115-43bb-9147-7faf17337317\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4dbjr" Oct 12 06:18:01 crc kubenswrapper[4930]: I1012 06:18:01.881558 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1c3c7557-a115-43bb-9147-7faf17337317-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4dbjr\" (UID: \"1c3c7557-a115-43bb-9147-7faf17337317\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4dbjr" Oct 12 06:18:01 crc kubenswrapper[4930]: I1012 06:18:01.882724 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/1c3c7557-a115-43bb-9147-7faf17337317-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4dbjr\" (UID: \"1c3c7557-a115-43bb-9147-7faf17337317\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4dbjr" Oct 12 06:18:01 crc kubenswrapper[4930]: I1012 06:18:01.895250 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrd9r\" (UniqueName: \"kubernetes.io/projected/1c3c7557-a115-43bb-9147-7faf17337317-kube-api-access-wrd9r\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4dbjr\" (UID: \"1c3c7557-a115-43bb-9147-7faf17337317\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4dbjr" Oct 12 06:18:02 crc kubenswrapper[4930]: I1012 06:18:02.018293 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4dbjr" Oct 12 06:18:02 crc kubenswrapper[4930]: I1012 06:18:02.457346 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4dbjr"] Oct 12 06:18:02 crc kubenswrapper[4930]: I1012 06:18:02.529533 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4dbjr" event={"ID":"1c3c7557-a115-43bb-9147-7faf17337317","Type":"ContainerStarted","Data":"39231d7b577c4f597018ce2d31a12f59cafcd97ac82ba9a136653b1356ad4366"} Oct 12 06:18:03 crc kubenswrapper[4930]: I1012 06:18:03.543185 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4dbjr" event={"ID":"1c3c7557-a115-43bb-9147-7faf17337317","Type":"ContainerStarted","Data":"873ee3811d8dc585de2626e60293ee3021f2f06bdb9525962b0a00b40621cdbc"} Oct 12 06:18:03 crc kubenswrapper[4930]: I1012 06:18:03.572099 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4dbjr" podStartSLOduration=2.075850691 podStartE2EDuration="2.572072005s" podCreationTimestamp="2025-10-12 06:18:01 +0000 UTC" firstStartedPulling="2025-10-12 06:18:02.456236997 +0000 UTC m=+2214.998338752" lastFinishedPulling="2025-10-12 06:18:02.952458261 +0000 UTC m=+2215.494560066" observedRunningTime="2025-10-12 06:18:03.567773288 +0000 UTC m=+2216.109875083" watchObservedRunningTime="2025-10-12 06:18:03.572072005 +0000 UTC m=+2216.114173800" Oct 12 06:18:33 crc kubenswrapper[4930]: I1012 06:18:33.668936 4930 patch_prober.go:28] interesting pod/machine-config-daemon-mk4tf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 06:18:33 crc kubenswrapper[4930]: I1012 06:18:33.671514 4930 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 06:19:03 crc kubenswrapper[4930]: I1012 06:19:03.669710 4930 patch_prober.go:28] interesting pod/machine-config-daemon-mk4tf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 06:19:03 crc kubenswrapper[4930]: I1012 06:19:03.670445 4930 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 06:19:33 crc kubenswrapper[4930]: I1012 06:19:33.669505 4930 patch_prober.go:28] interesting pod/machine-config-daemon-mk4tf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 06:19:33 crc kubenswrapper[4930]: I1012 06:19:33.670178 4930 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 06:19:33 crc kubenswrapper[4930]: I1012 06:19:33.670238 4930 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" Oct 12 06:19:33 crc kubenswrapper[4930]: I1012 06:19:33.671266 4930 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c5afaefc8cc429cd5319bf9dedb80677b518177bff290876a471659f3fc22ce9"} pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 12 06:19:33 crc kubenswrapper[4930]: I1012 06:19:33.671337 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerName="machine-config-daemon" containerID="cri-o://c5afaefc8cc429cd5319bf9dedb80677b518177bff290876a471659f3fc22ce9" gracePeriod=600 Oct 12 06:19:33 crc kubenswrapper[4930]: E1012 06:19:33.799414 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:19:34 crc kubenswrapper[4930]: I1012 06:19:34.628919 4930 generic.go:334] "Generic (PLEG): container finished" podID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerID="c5afaefc8cc429cd5319bf9dedb80677b518177bff290876a471659f3fc22ce9" exitCode=0 Oct 12 06:19:34 crc kubenswrapper[4930]: I1012 06:19:34.628966 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" event={"ID":"02f8684c-a3e4-44e8-9741-9f54488d8d8d","Type":"ContainerDied","Data":"c5afaefc8cc429cd5319bf9dedb80677b518177bff290876a471659f3fc22ce9"} Oct 12 06:19:34 crc kubenswrapper[4930]: I1012 06:19:34.629194 4930 scope.go:117] "RemoveContainer" containerID="1bd8424d852b3aed65544bab3c60df461f0881b0c3e1f0ea5ec7fed246751ec8" Oct 12 06:19:34 crc kubenswrapper[4930]: I1012 06:19:34.630279 4930 scope.go:117] "RemoveContainer" containerID="c5afaefc8cc429cd5319bf9dedb80677b518177bff290876a471659f3fc22ce9" Oct 12 06:19:34 crc kubenswrapper[4930]: E1012 06:19:34.630687 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:19:47 crc kubenswrapper[4930]: I1012 06:19:47.134830 4930 scope.go:117] "RemoveContainer" containerID="c5afaefc8cc429cd5319bf9dedb80677b518177bff290876a471659f3fc22ce9" Oct 12 06:19:47 crc kubenswrapper[4930]: E1012 06:19:47.135605 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:19:59 crc kubenswrapper[4930]: I1012 06:19:59.135643 4930 scope.go:117] "RemoveContainer" containerID="c5afaefc8cc429cd5319bf9dedb80677b518177bff290876a471659f3fc22ce9" Oct 12 06:19:59 crc kubenswrapper[4930]: E1012 06:19:59.137028 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:20:14 crc kubenswrapper[4930]: I1012 06:20:14.136198 4930 scope.go:117] "RemoveContainer" containerID="c5afaefc8cc429cd5319bf9dedb80677b518177bff290876a471659f3fc22ce9" Oct 12 06:20:14 crc kubenswrapper[4930]: E1012 06:20:14.137478 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:20:27 crc kubenswrapper[4930]: I1012 06:20:27.136025 4930 scope.go:117] "RemoveContainer" containerID="c5afaefc8cc429cd5319bf9dedb80677b518177bff290876a471659f3fc22ce9" Oct 12 06:20:27 crc kubenswrapper[4930]: E1012 06:20:27.137126 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:20:38 crc kubenswrapper[4930]: I1012 06:20:38.151272 4930 scope.go:117] "RemoveContainer" containerID="c5afaefc8cc429cd5319bf9dedb80677b518177bff290876a471659f3fc22ce9" Oct 12 06:20:38 crc kubenswrapper[4930]: E1012 06:20:38.152706 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:20:50 crc kubenswrapper[4930]: I1012 06:20:50.135806 4930 scope.go:117] "RemoveContainer" containerID="c5afaefc8cc429cd5319bf9dedb80677b518177bff290876a471659f3fc22ce9" Oct 12 06:20:50 crc kubenswrapper[4930]: E1012 06:20:50.137407 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:21:03 crc kubenswrapper[4930]: I1012 06:21:03.136458 4930 scope.go:117] "RemoveContainer" containerID="c5afaefc8cc429cd5319bf9dedb80677b518177bff290876a471659f3fc22ce9" Oct 12 06:21:03 crc kubenswrapper[4930]: E1012 06:21:03.137485 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:21:15 crc kubenswrapper[4930]: I1012 06:21:15.135315 4930 scope.go:117] "RemoveContainer" containerID="c5afaefc8cc429cd5319bf9dedb80677b518177bff290876a471659f3fc22ce9" Oct 12 06:21:15 crc kubenswrapper[4930]: E1012 06:21:15.136271 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:21:27 crc kubenswrapper[4930]: I1012 06:21:27.135527 4930 scope.go:117] "RemoveContainer" containerID="c5afaefc8cc429cd5319bf9dedb80677b518177bff290876a471659f3fc22ce9" Oct 12 06:21:27 crc kubenswrapper[4930]: E1012 06:21:27.136806 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:21:41 crc kubenswrapper[4930]: I1012 06:21:41.135466 4930 scope.go:117] "RemoveContainer" containerID="c5afaefc8cc429cd5319bf9dedb80677b518177bff290876a471659f3fc22ce9" Oct 12 06:21:41 crc kubenswrapper[4930]: E1012 06:21:41.136830 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:21:54 crc kubenswrapper[4930]: I1012 06:21:54.135480 4930 scope.go:117] "RemoveContainer" containerID="c5afaefc8cc429cd5319bf9dedb80677b518177bff290876a471659f3fc22ce9" Oct 12 06:21:54 crc kubenswrapper[4930]: E1012 06:21:54.136547 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:22:07 crc kubenswrapper[4930]: I1012 06:22:07.135947 4930 scope.go:117] "RemoveContainer" containerID="c5afaefc8cc429cd5319bf9dedb80677b518177bff290876a471659f3fc22ce9" Oct 12 06:22:07 crc kubenswrapper[4930]: E1012 06:22:07.136997 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:22:20 crc kubenswrapper[4930]: I1012 06:22:20.137246 4930 scope.go:117] "RemoveContainer" containerID="c5afaefc8cc429cd5319bf9dedb80677b518177bff290876a471659f3fc22ce9" Oct 12 06:22:20 crc kubenswrapper[4930]: E1012 06:22:20.138780 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:22:32 crc kubenswrapper[4930]: I1012 06:22:32.136724 4930 scope.go:117] "RemoveContainer" containerID="c5afaefc8cc429cd5319bf9dedb80677b518177bff290876a471659f3fc22ce9" Oct 12 06:22:32 crc kubenswrapper[4930]: E1012 06:22:32.137881 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:22:44 crc kubenswrapper[4930]: I1012 06:22:44.136250 4930 scope.go:117] "RemoveContainer" containerID="c5afaefc8cc429cd5319bf9dedb80677b518177bff290876a471659f3fc22ce9" Oct 12 06:22:44 crc kubenswrapper[4930]: E1012 06:22:44.136980 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:22:46 crc kubenswrapper[4930]: I1012 06:22:46.000093 4930 generic.go:334] "Generic (PLEG): container finished" podID="1c3c7557-a115-43bb-9147-7faf17337317" containerID="873ee3811d8dc585de2626e60293ee3021f2f06bdb9525962b0a00b40621cdbc" exitCode=0 Oct 12 06:22:46 crc kubenswrapper[4930]: I1012 06:22:46.000241 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4dbjr" event={"ID":"1c3c7557-a115-43bb-9147-7faf17337317","Type":"ContainerDied","Data":"873ee3811d8dc585de2626e60293ee3021f2f06bdb9525962b0a00b40621cdbc"} Oct 12 06:22:47 crc kubenswrapper[4930]: I1012 06:22:47.548212 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4dbjr" Oct 12 06:22:47 crc kubenswrapper[4930]: I1012 06:22:47.750596 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/1c3c7557-a115-43bb-9147-7faf17337317-libvirt-secret-0\") pod \"1c3c7557-a115-43bb-9147-7faf17337317\" (UID: \"1c3c7557-a115-43bb-9147-7faf17337317\") " Oct 12 06:22:47 crc kubenswrapper[4930]: I1012 06:22:47.750667 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1c3c7557-a115-43bb-9147-7faf17337317-inventory\") pod \"1c3c7557-a115-43bb-9147-7faf17337317\" (UID: \"1c3c7557-a115-43bb-9147-7faf17337317\") " Oct 12 06:22:47 crc kubenswrapper[4930]: I1012 06:22:47.750717 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1c3c7557-a115-43bb-9147-7faf17337317-ssh-key\") pod \"1c3c7557-a115-43bb-9147-7faf17337317\" (UID: \"1c3c7557-a115-43bb-9147-7faf17337317\") " Oct 12 06:22:47 crc kubenswrapper[4930]: I1012 06:22:47.750963 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c3c7557-a115-43bb-9147-7faf17337317-libvirt-combined-ca-bundle\") pod \"1c3c7557-a115-43bb-9147-7faf17337317\" (UID: \"1c3c7557-a115-43bb-9147-7faf17337317\") " Oct 12 06:22:47 crc kubenswrapper[4930]: I1012 06:22:47.751046 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrd9r\" (UniqueName: \"kubernetes.io/projected/1c3c7557-a115-43bb-9147-7faf17337317-kube-api-access-wrd9r\") pod \"1c3c7557-a115-43bb-9147-7faf17337317\" (UID: \"1c3c7557-a115-43bb-9147-7faf17337317\") " Oct 12 06:22:47 crc kubenswrapper[4930]: I1012 06:22:47.760992 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c3c7557-a115-43bb-9147-7faf17337317-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "1c3c7557-a115-43bb-9147-7faf17337317" (UID: "1c3c7557-a115-43bb-9147-7faf17337317"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:22:47 crc kubenswrapper[4930]: I1012 06:22:47.762175 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c3c7557-a115-43bb-9147-7faf17337317-kube-api-access-wrd9r" (OuterVolumeSpecName: "kube-api-access-wrd9r") pod "1c3c7557-a115-43bb-9147-7faf17337317" (UID: "1c3c7557-a115-43bb-9147-7faf17337317"). InnerVolumeSpecName "kube-api-access-wrd9r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:22:47 crc kubenswrapper[4930]: I1012 06:22:47.803419 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c3c7557-a115-43bb-9147-7faf17337317-inventory" (OuterVolumeSpecName: "inventory") pod "1c3c7557-a115-43bb-9147-7faf17337317" (UID: "1c3c7557-a115-43bb-9147-7faf17337317"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:22:47 crc kubenswrapper[4930]: I1012 06:22:47.803474 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c3c7557-a115-43bb-9147-7faf17337317-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1c3c7557-a115-43bb-9147-7faf17337317" (UID: "1c3c7557-a115-43bb-9147-7faf17337317"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:22:47 crc kubenswrapper[4930]: I1012 06:22:47.804030 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c3c7557-a115-43bb-9147-7faf17337317-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "1c3c7557-a115-43bb-9147-7faf17337317" (UID: "1c3c7557-a115-43bb-9147-7faf17337317"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:22:47 crc kubenswrapper[4930]: I1012 06:22:47.854091 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrd9r\" (UniqueName: \"kubernetes.io/projected/1c3c7557-a115-43bb-9147-7faf17337317-kube-api-access-wrd9r\") on node \"crc\" DevicePath \"\"" Oct 12 06:22:47 crc kubenswrapper[4930]: I1012 06:22:47.854127 4930 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/1c3c7557-a115-43bb-9147-7faf17337317-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Oct 12 06:22:47 crc kubenswrapper[4930]: I1012 06:22:47.854140 4930 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1c3c7557-a115-43bb-9147-7faf17337317-inventory\") on node \"crc\" DevicePath \"\"" Oct 12 06:22:47 crc kubenswrapper[4930]: I1012 06:22:47.854151 4930 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1c3c7557-a115-43bb-9147-7faf17337317-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 12 06:22:47 crc kubenswrapper[4930]: I1012 06:22:47.854163 4930 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c3c7557-a115-43bb-9147-7faf17337317-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 06:22:48 crc kubenswrapper[4930]: I1012 06:22:48.038464 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4dbjr" event={"ID":"1c3c7557-a115-43bb-9147-7faf17337317","Type":"ContainerDied","Data":"39231d7b577c4f597018ce2d31a12f59cafcd97ac82ba9a136653b1356ad4366"} Oct 12 06:22:48 crc kubenswrapper[4930]: I1012 06:22:48.038503 4930 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="39231d7b577c4f597018ce2d31a12f59cafcd97ac82ba9a136653b1356ad4366" Oct 12 06:22:48 crc kubenswrapper[4930]: I1012 06:22:48.038569 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4dbjr" Oct 12 06:22:48 crc kubenswrapper[4930]: I1012 06:22:48.214758 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-t8xlg"] Oct 12 06:22:48 crc kubenswrapper[4930]: E1012 06:22:48.215130 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c3c7557-a115-43bb-9147-7faf17337317" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 12 06:22:48 crc kubenswrapper[4930]: I1012 06:22:48.215149 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c3c7557-a115-43bb-9147-7faf17337317" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 12 06:22:48 crc kubenswrapper[4930]: I1012 06:22:48.215350 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c3c7557-a115-43bb-9147-7faf17337317" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 12 06:22:48 crc kubenswrapper[4930]: I1012 06:22:48.216158 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-t8xlg" Oct 12 06:22:48 crc kubenswrapper[4930]: I1012 06:22:48.221450 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-t8xlg"] Oct 12 06:22:48 crc kubenswrapper[4930]: I1012 06:22:48.221763 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 12 06:22:48 crc kubenswrapper[4930]: I1012 06:22:48.222126 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vhlxg" Oct 12 06:22:48 crc kubenswrapper[4930]: I1012 06:22:48.222348 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 12 06:22:48 crc kubenswrapper[4930]: I1012 06:22:48.222364 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Oct 12 06:22:48 crc kubenswrapper[4930]: I1012 06:22:48.222572 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 12 06:22:48 crc kubenswrapper[4930]: I1012 06:22:48.222591 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Oct 12 06:22:48 crc kubenswrapper[4930]: I1012 06:22:48.223970 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Oct 12 06:22:48 crc kubenswrapper[4930]: I1012 06:22:48.308409 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e6afdef5-2b76-470f-9fb1-a98ae115072a-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-t8xlg\" (UID: \"e6afdef5-2b76-470f-9fb1-a98ae115072a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-t8xlg" Oct 12 06:22:48 crc kubenswrapper[4930]: I1012 06:22:48.308480 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6afdef5-2b76-470f-9fb1-a98ae115072a-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-t8xlg\" (UID: \"e6afdef5-2b76-470f-9fb1-a98ae115072a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-t8xlg" Oct 12 06:22:48 crc kubenswrapper[4930]: I1012 06:22:48.308625 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e6afdef5-2b76-470f-9fb1-a98ae115072a-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-t8xlg\" (UID: \"e6afdef5-2b76-470f-9fb1-a98ae115072a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-t8xlg" Oct 12 06:22:48 crc kubenswrapper[4930]: I1012 06:22:48.308844 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e6afdef5-2b76-470f-9fb1-a98ae115072a-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-t8xlg\" (UID: \"e6afdef5-2b76-470f-9fb1-a98ae115072a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-t8xlg" Oct 12 06:22:48 crc kubenswrapper[4930]: I1012 06:22:48.308887 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e6afdef5-2b76-470f-9fb1-a98ae115072a-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-t8xlg\" (UID: \"e6afdef5-2b76-470f-9fb1-a98ae115072a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-t8xlg" Oct 12 06:22:48 crc kubenswrapper[4930]: I1012 06:22:48.309034 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e6afdef5-2b76-470f-9fb1-a98ae115072a-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-t8xlg\" (UID: \"e6afdef5-2b76-470f-9fb1-a98ae115072a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-t8xlg" Oct 12 06:22:48 crc kubenswrapper[4930]: I1012 06:22:48.309107 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/e6afdef5-2b76-470f-9fb1-a98ae115072a-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-t8xlg\" (UID: \"e6afdef5-2b76-470f-9fb1-a98ae115072a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-t8xlg" Oct 12 06:22:48 crc kubenswrapper[4930]: I1012 06:22:48.309163 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e6afdef5-2b76-470f-9fb1-a98ae115072a-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-t8xlg\" (UID: \"e6afdef5-2b76-470f-9fb1-a98ae115072a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-t8xlg" Oct 12 06:22:48 crc kubenswrapper[4930]: I1012 06:22:48.309185 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7zmn\" (UniqueName: \"kubernetes.io/projected/e6afdef5-2b76-470f-9fb1-a98ae115072a-kube-api-access-x7zmn\") pod \"nova-edpm-deployment-openstack-edpm-ipam-t8xlg\" (UID: \"e6afdef5-2b76-470f-9fb1-a98ae115072a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-t8xlg" Oct 12 06:22:48 crc kubenswrapper[4930]: I1012 06:22:48.410949 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e6afdef5-2b76-470f-9fb1-a98ae115072a-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-t8xlg\" (UID: \"e6afdef5-2b76-470f-9fb1-a98ae115072a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-t8xlg" Oct 12 06:22:48 crc kubenswrapper[4930]: I1012 06:22:48.411042 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/e6afdef5-2b76-470f-9fb1-a98ae115072a-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-t8xlg\" (UID: \"e6afdef5-2b76-470f-9fb1-a98ae115072a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-t8xlg" Oct 12 06:22:48 crc kubenswrapper[4930]: I1012 06:22:48.411119 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e6afdef5-2b76-470f-9fb1-a98ae115072a-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-t8xlg\" (UID: \"e6afdef5-2b76-470f-9fb1-a98ae115072a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-t8xlg" Oct 12 06:22:48 crc kubenswrapper[4930]: I1012 06:22:48.411157 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7zmn\" (UniqueName: \"kubernetes.io/projected/e6afdef5-2b76-470f-9fb1-a98ae115072a-kube-api-access-x7zmn\") pod \"nova-edpm-deployment-openstack-edpm-ipam-t8xlg\" (UID: \"e6afdef5-2b76-470f-9fb1-a98ae115072a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-t8xlg" Oct 12 06:22:48 crc kubenswrapper[4930]: I1012 06:22:48.411272 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e6afdef5-2b76-470f-9fb1-a98ae115072a-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-t8xlg\" (UID: \"e6afdef5-2b76-470f-9fb1-a98ae115072a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-t8xlg" Oct 12 06:22:48 crc kubenswrapper[4930]: I1012 06:22:48.411332 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6afdef5-2b76-470f-9fb1-a98ae115072a-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-t8xlg\" (UID: \"e6afdef5-2b76-470f-9fb1-a98ae115072a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-t8xlg" Oct 12 06:22:48 crc kubenswrapper[4930]: I1012 06:22:48.411392 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e6afdef5-2b76-470f-9fb1-a98ae115072a-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-t8xlg\" (UID: \"e6afdef5-2b76-470f-9fb1-a98ae115072a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-t8xlg" Oct 12 06:22:48 crc kubenswrapper[4930]: I1012 06:22:48.411492 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e6afdef5-2b76-470f-9fb1-a98ae115072a-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-t8xlg\" (UID: \"e6afdef5-2b76-470f-9fb1-a98ae115072a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-t8xlg" Oct 12 06:22:48 crc kubenswrapper[4930]: I1012 06:22:48.411533 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e6afdef5-2b76-470f-9fb1-a98ae115072a-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-t8xlg\" (UID: \"e6afdef5-2b76-470f-9fb1-a98ae115072a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-t8xlg" Oct 12 06:22:48 crc kubenswrapper[4930]: I1012 06:22:48.412805 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/e6afdef5-2b76-470f-9fb1-a98ae115072a-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-t8xlg\" (UID: \"e6afdef5-2b76-470f-9fb1-a98ae115072a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-t8xlg" Oct 12 06:22:48 crc kubenswrapper[4930]: I1012 06:22:48.416724 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e6afdef5-2b76-470f-9fb1-a98ae115072a-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-t8xlg\" (UID: \"e6afdef5-2b76-470f-9fb1-a98ae115072a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-t8xlg" Oct 12 06:22:48 crc kubenswrapper[4930]: I1012 06:22:48.416843 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e6afdef5-2b76-470f-9fb1-a98ae115072a-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-t8xlg\" (UID: \"e6afdef5-2b76-470f-9fb1-a98ae115072a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-t8xlg" Oct 12 06:22:48 crc kubenswrapper[4930]: I1012 06:22:48.417231 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e6afdef5-2b76-470f-9fb1-a98ae115072a-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-t8xlg\" (UID: \"e6afdef5-2b76-470f-9fb1-a98ae115072a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-t8xlg" Oct 12 06:22:48 crc kubenswrapper[4930]: I1012 06:22:48.417492 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6afdef5-2b76-470f-9fb1-a98ae115072a-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-t8xlg\" (UID: \"e6afdef5-2b76-470f-9fb1-a98ae115072a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-t8xlg" Oct 12 06:22:48 crc kubenswrapper[4930]: I1012 06:22:48.419040 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e6afdef5-2b76-470f-9fb1-a98ae115072a-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-t8xlg\" (UID: \"e6afdef5-2b76-470f-9fb1-a98ae115072a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-t8xlg" Oct 12 06:22:48 crc kubenswrapper[4930]: I1012 06:22:48.423692 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e6afdef5-2b76-470f-9fb1-a98ae115072a-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-t8xlg\" (UID: \"e6afdef5-2b76-470f-9fb1-a98ae115072a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-t8xlg" Oct 12 06:22:48 crc kubenswrapper[4930]: I1012 06:22:48.430632 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e6afdef5-2b76-470f-9fb1-a98ae115072a-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-t8xlg\" (UID: \"e6afdef5-2b76-470f-9fb1-a98ae115072a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-t8xlg" Oct 12 06:22:48 crc kubenswrapper[4930]: I1012 06:22:48.434352 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7zmn\" (UniqueName: \"kubernetes.io/projected/e6afdef5-2b76-470f-9fb1-a98ae115072a-kube-api-access-x7zmn\") pod \"nova-edpm-deployment-openstack-edpm-ipam-t8xlg\" (UID: \"e6afdef5-2b76-470f-9fb1-a98ae115072a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-t8xlg" Oct 12 06:22:48 crc kubenswrapper[4930]: I1012 06:22:48.534987 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-t8xlg" Oct 12 06:22:49 crc kubenswrapper[4930]: I1012 06:22:49.131951 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-t8xlg"] Oct 12 06:22:49 crc kubenswrapper[4930]: W1012 06:22:49.134057 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6afdef5_2b76_470f_9fb1_a98ae115072a.slice/crio-9554c6367c3768d2ce6b7d271ff987eaefb10676c7640739c933a930fa248792 WatchSource:0}: Error finding container 9554c6367c3768d2ce6b7d271ff987eaefb10676c7640739c933a930fa248792: Status 404 returned error can't find the container with id 9554c6367c3768d2ce6b7d271ff987eaefb10676c7640739c933a930fa248792 Oct 12 06:22:49 crc kubenswrapper[4930]: I1012 06:22:49.137519 4930 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 12 06:22:50 crc kubenswrapper[4930]: I1012 06:22:50.068180 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-t8xlg" event={"ID":"e6afdef5-2b76-470f-9fb1-a98ae115072a","Type":"ContainerStarted","Data":"abe436d8f820160580cc023e973bc7372908d2c9ff509d410abd852c0299bd01"} Oct 12 06:22:50 crc kubenswrapper[4930]: I1012 06:22:50.068472 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-t8xlg" event={"ID":"e6afdef5-2b76-470f-9fb1-a98ae115072a","Type":"ContainerStarted","Data":"9554c6367c3768d2ce6b7d271ff987eaefb10676c7640739c933a930fa248792"} Oct 12 06:22:50 crc kubenswrapper[4930]: I1012 06:22:50.094711 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-t8xlg" podStartSLOduration=1.58004469 podStartE2EDuration="2.09469237s" podCreationTimestamp="2025-10-12 06:22:48 +0000 UTC" firstStartedPulling="2025-10-12 06:22:49.137261118 +0000 UTC m=+2501.679362893" lastFinishedPulling="2025-10-12 06:22:49.651908258 +0000 UTC m=+2502.194010573" observedRunningTime="2025-10-12 06:22:50.087509201 +0000 UTC m=+2502.629610976" watchObservedRunningTime="2025-10-12 06:22:50.09469237 +0000 UTC m=+2502.636794125" Oct 12 06:22:55 crc kubenswrapper[4930]: I1012 06:22:55.136372 4930 scope.go:117] "RemoveContainer" containerID="c5afaefc8cc429cd5319bf9dedb80677b518177bff290876a471659f3fc22ce9" Oct 12 06:22:55 crc kubenswrapper[4930]: E1012 06:22:55.137491 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:23:09 crc kubenswrapper[4930]: I1012 06:23:09.136325 4930 scope.go:117] "RemoveContainer" containerID="c5afaefc8cc429cd5319bf9dedb80677b518177bff290876a471659f3fc22ce9" Oct 12 06:23:09 crc kubenswrapper[4930]: E1012 06:23:09.137450 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:23:23 crc kubenswrapper[4930]: I1012 06:23:23.135806 4930 scope.go:117] "RemoveContainer" containerID="c5afaefc8cc429cd5319bf9dedb80677b518177bff290876a471659f3fc22ce9" Oct 12 06:23:23 crc kubenswrapper[4930]: E1012 06:23:23.136993 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:23:35 crc kubenswrapper[4930]: I1012 06:23:35.135072 4930 scope.go:117] "RemoveContainer" containerID="c5afaefc8cc429cd5319bf9dedb80677b518177bff290876a471659f3fc22ce9" Oct 12 06:23:35 crc kubenswrapper[4930]: E1012 06:23:35.135943 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:23:47 crc kubenswrapper[4930]: I1012 06:23:47.794943 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lqxh7"] Oct 12 06:23:47 crc kubenswrapper[4930]: I1012 06:23:47.799174 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lqxh7" Oct 12 06:23:47 crc kubenswrapper[4930]: I1012 06:23:47.853338 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lqxh7"] Oct 12 06:23:47 crc kubenswrapper[4930]: I1012 06:23:47.896190 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d33bda4c-fb8e-4a92-afa3-a9999deba4c7-utilities\") pod \"community-operators-lqxh7\" (UID: \"d33bda4c-fb8e-4a92-afa3-a9999deba4c7\") " pod="openshift-marketplace/community-operators-lqxh7" Oct 12 06:23:47 crc kubenswrapper[4930]: I1012 06:23:47.896303 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wns57\" (UniqueName: \"kubernetes.io/projected/d33bda4c-fb8e-4a92-afa3-a9999deba4c7-kube-api-access-wns57\") pod \"community-operators-lqxh7\" (UID: \"d33bda4c-fb8e-4a92-afa3-a9999deba4c7\") " pod="openshift-marketplace/community-operators-lqxh7" Oct 12 06:23:47 crc kubenswrapper[4930]: I1012 06:23:47.896374 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d33bda4c-fb8e-4a92-afa3-a9999deba4c7-catalog-content\") pod \"community-operators-lqxh7\" (UID: \"d33bda4c-fb8e-4a92-afa3-a9999deba4c7\") " pod="openshift-marketplace/community-operators-lqxh7" Oct 12 06:23:47 crc kubenswrapper[4930]: I1012 06:23:47.998833 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d33bda4c-fb8e-4a92-afa3-a9999deba4c7-utilities\") pod \"community-operators-lqxh7\" (UID: \"d33bda4c-fb8e-4a92-afa3-a9999deba4c7\") " pod="openshift-marketplace/community-operators-lqxh7" Oct 12 06:23:47 crc kubenswrapper[4930]: I1012 06:23:47.998910 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wns57\" (UniqueName: \"kubernetes.io/projected/d33bda4c-fb8e-4a92-afa3-a9999deba4c7-kube-api-access-wns57\") pod \"community-operators-lqxh7\" (UID: \"d33bda4c-fb8e-4a92-afa3-a9999deba4c7\") " pod="openshift-marketplace/community-operators-lqxh7" Oct 12 06:23:47 crc kubenswrapper[4930]: I1012 06:23:47.998955 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d33bda4c-fb8e-4a92-afa3-a9999deba4c7-catalog-content\") pod \"community-operators-lqxh7\" (UID: \"d33bda4c-fb8e-4a92-afa3-a9999deba4c7\") " pod="openshift-marketplace/community-operators-lqxh7" Oct 12 06:23:47 crc kubenswrapper[4930]: I1012 06:23:47.999347 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d33bda4c-fb8e-4a92-afa3-a9999deba4c7-utilities\") pod \"community-operators-lqxh7\" (UID: \"d33bda4c-fb8e-4a92-afa3-a9999deba4c7\") " pod="openshift-marketplace/community-operators-lqxh7" Oct 12 06:23:47 crc kubenswrapper[4930]: I1012 06:23:47.999492 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d33bda4c-fb8e-4a92-afa3-a9999deba4c7-catalog-content\") pod \"community-operators-lqxh7\" (UID: \"d33bda4c-fb8e-4a92-afa3-a9999deba4c7\") " pod="openshift-marketplace/community-operators-lqxh7" Oct 12 06:23:48 crc kubenswrapper[4930]: I1012 06:23:48.031218 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wns57\" (UniqueName: \"kubernetes.io/projected/d33bda4c-fb8e-4a92-afa3-a9999deba4c7-kube-api-access-wns57\") pod \"community-operators-lqxh7\" (UID: \"d33bda4c-fb8e-4a92-afa3-a9999deba4c7\") " pod="openshift-marketplace/community-operators-lqxh7" Oct 12 06:23:48 crc kubenswrapper[4930]: I1012 06:23:48.180672 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lqxh7" Oct 12 06:23:48 crc kubenswrapper[4930]: I1012 06:23:48.701847 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lqxh7"] Oct 12 06:23:48 crc kubenswrapper[4930]: I1012 06:23:48.735541 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lqxh7" event={"ID":"d33bda4c-fb8e-4a92-afa3-a9999deba4c7","Type":"ContainerStarted","Data":"e908ada7a455610681a5ef2a0023f4c895425daa1fb0619062b1c52a17b7073f"} Oct 12 06:23:49 crc kubenswrapper[4930]: I1012 06:23:49.135397 4930 scope.go:117] "RemoveContainer" containerID="c5afaefc8cc429cd5319bf9dedb80677b518177bff290876a471659f3fc22ce9" Oct 12 06:23:49 crc kubenswrapper[4930]: E1012 06:23:49.136255 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:23:49 crc kubenswrapper[4930]: I1012 06:23:49.753972 4930 generic.go:334] "Generic (PLEG): container finished" podID="d33bda4c-fb8e-4a92-afa3-a9999deba4c7" containerID="e82ed498ac04d223ca24171a945bde45791b769122382f8ac0aa58d493d81396" exitCode=0 Oct 12 06:23:49 crc kubenswrapper[4930]: I1012 06:23:49.754050 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lqxh7" event={"ID":"d33bda4c-fb8e-4a92-afa3-a9999deba4c7","Type":"ContainerDied","Data":"e82ed498ac04d223ca24171a945bde45791b769122382f8ac0aa58d493d81396"} Oct 12 06:23:50 crc kubenswrapper[4930]: I1012 06:23:50.780146 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lqxh7" event={"ID":"d33bda4c-fb8e-4a92-afa3-a9999deba4c7","Type":"ContainerStarted","Data":"4ede882eb12af3de9570770ba247e18545b299369851557f35a2523be60bf6a0"} Oct 12 06:23:51 crc kubenswrapper[4930]: I1012 06:23:51.795277 4930 generic.go:334] "Generic (PLEG): container finished" podID="d33bda4c-fb8e-4a92-afa3-a9999deba4c7" containerID="4ede882eb12af3de9570770ba247e18545b299369851557f35a2523be60bf6a0" exitCode=0 Oct 12 06:23:51 crc kubenswrapper[4930]: I1012 06:23:51.795335 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lqxh7" event={"ID":"d33bda4c-fb8e-4a92-afa3-a9999deba4c7","Type":"ContainerDied","Data":"4ede882eb12af3de9570770ba247e18545b299369851557f35a2523be60bf6a0"} Oct 12 06:23:52 crc kubenswrapper[4930]: I1012 06:23:52.814510 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lqxh7" event={"ID":"d33bda4c-fb8e-4a92-afa3-a9999deba4c7","Type":"ContainerStarted","Data":"dc4993b8ed0cd8304adee61dcf4d4e79f6ed9c3f72d420ed6d0e60e38ba99758"} Oct 12 06:23:52 crc kubenswrapper[4930]: I1012 06:23:52.846369 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lqxh7" podStartSLOduration=3.353405083 podStartE2EDuration="5.846342711s" podCreationTimestamp="2025-10-12 06:23:47 +0000 UTC" firstStartedPulling="2025-10-12 06:23:49.757160781 +0000 UTC m=+2562.299262586" lastFinishedPulling="2025-10-12 06:23:52.250098439 +0000 UTC m=+2564.792200214" observedRunningTime="2025-10-12 06:23:52.832911897 +0000 UTC m=+2565.375013682" watchObservedRunningTime="2025-10-12 06:23:52.846342711 +0000 UTC m=+2565.388444516" Oct 12 06:23:58 crc kubenswrapper[4930]: I1012 06:23:58.181047 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lqxh7" Oct 12 06:23:58 crc kubenswrapper[4930]: I1012 06:23:58.181701 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lqxh7" Oct 12 06:23:58 crc kubenswrapper[4930]: I1012 06:23:58.261523 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lqxh7" Oct 12 06:23:58 crc kubenswrapper[4930]: I1012 06:23:58.958935 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lqxh7" Oct 12 06:23:59 crc kubenswrapper[4930]: I1012 06:23:59.039059 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lqxh7"] Oct 12 06:24:00 crc kubenswrapper[4930]: I1012 06:24:00.923210 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lqxh7" podUID="d33bda4c-fb8e-4a92-afa3-a9999deba4c7" containerName="registry-server" containerID="cri-o://dc4993b8ed0cd8304adee61dcf4d4e79f6ed9c3f72d420ed6d0e60e38ba99758" gracePeriod=2 Oct 12 06:24:01 crc kubenswrapper[4930]: I1012 06:24:01.435597 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lqxh7" Oct 12 06:24:01 crc kubenswrapper[4930]: I1012 06:24:01.611812 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d33bda4c-fb8e-4a92-afa3-a9999deba4c7-utilities\") pod \"d33bda4c-fb8e-4a92-afa3-a9999deba4c7\" (UID: \"d33bda4c-fb8e-4a92-afa3-a9999deba4c7\") " Oct 12 06:24:01 crc kubenswrapper[4930]: I1012 06:24:01.612009 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d33bda4c-fb8e-4a92-afa3-a9999deba4c7-catalog-content\") pod \"d33bda4c-fb8e-4a92-afa3-a9999deba4c7\" (UID: \"d33bda4c-fb8e-4a92-afa3-a9999deba4c7\") " Oct 12 06:24:01 crc kubenswrapper[4930]: I1012 06:24:01.612107 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wns57\" (UniqueName: \"kubernetes.io/projected/d33bda4c-fb8e-4a92-afa3-a9999deba4c7-kube-api-access-wns57\") pod \"d33bda4c-fb8e-4a92-afa3-a9999deba4c7\" (UID: \"d33bda4c-fb8e-4a92-afa3-a9999deba4c7\") " Oct 12 06:24:01 crc kubenswrapper[4930]: I1012 06:24:01.613706 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d33bda4c-fb8e-4a92-afa3-a9999deba4c7-utilities" (OuterVolumeSpecName: "utilities") pod "d33bda4c-fb8e-4a92-afa3-a9999deba4c7" (UID: "d33bda4c-fb8e-4a92-afa3-a9999deba4c7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 06:24:01 crc kubenswrapper[4930]: I1012 06:24:01.621112 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d33bda4c-fb8e-4a92-afa3-a9999deba4c7-kube-api-access-wns57" (OuterVolumeSpecName: "kube-api-access-wns57") pod "d33bda4c-fb8e-4a92-afa3-a9999deba4c7" (UID: "d33bda4c-fb8e-4a92-afa3-a9999deba4c7"). InnerVolumeSpecName "kube-api-access-wns57". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:24:01 crc kubenswrapper[4930]: I1012 06:24:01.663632 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d33bda4c-fb8e-4a92-afa3-a9999deba4c7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d33bda4c-fb8e-4a92-afa3-a9999deba4c7" (UID: "d33bda4c-fb8e-4a92-afa3-a9999deba4c7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 06:24:01 crc kubenswrapper[4930]: I1012 06:24:01.714079 4930 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d33bda4c-fb8e-4a92-afa3-a9999deba4c7-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 06:24:01 crc kubenswrapper[4930]: I1012 06:24:01.714112 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wns57\" (UniqueName: \"kubernetes.io/projected/d33bda4c-fb8e-4a92-afa3-a9999deba4c7-kube-api-access-wns57\") on node \"crc\" DevicePath \"\"" Oct 12 06:24:01 crc kubenswrapper[4930]: I1012 06:24:01.714124 4930 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d33bda4c-fb8e-4a92-afa3-a9999deba4c7-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 06:24:01 crc kubenswrapper[4930]: I1012 06:24:01.932523 4930 generic.go:334] "Generic (PLEG): container finished" podID="d33bda4c-fb8e-4a92-afa3-a9999deba4c7" containerID="dc4993b8ed0cd8304adee61dcf4d4e79f6ed9c3f72d420ed6d0e60e38ba99758" exitCode=0 Oct 12 06:24:01 crc kubenswrapper[4930]: I1012 06:24:01.932573 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lqxh7" event={"ID":"d33bda4c-fb8e-4a92-afa3-a9999deba4c7","Type":"ContainerDied","Data":"dc4993b8ed0cd8304adee61dcf4d4e79f6ed9c3f72d420ed6d0e60e38ba99758"} Oct 12 06:24:01 crc kubenswrapper[4930]: I1012 06:24:01.932600 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lqxh7" Oct 12 06:24:01 crc kubenswrapper[4930]: I1012 06:24:01.932610 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lqxh7" event={"ID":"d33bda4c-fb8e-4a92-afa3-a9999deba4c7","Type":"ContainerDied","Data":"e908ada7a455610681a5ef2a0023f4c895425daa1fb0619062b1c52a17b7073f"} Oct 12 06:24:01 crc kubenswrapper[4930]: I1012 06:24:01.932643 4930 scope.go:117] "RemoveContainer" containerID="dc4993b8ed0cd8304adee61dcf4d4e79f6ed9c3f72d420ed6d0e60e38ba99758" Oct 12 06:24:01 crc kubenswrapper[4930]: I1012 06:24:01.957310 4930 scope.go:117] "RemoveContainer" containerID="4ede882eb12af3de9570770ba247e18545b299369851557f35a2523be60bf6a0" Oct 12 06:24:01 crc kubenswrapper[4930]: I1012 06:24:01.980888 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lqxh7"] Oct 12 06:24:01 crc kubenswrapper[4930]: I1012 06:24:01.993349 4930 scope.go:117] "RemoveContainer" containerID="e82ed498ac04d223ca24171a945bde45791b769122382f8ac0aa58d493d81396" Oct 12 06:24:01 crc kubenswrapper[4930]: I1012 06:24:01.995749 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lqxh7"] Oct 12 06:24:02 crc kubenswrapper[4930]: I1012 06:24:02.058128 4930 scope.go:117] "RemoveContainer" containerID="dc4993b8ed0cd8304adee61dcf4d4e79f6ed9c3f72d420ed6d0e60e38ba99758" Oct 12 06:24:02 crc kubenswrapper[4930]: E1012 06:24:02.058721 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc4993b8ed0cd8304adee61dcf4d4e79f6ed9c3f72d420ed6d0e60e38ba99758\": container with ID starting with dc4993b8ed0cd8304adee61dcf4d4e79f6ed9c3f72d420ed6d0e60e38ba99758 not found: ID does not exist" containerID="dc4993b8ed0cd8304adee61dcf4d4e79f6ed9c3f72d420ed6d0e60e38ba99758" Oct 12 06:24:02 crc kubenswrapper[4930]: I1012 06:24:02.058777 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc4993b8ed0cd8304adee61dcf4d4e79f6ed9c3f72d420ed6d0e60e38ba99758"} err="failed to get container status \"dc4993b8ed0cd8304adee61dcf4d4e79f6ed9c3f72d420ed6d0e60e38ba99758\": rpc error: code = NotFound desc = could not find container \"dc4993b8ed0cd8304adee61dcf4d4e79f6ed9c3f72d420ed6d0e60e38ba99758\": container with ID starting with dc4993b8ed0cd8304adee61dcf4d4e79f6ed9c3f72d420ed6d0e60e38ba99758 not found: ID does not exist" Oct 12 06:24:02 crc kubenswrapper[4930]: I1012 06:24:02.058808 4930 scope.go:117] "RemoveContainer" containerID="4ede882eb12af3de9570770ba247e18545b299369851557f35a2523be60bf6a0" Oct 12 06:24:02 crc kubenswrapper[4930]: E1012 06:24:02.059199 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ede882eb12af3de9570770ba247e18545b299369851557f35a2523be60bf6a0\": container with ID starting with 4ede882eb12af3de9570770ba247e18545b299369851557f35a2523be60bf6a0 not found: ID does not exist" containerID="4ede882eb12af3de9570770ba247e18545b299369851557f35a2523be60bf6a0" Oct 12 06:24:02 crc kubenswrapper[4930]: I1012 06:24:02.059241 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ede882eb12af3de9570770ba247e18545b299369851557f35a2523be60bf6a0"} err="failed to get container status \"4ede882eb12af3de9570770ba247e18545b299369851557f35a2523be60bf6a0\": rpc error: code = NotFound desc = could not find container \"4ede882eb12af3de9570770ba247e18545b299369851557f35a2523be60bf6a0\": container with ID starting with 4ede882eb12af3de9570770ba247e18545b299369851557f35a2523be60bf6a0 not found: ID does not exist" Oct 12 06:24:02 crc kubenswrapper[4930]: I1012 06:24:02.059266 4930 scope.go:117] "RemoveContainer" containerID="e82ed498ac04d223ca24171a945bde45791b769122382f8ac0aa58d493d81396" Oct 12 06:24:02 crc kubenswrapper[4930]: E1012 06:24:02.059544 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e82ed498ac04d223ca24171a945bde45791b769122382f8ac0aa58d493d81396\": container with ID starting with e82ed498ac04d223ca24171a945bde45791b769122382f8ac0aa58d493d81396 not found: ID does not exist" containerID="e82ed498ac04d223ca24171a945bde45791b769122382f8ac0aa58d493d81396" Oct 12 06:24:02 crc kubenswrapper[4930]: I1012 06:24:02.059568 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e82ed498ac04d223ca24171a945bde45791b769122382f8ac0aa58d493d81396"} err="failed to get container status \"e82ed498ac04d223ca24171a945bde45791b769122382f8ac0aa58d493d81396\": rpc error: code = NotFound desc = could not find container \"e82ed498ac04d223ca24171a945bde45791b769122382f8ac0aa58d493d81396\": container with ID starting with e82ed498ac04d223ca24171a945bde45791b769122382f8ac0aa58d493d81396 not found: ID does not exist" Oct 12 06:24:02 crc kubenswrapper[4930]: I1012 06:24:02.150412 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d33bda4c-fb8e-4a92-afa3-a9999deba4c7" path="/var/lib/kubelet/pods/d33bda4c-fb8e-4a92-afa3-a9999deba4c7/volumes" Oct 12 06:24:03 crc kubenswrapper[4930]: I1012 06:24:03.135932 4930 scope.go:117] "RemoveContainer" containerID="c5afaefc8cc429cd5319bf9dedb80677b518177bff290876a471659f3fc22ce9" Oct 12 06:24:03 crc kubenswrapper[4930]: E1012 06:24:03.136569 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:24:18 crc kubenswrapper[4930]: I1012 06:24:18.141700 4930 scope.go:117] "RemoveContainer" containerID="c5afaefc8cc429cd5319bf9dedb80677b518177bff290876a471659f3fc22ce9" Oct 12 06:24:18 crc kubenswrapper[4930]: E1012 06:24:18.142781 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:24:30 crc kubenswrapper[4930]: I1012 06:24:30.136416 4930 scope.go:117] "RemoveContainer" containerID="c5afaefc8cc429cd5319bf9dedb80677b518177bff290876a471659f3fc22ce9" Oct 12 06:24:30 crc kubenswrapper[4930]: E1012 06:24:30.137303 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:24:43 crc kubenswrapper[4930]: I1012 06:24:43.135682 4930 scope.go:117] "RemoveContainer" containerID="c5afaefc8cc429cd5319bf9dedb80677b518177bff290876a471659f3fc22ce9" Oct 12 06:24:43 crc kubenswrapper[4930]: I1012 06:24:43.459118 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" event={"ID":"02f8684c-a3e4-44e8-9741-9f54488d8d8d","Type":"ContainerStarted","Data":"9ab3f87434347d32cda2544985e1ab8e81d9813b575b68284cd0ffe0db076ad9"} Oct 12 06:26:45 crc kubenswrapper[4930]: I1012 06:26:45.006796 4930 generic.go:334] "Generic (PLEG): container finished" podID="e6afdef5-2b76-470f-9fb1-a98ae115072a" containerID="abe436d8f820160580cc023e973bc7372908d2c9ff509d410abd852c0299bd01" exitCode=0 Oct 12 06:26:45 crc kubenswrapper[4930]: I1012 06:26:45.006882 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-t8xlg" event={"ID":"e6afdef5-2b76-470f-9fb1-a98ae115072a","Type":"ContainerDied","Data":"abe436d8f820160580cc023e973bc7372908d2c9ff509d410abd852c0299bd01"} Oct 12 06:26:46 crc kubenswrapper[4930]: I1012 06:26:46.564028 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-t8xlg" Oct 12 06:26:46 crc kubenswrapper[4930]: I1012 06:26:46.747276 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e6afdef5-2b76-470f-9fb1-a98ae115072a-nova-migration-ssh-key-0\") pod \"e6afdef5-2b76-470f-9fb1-a98ae115072a\" (UID: \"e6afdef5-2b76-470f-9fb1-a98ae115072a\") " Oct 12 06:26:46 crc kubenswrapper[4930]: I1012 06:26:46.747356 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e6afdef5-2b76-470f-9fb1-a98ae115072a-ssh-key\") pod \"e6afdef5-2b76-470f-9fb1-a98ae115072a\" (UID: \"e6afdef5-2b76-470f-9fb1-a98ae115072a\") " Oct 12 06:26:46 crc kubenswrapper[4930]: I1012 06:26:46.747390 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e6afdef5-2b76-470f-9fb1-a98ae115072a-nova-cell1-compute-config-1\") pod \"e6afdef5-2b76-470f-9fb1-a98ae115072a\" (UID: \"e6afdef5-2b76-470f-9fb1-a98ae115072a\") " Oct 12 06:26:46 crc kubenswrapper[4930]: I1012 06:26:46.747486 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6afdef5-2b76-470f-9fb1-a98ae115072a-nova-combined-ca-bundle\") pod \"e6afdef5-2b76-470f-9fb1-a98ae115072a\" (UID: \"e6afdef5-2b76-470f-9fb1-a98ae115072a\") " Oct 12 06:26:46 crc kubenswrapper[4930]: I1012 06:26:46.747541 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e6afdef5-2b76-470f-9fb1-a98ae115072a-inventory\") pod \"e6afdef5-2b76-470f-9fb1-a98ae115072a\" (UID: \"e6afdef5-2b76-470f-9fb1-a98ae115072a\") " Oct 12 06:26:46 crc kubenswrapper[4930]: I1012 06:26:46.747615 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zmn\" (UniqueName: \"kubernetes.io/projected/e6afdef5-2b76-470f-9fb1-a98ae115072a-kube-api-access-x7zmn\") pod \"e6afdef5-2b76-470f-9fb1-a98ae115072a\" (UID: \"e6afdef5-2b76-470f-9fb1-a98ae115072a\") " Oct 12 06:26:46 crc kubenswrapper[4930]: I1012 06:26:46.747704 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/e6afdef5-2b76-470f-9fb1-a98ae115072a-nova-extra-config-0\") pod \"e6afdef5-2b76-470f-9fb1-a98ae115072a\" (UID: \"e6afdef5-2b76-470f-9fb1-a98ae115072a\") " Oct 12 06:26:46 crc kubenswrapper[4930]: I1012 06:26:46.747817 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e6afdef5-2b76-470f-9fb1-a98ae115072a-nova-cell1-compute-config-0\") pod \"e6afdef5-2b76-470f-9fb1-a98ae115072a\" (UID: \"e6afdef5-2b76-470f-9fb1-a98ae115072a\") " Oct 12 06:26:46 crc kubenswrapper[4930]: I1012 06:26:46.747846 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e6afdef5-2b76-470f-9fb1-a98ae115072a-nova-migration-ssh-key-1\") pod \"e6afdef5-2b76-470f-9fb1-a98ae115072a\" (UID: \"e6afdef5-2b76-470f-9fb1-a98ae115072a\") " Oct 12 06:26:46 crc kubenswrapper[4930]: I1012 06:26:46.757037 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6afdef5-2b76-470f-9fb1-a98ae115072a-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "e6afdef5-2b76-470f-9fb1-a98ae115072a" (UID: "e6afdef5-2b76-470f-9fb1-a98ae115072a"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:26:46 crc kubenswrapper[4930]: I1012 06:26:46.763186 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6afdef5-2b76-470f-9fb1-a98ae115072a-kube-api-access-x7zmn" (OuterVolumeSpecName: "kube-api-access-x7zmn") pod "e6afdef5-2b76-470f-9fb1-a98ae115072a" (UID: "e6afdef5-2b76-470f-9fb1-a98ae115072a"). InnerVolumeSpecName "kube-api-access-x7zmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:26:46 crc kubenswrapper[4930]: I1012 06:26:46.789097 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6afdef5-2b76-470f-9fb1-a98ae115072a-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "e6afdef5-2b76-470f-9fb1-a98ae115072a" (UID: "e6afdef5-2b76-470f-9fb1-a98ae115072a"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:26:46 crc kubenswrapper[4930]: I1012 06:26:46.791720 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6afdef5-2b76-470f-9fb1-a98ae115072a-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "e6afdef5-2b76-470f-9fb1-a98ae115072a" (UID: "e6afdef5-2b76-470f-9fb1-a98ae115072a"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 06:26:46 crc kubenswrapper[4930]: I1012 06:26:46.829991 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6afdef5-2b76-470f-9fb1-a98ae115072a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e6afdef5-2b76-470f-9fb1-a98ae115072a" (UID: "e6afdef5-2b76-470f-9fb1-a98ae115072a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:26:46 crc kubenswrapper[4930]: I1012 06:26:46.830294 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6afdef5-2b76-470f-9fb1-a98ae115072a-inventory" (OuterVolumeSpecName: "inventory") pod "e6afdef5-2b76-470f-9fb1-a98ae115072a" (UID: "e6afdef5-2b76-470f-9fb1-a98ae115072a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:26:46 crc kubenswrapper[4930]: I1012 06:26:46.837095 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6afdef5-2b76-470f-9fb1-a98ae115072a-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "e6afdef5-2b76-470f-9fb1-a98ae115072a" (UID: "e6afdef5-2b76-470f-9fb1-a98ae115072a"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:26:46 crc kubenswrapper[4930]: I1012 06:26:46.841964 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6afdef5-2b76-470f-9fb1-a98ae115072a-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "e6afdef5-2b76-470f-9fb1-a98ae115072a" (UID: "e6afdef5-2b76-470f-9fb1-a98ae115072a"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:26:46 crc kubenswrapper[4930]: I1012 06:26:46.859692 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6afdef5-2b76-470f-9fb1-a98ae115072a-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "e6afdef5-2b76-470f-9fb1-a98ae115072a" (UID: "e6afdef5-2b76-470f-9fb1-a98ae115072a"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:26:46 crc kubenswrapper[4930]: I1012 06:26:46.870019 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zmn\" (UniqueName: \"kubernetes.io/projected/e6afdef5-2b76-470f-9fb1-a98ae115072a-kube-api-access-x7zmn\") on node \"crc\" DevicePath \"\"" Oct 12 06:26:46 crc kubenswrapper[4930]: I1012 06:26:46.870077 4930 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/e6afdef5-2b76-470f-9fb1-a98ae115072a-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Oct 12 06:26:46 crc kubenswrapper[4930]: I1012 06:26:46.870091 4930 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e6afdef5-2b76-470f-9fb1-a98ae115072a-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Oct 12 06:26:46 crc kubenswrapper[4930]: I1012 06:26:46.870106 4930 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e6afdef5-2b76-470f-9fb1-a98ae115072a-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Oct 12 06:26:46 crc kubenswrapper[4930]: I1012 06:26:46.870120 4930 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e6afdef5-2b76-470f-9fb1-a98ae115072a-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Oct 12 06:26:46 crc kubenswrapper[4930]: I1012 06:26:46.870163 4930 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e6afdef5-2b76-470f-9fb1-a98ae115072a-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Oct 12 06:26:46 crc kubenswrapper[4930]: I1012 06:26:46.870177 4930 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e6afdef5-2b76-470f-9fb1-a98ae115072a-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 12 06:26:46 crc kubenswrapper[4930]: I1012 06:26:46.870190 4930 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6afdef5-2b76-470f-9fb1-a98ae115072a-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 06:26:46 crc kubenswrapper[4930]: I1012 06:26:46.870202 4930 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e6afdef5-2b76-470f-9fb1-a98ae115072a-inventory\") on node \"crc\" DevicePath \"\"" Oct 12 06:26:47 crc kubenswrapper[4930]: I1012 06:26:47.026979 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-t8xlg" event={"ID":"e6afdef5-2b76-470f-9fb1-a98ae115072a","Type":"ContainerDied","Data":"9554c6367c3768d2ce6b7d271ff987eaefb10676c7640739c933a930fa248792"} Oct 12 06:26:47 crc kubenswrapper[4930]: I1012 06:26:47.027251 4930 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9554c6367c3768d2ce6b7d271ff987eaefb10676c7640739c933a930fa248792" Oct 12 06:26:47 crc kubenswrapper[4930]: I1012 06:26:47.027033 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-t8xlg" Oct 12 06:26:47 crc kubenswrapper[4930]: I1012 06:26:47.127705 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-n5xlj"] Oct 12 06:26:47 crc kubenswrapper[4930]: E1012 06:26:47.128094 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d33bda4c-fb8e-4a92-afa3-a9999deba4c7" containerName="extract-content" Oct 12 06:26:47 crc kubenswrapper[4930]: I1012 06:26:47.128112 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="d33bda4c-fb8e-4a92-afa3-a9999deba4c7" containerName="extract-content" Oct 12 06:26:47 crc kubenswrapper[4930]: E1012 06:26:47.128147 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d33bda4c-fb8e-4a92-afa3-a9999deba4c7" containerName="extract-utilities" Oct 12 06:26:47 crc kubenswrapper[4930]: I1012 06:26:47.128153 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="d33bda4c-fb8e-4a92-afa3-a9999deba4c7" containerName="extract-utilities" Oct 12 06:26:47 crc kubenswrapper[4930]: E1012 06:26:47.128168 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d33bda4c-fb8e-4a92-afa3-a9999deba4c7" containerName="registry-server" Oct 12 06:26:47 crc kubenswrapper[4930]: I1012 06:26:47.128174 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="d33bda4c-fb8e-4a92-afa3-a9999deba4c7" containerName="registry-server" Oct 12 06:26:47 crc kubenswrapper[4930]: E1012 06:26:47.128184 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6afdef5-2b76-470f-9fb1-a98ae115072a" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 12 06:26:47 crc kubenswrapper[4930]: I1012 06:26:47.128191 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6afdef5-2b76-470f-9fb1-a98ae115072a" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 12 06:26:47 crc kubenswrapper[4930]: I1012 06:26:47.128376 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="d33bda4c-fb8e-4a92-afa3-a9999deba4c7" containerName="registry-server" Oct 12 06:26:47 crc kubenswrapper[4930]: I1012 06:26:47.128394 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6afdef5-2b76-470f-9fb1-a98ae115072a" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 12 06:26:47 crc kubenswrapper[4930]: I1012 06:26:47.129051 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-n5xlj" Oct 12 06:26:47 crc kubenswrapper[4930]: I1012 06:26:47.136100 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 12 06:26:47 crc kubenswrapper[4930]: I1012 06:26:47.136159 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vhlxg" Oct 12 06:26:47 crc kubenswrapper[4930]: I1012 06:26:47.136169 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Oct 12 06:26:47 crc kubenswrapper[4930]: I1012 06:26:47.136320 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 12 06:26:47 crc kubenswrapper[4930]: I1012 06:26:47.136624 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 12 06:26:47 crc kubenswrapper[4930]: I1012 06:26:47.137897 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-n5xlj"] Oct 12 06:26:47 crc kubenswrapper[4930]: I1012 06:26:47.175144 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/fa0905ab-f3dc-41c6-b517-9f9ac23d7adc-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-n5xlj\" (UID: \"fa0905ab-f3dc-41c6-b517-9f9ac23d7adc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-n5xlj" Oct 12 06:26:47 crc kubenswrapper[4930]: I1012 06:26:47.175280 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/fa0905ab-f3dc-41c6-b517-9f9ac23d7adc-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-n5xlj\" (UID: \"fa0905ab-f3dc-41c6-b517-9f9ac23d7adc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-n5xlj" Oct 12 06:26:47 crc kubenswrapper[4930]: I1012 06:26:47.175298 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fa0905ab-f3dc-41c6-b517-9f9ac23d7adc-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-n5xlj\" (UID: \"fa0905ab-f3dc-41c6-b517-9f9ac23d7adc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-n5xlj" Oct 12 06:26:47 crc kubenswrapper[4930]: I1012 06:26:47.175317 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/fa0905ab-f3dc-41c6-b517-9f9ac23d7adc-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-n5xlj\" (UID: \"fa0905ab-f3dc-41c6-b517-9f9ac23d7adc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-n5xlj" Oct 12 06:26:47 crc kubenswrapper[4930]: I1012 06:26:47.175332 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6czlc\" (UniqueName: \"kubernetes.io/projected/fa0905ab-f3dc-41c6-b517-9f9ac23d7adc-kube-api-access-6czlc\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-n5xlj\" (UID: \"fa0905ab-f3dc-41c6-b517-9f9ac23d7adc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-n5xlj" Oct 12 06:26:47 crc kubenswrapper[4930]: I1012 06:26:47.175377 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa0905ab-f3dc-41c6-b517-9f9ac23d7adc-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-n5xlj\" (UID: \"fa0905ab-f3dc-41c6-b517-9f9ac23d7adc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-n5xlj" Oct 12 06:26:47 crc kubenswrapper[4930]: I1012 06:26:47.175397 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fa0905ab-f3dc-41c6-b517-9f9ac23d7adc-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-n5xlj\" (UID: \"fa0905ab-f3dc-41c6-b517-9f9ac23d7adc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-n5xlj" Oct 12 06:26:47 crc kubenswrapper[4930]: I1012 06:26:47.277021 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa0905ab-f3dc-41c6-b517-9f9ac23d7adc-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-n5xlj\" (UID: \"fa0905ab-f3dc-41c6-b517-9f9ac23d7adc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-n5xlj" Oct 12 06:26:47 crc kubenswrapper[4930]: I1012 06:26:47.277102 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fa0905ab-f3dc-41c6-b517-9f9ac23d7adc-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-n5xlj\" (UID: \"fa0905ab-f3dc-41c6-b517-9f9ac23d7adc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-n5xlj" Oct 12 06:26:47 crc kubenswrapper[4930]: I1012 06:26:47.277299 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/fa0905ab-f3dc-41c6-b517-9f9ac23d7adc-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-n5xlj\" (UID: \"fa0905ab-f3dc-41c6-b517-9f9ac23d7adc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-n5xlj" Oct 12 06:26:47 crc kubenswrapper[4930]: I1012 06:26:47.277432 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/fa0905ab-f3dc-41c6-b517-9f9ac23d7adc-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-n5xlj\" (UID: \"fa0905ab-f3dc-41c6-b517-9f9ac23d7adc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-n5xlj" Oct 12 06:26:47 crc kubenswrapper[4930]: I1012 06:26:47.277486 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fa0905ab-f3dc-41c6-b517-9f9ac23d7adc-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-n5xlj\" (UID: \"fa0905ab-f3dc-41c6-b517-9f9ac23d7adc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-n5xlj" Oct 12 06:26:47 crc kubenswrapper[4930]: I1012 06:26:47.277525 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/fa0905ab-f3dc-41c6-b517-9f9ac23d7adc-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-n5xlj\" (UID: \"fa0905ab-f3dc-41c6-b517-9f9ac23d7adc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-n5xlj" Oct 12 06:26:47 crc kubenswrapper[4930]: I1012 06:26:47.277560 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6czlc\" (UniqueName: \"kubernetes.io/projected/fa0905ab-f3dc-41c6-b517-9f9ac23d7adc-kube-api-access-6czlc\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-n5xlj\" (UID: \"fa0905ab-f3dc-41c6-b517-9f9ac23d7adc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-n5xlj" Oct 12 06:26:47 crc kubenswrapper[4930]: I1012 06:26:47.283938 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/fa0905ab-f3dc-41c6-b517-9f9ac23d7adc-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-n5xlj\" (UID: \"fa0905ab-f3dc-41c6-b517-9f9ac23d7adc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-n5xlj" Oct 12 06:26:47 crc kubenswrapper[4930]: I1012 06:26:47.284153 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/fa0905ab-f3dc-41c6-b517-9f9ac23d7adc-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-n5xlj\" (UID: \"fa0905ab-f3dc-41c6-b517-9f9ac23d7adc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-n5xlj" Oct 12 06:26:47 crc kubenswrapper[4930]: I1012 06:26:47.285679 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa0905ab-f3dc-41c6-b517-9f9ac23d7adc-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-n5xlj\" (UID: \"fa0905ab-f3dc-41c6-b517-9f9ac23d7adc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-n5xlj" Oct 12 06:26:47 crc kubenswrapper[4930]: I1012 06:26:47.288287 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fa0905ab-f3dc-41c6-b517-9f9ac23d7adc-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-n5xlj\" (UID: \"fa0905ab-f3dc-41c6-b517-9f9ac23d7adc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-n5xlj" Oct 12 06:26:47 crc kubenswrapper[4930]: I1012 06:26:47.289473 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/fa0905ab-f3dc-41c6-b517-9f9ac23d7adc-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-n5xlj\" (UID: \"fa0905ab-f3dc-41c6-b517-9f9ac23d7adc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-n5xlj" Oct 12 06:26:47 crc kubenswrapper[4930]: I1012 06:26:47.294016 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fa0905ab-f3dc-41c6-b517-9f9ac23d7adc-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-n5xlj\" (UID: \"fa0905ab-f3dc-41c6-b517-9f9ac23d7adc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-n5xlj" Oct 12 06:26:47 crc kubenswrapper[4930]: I1012 06:26:47.305247 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6czlc\" (UniqueName: \"kubernetes.io/projected/fa0905ab-f3dc-41c6-b517-9f9ac23d7adc-kube-api-access-6czlc\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-n5xlj\" (UID: \"fa0905ab-f3dc-41c6-b517-9f9ac23d7adc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-n5xlj" Oct 12 06:26:47 crc kubenswrapper[4930]: I1012 06:26:47.446733 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-n5xlj" Oct 12 06:26:47 crc kubenswrapper[4930]: I1012 06:26:47.804964 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-n5xlj"] Oct 12 06:26:48 crc kubenswrapper[4930]: I1012 06:26:48.052078 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-n5xlj" event={"ID":"fa0905ab-f3dc-41c6-b517-9f9ac23d7adc","Type":"ContainerStarted","Data":"4ed42fcb2253292070242408b4fbc8bbd63417ea9053c92563149e93d0243182"} Oct 12 06:26:49 crc kubenswrapper[4930]: I1012 06:26:49.066804 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-n5xlj" event={"ID":"fa0905ab-f3dc-41c6-b517-9f9ac23d7adc","Type":"ContainerStarted","Data":"e3ea49969d68a92368cab21b9b71fd42d139cff85a005f0139ea93ff6b1371d3"} Oct 12 06:26:49 crc kubenswrapper[4930]: I1012 06:26:49.094782 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-n5xlj" podStartSLOduration=1.56324366 podStartE2EDuration="2.094716326s" podCreationTimestamp="2025-10-12 06:26:47 +0000 UTC" firstStartedPulling="2025-10-12 06:26:47.816861753 +0000 UTC m=+2740.358963528" lastFinishedPulling="2025-10-12 06:26:48.348334419 +0000 UTC m=+2740.890436194" observedRunningTime="2025-10-12 06:26:49.0872106 +0000 UTC m=+2741.629312405" watchObservedRunningTime="2025-10-12 06:26:49.094716326 +0000 UTC m=+2741.636818131" Oct 12 06:27:00 crc kubenswrapper[4930]: I1012 06:27:00.827432 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pmtl8"] Oct 12 06:27:00 crc kubenswrapper[4930]: I1012 06:27:00.832231 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pmtl8" Oct 12 06:27:00 crc kubenswrapper[4930]: I1012 06:27:00.857434 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pmtl8"] Oct 12 06:27:00 crc kubenswrapper[4930]: I1012 06:27:00.929458 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fea83cc0-b805-4765-82bd-4625b01aefcc-utilities\") pod \"redhat-operators-pmtl8\" (UID: \"fea83cc0-b805-4765-82bd-4625b01aefcc\") " pod="openshift-marketplace/redhat-operators-pmtl8" Oct 12 06:27:00 crc kubenswrapper[4930]: I1012 06:27:00.929865 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fea83cc0-b805-4765-82bd-4625b01aefcc-catalog-content\") pod \"redhat-operators-pmtl8\" (UID: \"fea83cc0-b805-4765-82bd-4625b01aefcc\") " pod="openshift-marketplace/redhat-operators-pmtl8" Oct 12 06:27:00 crc kubenswrapper[4930]: I1012 06:27:00.930060 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdk78\" (UniqueName: \"kubernetes.io/projected/fea83cc0-b805-4765-82bd-4625b01aefcc-kube-api-access-cdk78\") pod \"redhat-operators-pmtl8\" (UID: \"fea83cc0-b805-4765-82bd-4625b01aefcc\") " pod="openshift-marketplace/redhat-operators-pmtl8" Oct 12 06:27:01 crc kubenswrapper[4930]: I1012 06:27:01.032967 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fea83cc0-b805-4765-82bd-4625b01aefcc-utilities\") pod \"redhat-operators-pmtl8\" (UID: \"fea83cc0-b805-4765-82bd-4625b01aefcc\") " pod="openshift-marketplace/redhat-operators-pmtl8" Oct 12 06:27:01 crc kubenswrapper[4930]: I1012 06:27:01.033427 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fea83cc0-b805-4765-82bd-4625b01aefcc-catalog-content\") pod \"redhat-operators-pmtl8\" (UID: \"fea83cc0-b805-4765-82bd-4625b01aefcc\") " pod="openshift-marketplace/redhat-operators-pmtl8" Oct 12 06:27:01 crc kubenswrapper[4930]: I1012 06:27:01.033505 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdk78\" (UniqueName: \"kubernetes.io/projected/fea83cc0-b805-4765-82bd-4625b01aefcc-kube-api-access-cdk78\") pod \"redhat-operators-pmtl8\" (UID: \"fea83cc0-b805-4765-82bd-4625b01aefcc\") " pod="openshift-marketplace/redhat-operators-pmtl8" Oct 12 06:27:01 crc kubenswrapper[4930]: I1012 06:27:01.033653 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fea83cc0-b805-4765-82bd-4625b01aefcc-utilities\") pod \"redhat-operators-pmtl8\" (UID: \"fea83cc0-b805-4765-82bd-4625b01aefcc\") " pod="openshift-marketplace/redhat-operators-pmtl8" Oct 12 06:27:01 crc kubenswrapper[4930]: I1012 06:27:01.034134 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fea83cc0-b805-4765-82bd-4625b01aefcc-catalog-content\") pod \"redhat-operators-pmtl8\" (UID: \"fea83cc0-b805-4765-82bd-4625b01aefcc\") " pod="openshift-marketplace/redhat-operators-pmtl8" Oct 12 06:27:01 crc kubenswrapper[4930]: I1012 06:27:01.056830 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdk78\" (UniqueName: \"kubernetes.io/projected/fea83cc0-b805-4765-82bd-4625b01aefcc-kube-api-access-cdk78\") pod \"redhat-operators-pmtl8\" (UID: \"fea83cc0-b805-4765-82bd-4625b01aefcc\") " pod="openshift-marketplace/redhat-operators-pmtl8" Oct 12 06:27:01 crc kubenswrapper[4930]: I1012 06:27:01.182857 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pmtl8" Oct 12 06:27:01 crc kubenswrapper[4930]: I1012 06:27:01.669118 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pmtl8"] Oct 12 06:27:02 crc kubenswrapper[4930]: I1012 06:27:02.233337 4930 generic.go:334] "Generic (PLEG): container finished" podID="fea83cc0-b805-4765-82bd-4625b01aefcc" containerID="7c3538d596b8360e935ebf4c16451644dc2b18c53efa9b12ed94605f788d3aea" exitCode=0 Oct 12 06:27:02 crc kubenswrapper[4930]: I1012 06:27:02.233427 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pmtl8" event={"ID":"fea83cc0-b805-4765-82bd-4625b01aefcc","Type":"ContainerDied","Data":"7c3538d596b8360e935ebf4c16451644dc2b18c53efa9b12ed94605f788d3aea"} Oct 12 06:27:02 crc kubenswrapper[4930]: I1012 06:27:02.233703 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pmtl8" event={"ID":"fea83cc0-b805-4765-82bd-4625b01aefcc","Type":"ContainerStarted","Data":"bd5f55f278f8e73a1ed9520376e96660c23a853c455a134608d96db8a80c94f2"} Oct 12 06:27:03 crc kubenswrapper[4930]: I1012 06:27:03.250871 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pmtl8" event={"ID":"fea83cc0-b805-4765-82bd-4625b01aefcc","Type":"ContainerStarted","Data":"15081a8f1e48ee33d432d841609a04671f3990bb050f957a3917891971744e25"} Oct 12 06:27:03 crc kubenswrapper[4930]: I1012 06:27:03.669316 4930 patch_prober.go:28] interesting pod/machine-config-daemon-mk4tf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 06:27:03 crc kubenswrapper[4930]: I1012 06:27:03.669728 4930 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 06:27:06 crc kubenswrapper[4930]: I1012 06:27:06.292367 4930 generic.go:334] "Generic (PLEG): container finished" podID="fea83cc0-b805-4765-82bd-4625b01aefcc" containerID="15081a8f1e48ee33d432d841609a04671f3990bb050f957a3917891971744e25" exitCode=0 Oct 12 06:27:06 crc kubenswrapper[4930]: I1012 06:27:06.292427 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pmtl8" event={"ID":"fea83cc0-b805-4765-82bd-4625b01aefcc","Type":"ContainerDied","Data":"15081a8f1e48ee33d432d841609a04671f3990bb050f957a3917891971744e25"} Oct 12 06:27:07 crc kubenswrapper[4930]: I1012 06:27:07.308189 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pmtl8" event={"ID":"fea83cc0-b805-4765-82bd-4625b01aefcc","Type":"ContainerStarted","Data":"cc460d21c247d2aeb0940a3ae2f5913789ebe845cf647efed77aaa20e61f152f"} Oct 12 06:27:07 crc kubenswrapper[4930]: I1012 06:27:07.338285 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pmtl8" podStartSLOduration=2.673634185 podStartE2EDuration="7.338256889s" podCreationTimestamp="2025-10-12 06:27:00 +0000 UTC" firstStartedPulling="2025-10-12 06:27:02.236610502 +0000 UTC m=+2754.778712308" lastFinishedPulling="2025-10-12 06:27:06.901233247 +0000 UTC m=+2759.443335012" observedRunningTime="2025-10-12 06:27:07.327332248 +0000 UTC m=+2759.869434053" watchObservedRunningTime="2025-10-12 06:27:07.338256889 +0000 UTC m=+2759.880358684" Oct 12 06:27:11 crc kubenswrapper[4930]: I1012 06:27:11.183396 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pmtl8" Oct 12 06:27:11 crc kubenswrapper[4930]: I1012 06:27:11.184862 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pmtl8" Oct 12 06:27:12 crc kubenswrapper[4930]: I1012 06:27:12.266443 4930 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-pmtl8" podUID="fea83cc0-b805-4765-82bd-4625b01aefcc" containerName="registry-server" probeResult="failure" output=< Oct 12 06:27:12 crc kubenswrapper[4930]: timeout: failed to connect service ":50051" within 1s Oct 12 06:27:12 crc kubenswrapper[4930]: > Oct 12 06:27:21 crc kubenswrapper[4930]: I1012 06:27:21.276320 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pmtl8" Oct 12 06:27:21 crc kubenswrapper[4930]: I1012 06:27:21.350724 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pmtl8" Oct 12 06:27:21 crc kubenswrapper[4930]: I1012 06:27:21.527671 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pmtl8"] Oct 12 06:27:22 crc kubenswrapper[4930]: I1012 06:27:22.500707 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-pmtl8" podUID="fea83cc0-b805-4765-82bd-4625b01aefcc" containerName="registry-server" containerID="cri-o://cc460d21c247d2aeb0940a3ae2f5913789ebe845cf647efed77aaa20e61f152f" gracePeriod=2 Oct 12 06:27:23 crc kubenswrapper[4930]: I1012 06:27:23.116119 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pmtl8" Oct 12 06:27:23 crc kubenswrapper[4930]: I1012 06:27:23.217459 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fea83cc0-b805-4765-82bd-4625b01aefcc-utilities\") pod \"fea83cc0-b805-4765-82bd-4625b01aefcc\" (UID: \"fea83cc0-b805-4765-82bd-4625b01aefcc\") " Oct 12 06:27:23 crc kubenswrapper[4930]: I1012 06:27:23.218198 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cdk78\" (UniqueName: \"kubernetes.io/projected/fea83cc0-b805-4765-82bd-4625b01aefcc-kube-api-access-cdk78\") pod \"fea83cc0-b805-4765-82bd-4625b01aefcc\" (UID: \"fea83cc0-b805-4765-82bd-4625b01aefcc\") " Oct 12 06:27:23 crc kubenswrapper[4930]: I1012 06:27:23.218343 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fea83cc0-b805-4765-82bd-4625b01aefcc-catalog-content\") pod \"fea83cc0-b805-4765-82bd-4625b01aefcc\" (UID: \"fea83cc0-b805-4765-82bd-4625b01aefcc\") " Oct 12 06:27:23 crc kubenswrapper[4930]: I1012 06:27:23.218923 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fea83cc0-b805-4765-82bd-4625b01aefcc-utilities" (OuterVolumeSpecName: "utilities") pod "fea83cc0-b805-4765-82bd-4625b01aefcc" (UID: "fea83cc0-b805-4765-82bd-4625b01aefcc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 06:27:23 crc kubenswrapper[4930]: I1012 06:27:23.219412 4930 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fea83cc0-b805-4765-82bd-4625b01aefcc-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 06:27:23 crc kubenswrapper[4930]: I1012 06:27:23.230953 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fea83cc0-b805-4765-82bd-4625b01aefcc-kube-api-access-cdk78" (OuterVolumeSpecName: "kube-api-access-cdk78") pod "fea83cc0-b805-4765-82bd-4625b01aefcc" (UID: "fea83cc0-b805-4765-82bd-4625b01aefcc"). InnerVolumeSpecName "kube-api-access-cdk78". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:27:23 crc kubenswrapper[4930]: I1012 06:27:23.321094 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cdk78\" (UniqueName: \"kubernetes.io/projected/fea83cc0-b805-4765-82bd-4625b01aefcc-kube-api-access-cdk78\") on node \"crc\" DevicePath \"\"" Oct 12 06:27:23 crc kubenswrapper[4930]: I1012 06:27:23.357965 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fea83cc0-b805-4765-82bd-4625b01aefcc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fea83cc0-b805-4765-82bd-4625b01aefcc" (UID: "fea83cc0-b805-4765-82bd-4625b01aefcc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 06:27:23 crc kubenswrapper[4930]: I1012 06:27:23.423244 4930 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fea83cc0-b805-4765-82bd-4625b01aefcc-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 06:27:23 crc kubenswrapper[4930]: I1012 06:27:23.514001 4930 generic.go:334] "Generic (PLEG): container finished" podID="fea83cc0-b805-4765-82bd-4625b01aefcc" containerID="cc460d21c247d2aeb0940a3ae2f5913789ebe845cf647efed77aaa20e61f152f" exitCode=0 Oct 12 06:27:23 crc kubenswrapper[4930]: I1012 06:27:23.514039 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pmtl8" Oct 12 06:27:23 crc kubenswrapper[4930]: I1012 06:27:23.514108 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pmtl8" event={"ID":"fea83cc0-b805-4765-82bd-4625b01aefcc","Type":"ContainerDied","Data":"cc460d21c247d2aeb0940a3ae2f5913789ebe845cf647efed77aaa20e61f152f"} Oct 12 06:27:23 crc kubenswrapper[4930]: I1012 06:27:23.514193 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pmtl8" event={"ID":"fea83cc0-b805-4765-82bd-4625b01aefcc","Type":"ContainerDied","Data":"bd5f55f278f8e73a1ed9520376e96660c23a853c455a134608d96db8a80c94f2"} Oct 12 06:27:23 crc kubenswrapper[4930]: I1012 06:27:23.514242 4930 scope.go:117] "RemoveContainer" containerID="cc460d21c247d2aeb0940a3ae2f5913789ebe845cf647efed77aaa20e61f152f" Oct 12 06:27:23 crc kubenswrapper[4930]: I1012 06:27:23.544644 4930 scope.go:117] "RemoveContainer" containerID="15081a8f1e48ee33d432d841609a04671f3990bb050f957a3917891971744e25" Oct 12 06:27:23 crc kubenswrapper[4930]: I1012 06:27:23.573639 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pmtl8"] Oct 12 06:27:23 crc kubenswrapper[4930]: I1012 06:27:23.578015 4930 scope.go:117] "RemoveContainer" containerID="7c3538d596b8360e935ebf4c16451644dc2b18c53efa9b12ed94605f788d3aea" Oct 12 06:27:23 crc kubenswrapper[4930]: I1012 06:27:23.584815 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-pmtl8"] Oct 12 06:27:23 crc kubenswrapper[4930]: I1012 06:27:23.628404 4930 scope.go:117] "RemoveContainer" containerID="cc460d21c247d2aeb0940a3ae2f5913789ebe845cf647efed77aaa20e61f152f" Oct 12 06:27:23 crc kubenswrapper[4930]: E1012 06:27:23.629913 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc460d21c247d2aeb0940a3ae2f5913789ebe845cf647efed77aaa20e61f152f\": container with ID starting with cc460d21c247d2aeb0940a3ae2f5913789ebe845cf647efed77aaa20e61f152f not found: ID does not exist" containerID="cc460d21c247d2aeb0940a3ae2f5913789ebe845cf647efed77aaa20e61f152f" Oct 12 06:27:23 crc kubenswrapper[4930]: I1012 06:27:23.630092 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc460d21c247d2aeb0940a3ae2f5913789ebe845cf647efed77aaa20e61f152f"} err="failed to get container status \"cc460d21c247d2aeb0940a3ae2f5913789ebe845cf647efed77aaa20e61f152f\": rpc error: code = NotFound desc = could not find container \"cc460d21c247d2aeb0940a3ae2f5913789ebe845cf647efed77aaa20e61f152f\": container with ID starting with cc460d21c247d2aeb0940a3ae2f5913789ebe845cf647efed77aaa20e61f152f not found: ID does not exist" Oct 12 06:27:23 crc kubenswrapper[4930]: I1012 06:27:23.630246 4930 scope.go:117] "RemoveContainer" containerID="15081a8f1e48ee33d432d841609a04671f3990bb050f957a3917891971744e25" Oct 12 06:27:23 crc kubenswrapper[4930]: E1012 06:27:23.630992 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15081a8f1e48ee33d432d841609a04671f3990bb050f957a3917891971744e25\": container with ID starting with 15081a8f1e48ee33d432d841609a04671f3990bb050f957a3917891971744e25 not found: ID does not exist" containerID="15081a8f1e48ee33d432d841609a04671f3990bb050f957a3917891971744e25" Oct 12 06:27:23 crc kubenswrapper[4930]: I1012 06:27:23.631059 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15081a8f1e48ee33d432d841609a04671f3990bb050f957a3917891971744e25"} err="failed to get container status \"15081a8f1e48ee33d432d841609a04671f3990bb050f957a3917891971744e25\": rpc error: code = NotFound desc = could not find container \"15081a8f1e48ee33d432d841609a04671f3990bb050f957a3917891971744e25\": container with ID starting with 15081a8f1e48ee33d432d841609a04671f3990bb050f957a3917891971744e25 not found: ID does not exist" Oct 12 06:27:23 crc kubenswrapper[4930]: I1012 06:27:23.631102 4930 scope.go:117] "RemoveContainer" containerID="7c3538d596b8360e935ebf4c16451644dc2b18c53efa9b12ed94605f788d3aea" Oct 12 06:27:23 crc kubenswrapper[4930]: E1012 06:27:23.631781 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c3538d596b8360e935ebf4c16451644dc2b18c53efa9b12ed94605f788d3aea\": container with ID starting with 7c3538d596b8360e935ebf4c16451644dc2b18c53efa9b12ed94605f788d3aea not found: ID does not exist" containerID="7c3538d596b8360e935ebf4c16451644dc2b18c53efa9b12ed94605f788d3aea" Oct 12 06:27:23 crc kubenswrapper[4930]: I1012 06:27:23.631841 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c3538d596b8360e935ebf4c16451644dc2b18c53efa9b12ed94605f788d3aea"} err="failed to get container status \"7c3538d596b8360e935ebf4c16451644dc2b18c53efa9b12ed94605f788d3aea\": rpc error: code = NotFound desc = could not find container \"7c3538d596b8360e935ebf4c16451644dc2b18c53efa9b12ed94605f788d3aea\": container with ID starting with 7c3538d596b8360e935ebf4c16451644dc2b18c53efa9b12ed94605f788d3aea not found: ID does not exist" Oct 12 06:27:24 crc kubenswrapper[4930]: I1012 06:27:24.149437 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fea83cc0-b805-4765-82bd-4625b01aefcc" path="/var/lib/kubelet/pods/fea83cc0-b805-4765-82bd-4625b01aefcc/volumes" Oct 12 06:27:33 crc kubenswrapper[4930]: I1012 06:27:33.669699 4930 patch_prober.go:28] interesting pod/machine-config-daemon-mk4tf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 06:27:33 crc kubenswrapper[4930]: I1012 06:27:33.670364 4930 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 06:28:03 crc kubenswrapper[4930]: I1012 06:28:03.669160 4930 patch_prober.go:28] interesting pod/machine-config-daemon-mk4tf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 06:28:03 crc kubenswrapper[4930]: I1012 06:28:03.669593 4930 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 06:28:03 crc kubenswrapper[4930]: I1012 06:28:03.669636 4930 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" Oct 12 06:28:03 crc kubenswrapper[4930]: I1012 06:28:03.670300 4930 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9ab3f87434347d32cda2544985e1ab8e81d9813b575b68284cd0ffe0db076ad9"} pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 12 06:28:03 crc kubenswrapper[4930]: I1012 06:28:03.670348 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerName="machine-config-daemon" containerID="cri-o://9ab3f87434347d32cda2544985e1ab8e81d9813b575b68284cd0ffe0db076ad9" gracePeriod=600 Oct 12 06:28:04 crc kubenswrapper[4930]: I1012 06:28:04.025759 4930 generic.go:334] "Generic (PLEG): container finished" podID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerID="9ab3f87434347d32cda2544985e1ab8e81d9813b575b68284cd0ffe0db076ad9" exitCode=0 Oct 12 06:28:04 crc kubenswrapper[4930]: I1012 06:28:04.026013 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" event={"ID":"02f8684c-a3e4-44e8-9741-9f54488d8d8d","Type":"ContainerDied","Data":"9ab3f87434347d32cda2544985e1ab8e81d9813b575b68284cd0ffe0db076ad9"} Oct 12 06:28:04 crc kubenswrapper[4930]: I1012 06:28:04.026044 4930 scope.go:117] "RemoveContainer" containerID="c5afaefc8cc429cd5319bf9dedb80677b518177bff290876a471659f3fc22ce9" Oct 12 06:28:05 crc kubenswrapper[4930]: I1012 06:28:05.043144 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" event={"ID":"02f8684c-a3e4-44e8-9741-9f54488d8d8d","Type":"ContainerStarted","Data":"8cf3fb7e8d14641e6ad26a172b6fdafcd1328d22aa4c129661b81d6d5e5a5cc0"} Oct 12 06:28:07 crc kubenswrapper[4930]: I1012 06:28:07.225624 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qv6v4"] Oct 12 06:28:07 crc kubenswrapper[4930]: E1012 06:28:07.226968 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fea83cc0-b805-4765-82bd-4625b01aefcc" containerName="extract-utilities" Oct 12 06:28:07 crc kubenswrapper[4930]: I1012 06:28:07.226990 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="fea83cc0-b805-4765-82bd-4625b01aefcc" containerName="extract-utilities" Oct 12 06:28:07 crc kubenswrapper[4930]: E1012 06:28:07.227020 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fea83cc0-b805-4765-82bd-4625b01aefcc" containerName="extract-content" Oct 12 06:28:07 crc kubenswrapper[4930]: I1012 06:28:07.227033 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="fea83cc0-b805-4765-82bd-4625b01aefcc" containerName="extract-content" Oct 12 06:28:07 crc kubenswrapper[4930]: E1012 06:28:07.227075 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fea83cc0-b805-4765-82bd-4625b01aefcc" containerName="registry-server" Oct 12 06:28:07 crc kubenswrapper[4930]: I1012 06:28:07.227182 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="fea83cc0-b805-4765-82bd-4625b01aefcc" containerName="registry-server" Oct 12 06:28:07 crc kubenswrapper[4930]: I1012 06:28:07.227597 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="fea83cc0-b805-4765-82bd-4625b01aefcc" containerName="registry-server" Oct 12 06:28:07 crc kubenswrapper[4930]: I1012 06:28:07.233589 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qv6v4" Oct 12 06:28:07 crc kubenswrapper[4930]: I1012 06:28:07.257618 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qv6v4"] Oct 12 06:28:07 crc kubenswrapper[4930]: I1012 06:28:07.392588 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b5d1385-71a7-405b-97b4-93a2aaff93ba-catalog-content\") pod \"certified-operators-qv6v4\" (UID: \"9b5d1385-71a7-405b-97b4-93a2aaff93ba\") " pod="openshift-marketplace/certified-operators-qv6v4" Oct 12 06:28:07 crc kubenswrapper[4930]: I1012 06:28:07.392962 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwq6m\" (UniqueName: \"kubernetes.io/projected/9b5d1385-71a7-405b-97b4-93a2aaff93ba-kube-api-access-fwq6m\") pod \"certified-operators-qv6v4\" (UID: \"9b5d1385-71a7-405b-97b4-93a2aaff93ba\") " pod="openshift-marketplace/certified-operators-qv6v4" Oct 12 06:28:07 crc kubenswrapper[4930]: I1012 06:28:07.393052 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b5d1385-71a7-405b-97b4-93a2aaff93ba-utilities\") pod \"certified-operators-qv6v4\" (UID: \"9b5d1385-71a7-405b-97b4-93a2aaff93ba\") " pod="openshift-marketplace/certified-operators-qv6v4" Oct 12 06:28:07 crc kubenswrapper[4930]: I1012 06:28:07.494752 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwq6m\" (UniqueName: \"kubernetes.io/projected/9b5d1385-71a7-405b-97b4-93a2aaff93ba-kube-api-access-fwq6m\") pod \"certified-operators-qv6v4\" (UID: \"9b5d1385-71a7-405b-97b4-93a2aaff93ba\") " pod="openshift-marketplace/certified-operators-qv6v4" Oct 12 06:28:07 crc kubenswrapper[4930]: I1012 06:28:07.494810 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b5d1385-71a7-405b-97b4-93a2aaff93ba-utilities\") pod \"certified-operators-qv6v4\" (UID: \"9b5d1385-71a7-405b-97b4-93a2aaff93ba\") " pod="openshift-marketplace/certified-operators-qv6v4" Oct 12 06:28:07 crc kubenswrapper[4930]: I1012 06:28:07.494841 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b5d1385-71a7-405b-97b4-93a2aaff93ba-catalog-content\") pod \"certified-operators-qv6v4\" (UID: \"9b5d1385-71a7-405b-97b4-93a2aaff93ba\") " pod="openshift-marketplace/certified-operators-qv6v4" Oct 12 06:28:07 crc kubenswrapper[4930]: I1012 06:28:07.495462 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b5d1385-71a7-405b-97b4-93a2aaff93ba-catalog-content\") pod \"certified-operators-qv6v4\" (UID: \"9b5d1385-71a7-405b-97b4-93a2aaff93ba\") " pod="openshift-marketplace/certified-operators-qv6v4" Oct 12 06:28:07 crc kubenswrapper[4930]: I1012 06:28:07.495841 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b5d1385-71a7-405b-97b4-93a2aaff93ba-utilities\") pod \"certified-operators-qv6v4\" (UID: \"9b5d1385-71a7-405b-97b4-93a2aaff93ba\") " pod="openshift-marketplace/certified-operators-qv6v4" Oct 12 06:28:07 crc kubenswrapper[4930]: I1012 06:28:07.514945 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwq6m\" (UniqueName: \"kubernetes.io/projected/9b5d1385-71a7-405b-97b4-93a2aaff93ba-kube-api-access-fwq6m\") pod \"certified-operators-qv6v4\" (UID: \"9b5d1385-71a7-405b-97b4-93a2aaff93ba\") " pod="openshift-marketplace/certified-operators-qv6v4" Oct 12 06:28:07 crc kubenswrapper[4930]: I1012 06:28:07.576356 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qv6v4" Oct 12 06:28:08 crc kubenswrapper[4930]: I1012 06:28:08.127952 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qv6v4"] Oct 12 06:28:09 crc kubenswrapper[4930]: I1012 06:28:09.098293 4930 generic.go:334] "Generic (PLEG): container finished" podID="9b5d1385-71a7-405b-97b4-93a2aaff93ba" containerID="64d2f097d622d6649a678c6ea345849e6f77bc9fde0bbed021789e6913254b68" exitCode=0 Oct 12 06:28:09 crc kubenswrapper[4930]: I1012 06:28:09.098351 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qv6v4" event={"ID":"9b5d1385-71a7-405b-97b4-93a2aaff93ba","Type":"ContainerDied","Data":"64d2f097d622d6649a678c6ea345849e6f77bc9fde0bbed021789e6913254b68"} Oct 12 06:28:09 crc kubenswrapper[4930]: I1012 06:28:09.098543 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qv6v4" event={"ID":"9b5d1385-71a7-405b-97b4-93a2aaff93ba","Type":"ContainerStarted","Data":"d6acc2e7a683e1e9dbaa492a2c0175d42ffd7fca61e3b572a7d90a03e0b14dfb"} Oct 12 06:28:09 crc kubenswrapper[4930]: I1012 06:28:09.101275 4930 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 12 06:28:10 crc kubenswrapper[4930]: I1012 06:28:10.107447 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qv6v4" event={"ID":"9b5d1385-71a7-405b-97b4-93a2aaff93ba","Type":"ContainerStarted","Data":"755d12f3de8c59cf858457ca82a1d637607005a41e63d86bf90d29b82fc7ca8d"} Oct 12 06:28:11 crc kubenswrapper[4930]: I1012 06:28:11.122213 4930 generic.go:334] "Generic (PLEG): container finished" podID="9b5d1385-71a7-405b-97b4-93a2aaff93ba" containerID="755d12f3de8c59cf858457ca82a1d637607005a41e63d86bf90d29b82fc7ca8d" exitCode=0 Oct 12 06:28:11 crc kubenswrapper[4930]: I1012 06:28:11.122327 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qv6v4" event={"ID":"9b5d1385-71a7-405b-97b4-93a2aaff93ba","Type":"ContainerDied","Data":"755d12f3de8c59cf858457ca82a1d637607005a41e63d86bf90d29b82fc7ca8d"} Oct 12 06:28:11 crc kubenswrapper[4930]: I1012 06:28:11.993019 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9jk7x"] Oct 12 06:28:11 crc kubenswrapper[4930]: I1012 06:28:11.995111 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9jk7x" Oct 12 06:28:12 crc kubenswrapper[4930]: I1012 06:28:12.036842 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9jk7x"] Oct 12 06:28:12 crc kubenswrapper[4930]: I1012 06:28:12.133478 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qv6v4" event={"ID":"9b5d1385-71a7-405b-97b4-93a2aaff93ba","Type":"ContainerStarted","Data":"f391a995415d7611bf42f03a1aa03fa95e648d3010ea914c3baefad1c6d810d3"} Oct 12 06:28:12 crc kubenswrapper[4930]: I1012 06:28:12.158649 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qv6v4" podStartSLOduration=2.717665058 podStartE2EDuration="5.158628445s" podCreationTimestamp="2025-10-12 06:28:07 +0000 UTC" firstStartedPulling="2025-10-12 06:28:09.101002979 +0000 UTC m=+2821.643104744" lastFinishedPulling="2025-10-12 06:28:11.541966366 +0000 UTC m=+2824.084068131" observedRunningTime="2025-10-12 06:28:12.151128589 +0000 UTC m=+2824.693230364" watchObservedRunningTime="2025-10-12 06:28:12.158628445 +0000 UTC m=+2824.700730220" Oct 12 06:28:12 crc kubenswrapper[4930]: I1012 06:28:12.192263 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/059c8bb5-57ed-448e-a318-add0be354986-utilities\") pod \"redhat-marketplace-9jk7x\" (UID: \"059c8bb5-57ed-448e-a318-add0be354986\") " pod="openshift-marketplace/redhat-marketplace-9jk7x" Oct 12 06:28:12 crc kubenswrapper[4930]: I1012 06:28:12.192334 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/059c8bb5-57ed-448e-a318-add0be354986-catalog-content\") pod \"redhat-marketplace-9jk7x\" (UID: \"059c8bb5-57ed-448e-a318-add0be354986\") " pod="openshift-marketplace/redhat-marketplace-9jk7x" Oct 12 06:28:12 crc kubenswrapper[4930]: I1012 06:28:12.192518 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbzc2\" (UniqueName: \"kubernetes.io/projected/059c8bb5-57ed-448e-a318-add0be354986-kube-api-access-jbzc2\") pod \"redhat-marketplace-9jk7x\" (UID: \"059c8bb5-57ed-448e-a318-add0be354986\") " pod="openshift-marketplace/redhat-marketplace-9jk7x" Oct 12 06:28:12 crc kubenswrapper[4930]: I1012 06:28:12.295361 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbzc2\" (UniqueName: \"kubernetes.io/projected/059c8bb5-57ed-448e-a318-add0be354986-kube-api-access-jbzc2\") pod \"redhat-marketplace-9jk7x\" (UID: \"059c8bb5-57ed-448e-a318-add0be354986\") " pod="openshift-marketplace/redhat-marketplace-9jk7x" Oct 12 06:28:12 crc kubenswrapper[4930]: I1012 06:28:12.296040 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/059c8bb5-57ed-448e-a318-add0be354986-utilities\") pod \"redhat-marketplace-9jk7x\" (UID: \"059c8bb5-57ed-448e-a318-add0be354986\") " pod="openshift-marketplace/redhat-marketplace-9jk7x" Oct 12 06:28:12 crc kubenswrapper[4930]: I1012 06:28:12.296083 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/059c8bb5-57ed-448e-a318-add0be354986-catalog-content\") pod \"redhat-marketplace-9jk7x\" (UID: \"059c8bb5-57ed-448e-a318-add0be354986\") " pod="openshift-marketplace/redhat-marketplace-9jk7x" Oct 12 06:28:12 crc kubenswrapper[4930]: I1012 06:28:12.296472 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/059c8bb5-57ed-448e-a318-add0be354986-utilities\") pod \"redhat-marketplace-9jk7x\" (UID: \"059c8bb5-57ed-448e-a318-add0be354986\") " pod="openshift-marketplace/redhat-marketplace-9jk7x" Oct 12 06:28:12 crc kubenswrapper[4930]: I1012 06:28:12.296663 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/059c8bb5-57ed-448e-a318-add0be354986-catalog-content\") pod \"redhat-marketplace-9jk7x\" (UID: \"059c8bb5-57ed-448e-a318-add0be354986\") " pod="openshift-marketplace/redhat-marketplace-9jk7x" Oct 12 06:28:12 crc kubenswrapper[4930]: I1012 06:28:12.315968 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbzc2\" (UniqueName: \"kubernetes.io/projected/059c8bb5-57ed-448e-a318-add0be354986-kube-api-access-jbzc2\") pod \"redhat-marketplace-9jk7x\" (UID: \"059c8bb5-57ed-448e-a318-add0be354986\") " pod="openshift-marketplace/redhat-marketplace-9jk7x" Oct 12 06:28:12 crc kubenswrapper[4930]: I1012 06:28:12.610314 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9jk7x" Oct 12 06:28:13 crc kubenswrapper[4930]: I1012 06:28:13.647238 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9jk7x"] Oct 12 06:28:13 crc kubenswrapper[4930]: W1012 06:28:13.655006 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod059c8bb5_57ed_448e_a318_add0be354986.slice/crio-d3d9d0d379f993e838da0abf548bb905d3bdcefb567f6c8ded8750d9cf41bdd4 WatchSource:0}: Error finding container d3d9d0d379f993e838da0abf548bb905d3bdcefb567f6c8ded8750d9cf41bdd4: Status 404 returned error can't find the container with id d3d9d0d379f993e838da0abf548bb905d3bdcefb567f6c8ded8750d9cf41bdd4 Oct 12 06:28:14 crc kubenswrapper[4930]: I1012 06:28:14.158309 4930 generic.go:334] "Generic (PLEG): container finished" podID="059c8bb5-57ed-448e-a318-add0be354986" containerID="5fae32ece74e8e687f6f539e79fc1145e933d3f00ca4ec80b53b0b52ca700823" exitCode=0 Oct 12 06:28:14 crc kubenswrapper[4930]: I1012 06:28:14.158384 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9jk7x" event={"ID":"059c8bb5-57ed-448e-a318-add0be354986","Type":"ContainerDied","Data":"5fae32ece74e8e687f6f539e79fc1145e933d3f00ca4ec80b53b0b52ca700823"} Oct 12 06:28:14 crc kubenswrapper[4930]: I1012 06:28:14.158717 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9jk7x" event={"ID":"059c8bb5-57ed-448e-a318-add0be354986","Type":"ContainerStarted","Data":"d3d9d0d379f993e838da0abf548bb905d3bdcefb567f6c8ded8750d9cf41bdd4"} Oct 12 06:28:15 crc kubenswrapper[4930]: I1012 06:28:15.176933 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9jk7x" event={"ID":"059c8bb5-57ed-448e-a318-add0be354986","Type":"ContainerStarted","Data":"7cb2572df355b8863217a8091ff4d8ddf1bdcb5b3d492327897eb99e71eab4ca"} Oct 12 06:28:16 crc kubenswrapper[4930]: I1012 06:28:16.188413 4930 generic.go:334] "Generic (PLEG): container finished" podID="059c8bb5-57ed-448e-a318-add0be354986" containerID="7cb2572df355b8863217a8091ff4d8ddf1bdcb5b3d492327897eb99e71eab4ca" exitCode=0 Oct 12 06:28:16 crc kubenswrapper[4930]: I1012 06:28:16.188718 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9jk7x" event={"ID":"059c8bb5-57ed-448e-a318-add0be354986","Type":"ContainerDied","Data":"7cb2572df355b8863217a8091ff4d8ddf1bdcb5b3d492327897eb99e71eab4ca"} Oct 12 06:28:17 crc kubenswrapper[4930]: I1012 06:28:17.199940 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9jk7x" event={"ID":"059c8bb5-57ed-448e-a318-add0be354986","Type":"ContainerStarted","Data":"f9203c4d388410b9e6555f6db485d9894a4b761ed2e5fcf3dcf2ae6eedfa0c0d"} Oct 12 06:28:17 crc kubenswrapper[4930]: I1012 06:28:17.223807 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9jk7x" podStartSLOduration=3.696749781 podStartE2EDuration="6.223781505s" podCreationTimestamp="2025-10-12 06:28:11 +0000 UTC" firstStartedPulling="2025-10-12 06:28:14.160514809 +0000 UTC m=+2826.702616584" lastFinishedPulling="2025-10-12 06:28:16.687546533 +0000 UTC m=+2829.229648308" observedRunningTime="2025-10-12 06:28:17.221830127 +0000 UTC m=+2829.763931892" watchObservedRunningTime="2025-10-12 06:28:17.223781505 +0000 UTC m=+2829.765883300" Oct 12 06:28:17 crc kubenswrapper[4930]: I1012 06:28:17.576571 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qv6v4" Oct 12 06:28:17 crc kubenswrapper[4930]: I1012 06:28:17.577045 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qv6v4" Oct 12 06:28:17 crc kubenswrapper[4930]: I1012 06:28:17.662816 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qv6v4" Oct 12 06:28:18 crc kubenswrapper[4930]: I1012 06:28:18.271689 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qv6v4" Oct 12 06:28:21 crc kubenswrapper[4930]: I1012 06:28:21.993231 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qv6v4"] Oct 12 06:28:21 crc kubenswrapper[4930]: I1012 06:28:21.994461 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qv6v4" podUID="9b5d1385-71a7-405b-97b4-93a2aaff93ba" containerName="registry-server" containerID="cri-o://f391a995415d7611bf42f03a1aa03fa95e648d3010ea914c3baefad1c6d810d3" gracePeriod=2 Oct 12 06:28:22 crc kubenswrapper[4930]: I1012 06:28:22.258242 4930 generic.go:334] "Generic (PLEG): container finished" podID="9b5d1385-71a7-405b-97b4-93a2aaff93ba" containerID="f391a995415d7611bf42f03a1aa03fa95e648d3010ea914c3baefad1c6d810d3" exitCode=0 Oct 12 06:28:22 crc kubenswrapper[4930]: I1012 06:28:22.258309 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qv6v4" event={"ID":"9b5d1385-71a7-405b-97b4-93a2aaff93ba","Type":"ContainerDied","Data":"f391a995415d7611bf42f03a1aa03fa95e648d3010ea914c3baefad1c6d810d3"} Oct 12 06:28:22 crc kubenswrapper[4930]: I1012 06:28:22.552410 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qv6v4" Oct 12 06:28:22 crc kubenswrapper[4930]: I1012 06:28:22.611006 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9jk7x" Oct 12 06:28:22 crc kubenswrapper[4930]: I1012 06:28:22.611151 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9jk7x" Oct 12 06:28:22 crc kubenswrapper[4930]: I1012 06:28:22.672886 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9jk7x" Oct 12 06:28:22 crc kubenswrapper[4930]: I1012 06:28:22.717677 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b5d1385-71a7-405b-97b4-93a2aaff93ba-utilities\") pod \"9b5d1385-71a7-405b-97b4-93a2aaff93ba\" (UID: \"9b5d1385-71a7-405b-97b4-93a2aaff93ba\") " Oct 12 06:28:22 crc kubenswrapper[4930]: I1012 06:28:22.717944 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b5d1385-71a7-405b-97b4-93a2aaff93ba-catalog-content\") pod \"9b5d1385-71a7-405b-97b4-93a2aaff93ba\" (UID: \"9b5d1385-71a7-405b-97b4-93a2aaff93ba\") " Oct 12 06:28:22 crc kubenswrapper[4930]: I1012 06:28:22.718116 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwq6m\" (UniqueName: \"kubernetes.io/projected/9b5d1385-71a7-405b-97b4-93a2aaff93ba-kube-api-access-fwq6m\") pod \"9b5d1385-71a7-405b-97b4-93a2aaff93ba\" (UID: \"9b5d1385-71a7-405b-97b4-93a2aaff93ba\") " Oct 12 06:28:22 crc kubenswrapper[4930]: I1012 06:28:22.718626 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b5d1385-71a7-405b-97b4-93a2aaff93ba-utilities" (OuterVolumeSpecName: "utilities") pod "9b5d1385-71a7-405b-97b4-93a2aaff93ba" (UID: "9b5d1385-71a7-405b-97b4-93a2aaff93ba"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 06:28:22 crc kubenswrapper[4930]: I1012 06:28:22.718787 4930 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b5d1385-71a7-405b-97b4-93a2aaff93ba-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 06:28:22 crc kubenswrapper[4930]: I1012 06:28:22.723302 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b5d1385-71a7-405b-97b4-93a2aaff93ba-kube-api-access-fwq6m" (OuterVolumeSpecName: "kube-api-access-fwq6m") pod "9b5d1385-71a7-405b-97b4-93a2aaff93ba" (UID: "9b5d1385-71a7-405b-97b4-93a2aaff93ba"). InnerVolumeSpecName "kube-api-access-fwq6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:28:22 crc kubenswrapper[4930]: I1012 06:28:22.770219 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b5d1385-71a7-405b-97b4-93a2aaff93ba-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9b5d1385-71a7-405b-97b4-93a2aaff93ba" (UID: "9b5d1385-71a7-405b-97b4-93a2aaff93ba"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 06:28:22 crc kubenswrapper[4930]: I1012 06:28:22.820391 4930 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b5d1385-71a7-405b-97b4-93a2aaff93ba-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 06:28:22 crc kubenswrapper[4930]: I1012 06:28:22.820433 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwq6m\" (UniqueName: \"kubernetes.io/projected/9b5d1385-71a7-405b-97b4-93a2aaff93ba-kube-api-access-fwq6m\") on node \"crc\" DevicePath \"\"" Oct 12 06:28:23 crc kubenswrapper[4930]: I1012 06:28:23.274310 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qv6v4" event={"ID":"9b5d1385-71a7-405b-97b4-93a2aaff93ba","Type":"ContainerDied","Data":"d6acc2e7a683e1e9dbaa492a2c0175d42ffd7fca61e3b572a7d90a03e0b14dfb"} Oct 12 06:28:23 crc kubenswrapper[4930]: I1012 06:28:23.274353 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qv6v4" Oct 12 06:28:23 crc kubenswrapper[4930]: I1012 06:28:23.274382 4930 scope.go:117] "RemoveContainer" containerID="f391a995415d7611bf42f03a1aa03fa95e648d3010ea914c3baefad1c6d810d3" Oct 12 06:28:23 crc kubenswrapper[4930]: I1012 06:28:23.314297 4930 scope.go:117] "RemoveContainer" containerID="755d12f3de8c59cf858457ca82a1d637607005a41e63d86bf90d29b82fc7ca8d" Oct 12 06:28:23 crc kubenswrapper[4930]: I1012 06:28:23.323463 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qv6v4"] Oct 12 06:28:23 crc kubenswrapper[4930]: I1012 06:28:23.356024 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qv6v4"] Oct 12 06:28:23 crc kubenswrapper[4930]: I1012 06:28:23.361905 4930 scope.go:117] "RemoveContainer" containerID="64d2f097d622d6649a678c6ea345849e6f77bc9fde0bbed021789e6913254b68" Oct 12 06:28:23 crc kubenswrapper[4930]: I1012 06:28:23.382420 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9jk7x" Oct 12 06:28:24 crc kubenswrapper[4930]: I1012 06:28:24.153519 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b5d1385-71a7-405b-97b4-93a2aaff93ba" path="/var/lib/kubelet/pods/9b5d1385-71a7-405b-97b4-93a2aaff93ba/volumes" Oct 12 06:28:25 crc kubenswrapper[4930]: I1012 06:28:25.189913 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9jk7x"] Oct 12 06:28:26 crc kubenswrapper[4930]: I1012 06:28:26.314572 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9jk7x" podUID="059c8bb5-57ed-448e-a318-add0be354986" containerName="registry-server" containerID="cri-o://f9203c4d388410b9e6555f6db485d9894a4b761ed2e5fcf3dcf2ae6eedfa0c0d" gracePeriod=2 Oct 12 06:28:26 crc kubenswrapper[4930]: I1012 06:28:26.745467 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9jk7x" Oct 12 06:28:26 crc kubenswrapper[4930]: I1012 06:28:26.912949 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbzc2\" (UniqueName: \"kubernetes.io/projected/059c8bb5-57ed-448e-a318-add0be354986-kube-api-access-jbzc2\") pod \"059c8bb5-57ed-448e-a318-add0be354986\" (UID: \"059c8bb5-57ed-448e-a318-add0be354986\") " Oct 12 06:28:26 crc kubenswrapper[4930]: I1012 06:28:26.913040 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/059c8bb5-57ed-448e-a318-add0be354986-catalog-content\") pod \"059c8bb5-57ed-448e-a318-add0be354986\" (UID: \"059c8bb5-57ed-448e-a318-add0be354986\") " Oct 12 06:28:26 crc kubenswrapper[4930]: I1012 06:28:26.913193 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/059c8bb5-57ed-448e-a318-add0be354986-utilities\") pod \"059c8bb5-57ed-448e-a318-add0be354986\" (UID: \"059c8bb5-57ed-448e-a318-add0be354986\") " Oct 12 06:28:26 crc kubenswrapper[4930]: I1012 06:28:26.914106 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/059c8bb5-57ed-448e-a318-add0be354986-utilities" (OuterVolumeSpecName: "utilities") pod "059c8bb5-57ed-448e-a318-add0be354986" (UID: "059c8bb5-57ed-448e-a318-add0be354986"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 06:28:26 crc kubenswrapper[4930]: I1012 06:28:26.924342 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/059c8bb5-57ed-448e-a318-add0be354986-kube-api-access-jbzc2" (OuterVolumeSpecName: "kube-api-access-jbzc2") pod "059c8bb5-57ed-448e-a318-add0be354986" (UID: "059c8bb5-57ed-448e-a318-add0be354986"). InnerVolumeSpecName "kube-api-access-jbzc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:28:26 crc kubenswrapper[4930]: I1012 06:28:26.925896 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/059c8bb5-57ed-448e-a318-add0be354986-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "059c8bb5-57ed-448e-a318-add0be354986" (UID: "059c8bb5-57ed-448e-a318-add0be354986"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 06:28:27 crc kubenswrapper[4930]: I1012 06:28:27.015276 4930 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/059c8bb5-57ed-448e-a318-add0be354986-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 06:28:27 crc kubenswrapper[4930]: I1012 06:28:27.015315 4930 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/059c8bb5-57ed-448e-a318-add0be354986-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 06:28:27 crc kubenswrapper[4930]: I1012 06:28:27.015329 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbzc2\" (UniqueName: \"kubernetes.io/projected/059c8bb5-57ed-448e-a318-add0be354986-kube-api-access-jbzc2\") on node \"crc\" DevicePath \"\"" Oct 12 06:28:27 crc kubenswrapper[4930]: I1012 06:28:27.330063 4930 generic.go:334] "Generic (PLEG): container finished" podID="059c8bb5-57ed-448e-a318-add0be354986" containerID="f9203c4d388410b9e6555f6db485d9894a4b761ed2e5fcf3dcf2ae6eedfa0c0d" exitCode=0 Oct 12 06:28:27 crc kubenswrapper[4930]: I1012 06:28:27.330120 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9jk7x" event={"ID":"059c8bb5-57ed-448e-a318-add0be354986","Type":"ContainerDied","Data":"f9203c4d388410b9e6555f6db485d9894a4b761ed2e5fcf3dcf2ae6eedfa0c0d"} Oct 12 06:28:27 crc kubenswrapper[4930]: I1012 06:28:27.330156 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9jk7x" event={"ID":"059c8bb5-57ed-448e-a318-add0be354986","Type":"ContainerDied","Data":"d3d9d0d379f993e838da0abf548bb905d3bdcefb567f6c8ded8750d9cf41bdd4"} Oct 12 06:28:27 crc kubenswrapper[4930]: I1012 06:28:27.330177 4930 scope.go:117] "RemoveContainer" containerID="f9203c4d388410b9e6555f6db485d9894a4b761ed2e5fcf3dcf2ae6eedfa0c0d" Oct 12 06:28:27 crc kubenswrapper[4930]: I1012 06:28:27.330189 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9jk7x" Oct 12 06:28:27 crc kubenswrapper[4930]: I1012 06:28:27.385972 4930 scope.go:117] "RemoveContainer" containerID="7cb2572df355b8863217a8091ff4d8ddf1bdcb5b3d492327897eb99e71eab4ca" Oct 12 06:28:27 crc kubenswrapper[4930]: I1012 06:28:27.393596 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9jk7x"] Oct 12 06:28:27 crc kubenswrapper[4930]: I1012 06:28:27.404701 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9jk7x"] Oct 12 06:28:27 crc kubenswrapper[4930]: I1012 06:28:27.426273 4930 scope.go:117] "RemoveContainer" containerID="5fae32ece74e8e687f6f539e79fc1145e933d3f00ca4ec80b53b0b52ca700823" Oct 12 06:28:27 crc kubenswrapper[4930]: I1012 06:28:27.482935 4930 scope.go:117] "RemoveContainer" containerID="f9203c4d388410b9e6555f6db485d9894a4b761ed2e5fcf3dcf2ae6eedfa0c0d" Oct 12 06:28:27 crc kubenswrapper[4930]: E1012 06:28:27.483390 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9203c4d388410b9e6555f6db485d9894a4b761ed2e5fcf3dcf2ae6eedfa0c0d\": container with ID starting with f9203c4d388410b9e6555f6db485d9894a4b761ed2e5fcf3dcf2ae6eedfa0c0d not found: ID does not exist" containerID="f9203c4d388410b9e6555f6db485d9894a4b761ed2e5fcf3dcf2ae6eedfa0c0d" Oct 12 06:28:27 crc kubenswrapper[4930]: I1012 06:28:27.483437 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9203c4d388410b9e6555f6db485d9894a4b761ed2e5fcf3dcf2ae6eedfa0c0d"} err="failed to get container status \"f9203c4d388410b9e6555f6db485d9894a4b761ed2e5fcf3dcf2ae6eedfa0c0d\": rpc error: code = NotFound desc = could not find container \"f9203c4d388410b9e6555f6db485d9894a4b761ed2e5fcf3dcf2ae6eedfa0c0d\": container with ID starting with f9203c4d388410b9e6555f6db485d9894a4b761ed2e5fcf3dcf2ae6eedfa0c0d not found: ID does not exist" Oct 12 06:28:27 crc kubenswrapper[4930]: I1012 06:28:27.483471 4930 scope.go:117] "RemoveContainer" containerID="7cb2572df355b8863217a8091ff4d8ddf1bdcb5b3d492327897eb99e71eab4ca" Oct 12 06:28:27 crc kubenswrapper[4930]: E1012 06:28:27.483784 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cb2572df355b8863217a8091ff4d8ddf1bdcb5b3d492327897eb99e71eab4ca\": container with ID starting with 7cb2572df355b8863217a8091ff4d8ddf1bdcb5b3d492327897eb99e71eab4ca not found: ID does not exist" containerID="7cb2572df355b8863217a8091ff4d8ddf1bdcb5b3d492327897eb99e71eab4ca" Oct 12 06:28:27 crc kubenswrapper[4930]: I1012 06:28:27.483829 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cb2572df355b8863217a8091ff4d8ddf1bdcb5b3d492327897eb99e71eab4ca"} err="failed to get container status \"7cb2572df355b8863217a8091ff4d8ddf1bdcb5b3d492327897eb99e71eab4ca\": rpc error: code = NotFound desc = could not find container \"7cb2572df355b8863217a8091ff4d8ddf1bdcb5b3d492327897eb99e71eab4ca\": container with ID starting with 7cb2572df355b8863217a8091ff4d8ddf1bdcb5b3d492327897eb99e71eab4ca not found: ID does not exist" Oct 12 06:28:27 crc kubenswrapper[4930]: I1012 06:28:27.483855 4930 scope.go:117] "RemoveContainer" containerID="5fae32ece74e8e687f6f539e79fc1145e933d3f00ca4ec80b53b0b52ca700823" Oct 12 06:28:27 crc kubenswrapper[4930]: E1012 06:28:27.484151 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fae32ece74e8e687f6f539e79fc1145e933d3f00ca4ec80b53b0b52ca700823\": container with ID starting with 5fae32ece74e8e687f6f539e79fc1145e933d3f00ca4ec80b53b0b52ca700823 not found: ID does not exist" containerID="5fae32ece74e8e687f6f539e79fc1145e933d3f00ca4ec80b53b0b52ca700823" Oct 12 06:28:27 crc kubenswrapper[4930]: I1012 06:28:27.484194 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fae32ece74e8e687f6f539e79fc1145e933d3f00ca4ec80b53b0b52ca700823"} err="failed to get container status \"5fae32ece74e8e687f6f539e79fc1145e933d3f00ca4ec80b53b0b52ca700823\": rpc error: code = NotFound desc = could not find container \"5fae32ece74e8e687f6f539e79fc1145e933d3f00ca4ec80b53b0b52ca700823\": container with ID starting with 5fae32ece74e8e687f6f539e79fc1145e933d3f00ca4ec80b53b0b52ca700823 not found: ID does not exist" Oct 12 06:28:27 crc kubenswrapper[4930]: E1012 06:28:27.497295 4930 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod059c8bb5_57ed_448e_a318_add0be354986.slice\": RecentStats: unable to find data in memory cache]" Oct 12 06:28:28 crc kubenswrapper[4930]: I1012 06:28:28.159673 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="059c8bb5-57ed-448e-a318-add0be354986" path="/var/lib/kubelet/pods/059c8bb5-57ed-448e-a318-add0be354986/volumes" Oct 12 06:29:26 crc kubenswrapper[4930]: I1012 06:29:26.052809 4930 generic.go:334] "Generic (PLEG): container finished" podID="fa0905ab-f3dc-41c6-b517-9f9ac23d7adc" containerID="e3ea49969d68a92368cab21b9b71fd42d139cff85a005f0139ea93ff6b1371d3" exitCode=0 Oct 12 06:29:26 crc kubenswrapper[4930]: I1012 06:29:26.052896 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-n5xlj" event={"ID":"fa0905ab-f3dc-41c6-b517-9f9ac23d7adc","Type":"ContainerDied","Data":"e3ea49969d68a92368cab21b9b71fd42d139cff85a005f0139ea93ff6b1371d3"} Oct 12 06:29:27 crc kubenswrapper[4930]: I1012 06:29:27.616628 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-n5xlj" Oct 12 06:29:27 crc kubenswrapper[4930]: I1012 06:29:27.765744 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/fa0905ab-f3dc-41c6-b517-9f9ac23d7adc-ceilometer-compute-config-data-2\") pod \"fa0905ab-f3dc-41c6-b517-9f9ac23d7adc\" (UID: \"fa0905ab-f3dc-41c6-b517-9f9ac23d7adc\") " Oct 12 06:29:27 crc kubenswrapper[4930]: I1012 06:29:27.765840 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fa0905ab-f3dc-41c6-b517-9f9ac23d7adc-inventory\") pod \"fa0905ab-f3dc-41c6-b517-9f9ac23d7adc\" (UID: \"fa0905ab-f3dc-41c6-b517-9f9ac23d7adc\") " Oct 12 06:29:27 crc kubenswrapper[4930]: I1012 06:29:27.766507 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/fa0905ab-f3dc-41c6-b517-9f9ac23d7adc-ceilometer-compute-config-data-0\") pod \"fa0905ab-f3dc-41c6-b517-9f9ac23d7adc\" (UID: \"fa0905ab-f3dc-41c6-b517-9f9ac23d7adc\") " Oct 12 06:29:27 crc kubenswrapper[4930]: I1012 06:29:27.766581 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa0905ab-f3dc-41c6-b517-9f9ac23d7adc-telemetry-combined-ca-bundle\") pod \"fa0905ab-f3dc-41c6-b517-9f9ac23d7adc\" (UID: \"fa0905ab-f3dc-41c6-b517-9f9ac23d7adc\") " Oct 12 06:29:27 crc kubenswrapper[4930]: I1012 06:29:27.766702 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/fa0905ab-f3dc-41c6-b517-9f9ac23d7adc-ceilometer-compute-config-data-1\") pod \"fa0905ab-f3dc-41c6-b517-9f9ac23d7adc\" (UID: \"fa0905ab-f3dc-41c6-b517-9f9ac23d7adc\") " Oct 12 06:29:27 crc kubenswrapper[4930]: I1012 06:29:27.766914 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6czlc\" (UniqueName: \"kubernetes.io/projected/fa0905ab-f3dc-41c6-b517-9f9ac23d7adc-kube-api-access-6czlc\") pod \"fa0905ab-f3dc-41c6-b517-9f9ac23d7adc\" (UID: \"fa0905ab-f3dc-41c6-b517-9f9ac23d7adc\") " Oct 12 06:29:27 crc kubenswrapper[4930]: I1012 06:29:27.766967 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fa0905ab-f3dc-41c6-b517-9f9ac23d7adc-ssh-key\") pod \"fa0905ab-f3dc-41c6-b517-9f9ac23d7adc\" (UID: \"fa0905ab-f3dc-41c6-b517-9f9ac23d7adc\") " Oct 12 06:29:27 crc kubenswrapper[4930]: I1012 06:29:27.774271 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa0905ab-f3dc-41c6-b517-9f9ac23d7adc-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "fa0905ab-f3dc-41c6-b517-9f9ac23d7adc" (UID: "fa0905ab-f3dc-41c6-b517-9f9ac23d7adc"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:29:27 crc kubenswrapper[4930]: I1012 06:29:27.776198 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa0905ab-f3dc-41c6-b517-9f9ac23d7adc-kube-api-access-6czlc" (OuterVolumeSpecName: "kube-api-access-6czlc") pod "fa0905ab-f3dc-41c6-b517-9f9ac23d7adc" (UID: "fa0905ab-f3dc-41c6-b517-9f9ac23d7adc"). InnerVolumeSpecName "kube-api-access-6czlc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:29:27 crc kubenswrapper[4930]: I1012 06:29:27.812560 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa0905ab-f3dc-41c6-b517-9f9ac23d7adc-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "fa0905ab-f3dc-41c6-b517-9f9ac23d7adc" (UID: "fa0905ab-f3dc-41c6-b517-9f9ac23d7adc"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:29:27 crc kubenswrapper[4930]: I1012 06:29:27.813292 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa0905ab-f3dc-41c6-b517-9f9ac23d7adc-inventory" (OuterVolumeSpecName: "inventory") pod "fa0905ab-f3dc-41c6-b517-9f9ac23d7adc" (UID: "fa0905ab-f3dc-41c6-b517-9f9ac23d7adc"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:29:27 crc kubenswrapper[4930]: I1012 06:29:27.822547 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa0905ab-f3dc-41c6-b517-9f9ac23d7adc-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "fa0905ab-f3dc-41c6-b517-9f9ac23d7adc" (UID: "fa0905ab-f3dc-41c6-b517-9f9ac23d7adc"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:29:27 crc kubenswrapper[4930]: I1012 06:29:27.823302 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa0905ab-f3dc-41c6-b517-9f9ac23d7adc-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "fa0905ab-f3dc-41c6-b517-9f9ac23d7adc" (UID: "fa0905ab-f3dc-41c6-b517-9f9ac23d7adc"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:29:27 crc kubenswrapper[4930]: I1012 06:29:27.831364 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa0905ab-f3dc-41c6-b517-9f9ac23d7adc-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "fa0905ab-f3dc-41c6-b517-9f9ac23d7adc" (UID: "fa0905ab-f3dc-41c6-b517-9f9ac23d7adc"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:29:27 crc kubenswrapper[4930]: I1012 06:29:27.869313 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6czlc\" (UniqueName: \"kubernetes.io/projected/fa0905ab-f3dc-41c6-b517-9f9ac23d7adc-kube-api-access-6czlc\") on node \"crc\" DevicePath \"\"" Oct 12 06:29:27 crc kubenswrapper[4930]: I1012 06:29:27.869357 4930 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fa0905ab-f3dc-41c6-b517-9f9ac23d7adc-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 12 06:29:27 crc kubenswrapper[4930]: I1012 06:29:27.869372 4930 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/fa0905ab-f3dc-41c6-b517-9f9ac23d7adc-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Oct 12 06:29:27 crc kubenswrapper[4930]: I1012 06:29:27.869386 4930 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fa0905ab-f3dc-41c6-b517-9f9ac23d7adc-inventory\") on node \"crc\" DevicePath \"\"" Oct 12 06:29:27 crc kubenswrapper[4930]: I1012 06:29:27.869401 4930 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/fa0905ab-f3dc-41c6-b517-9f9ac23d7adc-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Oct 12 06:29:27 crc kubenswrapper[4930]: I1012 06:29:27.869466 4930 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa0905ab-f3dc-41c6-b517-9f9ac23d7adc-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 06:29:27 crc kubenswrapper[4930]: I1012 06:29:27.869480 4930 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/fa0905ab-f3dc-41c6-b517-9f9ac23d7adc-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Oct 12 06:29:28 crc kubenswrapper[4930]: I1012 06:29:28.086335 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-n5xlj" event={"ID":"fa0905ab-f3dc-41c6-b517-9f9ac23d7adc","Type":"ContainerDied","Data":"4ed42fcb2253292070242408b4fbc8bbd63417ea9053c92563149e93d0243182"} Oct 12 06:29:28 crc kubenswrapper[4930]: I1012 06:29:28.086398 4930 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ed42fcb2253292070242408b4fbc8bbd63417ea9053c92563149e93d0243182" Oct 12 06:29:28 crc kubenswrapper[4930]: I1012 06:29:28.086477 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-n5xlj" Oct 12 06:29:45 crc kubenswrapper[4930]: I1012 06:29:45.083056 4930 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-7d8c9db847-bqfrb" podUID="4ed14594-beb5-4ce3-bf04-4a9299a932be" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Oct 12 06:30:00 crc kubenswrapper[4930]: I1012 06:30:00.204700 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29337510-4lkdj"] Oct 12 06:30:00 crc kubenswrapper[4930]: E1012 06:30:00.205528 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b5d1385-71a7-405b-97b4-93a2aaff93ba" containerName="registry-server" Oct 12 06:30:00 crc kubenswrapper[4930]: I1012 06:30:00.205540 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b5d1385-71a7-405b-97b4-93a2aaff93ba" containerName="registry-server" Oct 12 06:30:00 crc kubenswrapper[4930]: E1012 06:30:00.205571 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="059c8bb5-57ed-448e-a318-add0be354986" containerName="registry-server" Oct 12 06:30:00 crc kubenswrapper[4930]: I1012 06:30:00.205577 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="059c8bb5-57ed-448e-a318-add0be354986" containerName="registry-server" Oct 12 06:30:00 crc kubenswrapper[4930]: E1012 06:30:00.205589 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa0905ab-f3dc-41c6-b517-9f9ac23d7adc" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 12 06:30:00 crc kubenswrapper[4930]: I1012 06:30:00.205596 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa0905ab-f3dc-41c6-b517-9f9ac23d7adc" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 12 06:30:00 crc kubenswrapper[4930]: E1012 06:30:00.205608 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="059c8bb5-57ed-448e-a318-add0be354986" containerName="extract-content" Oct 12 06:30:00 crc kubenswrapper[4930]: I1012 06:30:00.205614 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="059c8bb5-57ed-448e-a318-add0be354986" containerName="extract-content" Oct 12 06:30:00 crc kubenswrapper[4930]: E1012 06:30:00.205621 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b5d1385-71a7-405b-97b4-93a2aaff93ba" containerName="extract-content" Oct 12 06:30:00 crc kubenswrapper[4930]: I1012 06:30:00.205628 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b5d1385-71a7-405b-97b4-93a2aaff93ba" containerName="extract-content" Oct 12 06:30:00 crc kubenswrapper[4930]: E1012 06:30:00.205646 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="059c8bb5-57ed-448e-a318-add0be354986" containerName="extract-utilities" Oct 12 06:30:00 crc kubenswrapper[4930]: I1012 06:30:00.205652 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="059c8bb5-57ed-448e-a318-add0be354986" containerName="extract-utilities" Oct 12 06:30:00 crc kubenswrapper[4930]: E1012 06:30:00.205661 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b5d1385-71a7-405b-97b4-93a2aaff93ba" containerName="extract-utilities" Oct 12 06:30:00 crc kubenswrapper[4930]: I1012 06:30:00.205667 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b5d1385-71a7-405b-97b4-93a2aaff93ba" containerName="extract-utilities" Oct 12 06:30:00 crc kubenswrapper[4930]: I1012 06:30:00.205857 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="059c8bb5-57ed-448e-a318-add0be354986" containerName="registry-server" Oct 12 06:30:00 crc kubenswrapper[4930]: I1012 06:30:00.205872 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b5d1385-71a7-405b-97b4-93a2aaff93ba" containerName="registry-server" Oct 12 06:30:00 crc kubenswrapper[4930]: I1012 06:30:00.205886 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa0905ab-f3dc-41c6-b517-9f9ac23d7adc" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 12 06:30:00 crc kubenswrapper[4930]: I1012 06:30:00.206536 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29337510-4lkdj" Oct 12 06:30:00 crc kubenswrapper[4930]: I1012 06:30:00.210494 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 12 06:30:00 crc kubenswrapper[4930]: I1012 06:30:00.210727 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 12 06:30:00 crc kubenswrapper[4930]: I1012 06:30:00.225946 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29337510-4lkdj"] Oct 12 06:30:00 crc kubenswrapper[4930]: I1012 06:30:00.356919 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4867q\" (UniqueName: \"kubernetes.io/projected/5e257f2a-ca74-43c1-bc58-ed297d75567c-kube-api-access-4867q\") pod \"collect-profiles-29337510-4lkdj\" (UID: \"5e257f2a-ca74-43c1-bc58-ed297d75567c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337510-4lkdj" Oct 12 06:30:00 crc kubenswrapper[4930]: I1012 06:30:00.356964 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5e257f2a-ca74-43c1-bc58-ed297d75567c-config-volume\") pod \"collect-profiles-29337510-4lkdj\" (UID: \"5e257f2a-ca74-43c1-bc58-ed297d75567c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337510-4lkdj" Oct 12 06:30:00 crc kubenswrapper[4930]: I1012 06:30:00.357357 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5e257f2a-ca74-43c1-bc58-ed297d75567c-secret-volume\") pod \"collect-profiles-29337510-4lkdj\" (UID: \"5e257f2a-ca74-43c1-bc58-ed297d75567c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337510-4lkdj" Oct 12 06:30:00 crc kubenswrapper[4930]: I1012 06:30:00.460099 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4867q\" (UniqueName: \"kubernetes.io/projected/5e257f2a-ca74-43c1-bc58-ed297d75567c-kube-api-access-4867q\") pod \"collect-profiles-29337510-4lkdj\" (UID: \"5e257f2a-ca74-43c1-bc58-ed297d75567c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337510-4lkdj" Oct 12 06:30:00 crc kubenswrapper[4930]: I1012 06:30:00.460518 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5e257f2a-ca74-43c1-bc58-ed297d75567c-config-volume\") pod \"collect-profiles-29337510-4lkdj\" (UID: \"5e257f2a-ca74-43c1-bc58-ed297d75567c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337510-4lkdj" Oct 12 06:30:00 crc kubenswrapper[4930]: I1012 06:30:00.460854 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5e257f2a-ca74-43c1-bc58-ed297d75567c-secret-volume\") pod \"collect-profiles-29337510-4lkdj\" (UID: \"5e257f2a-ca74-43c1-bc58-ed297d75567c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337510-4lkdj" Oct 12 06:30:00 crc kubenswrapper[4930]: I1012 06:30:00.462281 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5e257f2a-ca74-43c1-bc58-ed297d75567c-config-volume\") pod \"collect-profiles-29337510-4lkdj\" (UID: \"5e257f2a-ca74-43c1-bc58-ed297d75567c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337510-4lkdj" Oct 12 06:30:00 crc kubenswrapper[4930]: I1012 06:30:00.482613 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5e257f2a-ca74-43c1-bc58-ed297d75567c-secret-volume\") pod \"collect-profiles-29337510-4lkdj\" (UID: \"5e257f2a-ca74-43c1-bc58-ed297d75567c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337510-4lkdj" Oct 12 06:30:00 crc kubenswrapper[4930]: I1012 06:30:00.493620 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4867q\" (UniqueName: \"kubernetes.io/projected/5e257f2a-ca74-43c1-bc58-ed297d75567c-kube-api-access-4867q\") pod \"collect-profiles-29337510-4lkdj\" (UID: \"5e257f2a-ca74-43c1-bc58-ed297d75567c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337510-4lkdj" Oct 12 06:30:00 crc kubenswrapper[4930]: I1012 06:30:00.530955 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29337510-4lkdj" Oct 12 06:30:00 crc kubenswrapper[4930]: I1012 06:30:00.873023 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29337510-4lkdj"] Oct 12 06:30:01 crc kubenswrapper[4930]: I1012 06:30:01.532780 4930 generic.go:334] "Generic (PLEG): container finished" podID="5e257f2a-ca74-43c1-bc58-ed297d75567c" containerID="771ca82c2d62c80b3674215b00d0b217fc5b66ad7f479526714ced3458198596" exitCode=0 Oct 12 06:30:01 crc kubenswrapper[4930]: I1012 06:30:01.532828 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29337510-4lkdj" event={"ID":"5e257f2a-ca74-43c1-bc58-ed297d75567c","Type":"ContainerDied","Data":"771ca82c2d62c80b3674215b00d0b217fc5b66ad7f479526714ced3458198596"} Oct 12 06:30:01 crc kubenswrapper[4930]: I1012 06:30:01.532858 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29337510-4lkdj" event={"ID":"5e257f2a-ca74-43c1-bc58-ed297d75567c","Type":"ContainerStarted","Data":"3e03a3201eed9925d2af31e2faac06f79c8d06916cdcfa41191a966479cea917"} Oct 12 06:30:02 crc kubenswrapper[4930]: I1012 06:30:02.954323 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29337510-4lkdj" Oct 12 06:30:03 crc kubenswrapper[4930]: I1012 06:30:03.121969 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4867q\" (UniqueName: \"kubernetes.io/projected/5e257f2a-ca74-43c1-bc58-ed297d75567c-kube-api-access-4867q\") pod \"5e257f2a-ca74-43c1-bc58-ed297d75567c\" (UID: \"5e257f2a-ca74-43c1-bc58-ed297d75567c\") " Oct 12 06:30:03 crc kubenswrapper[4930]: I1012 06:30:03.122116 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5e257f2a-ca74-43c1-bc58-ed297d75567c-secret-volume\") pod \"5e257f2a-ca74-43c1-bc58-ed297d75567c\" (UID: \"5e257f2a-ca74-43c1-bc58-ed297d75567c\") " Oct 12 06:30:03 crc kubenswrapper[4930]: I1012 06:30:03.122371 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5e257f2a-ca74-43c1-bc58-ed297d75567c-config-volume\") pod \"5e257f2a-ca74-43c1-bc58-ed297d75567c\" (UID: \"5e257f2a-ca74-43c1-bc58-ed297d75567c\") " Oct 12 06:30:03 crc kubenswrapper[4930]: I1012 06:30:03.123923 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e257f2a-ca74-43c1-bc58-ed297d75567c-config-volume" (OuterVolumeSpecName: "config-volume") pod "5e257f2a-ca74-43c1-bc58-ed297d75567c" (UID: "5e257f2a-ca74-43c1-bc58-ed297d75567c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 06:30:03 crc kubenswrapper[4930]: I1012 06:30:03.129727 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e257f2a-ca74-43c1-bc58-ed297d75567c-kube-api-access-4867q" (OuterVolumeSpecName: "kube-api-access-4867q") pod "5e257f2a-ca74-43c1-bc58-ed297d75567c" (UID: "5e257f2a-ca74-43c1-bc58-ed297d75567c"). InnerVolumeSpecName "kube-api-access-4867q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:30:03 crc kubenswrapper[4930]: I1012 06:30:03.132828 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e257f2a-ca74-43c1-bc58-ed297d75567c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5e257f2a-ca74-43c1-bc58-ed297d75567c" (UID: "5e257f2a-ca74-43c1-bc58-ed297d75567c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:30:03 crc kubenswrapper[4930]: I1012 06:30:03.228970 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4867q\" (UniqueName: \"kubernetes.io/projected/5e257f2a-ca74-43c1-bc58-ed297d75567c-kube-api-access-4867q\") on node \"crc\" DevicePath \"\"" Oct 12 06:30:03 crc kubenswrapper[4930]: I1012 06:30:03.229050 4930 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5e257f2a-ca74-43c1-bc58-ed297d75567c-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 12 06:30:03 crc kubenswrapper[4930]: I1012 06:30:03.229078 4930 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5e257f2a-ca74-43c1-bc58-ed297d75567c-config-volume\") on node \"crc\" DevicePath \"\"" Oct 12 06:30:03 crc kubenswrapper[4930]: I1012 06:30:03.560581 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29337510-4lkdj" event={"ID":"5e257f2a-ca74-43c1-bc58-ed297d75567c","Type":"ContainerDied","Data":"3e03a3201eed9925d2af31e2faac06f79c8d06916cdcfa41191a966479cea917"} Oct 12 06:30:03 crc kubenswrapper[4930]: I1012 06:30:03.560906 4930 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e03a3201eed9925d2af31e2faac06f79c8d06916cdcfa41191a966479cea917" Oct 12 06:30:03 crc kubenswrapper[4930]: I1012 06:30:03.560648 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29337510-4lkdj" Oct 12 06:30:03 crc kubenswrapper[4930]: I1012 06:30:03.989542 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 12 06:30:03 crc kubenswrapper[4930]: I1012 06:30:03.989935 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="e61b566b-95e6-4b55-801e-5db824bd5814" containerName="prometheus" containerID="cri-o://f82954f05556e368bc6165bff1860b8a1559662c84bf6880efb75d0c9b9212ff" gracePeriod=600 Oct 12 06:30:03 crc kubenswrapper[4930]: I1012 06:30:03.990098 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="e61b566b-95e6-4b55-801e-5db824bd5814" containerName="config-reloader" containerID="cri-o://c167b25d47d309eb148d4becbc53110fc7657fa84771075ed535e97196965593" gracePeriod=600 Oct 12 06:30:03 crc kubenswrapper[4930]: I1012 06:30:03.990347 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="e61b566b-95e6-4b55-801e-5db824bd5814" containerName="thanos-sidecar" containerID="cri-o://674b9b89d3de6274121c17ead64e7b7d2689ff475296e079738918f34652dc44" gracePeriod=600 Oct 12 06:30:04 crc kubenswrapper[4930]: I1012 06:30:04.082804 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29337465-t9m4c"] Oct 12 06:30:04 crc kubenswrapper[4930]: I1012 06:30:04.092347 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29337465-t9m4c"] Oct 12 06:30:04 crc kubenswrapper[4930]: I1012 06:30:04.105552 4930 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="e61b566b-95e6-4b55-801e-5db824bd5814" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.133:9090/-/ready\": dial tcp 10.217.0.133:9090: connect: connection refused" Oct 12 06:30:04 crc kubenswrapper[4930]: I1012 06:30:04.154773 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78dfc24e-649b-4183-b547-83ffda66a897" path="/var/lib/kubelet/pods/78dfc24e-649b-4183-b547-83ffda66a897/volumes" Oct 12 06:30:04 crc kubenswrapper[4930]: I1012 06:30:04.577233 4930 generic.go:334] "Generic (PLEG): container finished" podID="e61b566b-95e6-4b55-801e-5db824bd5814" containerID="674b9b89d3de6274121c17ead64e7b7d2689ff475296e079738918f34652dc44" exitCode=0 Oct 12 06:30:04 crc kubenswrapper[4930]: I1012 06:30:04.577283 4930 generic.go:334] "Generic (PLEG): container finished" podID="e61b566b-95e6-4b55-801e-5db824bd5814" containerID="c167b25d47d309eb148d4becbc53110fc7657fa84771075ed535e97196965593" exitCode=0 Oct 12 06:30:04 crc kubenswrapper[4930]: I1012 06:30:04.577300 4930 generic.go:334] "Generic (PLEG): container finished" podID="e61b566b-95e6-4b55-801e-5db824bd5814" containerID="f82954f05556e368bc6165bff1860b8a1559662c84bf6880efb75d0c9b9212ff" exitCode=0 Oct 12 06:30:04 crc kubenswrapper[4930]: I1012 06:30:04.577329 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e61b566b-95e6-4b55-801e-5db824bd5814","Type":"ContainerDied","Data":"674b9b89d3de6274121c17ead64e7b7d2689ff475296e079738918f34652dc44"} Oct 12 06:30:04 crc kubenswrapper[4930]: I1012 06:30:04.577369 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e61b566b-95e6-4b55-801e-5db824bd5814","Type":"ContainerDied","Data":"c167b25d47d309eb148d4becbc53110fc7657fa84771075ed535e97196965593"} Oct 12 06:30:04 crc kubenswrapper[4930]: I1012 06:30:04.577389 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e61b566b-95e6-4b55-801e-5db824bd5814","Type":"ContainerDied","Data":"f82954f05556e368bc6165bff1860b8a1559662c84bf6880efb75d0c9b9212ff"} Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.104371 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.269928 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e61b566b-95e6-4b55-801e-5db824bd5814-config\") pod \"e61b566b-95e6-4b55-801e-5db824bd5814\" (UID: \"e61b566b-95e6-4b55-801e-5db824bd5814\") " Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.269969 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95tbx\" (UniqueName: \"kubernetes.io/projected/e61b566b-95e6-4b55-801e-5db824bd5814-kube-api-access-95tbx\") pod \"e61b566b-95e6-4b55-801e-5db824bd5814\" (UID: \"e61b566b-95e6-4b55-801e-5db824bd5814\") " Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.270020 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/e61b566b-95e6-4b55-801e-5db824bd5814-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"e61b566b-95e6-4b55-801e-5db824bd5814\" (UID: \"e61b566b-95e6-4b55-801e-5db824bd5814\") " Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.270120 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e61b566b-95e6-4b55-801e-5db824bd5814-thanos-prometheus-http-client-file\") pod \"e61b566b-95e6-4b55-801e-5db824bd5814\" (UID: \"e61b566b-95e6-4b55-801e-5db824bd5814\") " Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.270162 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e61b566b-95e6-4b55-801e-5db824bd5814-prometheus-metric-storage-rulefiles-0\") pod \"e61b566b-95e6-4b55-801e-5db824bd5814\" (UID: \"e61b566b-95e6-4b55-801e-5db824bd5814\") " Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.270228 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/e61b566b-95e6-4b55-801e-5db824bd5814-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"e61b566b-95e6-4b55-801e-5db824bd5814\" (UID: \"e61b566b-95e6-4b55-801e-5db824bd5814\") " Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.270252 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e61b566b-95e6-4b55-801e-5db824bd5814-tls-assets\") pod \"e61b566b-95e6-4b55-801e-5db824bd5814\" (UID: \"e61b566b-95e6-4b55-801e-5db824bd5814\") " Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.270319 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e61b566b-95e6-4b55-801e-5db824bd5814-secret-combined-ca-bundle\") pod \"e61b566b-95e6-4b55-801e-5db824bd5814\" (UID: \"e61b566b-95e6-4b55-801e-5db824bd5814\") " Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.270407 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e61b566b-95e6-4b55-801e-5db824bd5814-config-out\") pod \"e61b566b-95e6-4b55-801e-5db824bd5814\" (UID: \"e61b566b-95e6-4b55-801e-5db824bd5814\") " Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.270431 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e61b566b-95e6-4b55-801e-5db824bd5814-web-config\") pod \"e61b566b-95e6-4b55-801e-5db824bd5814\" (UID: \"e61b566b-95e6-4b55-801e-5db824bd5814\") " Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.270681 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-22a7639f-0965-4497-b4ff-8d976229c443\") pod \"e61b566b-95e6-4b55-801e-5db824bd5814\" (UID: \"e61b566b-95e6-4b55-801e-5db824bd5814\") " Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.271692 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e61b566b-95e6-4b55-801e-5db824bd5814-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "e61b566b-95e6-4b55-801e-5db824bd5814" (UID: "e61b566b-95e6-4b55-801e-5db824bd5814"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.277565 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e61b566b-95e6-4b55-801e-5db824bd5814-config-out" (OuterVolumeSpecName: "config-out") pod "e61b566b-95e6-4b55-801e-5db824bd5814" (UID: "e61b566b-95e6-4b55-801e-5db824bd5814"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.277649 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e61b566b-95e6-4b55-801e-5db824bd5814-kube-api-access-95tbx" (OuterVolumeSpecName: "kube-api-access-95tbx") pod "e61b566b-95e6-4b55-801e-5db824bd5814" (UID: "e61b566b-95e6-4b55-801e-5db824bd5814"). InnerVolumeSpecName "kube-api-access-95tbx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.280175 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e61b566b-95e6-4b55-801e-5db824bd5814-config" (OuterVolumeSpecName: "config") pod "e61b566b-95e6-4b55-801e-5db824bd5814" (UID: "e61b566b-95e6-4b55-801e-5db824bd5814"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.281048 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e61b566b-95e6-4b55-801e-5db824bd5814-secret-combined-ca-bundle" (OuterVolumeSpecName: "secret-combined-ca-bundle") pod "e61b566b-95e6-4b55-801e-5db824bd5814" (UID: "e61b566b-95e6-4b55-801e-5db824bd5814"). InnerVolumeSpecName "secret-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.292022 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e61b566b-95e6-4b55-801e-5db824bd5814-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d") pod "e61b566b-95e6-4b55-801e-5db824bd5814" (UID: "e61b566b-95e6-4b55-801e-5db824bd5814"). InnerVolumeSpecName "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.292082 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e61b566b-95e6-4b55-801e-5db824bd5814-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "e61b566b-95e6-4b55-801e-5db824bd5814" (UID: "e61b566b-95e6-4b55-801e-5db824bd5814"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.293132 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e61b566b-95e6-4b55-801e-5db824bd5814-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d") pod "e61b566b-95e6-4b55-801e-5db824bd5814" (UID: "e61b566b-95e6-4b55-801e-5db824bd5814"). InnerVolumeSpecName "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.293372 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e61b566b-95e6-4b55-801e-5db824bd5814-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "e61b566b-95e6-4b55-801e-5db824bd5814" (UID: "e61b566b-95e6-4b55-801e-5db824bd5814"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.307044 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-22a7639f-0965-4497-b4ff-8d976229c443" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "e61b566b-95e6-4b55-801e-5db824bd5814" (UID: "e61b566b-95e6-4b55-801e-5db824bd5814"). InnerVolumeSpecName "pvc-22a7639f-0965-4497-b4ff-8d976229c443". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.373316 4930 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/e61b566b-95e6-4b55-801e-5db824bd5814-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") on node \"crc\" DevicePath \"\"" Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.373369 4930 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e61b566b-95e6-4b55-801e-5db824bd5814-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.373389 4930 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e61b566b-95e6-4b55-801e-5db824bd5814-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.373414 4930 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/e61b566b-95e6-4b55-801e-5db824bd5814-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") on node \"crc\" DevicePath \"\"" Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.373432 4930 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e61b566b-95e6-4b55-801e-5db824bd5814-tls-assets\") on node \"crc\" DevicePath \"\"" Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.373449 4930 reconciler_common.go:293] "Volume detached for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e61b566b-95e6-4b55-801e-5db824bd5814-secret-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.373467 4930 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e61b566b-95e6-4b55-801e-5db824bd5814-config-out\") on node \"crc\" DevicePath \"\"" Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.373512 4930 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-22a7639f-0965-4497-b4ff-8d976229c443\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-22a7639f-0965-4497-b4ff-8d976229c443\") on node \"crc\" " Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.373526 4930 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/e61b566b-95e6-4b55-801e-5db824bd5814-config\") on node \"crc\" DevicePath \"\"" Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.373545 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-95tbx\" (UniqueName: \"kubernetes.io/projected/e61b566b-95e6-4b55-801e-5db824bd5814-kube-api-access-95tbx\") on node \"crc\" DevicePath \"\"" Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.419983 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e61b566b-95e6-4b55-801e-5db824bd5814-web-config" (OuterVolumeSpecName: "web-config") pod "e61b566b-95e6-4b55-801e-5db824bd5814" (UID: "e61b566b-95e6-4b55-801e-5db824bd5814"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.435468 4930 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.435655 4930 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-22a7639f-0965-4497-b4ff-8d976229c443" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-22a7639f-0965-4497-b4ff-8d976229c443") on node "crc" Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.476073 4930 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e61b566b-95e6-4b55-801e-5db824bd5814-web-config\") on node \"crc\" DevicePath \"\"" Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.476112 4930 reconciler_common.go:293] "Volume detached for volume \"pvc-22a7639f-0965-4497-b4ff-8d976229c443\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-22a7639f-0965-4497-b4ff-8d976229c443\") on node \"crc\" DevicePath \"\"" Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.589696 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e61b566b-95e6-4b55-801e-5db824bd5814","Type":"ContainerDied","Data":"df157038ed553b6eebc5062e237ab86c75f87a280181a46fadc4869d751f3e7e"} Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.589755 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.589793 4930 scope.go:117] "RemoveContainer" containerID="674b9b89d3de6274121c17ead64e7b7d2689ff475296e079738918f34652dc44" Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.616946 4930 scope.go:117] "RemoveContainer" containerID="c167b25d47d309eb148d4becbc53110fc7657fa84771075ed535e97196965593" Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.628011 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.635365 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.643436 4930 scope.go:117] "RemoveContainer" containerID="f82954f05556e368bc6165bff1860b8a1559662c84bf6880efb75d0c9b9212ff" Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.661334 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 12 06:30:05 crc kubenswrapper[4930]: E1012 06:30:05.661957 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e61b566b-95e6-4b55-801e-5db824bd5814" containerName="config-reloader" Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.662119 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="e61b566b-95e6-4b55-801e-5db824bd5814" containerName="config-reloader" Oct 12 06:30:05 crc kubenswrapper[4930]: E1012 06:30:05.662186 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e61b566b-95e6-4b55-801e-5db824bd5814" containerName="thanos-sidecar" Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.662266 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="e61b566b-95e6-4b55-801e-5db824bd5814" containerName="thanos-sidecar" Oct 12 06:30:05 crc kubenswrapper[4930]: E1012 06:30:05.662330 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e61b566b-95e6-4b55-801e-5db824bd5814" containerName="init-config-reloader" Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.662377 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="e61b566b-95e6-4b55-801e-5db824bd5814" containerName="init-config-reloader" Oct 12 06:30:05 crc kubenswrapper[4930]: E1012 06:30:05.662434 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e61b566b-95e6-4b55-801e-5db824bd5814" containerName="prometheus" Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.662484 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="e61b566b-95e6-4b55-801e-5db824bd5814" containerName="prometheus" Oct 12 06:30:05 crc kubenswrapper[4930]: E1012 06:30:05.662553 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e257f2a-ca74-43c1-bc58-ed297d75567c" containerName="collect-profiles" Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.662602 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e257f2a-ca74-43c1-bc58-ed297d75567c" containerName="collect-profiles" Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.662881 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="e61b566b-95e6-4b55-801e-5db824bd5814" containerName="thanos-sidecar" Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.662958 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="e61b566b-95e6-4b55-801e-5db824bd5814" containerName="prometheus" Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.663008 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="e61b566b-95e6-4b55-801e-5db824bd5814" containerName="config-reloader" Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.663061 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e257f2a-ca74-43c1-bc58-ed297d75567c" containerName="collect-profiles" Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.668519 4930 scope.go:117] "RemoveContainer" containerID="e5f53a09911ea7eb99e996b3e8fa5996ff29ee3e390ac9dd75dc650aed71ad43" Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.669991 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.673100 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.673617 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.674487 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-jt8z4" Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.676910 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.679485 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.698462 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.704628 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.782126 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vjxr\" (UniqueName: \"kubernetes.io/projected/4a86a13a-7ebf-4fba-b066-f4c1ff705ffd-kube-api-access-6vjxr\") pod \"prometheus-metric-storage-0\" (UID: \"4a86a13a-7ebf-4fba-b066-f4c1ff705ffd\") " pod="openstack/prometheus-metric-storage-0" Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.782220 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4a86a13a-7ebf-4fba-b066-f4c1ff705ffd-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"4a86a13a-7ebf-4fba-b066-f4c1ff705ffd\") " pod="openstack/prometheus-metric-storage-0" Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.782281 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a86a13a-7ebf-4fba-b066-f4c1ff705ffd-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"4a86a13a-7ebf-4fba-b066-f4c1ff705ffd\") " pod="openstack/prometheus-metric-storage-0" Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.782320 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/4a86a13a-7ebf-4fba-b066-f4c1ff705ffd-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"4a86a13a-7ebf-4fba-b066-f4c1ff705ffd\") " pod="openstack/prometheus-metric-storage-0" Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.782373 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/4a86a13a-7ebf-4fba-b066-f4c1ff705ffd-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"4a86a13a-7ebf-4fba-b066-f4c1ff705ffd\") " pod="openstack/prometheus-metric-storage-0" Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.782397 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4a86a13a-7ebf-4fba-b066-f4c1ff705ffd-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"4a86a13a-7ebf-4fba-b066-f4c1ff705ffd\") " pod="openstack/prometheus-metric-storage-0" Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.782489 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4a86a13a-7ebf-4fba-b066-f4c1ff705ffd-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"4a86a13a-7ebf-4fba-b066-f4c1ff705ffd\") " pod="openstack/prometheus-metric-storage-0" Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.782625 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4a86a13a-7ebf-4fba-b066-f4c1ff705ffd-config\") pod \"prometheus-metric-storage-0\" (UID: \"4a86a13a-7ebf-4fba-b066-f4c1ff705ffd\") " pod="openstack/prometheus-metric-storage-0" Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.782683 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-22a7639f-0965-4497-b4ff-8d976229c443\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-22a7639f-0965-4497-b4ff-8d976229c443\") pod \"prometheus-metric-storage-0\" (UID: \"4a86a13a-7ebf-4fba-b066-f4c1ff705ffd\") " pod="openstack/prometheus-metric-storage-0" Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.782905 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4a86a13a-7ebf-4fba-b066-f4c1ff705ffd-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"4a86a13a-7ebf-4fba-b066-f4c1ff705ffd\") " pod="openstack/prometheus-metric-storage-0" Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.782962 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4a86a13a-7ebf-4fba-b066-f4c1ff705ffd-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"4a86a13a-7ebf-4fba-b066-f4c1ff705ffd\") " pod="openstack/prometheus-metric-storage-0" Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.885424 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vjxr\" (UniqueName: \"kubernetes.io/projected/4a86a13a-7ebf-4fba-b066-f4c1ff705ffd-kube-api-access-6vjxr\") pod \"prometheus-metric-storage-0\" (UID: \"4a86a13a-7ebf-4fba-b066-f4c1ff705ffd\") " pod="openstack/prometheus-metric-storage-0" Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.885514 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4a86a13a-7ebf-4fba-b066-f4c1ff705ffd-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"4a86a13a-7ebf-4fba-b066-f4c1ff705ffd\") " pod="openstack/prometheus-metric-storage-0" Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.885562 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a86a13a-7ebf-4fba-b066-f4c1ff705ffd-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"4a86a13a-7ebf-4fba-b066-f4c1ff705ffd\") " pod="openstack/prometheus-metric-storage-0" Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.885610 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/4a86a13a-7ebf-4fba-b066-f4c1ff705ffd-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"4a86a13a-7ebf-4fba-b066-f4c1ff705ffd\") " pod="openstack/prometheus-metric-storage-0" Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.885675 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/4a86a13a-7ebf-4fba-b066-f4c1ff705ffd-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"4a86a13a-7ebf-4fba-b066-f4c1ff705ffd\") " pod="openstack/prometheus-metric-storage-0" Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.885707 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4a86a13a-7ebf-4fba-b066-f4c1ff705ffd-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"4a86a13a-7ebf-4fba-b066-f4c1ff705ffd\") " pod="openstack/prometheus-metric-storage-0" Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.885782 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4a86a13a-7ebf-4fba-b066-f4c1ff705ffd-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"4a86a13a-7ebf-4fba-b066-f4c1ff705ffd\") " pod="openstack/prometheus-metric-storage-0" Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.885872 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4a86a13a-7ebf-4fba-b066-f4c1ff705ffd-config\") pod \"prometheus-metric-storage-0\" (UID: \"4a86a13a-7ebf-4fba-b066-f4c1ff705ffd\") " pod="openstack/prometheus-metric-storage-0" Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.886723 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-22a7639f-0965-4497-b4ff-8d976229c443\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-22a7639f-0965-4497-b4ff-8d976229c443\") pod \"prometheus-metric-storage-0\" (UID: \"4a86a13a-7ebf-4fba-b066-f4c1ff705ffd\") " pod="openstack/prometheus-metric-storage-0" Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.886842 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4a86a13a-7ebf-4fba-b066-f4c1ff705ffd-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"4a86a13a-7ebf-4fba-b066-f4c1ff705ffd\") " pod="openstack/prometheus-metric-storage-0" Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.886933 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4a86a13a-7ebf-4fba-b066-f4c1ff705ffd-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"4a86a13a-7ebf-4fba-b066-f4c1ff705ffd\") " pod="openstack/prometheus-metric-storage-0" Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.886975 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4a86a13a-7ebf-4fba-b066-f4c1ff705ffd-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"4a86a13a-7ebf-4fba-b066-f4c1ff705ffd\") " pod="openstack/prometheus-metric-storage-0" Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.891231 4930 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.891277 4930 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-22a7639f-0965-4497-b4ff-8d976229c443\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-22a7639f-0965-4497-b4ff-8d976229c443\") pod \"prometheus-metric-storage-0\" (UID: \"4a86a13a-7ebf-4fba-b066-f4c1ff705ffd\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/6e15b61105eae1f7086da32ef53b808da7b93145612971cfb218d121d2d8a399/globalmount\"" pod="openstack/prometheus-metric-storage-0" Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.896367 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/4a86a13a-7ebf-4fba-b066-f4c1ff705ffd-config\") pod \"prometheus-metric-storage-0\" (UID: \"4a86a13a-7ebf-4fba-b066-f4c1ff705ffd\") " pod="openstack/prometheus-metric-storage-0" Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.897903 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4a86a13a-7ebf-4fba-b066-f4c1ff705ffd-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"4a86a13a-7ebf-4fba-b066-f4c1ff705ffd\") " pod="openstack/prometheus-metric-storage-0" Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.901191 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4a86a13a-7ebf-4fba-b066-f4c1ff705ffd-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"4a86a13a-7ebf-4fba-b066-f4c1ff705ffd\") " pod="openstack/prometheus-metric-storage-0" Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.902256 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/4a86a13a-7ebf-4fba-b066-f4c1ff705ffd-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"4a86a13a-7ebf-4fba-b066-f4c1ff705ffd\") " pod="openstack/prometheus-metric-storage-0" Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.902615 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a86a13a-7ebf-4fba-b066-f4c1ff705ffd-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"4a86a13a-7ebf-4fba-b066-f4c1ff705ffd\") " pod="openstack/prometheus-metric-storage-0" Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.903093 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4a86a13a-7ebf-4fba-b066-f4c1ff705ffd-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"4a86a13a-7ebf-4fba-b066-f4c1ff705ffd\") " pod="openstack/prometheus-metric-storage-0" Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.903489 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/4a86a13a-7ebf-4fba-b066-f4c1ff705ffd-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"4a86a13a-7ebf-4fba-b066-f4c1ff705ffd\") " pod="openstack/prometheus-metric-storage-0" Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.904292 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4a86a13a-7ebf-4fba-b066-f4c1ff705ffd-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"4a86a13a-7ebf-4fba-b066-f4c1ff705ffd\") " pod="openstack/prometheus-metric-storage-0" Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.910406 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vjxr\" (UniqueName: \"kubernetes.io/projected/4a86a13a-7ebf-4fba-b066-f4c1ff705ffd-kube-api-access-6vjxr\") pod \"prometheus-metric-storage-0\" (UID: \"4a86a13a-7ebf-4fba-b066-f4c1ff705ffd\") " pod="openstack/prometheus-metric-storage-0" Oct 12 06:30:05 crc kubenswrapper[4930]: I1012 06:30:05.976223 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-22a7639f-0965-4497-b4ff-8d976229c443\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-22a7639f-0965-4497-b4ff-8d976229c443\") pod \"prometheus-metric-storage-0\" (UID: \"4a86a13a-7ebf-4fba-b066-f4c1ff705ffd\") " pod="openstack/prometheus-metric-storage-0" Oct 12 06:30:06 crc kubenswrapper[4930]: I1012 06:30:06.066978 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 12 06:30:06 crc kubenswrapper[4930]: I1012 06:30:06.150029 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e61b566b-95e6-4b55-801e-5db824bd5814" path="/var/lib/kubelet/pods/e61b566b-95e6-4b55-801e-5db824bd5814/volumes" Oct 12 06:30:06 crc kubenswrapper[4930]: I1012 06:30:06.568093 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 12 06:30:06 crc kubenswrapper[4930]: I1012 06:30:06.603052 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4a86a13a-7ebf-4fba-b066-f4c1ff705ffd","Type":"ContainerStarted","Data":"f8df4c0e37933ac4041c8af1c45e777ec4d4681423b34839cbb8c3ab709edb65"} Oct 12 06:30:11 crc kubenswrapper[4930]: I1012 06:30:11.672969 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4a86a13a-7ebf-4fba-b066-f4c1ff705ffd","Type":"ContainerStarted","Data":"5649fc5bc2c8567d6f102ac2e0e1b1f4fa0589cab157b0bf9e17ec9e14425825"} Oct 12 06:30:21 crc kubenswrapper[4930]: I1012 06:30:21.812733 4930 generic.go:334] "Generic (PLEG): container finished" podID="4a86a13a-7ebf-4fba-b066-f4c1ff705ffd" containerID="5649fc5bc2c8567d6f102ac2e0e1b1f4fa0589cab157b0bf9e17ec9e14425825" exitCode=0 Oct 12 06:30:21 crc kubenswrapper[4930]: I1012 06:30:21.812768 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4a86a13a-7ebf-4fba-b066-f4c1ff705ffd","Type":"ContainerDied","Data":"5649fc5bc2c8567d6f102ac2e0e1b1f4fa0589cab157b0bf9e17ec9e14425825"} Oct 12 06:30:22 crc kubenswrapper[4930]: I1012 06:30:22.831929 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4a86a13a-7ebf-4fba-b066-f4c1ff705ffd","Type":"ContainerStarted","Data":"4afb48d838e569978977dfe584f53e56c1c2c2b8584ba57a51946ac26b41f52f"} Oct 12 06:30:26 crc kubenswrapper[4930]: I1012 06:30:26.885642 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4a86a13a-7ebf-4fba-b066-f4c1ff705ffd","Type":"ContainerStarted","Data":"9de54c915ae809089bd09012167d8631ed6bdc7a04bc129e240cf3044a0ca520"} Oct 12 06:30:26 crc kubenswrapper[4930]: I1012 06:30:26.886246 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4a86a13a-7ebf-4fba-b066-f4c1ff705ffd","Type":"ContainerStarted","Data":"655d9c29663ec155394ccbced96642071e426530c479f16070a92cf9cc8bda41"} Oct 12 06:30:26 crc kubenswrapper[4930]: I1012 06:30:26.924548 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=21.924526535 podStartE2EDuration="21.924526535s" podCreationTimestamp="2025-10-12 06:30:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 06:30:26.924246938 +0000 UTC m=+2959.466348713" watchObservedRunningTime="2025-10-12 06:30:26.924526535 +0000 UTC m=+2959.466628320" Oct 12 06:30:27 crc kubenswrapper[4930]: I1012 06:30:27.558345 4930 scope.go:117] "RemoveContainer" containerID="50cd0190e3ea9244e0bce44eaecb6bd9ceefed270d2ceb61875f3cc8cccb804a" Oct 12 06:30:31 crc kubenswrapper[4930]: I1012 06:30:31.067638 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Oct 12 06:30:33 crc kubenswrapper[4930]: I1012 06:30:33.669700 4930 patch_prober.go:28] interesting pod/machine-config-daemon-mk4tf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 06:30:33 crc kubenswrapper[4930]: I1012 06:30:33.670139 4930 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 06:30:36 crc kubenswrapper[4930]: I1012 06:30:36.067659 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Oct 12 06:30:36 crc kubenswrapper[4930]: I1012 06:30:36.079532 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Oct 12 06:30:37 crc kubenswrapper[4930]: I1012 06:30:37.062214 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Oct 12 06:30:52 crc kubenswrapper[4930]: I1012 06:30:52.684302 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Oct 12 06:30:52 crc kubenswrapper[4930]: I1012 06:30:52.685801 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 12 06:30:52 crc kubenswrapper[4930]: I1012 06:30:52.688573 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Oct 12 06:30:52 crc kubenswrapper[4930]: I1012 06:30:52.688704 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Oct 12 06:30:52 crc kubenswrapper[4930]: I1012 06:30:52.688809 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-mlg5l" Oct 12 06:30:52 crc kubenswrapper[4930]: I1012 06:30:52.694097 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Oct 12 06:30:52 crc kubenswrapper[4930]: I1012 06:30:52.694287 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Oct 12 06:30:52 crc kubenswrapper[4930]: I1012 06:30:52.785144 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/acf9e824-abb4-4b1c-9925-c7794fafaad4-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"acf9e824-abb4-4b1c-9925-c7794fafaad4\") " pod="openstack/tempest-tests-tempest" Oct 12 06:30:52 crc kubenswrapper[4930]: I1012 06:30:52.785195 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/acf9e824-abb4-4b1c-9925-c7794fafaad4-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"acf9e824-abb4-4b1c-9925-c7794fafaad4\") " pod="openstack/tempest-tests-tempest" Oct 12 06:30:52 crc kubenswrapper[4930]: I1012 06:30:52.785345 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"tempest-tests-tempest\" (UID: \"acf9e824-abb4-4b1c-9925-c7794fafaad4\") " pod="openstack/tempest-tests-tempest" Oct 12 06:30:52 crc kubenswrapper[4930]: I1012 06:30:52.785433 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/acf9e824-abb4-4b1c-9925-c7794fafaad4-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"acf9e824-abb4-4b1c-9925-c7794fafaad4\") " pod="openstack/tempest-tests-tempest" Oct 12 06:30:52 crc kubenswrapper[4930]: I1012 06:30:52.785542 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pm8hk\" (UniqueName: \"kubernetes.io/projected/acf9e824-abb4-4b1c-9925-c7794fafaad4-kube-api-access-pm8hk\") pod \"tempest-tests-tempest\" (UID: \"acf9e824-abb4-4b1c-9925-c7794fafaad4\") " pod="openstack/tempest-tests-tempest" Oct 12 06:30:52 crc kubenswrapper[4930]: I1012 06:30:52.785590 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/acf9e824-abb4-4b1c-9925-c7794fafaad4-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"acf9e824-abb4-4b1c-9925-c7794fafaad4\") " pod="openstack/tempest-tests-tempest" Oct 12 06:30:52 crc kubenswrapper[4930]: I1012 06:30:52.785629 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/acf9e824-abb4-4b1c-9925-c7794fafaad4-config-data\") pod \"tempest-tests-tempest\" (UID: \"acf9e824-abb4-4b1c-9925-c7794fafaad4\") " pod="openstack/tempest-tests-tempest" Oct 12 06:30:52 crc kubenswrapper[4930]: I1012 06:30:52.785727 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/acf9e824-abb4-4b1c-9925-c7794fafaad4-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"acf9e824-abb4-4b1c-9925-c7794fafaad4\") " pod="openstack/tempest-tests-tempest" Oct 12 06:30:52 crc kubenswrapper[4930]: I1012 06:30:52.785854 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/acf9e824-abb4-4b1c-9925-c7794fafaad4-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"acf9e824-abb4-4b1c-9925-c7794fafaad4\") " pod="openstack/tempest-tests-tempest" Oct 12 06:30:52 crc kubenswrapper[4930]: I1012 06:30:52.887767 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/acf9e824-abb4-4b1c-9925-c7794fafaad4-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"acf9e824-abb4-4b1c-9925-c7794fafaad4\") " pod="openstack/tempest-tests-tempest" Oct 12 06:30:52 crc kubenswrapper[4930]: I1012 06:30:52.887874 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/acf9e824-abb4-4b1c-9925-c7794fafaad4-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"acf9e824-abb4-4b1c-9925-c7794fafaad4\") " pod="openstack/tempest-tests-tempest" Oct 12 06:30:52 crc kubenswrapper[4930]: I1012 06:30:52.887899 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/acf9e824-abb4-4b1c-9925-c7794fafaad4-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"acf9e824-abb4-4b1c-9925-c7794fafaad4\") " pod="openstack/tempest-tests-tempest" Oct 12 06:30:52 crc kubenswrapper[4930]: I1012 06:30:52.887930 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"tempest-tests-tempest\" (UID: \"acf9e824-abb4-4b1c-9925-c7794fafaad4\") " pod="openstack/tempest-tests-tempest" Oct 12 06:30:52 crc kubenswrapper[4930]: I1012 06:30:52.887958 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/acf9e824-abb4-4b1c-9925-c7794fafaad4-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"acf9e824-abb4-4b1c-9925-c7794fafaad4\") " pod="openstack/tempest-tests-tempest" Oct 12 06:30:52 crc kubenswrapper[4930]: I1012 06:30:52.887994 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pm8hk\" (UniqueName: \"kubernetes.io/projected/acf9e824-abb4-4b1c-9925-c7794fafaad4-kube-api-access-pm8hk\") pod \"tempest-tests-tempest\" (UID: \"acf9e824-abb4-4b1c-9925-c7794fafaad4\") " pod="openstack/tempest-tests-tempest" Oct 12 06:30:52 crc kubenswrapper[4930]: I1012 06:30:52.888015 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/acf9e824-abb4-4b1c-9925-c7794fafaad4-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"acf9e824-abb4-4b1c-9925-c7794fafaad4\") " pod="openstack/tempest-tests-tempest" Oct 12 06:30:52 crc kubenswrapper[4930]: I1012 06:30:52.888036 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/acf9e824-abb4-4b1c-9925-c7794fafaad4-config-data\") pod \"tempest-tests-tempest\" (UID: \"acf9e824-abb4-4b1c-9925-c7794fafaad4\") " pod="openstack/tempest-tests-tempest" Oct 12 06:30:52 crc kubenswrapper[4930]: I1012 06:30:52.888072 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/acf9e824-abb4-4b1c-9925-c7794fafaad4-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"acf9e824-abb4-4b1c-9925-c7794fafaad4\") " pod="openstack/tempest-tests-tempest" Oct 12 06:30:52 crc kubenswrapper[4930]: I1012 06:30:52.888608 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/acf9e824-abb4-4b1c-9925-c7794fafaad4-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"acf9e824-abb4-4b1c-9925-c7794fafaad4\") " pod="openstack/tempest-tests-tempest" Oct 12 06:30:52 crc kubenswrapper[4930]: I1012 06:30:52.888766 4930 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"tempest-tests-tempest\" (UID: \"acf9e824-abb4-4b1c-9925-c7794fafaad4\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/tempest-tests-tempest" Oct 12 06:30:52 crc kubenswrapper[4930]: I1012 06:30:52.888769 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/acf9e824-abb4-4b1c-9925-c7794fafaad4-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"acf9e824-abb4-4b1c-9925-c7794fafaad4\") " pod="openstack/tempest-tests-tempest" Oct 12 06:30:52 crc kubenswrapper[4930]: I1012 06:30:52.889791 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/acf9e824-abb4-4b1c-9925-c7794fafaad4-config-data\") pod \"tempest-tests-tempest\" (UID: \"acf9e824-abb4-4b1c-9925-c7794fafaad4\") " pod="openstack/tempest-tests-tempest" Oct 12 06:30:52 crc kubenswrapper[4930]: I1012 06:30:52.890094 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/acf9e824-abb4-4b1c-9925-c7794fafaad4-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"acf9e824-abb4-4b1c-9925-c7794fafaad4\") " pod="openstack/tempest-tests-tempest" Oct 12 06:30:52 crc kubenswrapper[4930]: I1012 06:30:52.895477 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/acf9e824-abb4-4b1c-9925-c7794fafaad4-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"acf9e824-abb4-4b1c-9925-c7794fafaad4\") " pod="openstack/tempest-tests-tempest" Oct 12 06:30:52 crc kubenswrapper[4930]: I1012 06:30:52.895577 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/acf9e824-abb4-4b1c-9925-c7794fafaad4-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"acf9e824-abb4-4b1c-9925-c7794fafaad4\") " pod="openstack/tempest-tests-tempest" Oct 12 06:30:52 crc kubenswrapper[4930]: I1012 06:30:52.898254 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/acf9e824-abb4-4b1c-9925-c7794fafaad4-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"acf9e824-abb4-4b1c-9925-c7794fafaad4\") " pod="openstack/tempest-tests-tempest" Oct 12 06:30:52 crc kubenswrapper[4930]: I1012 06:30:52.915613 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pm8hk\" (UniqueName: \"kubernetes.io/projected/acf9e824-abb4-4b1c-9925-c7794fafaad4-kube-api-access-pm8hk\") pod \"tempest-tests-tempest\" (UID: \"acf9e824-abb4-4b1c-9925-c7794fafaad4\") " pod="openstack/tempest-tests-tempest" Oct 12 06:30:52 crc kubenswrapper[4930]: I1012 06:30:52.953207 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"tempest-tests-tempest\" (UID: \"acf9e824-abb4-4b1c-9925-c7794fafaad4\") " pod="openstack/tempest-tests-tempest" Oct 12 06:30:53 crc kubenswrapper[4930]: I1012 06:30:53.007194 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 12 06:30:53 crc kubenswrapper[4930]: I1012 06:30:53.491824 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Oct 12 06:30:54 crc kubenswrapper[4930]: I1012 06:30:54.247493 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"acf9e824-abb4-4b1c-9925-c7794fafaad4","Type":"ContainerStarted","Data":"173b73af5735ed9a4f5b24ea6fe93e888f5a17249fcff85fb84dec639310a3da"} Oct 12 06:31:03 crc kubenswrapper[4930]: I1012 06:31:03.669829 4930 patch_prober.go:28] interesting pod/machine-config-daemon-mk4tf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 06:31:03 crc kubenswrapper[4930]: I1012 06:31:03.670420 4930 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 06:31:08 crc kubenswrapper[4930]: I1012 06:31:08.427005 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"acf9e824-abb4-4b1c-9925-c7794fafaad4","Type":"ContainerStarted","Data":"8a7b273e05d7dc912375aecb7500a13d3f402987b770443fe657374bc03c24b5"} Oct 12 06:31:08 crc kubenswrapper[4930]: I1012 06:31:08.459337 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.510272977 podStartE2EDuration="17.459321949s" podCreationTimestamp="2025-10-12 06:30:51 +0000 UTC" firstStartedPulling="2025-10-12 06:30:53.505273705 +0000 UTC m=+2986.047375480" lastFinishedPulling="2025-10-12 06:31:06.454322647 +0000 UTC m=+2998.996424452" observedRunningTime="2025-10-12 06:31:08.457292429 +0000 UTC m=+3000.999394224" watchObservedRunningTime="2025-10-12 06:31:08.459321949 +0000 UTC m=+3001.001423704" Oct 12 06:31:33 crc kubenswrapper[4930]: I1012 06:31:33.669978 4930 patch_prober.go:28] interesting pod/machine-config-daemon-mk4tf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 06:31:33 crc kubenswrapper[4930]: I1012 06:31:33.672040 4930 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 06:31:33 crc kubenswrapper[4930]: I1012 06:31:33.672252 4930 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" Oct 12 06:31:33 crc kubenswrapper[4930]: I1012 06:31:33.673469 4930 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8cf3fb7e8d14641e6ad26a172b6fdafcd1328d22aa4c129661b81d6d5e5a5cc0"} pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 12 06:31:33 crc kubenswrapper[4930]: I1012 06:31:33.673779 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerName="machine-config-daemon" containerID="cri-o://8cf3fb7e8d14641e6ad26a172b6fdafcd1328d22aa4c129661b81d6d5e5a5cc0" gracePeriod=600 Oct 12 06:31:33 crc kubenswrapper[4930]: E1012 06:31:33.806280 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:31:34 crc kubenswrapper[4930]: I1012 06:31:34.758073 4930 generic.go:334] "Generic (PLEG): container finished" podID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerID="8cf3fb7e8d14641e6ad26a172b6fdafcd1328d22aa4c129661b81d6d5e5a5cc0" exitCode=0 Oct 12 06:31:34 crc kubenswrapper[4930]: I1012 06:31:34.759089 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" event={"ID":"02f8684c-a3e4-44e8-9741-9f54488d8d8d","Type":"ContainerDied","Data":"8cf3fb7e8d14641e6ad26a172b6fdafcd1328d22aa4c129661b81d6d5e5a5cc0"} Oct 12 06:31:34 crc kubenswrapper[4930]: I1012 06:31:34.759175 4930 scope.go:117] "RemoveContainer" containerID="9ab3f87434347d32cda2544985e1ab8e81d9813b575b68284cd0ffe0db076ad9" Oct 12 06:31:34 crc kubenswrapper[4930]: I1012 06:31:34.759901 4930 scope.go:117] "RemoveContainer" containerID="8cf3fb7e8d14641e6ad26a172b6fdafcd1328d22aa4c129661b81d6d5e5a5cc0" Oct 12 06:31:34 crc kubenswrapper[4930]: E1012 06:31:34.760251 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:31:50 crc kubenswrapper[4930]: I1012 06:31:50.135188 4930 scope.go:117] "RemoveContainer" containerID="8cf3fb7e8d14641e6ad26a172b6fdafcd1328d22aa4c129661b81d6d5e5a5cc0" Oct 12 06:31:50 crc kubenswrapper[4930]: E1012 06:31:50.135988 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:32:01 crc kubenswrapper[4930]: I1012 06:32:01.136171 4930 scope.go:117] "RemoveContainer" containerID="8cf3fb7e8d14641e6ad26a172b6fdafcd1328d22aa4c129661b81d6d5e5a5cc0" Oct 12 06:32:01 crc kubenswrapper[4930]: E1012 06:32:01.137392 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:32:12 crc kubenswrapper[4930]: I1012 06:32:12.136429 4930 scope.go:117] "RemoveContainer" containerID="8cf3fb7e8d14641e6ad26a172b6fdafcd1328d22aa4c129661b81d6d5e5a5cc0" Oct 12 06:32:12 crc kubenswrapper[4930]: E1012 06:32:12.137269 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:32:27 crc kubenswrapper[4930]: I1012 06:32:27.135889 4930 scope.go:117] "RemoveContainer" containerID="8cf3fb7e8d14641e6ad26a172b6fdafcd1328d22aa4c129661b81d6d5e5a5cc0" Oct 12 06:32:27 crc kubenswrapper[4930]: E1012 06:32:27.136594 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:32:38 crc kubenswrapper[4930]: I1012 06:32:38.142000 4930 scope.go:117] "RemoveContainer" containerID="8cf3fb7e8d14641e6ad26a172b6fdafcd1328d22aa4c129661b81d6d5e5a5cc0" Oct 12 06:32:38 crc kubenswrapper[4930]: E1012 06:32:38.145209 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:32:51 crc kubenswrapper[4930]: I1012 06:32:51.135545 4930 scope.go:117] "RemoveContainer" containerID="8cf3fb7e8d14641e6ad26a172b6fdafcd1328d22aa4c129661b81d6d5e5a5cc0" Oct 12 06:32:51 crc kubenswrapper[4930]: E1012 06:32:51.136518 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:33:02 crc kubenswrapper[4930]: I1012 06:33:02.135237 4930 scope.go:117] "RemoveContainer" containerID="8cf3fb7e8d14641e6ad26a172b6fdafcd1328d22aa4c129661b81d6d5e5a5cc0" Oct 12 06:33:02 crc kubenswrapper[4930]: E1012 06:33:02.135972 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:33:13 crc kubenswrapper[4930]: I1012 06:33:13.135304 4930 scope.go:117] "RemoveContainer" containerID="8cf3fb7e8d14641e6ad26a172b6fdafcd1328d22aa4c129661b81d6d5e5a5cc0" Oct 12 06:33:13 crc kubenswrapper[4930]: E1012 06:33:13.137109 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:33:28 crc kubenswrapper[4930]: I1012 06:33:28.161874 4930 scope.go:117] "RemoveContainer" containerID="8cf3fb7e8d14641e6ad26a172b6fdafcd1328d22aa4c129661b81d6d5e5a5cc0" Oct 12 06:33:28 crc kubenswrapper[4930]: E1012 06:33:28.163055 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:33:42 crc kubenswrapper[4930]: I1012 06:33:42.136641 4930 scope.go:117] "RemoveContainer" containerID="8cf3fb7e8d14641e6ad26a172b6fdafcd1328d22aa4c129661b81d6d5e5a5cc0" Oct 12 06:33:42 crc kubenswrapper[4930]: E1012 06:33:42.137632 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:33:54 crc kubenswrapper[4930]: I1012 06:33:54.911763 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vzm4n"] Oct 12 06:33:54 crc kubenswrapper[4930]: I1012 06:33:54.917553 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vzm4n" Oct 12 06:33:54 crc kubenswrapper[4930]: I1012 06:33:54.936693 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vzm4n"] Oct 12 06:33:55 crc kubenswrapper[4930]: I1012 06:33:55.028975 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf840d74-04a1-45c3-b820-0556fe336c41-utilities\") pod \"community-operators-vzm4n\" (UID: \"cf840d74-04a1-45c3-b820-0556fe336c41\") " pod="openshift-marketplace/community-operators-vzm4n" Oct 12 06:33:55 crc kubenswrapper[4930]: I1012 06:33:55.029257 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf840d74-04a1-45c3-b820-0556fe336c41-catalog-content\") pod \"community-operators-vzm4n\" (UID: \"cf840d74-04a1-45c3-b820-0556fe336c41\") " pod="openshift-marketplace/community-operators-vzm4n" Oct 12 06:33:55 crc kubenswrapper[4930]: I1012 06:33:55.029303 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9t2w\" (UniqueName: \"kubernetes.io/projected/cf840d74-04a1-45c3-b820-0556fe336c41-kube-api-access-s9t2w\") pod \"community-operators-vzm4n\" (UID: \"cf840d74-04a1-45c3-b820-0556fe336c41\") " pod="openshift-marketplace/community-operators-vzm4n" Oct 12 06:33:55 crc kubenswrapper[4930]: I1012 06:33:55.130541 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf840d74-04a1-45c3-b820-0556fe336c41-utilities\") pod \"community-operators-vzm4n\" (UID: \"cf840d74-04a1-45c3-b820-0556fe336c41\") " pod="openshift-marketplace/community-operators-vzm4n" Oct 12 06:33:55 crc kubenswrapper[4930]: I1012 06:33:55.130782 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf840d74-04a1-45c3-b820-0556fe336c41-catalog-content\") pod \"community-operators-vzm4n\" (UID: \"cf840d74-04a1-45c3-b820-0556fe336c41\") " pod="openshift-marketplace/community-operators-vzm4n" Oct 12 06:33:55 crc kubenswrapper[4930]: I1012 06:33:55.130830 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9t2w\" (UniqueName: \"kubernetes.io/projected/cf840d74-04a1-45c3-b820-0556fe336c41-kube-api-access-s9t2w\") pod \"community-operators-vzm4n\" (UID: \"cf840d74-04a1-45c3-b820-0556fe336c41\") " pod="openshift-marketplace/community-operators-vzm4n" Oct 12 06:33:55 crc kubenswrapper[4930]: I1012 06:33:55.131084 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf840d74-04a1-45c3-b820-0556fe336c41-utilities\") pod \"community-operators-vzm4n\" (UID: \"cf840d74-04a1-45c3-b820-0556fe336c41\") " pod="openshift-marketplace/community-operators-vzm4n" Oct 12 06:33:55 crc kubenswrapper[4930]: I1012 06:33:55.131444 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf840d74-04a1-45c3-b820-0556fe336c41-catalog-content\") pod \"community-operators-vzm4n\" (UID: \"cf840d74-04a1-45c3-b820-0556fe336c41\") " pod="openshift-marketplace/community-operators-vzm4n" Oct 12 06:33:55 crc kubenswrapper[4930]: I1012 06:33:55.169311 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9t2w\" (UniqueName: \"kubernetes.io/projected/cf840d74-04a1-45c3-b820-0556fe336c41-kube-api-access-s9t2w\") pod \"community-operators-vzm4n\" (UID: \"cf840d74-04a1-45c3-b820-0556fe336c41\") " pod="openshift-marketplace/community-operators-vzm4n" Oct 12 06:33:55 crc kubenswrapper[4930]: I1012 06:33:55.247475 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vzm4n" Oct 12 06:33:55 crc kubenswrapper[4930]: I1012 06:33:55.823538 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vzm4n"] Oct 12 06:33:56 crc kubenswrapper[4930]: I1012 06:33:56.507316 4930 generic.go:334] "Generic (PLEG): container finished" podID="cf840d74-04a1-45c3-b820-0556fe336c41" containerID="f4babd6a5b584066f54ed82c6a425cb6ac680876b49513e71a100f93105466a9" exitCode=0 Oct 12 06:33:56 crc kubenswrapper[4930]: I1012 06:33:56.507454 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vzm4n" event={"ID":"cf840d74-04a1-45c3-b820-0556fe336c41","Type":"ContainerDied","Data":"f4babd6a5b584066f54ed82c6a425cb6ac680876b49513e71a100f93105466a9"} Oct 12 06:33:56 crc kubenswrapper[4930]: I1012 06:33:56.507845 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vzm4n" event={"ID":"cf840d74-04a1-45c3-b820-0556fe336c41","Type":"ContainerStarted","Data":"f10e2973d04ebd1042bf7edcd18de3c7e64a14ea7476e4a50a3398933b2a2fc3"} Oct 12 06:33:56 crc kubenswrapper[4930]: I1012 06:33:56.510392 4930 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 12 06:33:57 crc kubenswrapper[4930]: I1012 06:33:57.136019 4930 scope.go:117] "RemoveContainer" containerID="8cf3fb7e8d14641e6ad26a172b6fdafcd1328d22aa4c129661b81d6d5e5a5cc0" Oct 12 06:33:57 crc kubenswrapper[4930]: E1012 06:33:57.136549 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:33:57 crc kubenswrapper[4930]: I1012 06:33:57.522354 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vzm4n" event={"ID":"cf840d74-04a1-45c3-b820-0556fe336c41","Type":"ContainerStarted","Data":"4d98bdc8fd583201ab1584aac7917519706ec78f03690d51a889e04766922a12"} Oct 12 06:33:59 crc kubenswrapper[4930]: I1012 06:33:59.551552 4930 generic.go:334] "Generic (PLEG): container finished" podID="cf840d74-04a1-45c3-b820-0556fe336c41" containerID="4d98bdc8fd583201ab1584aac7917519706ec78f03690d51a889e04766922a12" exitCode=0 Oct 12 06:33:59 crc kubenswrapper[4930]: I1012 06:33:59.551891 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vzm4n" event={"ID":"cf840d74-04a1-45c3-b820-0556fe336c41","Type":"ContainerDied","Data":"4d98bdc8fd583201ab1584aac7917519706ec78f03690d51a889e04766922a12"} Oct 12 06:34:00 crc kubenswrapper[4930]: I1012 06:34:00.564602 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vzm4n" event={"ID":"cf840d74-04a1-45c3-b820-0556fe336c41","Type":"ContainerStarted","Data":"6d790801c711347685728fcda0eae5976e147d0dbfd0f25723ff64a25104dc95"} Oct 12 06:34:00 crc kubenswrapper[4930]: I1012 06:34:00.613030 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vzm4n" podStartSLOduration=3.177635713 podStartE2EDuration="6.613013143s" podCreationTimestamp="2025-10-12 06:33:54 +0000 UTC" firstStartedPulling="2025-10-12 06:33:56.510026937 +0000 UTC m=+3169.052128722" lastFinishedPulling="2025-10-12 06:33:59.945404387 +0000 UTC m=+3172.487506152" observedRunningTime="2025-10-12 06:34:00.610253844 +0000 UTC m=+3173.152355619" watchObservedRunningTime="2025-10-12 06:34:00.613013143 +0000 UTC m=+3173.155114928" Oct 12 06:34:05 crc kubenswrapper[4930]: I1012 06:34:05.249129 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vzm4n" Oct 12 06:34:05 crc kubenswrapper[4930]: I1012 06:34:05.249970 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vzm4n" Oct 12 06:34:05 crc kubenswrapper[4930]: I1012 06:34:05.313541 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vzm4n" Oct 12 06:34:05 crc kubenswrapper[4930]: I1012 06:34:05.675174 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vzm4n" Oct 12 06:34:05 crc kubenswrapper[4930]: I1012 06:34:05.742334 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vzm4n"] Oct 12 06:34:07 crc kubenswrapper[4930]: I1012 06:34:07.642927 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vzm4n" podUID="cf840d74-04a1-45c3-b820-0556fe336c41" containerName="registry-server" containerID="cri-o://6d790801c711347685728fcda0eae5976e147d0dbfd0f25723ff64a25104dc95" gracePeriod=2 Oct 12 06:34:08 crc kubenswrapper[4930]: I1012 06:34:08.236105 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vzm4n" Oct 12 06:34:08 crc kubenswrapper[4930]: I1012 06:34:08.419080 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf840d74-04a1-45c3-b820-0556fe336c41-utilities\") pod \"cf840d74-04a1-45c3-b820-0556fe336c41\" (UID: \"cf840d74-04a1-45c3-b820-0556fe336c41\") " Oct 12 06:34:08 crc kubenswrapper[4930]: I1012 06:34:08.419128 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9t2w\" (UniqueName: \"kubernetes.io/projected/cf840d74-04a1-45c3-b820-0556fe336c41-kube-api-access-s9t2w\") pod \"cf840d74-04a1-45c3-b820-0556fe336c41\" (UID: \"cf840d74-04a1-45c3-b820-0556fe336c41\") " Oct 12 06:34:08 crc kubenswrapper[4930]: I1012 06:34:08.419221 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf840d74-04a1-45c3-b820-0556fe336c41-catalog-content\") pod \"cf840d74-04a1-45c3-b820-0556fe336c41\" (UID: \"cf840d74-04a1-45c3-b820-0556fe336c41\") " Oct 12 06:34:08 crc kubenswrapper[4930]: I1012 06:34:08.420550 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf840d74-04a1-45c3-b820-0556fe336c41-utilities" (OuterVolumeSpecName: "utilities") pod "cf840d74-04a1-45c3-b820-0556fe336c41" (UID: "cf840d74-04a1-45c3-b820-0556fe336c41"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 06:34:08 crc kubenswrapper[4930]: I1012 06:34:08.430197 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf840d74-04a1-45c3-b820-0556fe336c41-kube-api-access-s9t2w" (OuterVolumeSpecName: "kube-api-access-s9t2w") pod "cf840d74-04a1-45c3-b820-0556fe336c41" (UID: "cf840d74-04a1-45c3-b820-0556fe336c41"). InnerVolumeSpecName "kube-api-access-s9t2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:34:08 crc kubenswrapper[4930]: I1012 06:34:08.467408 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf840d74-04a1-45c3-b820-0556fe336c41-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cf840d74-04a1-45c3-b820-0556fe336c41" (UID: "cf840d74-04a1-45c3-b820-0556fe336c41"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 06:34:08 crc kubenswrapper[4930]: I1012 06:34:08.523035 4930 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf840d74-04a1-45c3-b820-0556fe336c41-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 06:34:08 crc kubenswrapper[4930]: I1012 06:34:08.523371 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9t2w\" (UniqueName: \"kubernetes.io/projected/cf840d74-04a1-45c3-b820-0556fe336c41-kube-api-access-s9t2w\") on node \"crc\" DevicePath \"\"" Oct 12 06:34:08 crc kubenswrapper[4930]: I1012 06:34:08.523517 4930 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf840d74-04a1-45c3-b820-0556fe336c41-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 06:34:08 crc kubenswrapper[4930]: I1012 06:34:08.653583 4930 generic.go:334] "Generic (PLEG): container finished" podID="cf840d74-04a1-45c3-b820-0556fe336c41" containerID="6d790801c711347685728fcda0eae5976e147d0dbfd0f25723ff64a25104dc95" exitCode=0 Oct 12 06:34:08 crc kubenswrapper[4930]: I1012 06:34:08.653655 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vzm4n" event={"ID":"cf840d74-04a1-45c3-b820-0556fe336c41","Type":"ContainerDied","Data":"6d790801c711347685728fcda0eae5976e147d0dbfd0f25723ff64a25104dc95"} Oct 12 06:34:08 crc kubenswrapper[4930]: I1012 06:34:08.653698 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vzm4n" event={"ID":"cf840d74-04a1-45c3-b820-0556fe336c41","Type":"ContainerDied","Data":"f10e2973d04ebd1042bf7edcd18de3c7e64a14ea7476e4a50a3398933b2a2fc3"} Oct 12 06:34:08 crc kubenswrapper[4930]: I1012 06:34:08.653773 4930 scope.go:117] "RemoveContainer" containerID="6d790801c711347685728fcda0eae5976e147d0dbfd0f25723ff64a25104dc95" Oct 12 06:34:08 crc kubenswrapper[4930]: I1012 06:34:08.653808 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vzm4n" Oct 12 06:34:08 crc kubenswrapper[4930]: I1012 06:34:08.677789 4930 scope.go:117] "RemoveContainer" containerID="4d98bdc8fd583201ab1584aac7917519706ec78f03690d51a889e04766922a12" Oct 12 06:34:08 crc kubenswrapper[4930]: I1012 06:34:08.716326 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vzm4n"] Oct 12 06:34:08 crc kubenswrapper[4930]: I1012 06:34:08.732172 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vzm4n"] Oct 12 06:34:08 crc kubenswrapper[4930]: I1012 06:34:08.733999 4930 scope.go:117] "RemoveContainer" containerID="f4babd6a5b584066f54ed82c6a425cb6ac680876b49513e71a100f93105466a9" Oct 12 06:34:08 crc kubenswrapper[4930]: I1012 06:34:08.776210 4930 scope.go:117] "RemoveContainer" containerID="6d790801c711347685728fcda0eae5976e147d0dbfd0f25723ff64a25104dc95" Oct 12 06:34:08 crc kubenswrapper[4930]: E1012 06:34:08.776687 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d790801c711347685728fcda0eae5976e147d0dbfd0f25723ff64a25104dc95\": container with ID starting with 6d790801c711347685728fcda0eae5976e147d0dbfd0f25723ff64a25104dc95 not found: ID does not exist" containerID="6d790801c711347685728fcda0eae5976e147d0dbfd0f25723ff64a25104dc95" Oct 12 06:34:08 crc kubenswrapper[4930]: I1012 06:34:08.776800 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d790801c711347685728fcda0eae5976e147d0dbfd0f25723ff64a25104dc95"} err="failed to get container status \"6d790801c711347685728fcda0eae5976e147d0dbfd0f25723ff64a25104dc95\": rpc error: code = NotFound desc = could not find container \"6d790801c711347685728fcda0eae5976e147d0dbfd0f25723ff64a25104dc95\": container with ID starting with 6d790801c711347685728fcda0eae5976e147d0dbfd0f25723ff64a25104dc95 not found: ID does not exist" Oct 12 06:34:08 crc kubenswrapper[4930]: I1012 06:34:08.776862 4930 scope.go:117] "RemoveContainer" containerID="4d98bdc8fd583201ab1584aac7917519706ec78f03690d51a889e04766922a12" Oct 12 06:34:08 crc kubenswrapper[4930]: E1012 06:34:08.777246 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d98bdc8fd583201ab1584aac7917519706ec78f03690d51a889e04766922a12\": container with ID starting with 4d98bdc8fd583201ab1584aac7917519706ec78f03690d51a889e04766922a12 not found: ID does not exist" containerID="4d98bdc8fd583201ab1584aac7917519706ec78f03690d51a889e04766922a12" Oct 12 06:34:08 crc kubenswrapper[4930]: I1012 06:34:08.777278 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d98bdc8fd583201ab1584aac7917519706ec78f03690d51a889e04766922a12"} err="failed to get container status \"4d98bdc8fd583201ab1584aac7917519706ec78f03690d51a889e04766922a12\": rpc error: code = NotFound desc = could not find container \"4d98bdc8fd583201ab1584aac7917519706ec78f03690d51a889e04766922a12\": container with ID starting with 4d98bdc8fd583201ab1584aac7917519706ec78f03690d51a889e04766922a12 not found: ID does not exist" Oct 12 06:34:08 crc kubenswrapper[4930]: I1012 06:34:08.777302 4930 scope.go:117] "RemoveContainer" containerID="f4babd6a5b584066f54ed82c6a425cb6ac680876b49513e71a100f93105466a9" Oct 12 06:34:08 crc kubenswrapper[4930]: E1012 06:34:08.777533 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4babd6a5b584066f54ed82c6a425cb6ac680876b49513e71a100f93105466a9\": container with ID starting with f4babd6a5b584066f54ed82c6a425cb6ac680876b49513e71a100f93105466a9 not found: ID does not exist" containerID="f4babd6a5b584066f54ed82c6a425cb6ac680876b49513e71a100f93105466a9" Oct 12 06:34:08 crc kubenswrapper[4930]: I1012 06:34:08.777557 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4babd6a5b584066f54ed82c6a425cb6ac680876b49513e71a100f93105466a9"} err="failed to get container status \"f4babd6a5b584066f54ed82c6a425cb6ac680876b49513e71a100f93105466a9\": rpc error: code = NotFound desc = could not find container \"f4babd6a5b584066f54ed82c6a425cb6ac680876b49513e71a100f93105466a9\": container with ID starting with f4babd6a5b584066f54ed82c6a425cb6ac680876b49513e71a100f93105466a9 not found: ID does not exist" Oct 12 06:34:10 crc kubenswrapper[4930]: I1012 06:34:10.157152 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf840d74-04a1-45c3-b820-0556fe336c41" path="/var/lib/kubelet/pods/cf840d74-04a1-45c3-b820-0556fe336c41/volumes" Oct 12 06:34:11 crc kubenswrapper[4930]: I1012 06:34:11.135413 4930 scope.go:117] "RemoveContainer" containerID="8cf3fb7e8d14641e6ad26a172b6fdafcd1328d22aa4c129661b81d6d5e5a5cc0" Oct 12 06:34:11 crc kubenswrapper[4930]: E1012 06:34:11.136027 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:34:25 crc kubenswrapper[4930]: I1012 06:34:25.136545 4930 scope.go:117] "RemoveContainer" containerID="8cf3fb7e8d14641e6ad26a172b6fdafcd1328d22aa4c129661b81d6d5e5a5cc0" Oct 12 06:34:25 crc kubenswrapper[4930]: E1012 06:34:25.137874 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:34:38 crc kubenswrapper[4930]: I1012 06:34:38.140798 4930 scope.go:117] "RemoveContainer" containerID="8cf3fb7e8d14641e6ad26a172b6fdafcd1328d22aa4c129661b81d6d5e5a5cc0" Oct 12 06:34:38 crc kubenswrapper[4930]: E1012 06:34:38.142893 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:34:52 crc kubenswrapper[4930]: I1012 06:34:52.136040 4930 scope.go:117] "RemoveContainer" containerID="8cf3fb7e8d14641e6ad26a172b6fdafcd1328d22aa4c129661b81d6d5e5a5cc0" Oct 12 06:34:52 crc kubenswrapper[4930]: E1012 06:34:52.137177 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:35:06 crc kubenswrapper[4930]: I1012 06:35:06.136437 4930 scope.go:117] "RemoveContainer" containerID="8cf3fb7e8d14641e6ad26a172b6fdafcd1328d22aa4c129661b81d6d5e5a5cc0" Oct 12 06:35:06 crc kubenswrapper[4930]: E1012 06:35:06.138124 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:35:21 crc kubenswrapper[4930]: I1012 06:35:21.135047 4930 scope.go:117] "RemoveContainer" containerID="8cf3fb7e8d14641e6ad26a172b6fdafcd1328d22aa4c129661b81d6d5e5a5cc0" Oct 12 06:35:21 crc kubenswrapper[4930]: E1012 06:35:21.136907 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:35:33 crc kubenswrapper[4930]: I1012 06:35:33.136502 4930 scope.go:117] "RemoveContainer" containerID="8cf3fb7e8d14641e6ad26a172b6fdafcd1328d22aa4c129661b81d6d5e5a5cc0" Oct 12 06:35:33 crc kubenswrapper[4930]: E1012 06:35:33.137870 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:35:45 crc kubenswrapper[4930]: I1012 06:35:45.136273 4930 scope.go:117] "RemoveContainer" containerID="8cf3fb7e8d14641e6ad26a172b6fdafcd1328d22aa4c129661b81d6d5e5a5cc0" Oct 12 06:35:45 crc kubenswrapper[4930]: E1012 06:35:45.137402 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:35:59 crc kubenswrapper[4930]: I1012 06:35:59.136130 4930 scope.go:117] "RemoveContainer" containerID="8cf3fb7e8d14641e6ad26a172b6fdafcd1328d22aa4c129661b81d6d5e5a5cc0" Oct 12 06:35:59 crc kubenswrapper[4930]: E1012 06:35:59.137273 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:36:14 crc kubenswrapper[4930]: I1012 06:36:14.135429 4930 scope.go:117] "RemoveContainer" containerID="8cf3fb7e8d14641e6ad26a172b6fdafcd1328d22aa4c129661b81d6d5e5a5cc0" Oct 12 06:36:14 crc kubenswrapper[4930]: E1012 06:36:14.136356 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:36:28 crc kubenswrapper[4930]: I1012 06:36:28.142285 4930 scope.go:117] "RemoveContainer" containerID="8cf3fb7e8d14641e6ad26a172b6fdafcd1328d22aa4c129661b81d6d5e5a5cc0" Oct 12 06:36:28 crc kubenswrapper[4930]: E1012 06:36:28.143384 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:36:43 crc kubenswrapper[4930]: I1012 06:36:43.136785 4930 scope.go:117] "RemoveContainer" containerID="8cf3fb7e8d14641e6ad26a172b6fdafcd1328d22aa4c129661b81d6d5e5a5cc0" Oct 12 06:36:43 crc kubenswrapper[4930]: I1012 06:36:43.711820 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" event={"ID":"02f8684c-a3e4-44e8-9741-9f54488d8d8d","Type":"ContainerStarted","Data":"d151d8c371342d4130e54ee25c023bdb58521d35b6a35844b76618c1e64a97fb"} Oct 12 06:37:36 crc kubenswrapper[4930]: I1012 06:37:36.746926 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tvfmm"] Oct 12 06:37:36 crc kubenswrapper[4930]: E1012 06:37:36.748154 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf840d74-04a1-45c3-b820-0556fe336c41" containerName="registry-server" Oct 12 06:37:36 crc kubenswrapper[4930]: I1012 06:37:36.748176 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf840d74-04a1-45c3-b820-0556fe336c41" containerName="registry-server" Oct 12 06:37:36 crc kubenswrapper[4930]: E1012 06:37:36.748235 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf840d74-04a1-45c3-b820-0556fe336c41" containerName="extract-content" Oct 12 06:37:36 crc kubenswrapper[4930]: I1012 06:37:36.748248 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf840d74-04a1-45c3-b820-0556fe336c41" containerName="extract-content" Oct 12 06:37:36 crc kubenswrapper[4930]: E1012 06:37:36.748275 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf840d74-04a1-45c3-b820-0556fe336c41" containerName="extract-utilities" Oct 12 06:37:36 crc kubenswrapper[4930]: I1012 06:37:36.748289 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf840d74-04a1-45c3-b820-0556fe336c41" containerName="extract-utilities" Oct 12 06:37:36 crc kubenswrapper[4930]: I1012 06:37:36.748650 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf840d74-04a1-45c3-b820-0556fe336c41" containerName="registry-server" Oct 12 06:37:36 crc kubenswrapper[4930]: I1012 06:37:36.751144 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tvfmm" Oct 12 06:37:36 crc kubenswrapper[4930]: I1012 06:37:36.758263 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tvfmm"] Oct 12 06:37:36 crc kubenswrapper[4930]: I1012 06:37:36.937093 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72f0413a-ee99-40ce-b811-e8b094b86c5f-catalog-content\") pod \"redhat-operators-tvfmm\" (UID: \"72f0413a-ee99-40ce-b811-e8b094b86c5f\") " pod="openshift-marketplace/redhat-operators-tvfmm" Oct 12 06:37:36 crc kubenswrapper[4930]: I1012 06:37:36.937864 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72f0413a-ee99-40ce-b811-e8b094b86c5f-utilities\") pod \"redhat-operators-tvfmm\" (UID: \"72f0413a-ee99-40ce-b811-e8b094b86c5f\") " pod="openshift-marketplace/redhat-operators-tvfmm" Oct 12 06:37:36 crc kubenswrapper[4930]: I1012 06:37:36.938106 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d679w\" (UniqueName: \"kubernetes.io/projected/72f0413a-ee99-40ce-b811-e8b094b86c5f-kube-api-access-d679w\") pod \"redhat-operators-tvfmm\" (UID: \"72f0413a-ee99-40ce-b811-e8b094b86c5f\") " pod="openshift-marketplace/redhat-operators-tvfmm" Oct 12 06:37:37 crc kubenswrapper[4930]: I1012 06:37:37.040002 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72f0413a-ee99-40ce-b811-e8b094b86c5f-catalog-content\") pod \"redhat-operators-tvfmm\" (UID: \"72f0413a-ee99-40ce-b811-e8b094b86c5f\") " pod="openshift-marketplace/redhat-operators-tvfmm" Oct 12 06:37:37 crc kubenswrapper[4930]: I1012 06:37:37.040068 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72f0413a-ee99-40ce-b811-e8b094b86c5f-utilities\") pod \"redhat-operators-tvfmm\" (UID: \"72f0413a-ee99-40ce-b811-e8b094b86c5f\") " pod="openshift-marketplace/redhat-operators-tvfmm" Oct 12 06:37:37 crc kubenswrapper[4930]: I1012 06:37:37.040110 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d679w\" (UniqueName: \"kubernetes.io/projected/72f0413a-ee99-40ce-b811-e8b094b86c5f-kube-api-access-d679w\") pod \"redhat-operators-tvfmm\" (UID: \"72f0413a-ee99-40ce-b811-e8b094b86c5f\") " pod="openshift-marketplace/redhat-operators-tvfmm" Oct 12 06:37:37 crc kubenswrapper[4930]: I1012 06:37:37.040860 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72f0413a-ee99-40ce-b811-e8b094b86c5f-catalog-content\") pod \"redhat-operators-tvfmm\" (UID: \"72f0413a-ee99-40ce-b811-e8b094b86c5f\") " pod="openshift-marketplace/redhat-operators-tvfmm" Oct 12 06:37:37 crc kubenswrapper[4930]: I1012 06:37:37.040935 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72f0413a-ee99-40ce-b811-e8b094b86c5f-utilities\") pod \"redhat-operators-tvfmm\" (UID: \"72f0413a-ee99-40ce-b811-e8b094b86c5f\") " pod="openshift-marketplace/redhat-operators-tvfmm" Oct 12 06:37:37 crc kubenswrapper[4930]: I1012 06:37:37.065864 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d679w\" (UniqueName: \"kubernetes.io/projected/72f0413a-ee99-40ce-b811-e8b094b86c5f-kube-api-access-d679w\") pod \"redhat-operators-tvfmm\" (UID: \"72f0413a-ee99-40ce-b811-e8b094b86c5f\") " pod="openshift-marketplace/redhat-operators-tvfmm" Oct 12 06:37:37 crc kubenswrapper[4930]: I1012 06:37:37.084543 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tvfmm" Oct 12 06:37:37 crc kubenswrapper[4930]: I1012 06:37:37.592917 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tvfmm"] Oct 12 06:37:38 crc kubenswrapper[4930]: I1012 06:37:38.473105 4930 generic.go:334] "Generic (PLEG): container finished" podID="72f0413a-ee99-40ce-b811-e8b094b86c5f" containerID="156c36ad7fb2e5a94c585f3f36b6094f0046db1672066e7f732af81c7f4531cc" exitCode=0 Oct 12 06:37:38 crc kubenswrapper[4930]: I1012 06:37:38.473161 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tvfmm" event={"ID":"72f0413a-ee99-40ce-b811-e8b094b86c5f","Type":"ContainerDied","Data":"156c36ad7fb2e5a94c585f3f36b6094f0046db1672066e7f732af81c7f4531cc"} Oct 12 06:37:38 crc kubenswrapper[4930]: I1012 06:37:38.473513 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tvfmm" event={"ID":"72f0413a-ee99-40ce-b811-e8b094b86c5f","Type":"ContainerStarted","Data":"c65d866e87a0b0b84860effc6f7797f8db08f74dc3f0f927a2ca5454a6ff2c12"} Oct 12 06:37:39 crc kubenswrapper[4930]: I1012 06:37:39.497164 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tvfmm" event={"ID":"72f0413a-ee99-40ce-b811-e8b094b86c5f","Type":"ContainerStarted","Data":"bed68e0dfdb043c19daa43489c2a81cd3e2aea4b498750c6c4ca006b40150d44"} Oct 12 06:37:43 crc kubenswrapper[4930]: I1012 06:37:43.538556 4930 generic.go:334] "Generic (PLEG): container finished" podID="72f0413a-ee99-40ce-b811-e8b094b86c5f" containerID="bed68e0dfdb043c19daa43489c2a81cd3e2aea4b498750c6c4ca006b40150d44" exitCode=0 Oct 12 06:37:43 crc kubenswrapper[4930]: I1012 06:37:43.538633 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tvfmm" event={"ID":"72f0413a-ee99-40ce-b811-e8b094b86c5f","Type":"ContainerDied","Data":"bed68e0dfdb043c19daa43489c2a81cd3e2aea4b498750c6c4ca006b40150d44"} Oct 12 06:37:44 crc kubenswrapper[4930]: I1012 06:37:44.554924 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tvfmm" event={"ID":"72f0413a-ee99-40ce-b811-e8b094b86c5f","Type":"ContainerStarted","Data":"f5ea971c8122cd2539ab0446f8f3e84fd00f968d58de020685abb6e1be763961"} Oct 12 06:37:44 crc kubenswrapper[4930]: I1012 06:37:44.584694 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tvfmm" podStartSLOduration=3.141315012 podStartE2EDuration="8.584664496s" podCreationTimestamp="2025-10-12 06:37:36 +0000 UTC" firstStartedPulling="2025-10-12 06:37:38.476640169 +0000 UTC m=+3391.018741974" lastFinishedPulling="2025-10-12 06:37:43.919989693 +0000 UTC m=+3396.462091458" observedRunningTime="2025-10-12 06:37:44.581926118 +0000 UTC m=+3397.124027913" watchObservedRunningTime="2025-10-12 06:37:44.584664496 +0000 UTC m=+3397.126766301" Oct 12 06:37:47 crc kubenswrapper[4930]: I1012 06:37:47.085332 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tvfmm" Oct 12 06:37:47 crc kubenswrapper[4930]: I1012 06:37:47.085711 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tvfmm" Oct 12 06:37:48 crc kubenswrapper[4930]: I1012 06:37:48.147563 4930 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tvfmm" podUID="72f0413a-ee99-40ce-b811-e8b094b86c5f" containerName="registry-server" probeResult="failure" output=< Oct 12 06:37:48 crc kubenswrapper[4930]: timeout: failed to connect service ":50051" within 1s Oct 12 06:37:48 crc kubenswrapper[4930]: > Oct 12 06:37:57 crc kubenswrapper[4930]: I1012 06:37:57.131908 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tvfmm" Oct 12 06:37:57 crc kubenswrapper[4930]: I1012 06:37:57.202376 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tvfmm" Oct 12 06:37:57 crc kubenswrapper[4930]: I1012 06:37:57.366183 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tvfmm"] Oct 12 06:37:58 crc kubenswrapper[4930]: I1012 06:37:58.712459 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tvfmm" podUID="72f0413a-ee99-40ce-b811-e8b094b86c5f" containerName="registry-server" containerID="cri-o://f5ea971c8122cd2539ab0446f8f3e84fd00f968d58de020685abb6e1be763961" gracePeriod=2 Oct 12 06:37:59 crc kubenswrapper[4930]: I1012 06:37:59.213788 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tvfmm" Oct 12 06:37:59 crc kubenswrapper[4930]: I1012 06:37:59.229156 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d679w\" (UniqueName: \"kubernetes.io/projected/72f0413a-ee99-40ce-b811-e8b094b86c5f-kube-api-access-d679w\") pod \"72f0413a-ee99-40ce-b811-e8b094b86c5f\" (UID: \"72f0413a-ee99-40ce-b811-e8b094b86c5f\") " Oct 12 06:37:59 crc kubenswrapper[4930]: I1012 06:37:59.229349 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72f0413a-ee99-40ce-b811-e8b094b86c5f-catalog-content\") pod \"72f0413a-ee99-40ce-b811-e8b094b86c5f\" (UID: \"72f0413a-ee99-40ce-b811-e8b094b86c5f\") " Oct 12 06:37:59 crc kubenswrapper[4930]: I1012 06:37:59.242171 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72f0413a-ee99-40ce-b811-e8b094b86c5f-utilities\") pod \"72f0413a-ee99-40ce-b811-e8b094b86c5f\" (UID: \"72f0413a-ee99-40ce-b811-e8b094b86c5f\") " Oct 12 06:37:59 crc kubenswrapper[4930]: I1012 06:37:59.242872 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72f0413a-ee99-40ce-b811-e8b094b86c5f-kube-api-access-d679w" (OuterVolumeSpecName: "kube-api-access-d679w") pod "72f0413a-ee99-40ce-b811-e8b094b86c5f" (UID: "72f0413a-ee99-40ce-b811-e8b094b86c5f"). InnerVolumeSpecName "kube-api-access-d679w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:37:59 crc kubenswrapper[4930]: I1012 06:37:59.243150 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d679w\" (UniqueName: \"kubernetes.io/projected/72f0413a-ee99-40ce-b811-e8b094b86c5f-kube-api-access-d679w\") on node \"crc\" DevicePath \"\"" Oct 12 06:37:59 crc kubenswrapper[4930]: I1012 06:37:59.245799 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72f0413a-ee99-40ce-b811-e8b094b86c5f-utilities" (OuterVolumeSpecName: "utilities") pod "72f0413a-ee99-40ce-b811-e8b094b86c5f" (UID: "72f0413a-ee99-40ce-b811-e8b094b86c5f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 06:37:59 crc kubenswrapper[4930]: I1012 06:37:59.344826 4930 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72f0413a-ee99-40ce-b811-e8b094b86c5f-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 06:37:59 crc kubenswrapper[4930]: I1012 06:37:59.413190 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72f0413a-ee99-40ce-b811-e8b094b86c5f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "72f0413a-ee99-40ce-b811-e8b094b86c5f" (UID: "72f0413a-ee99-40ce-b811-e8b094b86c5f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 06:37:59 crc kubenswrapper[4930]: I1012 06:37:59.445808 4930 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72f0413a-ee99-40ce-b811-e8b094b86c5f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 06:37:59 crc kubenswrapper[4930]: I1012 06:37:59.730245 4930 generic.go:334] "Generic (PLEG): container finished" podID="72f0413a-ee99-40ce-b811-e8b094b86c5f" containerID="f5ea971c8122cd2539ab0446f8f3e84fd00f968d58de020685abb6e1be763961" exitCode=0 Oct 12 06:37:59 crc kubenswrapper[4930]: I1012 06:37:59.730312 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tvfmm" event={"ID":"72f0413a-ee99-40ce-b811-e8b094b86c5f","Type":"ContainerDied","Data":"f5ea971c8122cd2539ab0446f8f3e84fd00f968d58de020685abb6e1be763961"} Oct 12 06:37:59 crc kubenswrapper[4930]: I1012 06:37:59.730348 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tvfmm" Oct 12 06:37:59 crc kubenswrapper[4930]: I1012 06:37:59.730363 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tvfmm" event={"ID":"72f0413a-ee99-40ce-b811-e8b094b86c5f","Type":"ContainerDied","Data":"c65d866e87a0b0b84860effc6f7797f8db08f74dc3f0f927a2ca5454a6ff2c12"} Oct 12 06:37:59 crc kubenswrapper[4930]: I1012 06:37:59.730395 4930 scope.go:117] "RemoveContainer" containerID="f5ea971c8122cd2539ab0446f8f3e84fd00f968d58de020685abb6e1be763961" Oct 12 06:37:59 crc kubenswrapper[4930]: I1012 06:37:59.774520 4930 scope.go:117] "RemoveContainer" containerID="bed68e0dfdb043c19daa43489c2a81cd3e2aea4b498750c6c4ca006b40150d44" Oct 12 06:37:59 crc kubenswrapper[4930]: I1012 06:37:59.809372 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tvfmm"] Oct 12 06:37:59 crc kubenswrapper[4930]: I1012 06:37:59.816766 4930 scope.go:117] "RemoveContainer" containerID="156c36ad7fb2e5a94c585f3f36b6094f0046db1672066e7f732af81c7f4531cc" Oct 12 06:37:59 crc kubenswrapper[4930]: I1012 06:37:59.827216 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tvfmm"] Oct 12 06:37:59 crc kubenswrapper[4930]: I1012 06:37:59.876557 4930 scope.go:117] "RemoveContainer" containerID="f5ea971c8122cd2539ab0446f8f3e84fd00f968d58de020685abb6e1be763961" Oct 12 06:37:59 crc kubenswrapper[4930]: E1012 06:37:59.877238 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5ea971c8122cd2539ab0446f8f3e84fd00f968d58de020685abb6e1be763961\": container with ID starting with f5ea971c8122cd2539ab0446f8f3e84fd00f968d58de020685abb6e1be763961 not found: ID does not exist" containerID="f5ea971c8122cd2539ab0446f8f3e84fd00f968d58de020685abb6e1be763961" Oct 12 06:37:59 crc kubenswrapper[4930]: I1012 06:37:59.877295 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5ea971c8122cd2539ab0446f8f3e84fd00f968d58de020685abb6e1be763961"} err="failed to get container status \"f5ea971c8122cd2539ab0446f8f3e84fd00f968d58de020685abb6e1be763961\": rpc error: code = NotFound desc = could not find container \"f5ea971c8122cd2539ab0446f8f3e84fd00f968d58de020685abb6e1be763961\": container with ID starting with f5ea971c8122cd2539ab0446f8f3e84fd00f968d58de020685abb6e1be763961 not found: ID does not exist" Oct 12 06:37:59 crc kubenswrapper[4930]: I1012 06:37:59.877327 4930 scope.go:117] "RemoveContainer" containerID="bed68e0dfdb043c19daa43489c2a81cd3e2aea4b498750c6c4ca006b40150d44" Oct 12 06:37:59 crc kubenswrapper[4930]: E1012 06:37:59.877728 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bed68e0dfdb043c19daa43489c2a81cd3e2aea4b498750c6c4ca006b40150d44\": container with ID starting with bed68e0dfdb043c19daa43489c2a81cd3e2aea4b498750c6c4ca006b40150d44 not found: ID does not exist" containerID="bed68e0dfdb043c19daa43489c2a81cd3e2aea4b498750c6c4ca006b40150d44" Oct 12 06:37:59 crc kubenswrapper[4930]: I1012 06:37:59.877773 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bed68e0dfdb043c19daa43489c2a81cd3e2aea4b498750c6c4ca006b40150d44"} err="failed to get container status \"bed68e0dfdb043c19daa43489c2a81cd3e2aea4b498750c6c4ca006b40150d44\": rpc error: code = NotFound desc = could not find container \"bed68e0dfdb043c19daa43489c2a81cd3e2aea4b498750c6c4ca006b40150d44\": container with ID starting with bed68e0dfdb043c19daa43489c2a81cd3e2aea4b498750c6c4ca006b40150d44 not found: ID does not exist" Oct 12 06:37:59 crc kubenswrapper[4930]: I1012 06:37:59.877799 4930 scope.go:117] "RemoveContainer" containerID="156c36ad7fb2e5a94c585f3f36b6094f0046db1672066e7f732af81c7f4531cc" Oct 12 06:37:59 crc kubenswrapper[4930]: E1012 06:37:59.878130 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"156c36ad7fb2e5a94c585f3f36b6094f0046db1672066e7f732af81c7f4531cc\": container with ID starting with 156c36ad7fb2e5a94c585f3f36b6094f0046db1672066e7f732af81c7f4531cc not found: ID does not exist" containerID="156c36ad7fb2e5a94c585f3f36b6094f0046db1672066e7f732af81c7f4531cc" Oct 12 06:37:59 crc kubenswrapper[4930]: I1012 06:37:59.878234 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"156c36ad7fb2e5a94c585f3f36b6094f0046db1672066e7f732af81c7f4531cc"} err="failed to get container status \"156c36ad7fb2e5a94c585f3f36b6094f0046db1672066e7f732af81c7f4531cc\": rpc error: code = NotFound desc = could not find container \"156c36ad7fb2e5a94c585f3f36b6094f0046db1672066e7f732af81c7f4531cc\": container with ID starting with 156c36ad7fb2e5a94c585f3f36b6094f0046db1672066e7f732af81c7f4531cc not found: ID does not exist" Oct 12 06:38:00 crc kubenswrapper[4930]: I1012 06:38:00.150447 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72f0413a-ee99-40ce-b811-e8b094b86c5f" path="/var/lib/kubelet/pods/72f0413a-ee99-40ce-b811-e8b094b86c5f/volumes" Oct 12 06:39:03 crc kubenswrapper[4930]: I1012 06:39:03.669544 4930 patch_prober.go:28] interesting pod/machine-config-daemon-mk4tf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 06:39:03 crc kubenswrapper[4930]: I1012 06:39:03.670312 4930 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 06:39:33 crc kubenswrapper[4930]: I1012 06:39:33.669279 4930 patch_prober.go:28] interesting pod/machine-config-daemon-mk4tf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 06:39:33 crc kubenswrapper[4930]: I1012 06:39:33.669765 4930 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 06:39:37 crc kubenswrapper[4930]: I1012 06:39:37.876471 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-82k5b"] Oct 12 06:39:37 crc kubenswrapper[4930]: E1012 06:39:37.877527 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72f0413a-ee99-40ce-b811-e8b094b86c5f" containerName="extract-content" Oct 12 06:39:37 crc kubenswrapper[4930]: I1012 06:39:37.877629 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="72f0413a-ee99-40ce-b811-e8b094b86c5f" containerName="extract-content" Oct 12 06:39:37 crc kubenswrapper[4930]: E1012 06:39:37.877654 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72f0413a-ee99-40ce-b811-e8b094b86c5f" containerName="registry-server" Oct 12 06:39:37 crc kubenswrapper[4930]: I1012 06:39:37.877662 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="72f0413a-ee99-40ce-b811-e8b094b86c5f" containerName="registry-server" Oct 12 06:39:37 crc kubenswrapper[4930]: E1012 06:39:37.877677 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72f0413a-ee99-40ce-b811-e8b094b86c5f" containerName="extract-utilities" Oct 12 06:39:37 crc kubenswrapper[4930]: I1012 06:39:37.877687 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="72f0413a-ee99-40ce-b811-e8b094b86c5f" containerName="extract-utilities" Oct 12 06:39:37 crc kubenswrapper[4930]: I1012 06:39:37.877944 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="72f0413a-ee99-40ce-b811-e8b094b86c5f" containerName="registry-server" Oct 12 06:39:37 crc kubenswrapper[4930]: I1012 06:39:37.879694 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-82k5b" Oct 12 06:39:37 crc kubenswrapper[4930]: I1012 06:39:37.893638 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-82k5b"] Oct 12 06:39:37 crc kubenswrapper[4930]: I1012 06:39:37.987931 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1264b467-e8fe-49f0-97d2-a0b920083a9b-utilities\") pod \"redhat-marketplace-82k5b\" (UID: \"1264b467-e8fe-49f0-97d2-a0b920083a9b\") " pod="openshift-marketplace/redhat-marketplace-82k5b" Oct 12 06:39:37 crc kubenswrapper[4930]: I1012 06:39:37.988007 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hhcb\" (UniqueName: \"kubernetes.io/projected/1264b467-e8fe-49f0-97d2-a0b920083a9b-kube-api-access-7hhcb\") pod \"redhat-marketplace-82k5b\" (UID: \"1264b467-e8fe-49f0-97d2-a0b920083a9b\") " pod="openshift-marketplace/redhat-marketplace-82k5b" Oct 12 06:39:37 crc kubenswrapper[4930]: I1012 06:39:37.988053 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1264b467-e8fe-49f0-97d2-a0b920083a9b-catalog-content\") pod \"redhat-marketplace-82k5b\" (UID: \"1264b467-e8fe-49f0-97d2-a0b920083a9b\") " pod="openshift-marketplace/redhat-marketplace-82k5b" Oct 12 06:39:38 crc kubenswrapper[4930]: I1012 06:39:38.090855 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1264b467-e8fe-49f0-97d2-a0b920083a9b-utilities\") pod \"redhat-marketplace-82k5b\" (UID: \"1264b467-e8fe-49f0-97d2-a0b920083a9b\") " pod="openshift-marketplace/redhat-marketplace-82k5b" Oct 12 06:39:38 crc kubenswrapper[4930]: I1012 06:39:38.090972 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hhcb\" (UniqueName: \"kubernetes.io/projected/1264b467-e8fe-49f0-97d2-a0b920083a9b-kube-api-access-7hhcb\") pod \"redhat-marketplace-82k5b\" (UID: \"1264b467-e8fe-49f0-97d2-a0b920083a9b\") " pod="openshift-marketplace/redhat-marketplace-82k5b" Oct 12 06:39:38 crc kubenswrapper[4930]: I1012 06:39:38.091031 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1264b467-e8fe-49f0-97d2-a0b920083a9b-catalog-content\") pod \"redhat-marketplace-82k5b\" (UID: \"1264b467-e8fe-49f0-97d2-a0b920083a9b\") " pod="openshift-marketplace/redhat-marketplace-82k5b" Oct 12 06:39:38 crc kubenswrapper[4930]: I1012 06:39:38.091469 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1264b467-e8fe-49f0-97d2-a0b920083a9b-utilities\") pod \"redhat-marketplace-82k5b\" (UID: \"1264b467-e8fe-49f0-97d2-a0b920083a9b\") " pod="openshift-marketplace/redhat-marketplace-82k5b" Oct 12 06:39:38 crc kubenswrapper[4930]: I1012 06:39:38.091655 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1264b467-e8fe-49f0-97d2-a0b920083a9b-catalog-content\") pod \"redhat-marketplace-82k5b\" (UID: \"1264b467-e8fe-49f0-97d2-a0b920083a9b\") " pod="openshift-marketplace/redhat-marketplace-82k5b" Oct 12 06:39:38 crc kubenswrapper[4930]: I1012 06:39:38.116487 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hhcb\" (UniqueName: \"kubernetes.io/projected/1264b467-e8fe-49f0-97d2-a0b920083a9b-kube-api-access-7hhcb\") pod \"redhat-marketplace-82k5b\" (UID: \"1264b467-e8fe-49f0-97d2-a0b920083a9b\") " pod="openshift-marketplace/redhat-marketplace-82k5b" Oct 12 06:39:38 crc kubenswrapper[4930]: I1012 06:39:38.215931 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-82k5b" Oct 12 06:39:38 crc kubenswrapper[4930]: I1012 06:39:38.752697 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-82k5b"] Oct 12 06:39:38 crc kubenswrapper[4930]: I1012 06:39:38.975085 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-82k5b" event={"ID":"1264b467-e8fe-49f0-97d2-a0b920083a9b","Type":"ContainerStarted","Data":"a694d4d982add618481832606c156115ad78ea0be52f87cfb1ea3332543c9c88"} Oct 12 06:39:39 crc kubenswrapper[4930]: I1012 06:39:39.988368 4930 generic.go:334] "Generic (PLEG): container finished" podID="1264b467-e8fe-49f0-97d2-a0b920083a9b" containerID="13cd7a296910a49ad2ea05bb7660fbe9e47de8d24ad857fc25b4529514a4fa0d" exitCode=0 Oct 12 06:39:39 crc kubenswrapper[4930]: I1012 06:39:39.988548 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-82k5b" event={"ID":"1264b467-e8fe-49f0-97d2-a0b920083a9b","Type":"ContainerDied","Data":"13cd7a296910a49ad2ea05bb7660fbe9e47de8d24ad857fc25b4529514a4fa0d"} Oct 12 06:39:39 crc kubenswrapper[4930]: I1012 06:39:39.991555 4930 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 12 06:39:41 crc kubenswrapper[4930]: I1012 06:39:41.000434 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-82k5b" event={"ID":"1264b467-e8fe-49f0-97d2-a0b920083a9b","Type":"ContainerStarted","Data":"3552433be7697672a3728cabebe5a34c6d9579f58856469410ad90e91c2a95ff"} Oct 12 06:39:42 crc kubenswrapper[4930]: I1012 06:39:42.013914 4930 generic.go:334] "Generic (PLEG): container finished" podID="1264b467-e8fe-49f0-97d2-a0b920083a9b" containerID="3552433be7697672a3728cabebe5a34c6d9579f58856469410ad90e91c2a95ff" exitCode=0 Oct 12 06:39:42 crc kubenswrapper[4930]: I1012 06:39:42.013996 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-82k5b" event={"ID":"1264b467-e8fe-49f0-97d2-a0b920083a9b","Type":"ContainerDied","Data":"3552433be7697672a3728cabebe5a34c6d9579f58856469410ad90e91c2a95ff"} Oct 12 06:39:43 crc kubenswrapper[4930]: I1012 06:39:43.031682 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-82k5b" event={"ID":"1264b467-e8fe-49f0-97d2-a0b920083a9b","Type":"ContainerStarted","Data":"2f839a7e6764761c9f30e3365d25c1d320165ac1b3dc2506c70dcda5b28df0a2"} Oct 12 06:39:43 crc kubenswrapper[4930]: I1012 06:39:43.064568 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-82k5b" podStartSLOduration=3.455449076 podStartE2EDuration="6.064548448s" podCreationTimestamp="2025-10-12 06:39:37 +0000 UTC" firstStartedPulling="2025-10-12 06:39:39.991353461 +0000 UTC m=+3512.533455226" lastFinishedPulling="2025-10-12 06:39:42.600452793 +0000 UTC m=+3515.142554598" observedRunningTime="2025-10-12 06:39:43.054378896 +0000 UTC m=+3515.596480711" watchObservedRunningTime="2025-10-12 06:39:43.064548448 +0000 UTC m=+3515.606650223" Oct 12 06:39:48 crc kubenswrapper[4930]: I1012 06:39:48.217076 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-82k5b" Oct 12 06:39:48 crc kubenswrapper[4930]: I1012 06:39:48.217557 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-82k5b" Oct 12 06:39:48 crc kubenswrapper[4930]: I1012 06:39:48.312291 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-82k5b" Oct 12 06:39:49 crc kubenswrapper[4930]: I1012 06:39:49.178457 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-82k5b" Oct 12 06:39:49 crc kubenswrapper[4930]: I1012 06:39:49.249243 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-82k5b"] Oct 12 06:39:51 crc kubenswrapper[4930]: I1012 06:39:51.113619 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-82k5b" podUID="1264b467-e8fe-49f0-97d2-a0b920083a9b" containerName="registry-server" containerID="cri-o://2f839a7e6764761c9f30e3365d25c1d320165ac1b3dc2506c70dcda5b28df0a2" gracePeriod=2 Oct 12 06:39:51 crc kubenswrapper[4930]: I1012 06:39:51.650191 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-82k5b" Oct 12 06:39:51 crc kubenswrapper[4930]: I1012 06:39:51.798210 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1264b467-e8fe-49f0-97d2-a0b920083a9b-catalog-content\") pod \"1264b467-e8fe-49f0-97d2-a0b920083a9b\" (UID: \"1264b467-e8fe-49f0-97d2-a0b920083a9b\") " Oct 12 06:39:51 crc kubenswrapper[4930]: I1012 06:39:51.798350 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1264b467-e8fe-49f0-97d2-a0b920083a9b-utilities\") pod \"1264b467-e8fe-49f0-97d2-a0b920083a9b\" (UID: \"1264b467-e8fe-49f0-97d2-a0b920083a9b\") " Oct 12 06:39:51 crc kubenswrapper[4930]: I1012 06:39:51.798610 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hhcb\" (UniqueName: \"kubernetes.io/projected/1264b467-e8fe-49f0-97d2-a0b920083a9b-kube-api-access-7hhcb\") pod \"1264b467-e8fe-49f0-97d2-a0b920083a9b\" (UID: \"1264b467-e8fe-49f0-97d2-a0b920083a9b\") " Oct 12 06:39:51 crc kubenswrapper[4930]: I1012 06:39:51.799357 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1264b467-e8fe-49f0-97d2-a0b920083a9b-utilities" (OuterVolumeSpecName: "utilities") pod "1264b467-e8fe-49f0-97d2-a0b920083a9b" (UID: "1264b467-e8fe-49f0-97d2-a0b920083a9b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 06:39:51 crc kubenswrapper[4930]: I1012 06:39:51.808962 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1264b467-e8fe-49f0-97d2-a0b920083a9b-kube-api-access-7hhcb" (OuterVolumeSpecName: "kube-api-access-7hhcb") pod "1264b467-e8fe-49f0-97d2-a0b920083a9b" (UID: "1264b467-e8fe-49f0-97d2-a0b920083a9b"). InnerVolumeSpecName "kube-api-access-7hhcb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:39:51 crc kubenswrapper[4930]: I1012 06:39:51.810909 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1264b467-e8fe-49f0-97d2-a0b920083a9b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1264b467-e8fe-49f0-97d2-a0b920083a9b" (UID: "1264b467-e8fe-49f0-97d2-a0b920083a9b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 06:39:51 crc kubenswrapper[4930]: I1012 06:39:51.900679 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7hhcb\" (UniqueName: \"kubernetes.io/projected/1264b467-e8fe-49f0-97d2-a0b920083a9b-kube-api-access-7hhcb\") on node \"crc\" DevicePath \"\"" Oct 12 06:39:51 crc kubenswrapper[4930]: I1012 06:39:51.900722 4930 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1264b467-e8fe-49f0-97d2-a0b920083a9b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 06:39:51 crc kubenswrapper[4930]: I1012 06:39:51.900754 4930 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1264b467-e8fe-49f0-97d2-a0b920083a9b-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 06:39:52 crc kubenswrapper[4930]: I1012 06:39:52.128582 4930 generic.go:334] "Generic (PLEG): container finished" podID="1264b467-e8fe-49f0-97d2-a0b920083a9b" containerID="2f839a7e6764761c9f30e3365d25c1d320165ac1b3dc2506c70dcda5b28df0a2" exitCode=0 Oct 12 06:39:52 crc kubenswrapper[4930]: I1012 06:39:52.128672 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-82k5b" Oct 12 06:39:52 crc kubenswrapper[4930]: I1012 06:39:52.128699 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-82k5b" event={"ID":"1264b467-e8fe-49f0-97d2-a0b920083a9b","Type":"ContainerDied","Data":"2f839a7e6764761c9f30e3365d25c1d320165ac1b3dc2506c70dcda5b28df0a2"} Oct 12 06:39:52 crc kubenswrapper[4930]: I1012 06:39:52.129129 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-82k5b" event={"ID":"1264b467-e8fe-49f0-97d2-a0b920083a9b","Type":"ContainerDied","Data":"a694d4d982add618481832606c156115ad78ea0be52f87cfb1ea3332543c9c88"} Oct 12 06:39:52 crc kubenswrapper[4930]: I1012 06:39:52.129177 4930 scope.go:117] "RemoveContainer" containerID="2f839a7e6764761c9f30e3365d25c1d320165ac1b3dc2506c70dcda5b28df0a2" Oct 12 06:39:52 crc kubenswrapper[4930]: I1012 06:39:52.179128 4930 scope.go:117] "RemoveContainer" containerID="3552433be7697672a3728cabebe5a34c6d9579f58856469410ad90e91c2a95ff" Oct 12 06:39:52 crc kubenswrapper[4930]: I1012 06:39:52.187282 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-82k5b"] Oct 12 06:39:52 crc kubenswrapper[4930]: I1012 06:39:52.198094 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-82k5b"] Oct 12 06:39:52 crc kubenswrapper[4930]: I1012 06:39:52.218261 4930 scope.go:117] "RemoveContainer" containerID="13cd7a296910a49ad2ea05bb7660fbe9e47de8d24ad857fc25b4529514a4fa0d" Oct 12 06:39:52 crc kubenswrapper[4930]: I1012 06:39:52.253565 4930 scope.go:117] "RemoveContainer" containerID="2f839a7e6764761c9f30e3365d25c1d320165ac1b3dc2506c70dcda5b28df0a2" Oct 12 06:39:52 crc kubenswrapper[4930]: E1012 06:39:52.254486 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f839a7e6764761c9f30e3365d25c1d320165ac1b3dc2506c70dcda5b28df0a2\": container with ID starting with 2f839a7e6764761c9f30e3365d25c1d320165ac1b3dc2506c70dcda5b28df0a2 not found: ID does not exist" containerID="2f839a7e6764761c9f30e3365d25c1d320165ac1b3dc2506c70dcda5b28df0a2" Oct 12 06:39:52 crc kubenswrapper[4930]: I1012 06:39:52.254702 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f839a7e6764761c9f30e3365d25c1d320165ac1b3dc2506c70dcda5b28df0a2"} err="failed to get container status \"2f839a7e6764761c9f30e3365d25c1d320165ac1b3dc2506c70dcda5b28df0a2\": rpc error: code = NotFound desc = could not find container \"2f839a7e6764761c9f30e3365d25c1d320165ac1b3dc2506c70dcda5b28df0a2\": container with ID starting with 2f839a7e6764761c9f30e3365d25c1d320165ac1b3dc2506c70dcda5b28df0a2 not found: ID does not exist" Oct 12 06:39:52 crc kubenswrapper[4930]: I1012 06:39:52.254920 4930 scope.go:117] "RemoveContainer" containerID="3552433be7697672a3728cabebe5a34c6d9579f58856469410ad90e91c2a95ff" Oct 12 06:39:52 crc kubenswrapper[4930]: E1012 06:39:52.255689 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3552433be7697672a3728cabebe5a34c6d9579f58856469410ad90e91c2a95ff\": container with ID starting with 3552433be7697672a3728cabebe5a34c6d9579f58856469410ad90e91c2a95ff not found: ID does not exist" containerID="3552433be7697672a3728cabebe5a34c6d9579f58856469410ad90e91c2a95ff" Oct 12 06:39:52 crc kubenswrapper[4930]: I1012 06:39:52.255760 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3552433be7697672a3728cabebe5a34c6d9579f58856469410ad90e91c2a95ff"} err="failed to get container status \"3552433be7697672a3728cabebe5a34c6d9579f58856469410ad90e91c2a95ff\": rpc error: code = NotFound desc = could not find container \"3552433be7697672a3728cabebe5a34c6d9579f58856469410ad90e91c2a95ff\": container with ID starting with 3552433be7697672a3728cabebe5a34c6d9579f58856469410ad90e91c2a95ff not found: ID does not exist" Oct 12 06:39:52 crc kubenswrapper[4930]: I1012 06:39:52.255802 4930 scope.go:117] "RemoveContainer" containerID="13cd7a296910a49ad2ea05bb7660fbe9e47de8d24ad857fc25b4529514a4fa0d" Oct 12 06:39:52 crc kubenswrapper[4930]: E1012 06:39:52.256440 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13cd7a296910a49ad2ea05bb7660fbe9e47de8d24ad857fc25b4529514a4fa0d\": container with ID starting with 13cd7a296910a49ad2ea05bb7660fbe9e47de8d24ad857fc25b4529514a4fa0d not found: ID does not exist" containerID="13cd7a296910a49ad2ea05bb7660fbe9e47de8d24ad857fc25b4529514a4fa0d" Oct 12 06:39:52 crc kubenswrapper[4930]: I1012 06:39:52.256626 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13cd7a296910a49ad2ea05bb7660fbe9e47de8d24ad857fc25b4529514a4fa0d"} err="failed to get container status \"13cd7a296910a49ad2ea05bb7660fbe9e47de8d24ad857fc25b4529514a4fa0d\": rpc error: code = NotFound desc = could not find container \"13cd7a296910a49ad2ea05bb7660fbe9e47de8d24ad857fc25b4529514a4fa0d\": container with ID starting with 13cd7a296910a49ad2ea05bb7660fbe9e47de8d24ad857fc25b4529514a4fa0d not found: ID does not exist" Oct 12 06:39:54 crc kubenswrapper[4930]: I1012 06:39:54.149586 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1264b467-e8fe-49f0-97d2-a0b920083a9b" path="/var/lib/kubelet/pods/1264b467-e8fe-49f0-97d2-a0b920083a9b/volumes" Oct 12 06:40:03 crc kubenswrapper[4930]: I1012 06:40:03.669285 4930 patch_prober.go:28] interesting pod/machine-config-daemon-mk4tf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 06:40:03 crc kubenswrapper[4930]: I1012 06:40:03.670237 4930 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 06:40:03 crc kubenswrapper[4930]: I1012 06:40:03.670344 4930 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" Oct 12 06:40:03 crc kubenswrapper[4930]: I1012 06:40:03.671814 4930 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d151d8c371342d4130e54ee25c023bdb58521d35b6a35844b76618c1e64a97fb"} pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 12 06:40:03 crc kubenswrapper[4930]: I1012 06:40:03.671975 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerName="machine-config-daemon" containerID="cri-o://d151d8c371342d4130e54ee25c023bdb58521d35b6a35844b76618c1e64a97fb" gracePeriod=600 Oct 12 06:40:04 crc kubenswrapper[4930]: I1012 06:40:04.286233 4930 generic.go:334] "Generic (PLEG): container finished" podID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerID="d151d8c371342d4130e54ee25c023bdb58521d35b6a35844b76618c1e64a97fb" exitCode=0 Oct 12 06:40:04 crc kubenswrapper[4930]: I1012 06:40:04.286298 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" event={"ID":"02f8684c-a3e4-44e8-9741-9f54488d8d8d","Type":"ContainerDied","Data":"d151d8c371342d4130e54ee25c023bdb58521d35b6a35844b76618c1e64a97fb"} Oct 12 06:40:04 crc kubenswrapper[4930]: I1012 06:40:04.286624 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" event={"ID":"02f8684c-a3e4-44e8-9741-9f54488d8d8d","Type":"ContainerStarted","Data":"eb75ef60b15b6f921365d8663e4e5999d5843d674724de96ce9347f0001236ed"} Oct 12 06:40:04 crc kubenswrapper[4930]: I1012 06:40:04.286651 4930 scope.go:117] "RemoveContainer" containerID="8cf3fb7e8d14641e6ad26a172b6fdafcd1328d22aa4c129661b81d6d5e5a5cc0" Oct 12 06:40:44 crc kubenswrapper[4930]: E1012 06:40:44.660085 4930 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.111:58296->38.102.83.111:46517: write tcp 38.102.83.111:58296->38.102.83.111:46517: write: broken pipe Oct 12 06:42:33 crc kubenswrapper[4930]: I1012 06:42:33.669186 4930 patch_prober.go:28] interesting pod/machine-config-daemon-mk4tf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 06:42:33 crc kubenswrapper[4930]: I1012 06:42:33.669830 4930 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 06:42:55 crc kubenswrapper[4930]: I1012 06:42:55.412458 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jn52b"] Oct 12 06:42:55 crc kubenswrapper[4930]: E1012 06:42:55.415101 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1264b467-e8fe-49f0-97d2-a0b920083a9b" containerName="extract-utilities" Oct 12 06:42:55 crc kubenswrapper[4930]: I1012 06:42:55.415147 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="1264b467-e8fe-49f0-97d2-a0b920083a9b" containerName="extract-utilities" Oct 12 06:42:55 crc kubenswrapper[4930]: E1012 06:42:55.415213 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1264b467-e8fe-49f0-97d2-a0b920083a9b" containerName="registry-server" Oct 12 06:42:55 crc kubenswrapper[4930]: I1012 06:42:55.415229 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="1264b467-e8fe-49f0-97d2-a0b920083a9b" containerName="registry-server" Oct 12 06:42:55 crc kubenswrapper[4930]: E1012 06:42:55.415273 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1264b467-e8fe-49f0-97d2-a0b920083a9b" containerName="extract-content" Oct 12 06:42:55 crc kubenswrapper[4930]: I1012 06:42:55.415287 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="1264b467-e8fe-49f0-97d2-a0b920083a9b" containerName="extract-content" Oct 12 06:42:55 crc kubenswrapper[4930]: I1012 06:42:55.415722 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="1264b467-e8fe-49f0-97d2-a0b920083a9b" containerName="registry-server" Oct 12 06:42:55 crc kubenswrapper[4930]: I1012 06:42:55.418884 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jn52b" Oct 12 06:42:55 crc kubenswrapper[4930]: I1012 06:42:55.434619 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jn52b"] Oct 12 06:42:55 crc kubenswrapper[4930]: I1012 06:42:55.540066 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dc9gm\" (UniqueName: \"kubernetes.io/projected/d00a10ec-bd70-428f-8c50-1831c5aff9f3-kube-api-access-dc9gm\") pod \"certified-operators-jn52b\" (UID: \"d00a10ec-bd70-428f-8c50-1831c5aff9f3\") " pod="openshift-marketplace/certified-operators-jn52b" Oct 12 06:42:55 crc kubenswrapper[4930]: I1012 06:42:55.540147 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d00a10ec-bd70-428f-8c50-1831c5aff9f3-utilities\") pod \"certified-operators-jn52b\" (UID: \"d00a10ec-bd70-428f-8c50-1831c5aff9f3\") " pod="openshift-marketplace/certified-operators-jn52b" Oct 12 06:42:55 crc kubenswrapper[4930]: I1012 06:42:55.540830 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d00a10ec-bd70-428f-8c50-1831c5aff9f3-catalog-content\") pod \"certified-operators-jn52b\" (UID: \"d00a10ec-bd70-428f-8c50-1831c5aff9f3\") " pod="openshift-marketplace/certified-operators-jn52b" Oct 12 06:42:55 crc kubenswrapper[4930]: I1012 06:42:55.643435 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dc9gm\" (UniqueName: \"kubernetes.io/projected/d00a10ec-bd70-428f-8c50-1831c5aff9f3-kube-api-access-dc9gm\") pod \"certified-operators-jn52b\" (UID: \"d00a10ec-bd70-428f-8c50-1831c5aff9f3\") " pod="openshift-marketplace/certified-operators-jn52b" Oct 12 06:42:55 crc kubenswrapper[4930]: I1012 06:42:55.643511 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d00a10ec-bd70-428f-8c50-1831c5aff9f3-utilities\") pod \"certified-operators-jn52b\" (UID: \"d00a10ec-bd70-428f-8c50-1831c5aff9f3\") " pod="openshift-marketplace/certified-operators-jn52b" Oct 12 06:42:55 crc kubenswrapper[4930]: I1012 06:42:55.643576 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d00a10ec-bd70-428f-8c50-1831c5aff9f3-catalog-content\") pod \"certified-operators-jn52b\" (UID: \"d00a10ec-bd70-428f-8c50-1831c5aff9f3\") " pod="openshift-marketplace/certified-operators-jn52b" Oct 12 06:42:55 crc kubenswrapper[4930]: I1012 06:42:55.644054 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d00a10ec-bd70-428f-8c50-1831c5aff9f3-catalog-content\") pod \"certified-operators-jn52b\" (UID: \"d00a10ec-bd70-428f-8c50-1831c5aff9f3\") " pod="openshift-marketplace/certified-operators-jn52b" Oct 12 06:42:55 crc kubenswrapper[4930]: I1012 06:42:55.644197 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d00a10ec-bd70-428f-8c50-1831c5aff9f3-utilities\") pod \"certified-operators-jn52b\" (UID: \"d00a10ec-bd70-428f-8c50-1831c5aff9f3\") " pod="openshift-marketplace/certified-operators-jn52b" Oct 12 06:42:55 crc kubenswrapper[4930]: I1012 06:42:55.672865 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dc9gm\" (UniqueName: \"kubernetes.io/projected/d00a10ec-bd70-428f-8c50-1831c5aff9f3-kube-api-access-dc9gm\") pod \"certified-operators-jn52b\" (UID: \"d00a10ec-bd70-428f-8c50-1831c5aff9f3\") " pod="openshift-marketplace/certified-operators-jn52b" Oct 12 06:42:55 crc kubenswrapper[4930]: I1012 06:42:55.760014 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jn52b" Oct 12 06:42:56 crc kubenswrapper[4930]: I1012 06:42:56.276115 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jn52b"] Oct 12 06:42:56 crc kubenswrapper[4930]: I1012 06:42:56.406055 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jn52b" event={"ID":"d00a10ec-bd70-428f-8c50-1831c5aff9f3","Type":"ContainerStarted","Data":"8a03ad9779a9da32f554dc2896d7c9f82592161a43658314b0e81c3e66a4d9f3"} Oct 12 06:42:57 crc kubenswrapper[4930]: I1012 06:42:57.419503 4930 generic.go:334] "Generic (PLEG): container finished" podID="d00a10ec-bd70-428f-8c50-1831c5aff9f3" containerID="20f646b638b9d6fcbef957221d551f4f9503056043c35918b90fc346d47cd6f8" exitCode=0 Oct 12 06:42:57 crc kubenswrapper[4930]: I1012 06:42:57.419724 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jn52b" event={"ID":"d00a10ec-bd70-428f-8c50-1831c5aff9f3","Type":"ContainerDied","Data":"20f646b638b9d6fcbef957221d551f4f9503056043c35918b90fc346d47cd6f8"} Oct 12 06:42:58 crc kubenswrapper[4930]: I1012 06:42:58.431876 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jn52b" event={"ID":"d00a10ec-bd70-428f-8c50-1831c5aff9f3","Type":"ContainerStarted","Data":"c77b4011abdcbbb7e3fd07e87b05c373fcbbc3fc0369c27aef9529fdfbc0018c"} Oct 12 06:43:00 crc kubenswrapper[4930]: I1012 06:43:00.459775 4930 generic.go:334] "Generic (PLEG): container finished" podID="d00a10ec-bd70-428f-8c50-1831c5aff9f3" containerID="c77b4011abdcbbb7e3fd07e87b05c373fcbbc3fc0369c27aef9529fdfbc0018c" exitCode=0 Oct 12 06:43:00 crc kubenswrapper[4930]: I1012 06:43:00.459847 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jn52b" event={"ID":"d00a10ec-bd70-428f-8c50-1831c5aff9f3","Type":"ContainerDied","Data":"c77b4011abdcbbb7e3fd07e87b05c373fcbbc3fc0369c27aef9529fdfbc0018c"} Oct 12 06:43:01 crc kubenswrapper[4930]: I1012 06:43:01.474252 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jn52b" event={"ID":"d00a10ec-bd70-428f-8c50-1831c5aff9f3","Type":"ContainerStarted","Data":"94977d248fda5b6be4799ebeb83e94d2b9b00d544d094b7854d5651a5d16fac4"} Oct 12 06:43:01 crc kubenswrapper[4930]: I1012 06:43:01.509079 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jn52b" podStartSLOduration=3.036666889 podStartE2EDuration="6.50905485s" podCreationTimestamp="2025-10-12 06:42:55 +0000 UTC" firstStartedPulling="2025-10-12 06:42:57.422659838 +0000 UTC m=+3709.964761613" lastFinishedPulling="2025-10-12 06:43:00.895047769 +0000 UTC m=+3713.437149574" observedRunningTime="2025-10-12 06:43:01.505567214 +0000 UTC m=+3714.047668989" watchObservedRunningTime="2025-10-12 06:43:01.50905485 +0000 UTC m=+3714.051156625" Oct 12 06:43:03 crc kubenswrapper[4930]: I1012 06:43:03.669767 4930 patch_prober.go:28] interesting pod/machine-config-daemon-mk4tf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 06:43:03 crc kubenswrapper[4930]: I1012 06:43:03.670345 4930 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 06:43:05 crc kubenswrapper[4930]: I1012 06:43:05.761796 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jn52b" Oct 12 06:43:05 crc kubenswrapper[4930]: I1012 06:43:05.762141 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jn52b" Oct 12 06:43:06 crc kubenswrapper[4930]: I1012 06:43:06.822116 4930 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-jn52b" podUID="d00a10ec-bd70-428f-8c50-1831c5aff9f3" containerName="registry-server" probeResult="failure" output=< Oct 12 06:43:06 crc kubenswrapper[4930]: timeout: failed to connect service ":50051" within 1s Oct 12 06:43:06 crc kubenswrapper[4930]: > Oct 12 06:43:15 crc kubenswrapper[4930]: I1012 06:43:15.829091 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jn52b" Oct 12 06:43:15 crc kubenswrapper[4930]: I1012 06:43:15.926450 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jn52b" Oct 12 06:43:16 crc kubenswrapper[4930]: I1012 06:43:16.087934 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jn52b"] Oct 12 06:43:17 crc kubenswrapper[4930]: I1012 06:43:17.642411 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jn52b" podUID="d00a10ec-bd70-428f-8c50-1831c5aff9f3" containerName="registry-server" containerID="cri-o://94977d248fda5b6be4799ebeb83e94d2b9b00d544d094b7854d5651a5d16fac4" gracePeriod=2 Oct 12 06:43:18 crc kubenswrapper[4930]: I1012 06:43:18.134290 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jn52b" Oct 12 06:43:18 crc kubenswrapper[4930]: I1012 06:43:18.258759 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d00a10ec-bd70-428f-8c50-1831c5aff9f3-catalog-content\") pod \"d00a10ec-bd70-428f-8c50-1831c5aff9f3\" (UID: \"d00a10ec-bd70-428f-8c50-1831c5aff9f3\") " Oct 12 06:43:18 crc kubenswrapper[4930]: I1012 06:43:18.258822 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dc9gm\" (UniqueName: \"kubernetes.io/projected/d00a10ec-bd70-428f-8c50-1831c5aff9f3-kube-api-access-dc9gm\") pod \"d00a10ec-bd70-428f-8c50-1831c5aff9f3\" (UID: \"d00a10ec-bd70-428f-8c50-1831c5aff9f3\") " Oct 12 06:43:18 crc kubenswrapper[4930]: I1012 06:43:18.258963 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d00a10ec-bd70-428f-8c50-1831c5aff9f3-utilities\") pod \"d00a10ec-bd70-428f-8c50-1831c5aff9f3\" (UID: \"d00a10ec-bd70-428f-8c50-1831c5aff9f3\") " Oct 12 06:43:18 crc kubenswrapper[4930]: I1012 06:43:18.259604 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d00a10ec-bd70-428f-8c50-1831c5aff9f3-utilities" (OuterVolumeSpecName: "utilities") pod "d00a10ec-bd70-428f-8c50-1831c5aff9f3" (UID: "d00a10ec-bd70-428f-8c50-1831c5aff9f3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 06:43:18 crc kubenswrapper[4930]: I1012 06:43:18.264511 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d00a10ec-bd70-428f-8c50-1831c5aff9f3-kube-api-access-dc9gm" (OuterVolumeSpecName: "kube-api-access-dc9gm") pod "d00a10ec-bd70-428f-8c50-1831c5aff9f3" (UID: "d00a10ec-bd70-428f-8c50-1831c5aff9f3"). InnerVolumeSpecName "kube-api-access-dc9gm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:43:18 crc kubenswrapper[4930]: I1012 06:43:18.301257 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d00a10ec-bd70-428f-8c50-1831c5aff9f3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d00a10ec-bd70-428f-8c50-1831c5aff9f3" (UID: "d00a10ec-bd70-428f-8c50-1831c5aff9f3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 06:43:18 crc kubenswrapper[4930]: I1012 06:43:18.361292 4930 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d00a10ec-bd70-428f-8c50-1831c5aff9f3-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 06:43:18 crc kubenswrapper[4930]: I1012 06:43:18.361331 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dc9gm\" (UniqueName: \"kubernetes.io/projected/d00a10ec-bd70-428f-8c50-1831c5aff9f3-kube-api-access-dc9gm\") on node \"crc\" DevicePath \"\"" Oct 12 06:43:18 crc kubenswrapper[4930]: I1012 06:43:18.361342 4930 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d00a10ec-bd70-428f-8c50-1831c5aff9f3-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 06:43:18 crc kubenswrapper[4930]: I1012 06:43:18.653531 4930 generic.go:334] "Generic (PLEG): container finished" podID="d00a10ec-bd70-428f-8c50-1831c5aff9f3" containerID="94977d248fda5b6be4799ebeb83e94d2b9b00d544d094b7854d5651a5d16fac4" exitCode=0 Oct 12 06:43:18 crc kubenswrapper[4930]: I1012 06:43:18.653570 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jn52b" event={"ID":"d00a10ec-bd70-428f-8c50-1831c5aff9f3","Type":"ContainerDied","Data":"94977d248fda5b6be4799ebeb83e94d2b9b00d544d094b7854d5651a5d16fac4"} Oct 12 06:43:18 crc kubenswrapper[4930]: I1012 06:43:18.653630 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jn52b" event={"ID":"d00a10ec-bd70-428f-8c50-1831c5aff9f3","Type":"ContainerDied","Data":"8a03ad9779a9da32f554dc2896d7c9f82592161a43658314b0e81c3e66a4d9f3"} Oct 12 06:43:18 crc kubenswrapper[4930]: I1012 06:43:18.653653 4930 scope.go:117] "RemoveContainer" containerID="94977d248fda5b6be4799ebeb83e94d2b9b00d544d094b7854d5651a5d16fac4" Oct 12 06:43:18 crc kubenswrapper[4930]: I1012 06:43:18.653585 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jn52b" Oct 12 06:43:18 crc kubenswrapper[4930]: I1012 06:43:18.672835 4930 scope.go:117] "RemoveContainer" containerID="c77b4011abdcbbb7e3fd07e87b05c373fcbbc3fc0369c27aef9529fdfbc0018c" Oct 12 06:43:18 crc kubenswrapper[4930]: I1012 06:43:18.686827 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jn52b"] Oct 12 06:43:18 crc kubenswrapper[4930]: I1012 06:43:18.695486 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jn52b"] Oct 12 06:43:18 crc kubenswrapper[4930]: I1012 06:43:18.712116 4930 scope.go:117] "RemoveContainer" containerID="20f646b638b9d6fcbef957221d551f4f9503056043c35918b90fc346d47cd6f8" Oct 12 06:43:18 crc kubenswrapper[4930]: I1012 06:43:18.741440 4930 scope.go:117] "RemoveContainer" containerID="94977d248fda5b6be4799ebeb83e94d2b9b00d544d094b7854d5651a5d16fac4" Oct 12 06:43:18 crc kubenswrapper[4930]: E1012 06:43:18.741949 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94977d248fda5b6be4799ebeb83e94d2b9b00d544d094b7854d5651a5d16fac4\": container with ID starting with 94977d248fda5b6be4799ebeb83e94d2b9b00d544d094b7854d5651a5d16fac4 not found: ID does not exist" containerID="94977d248fda5b6be4799ebeb83e94d2b9b00d544d094b7854d5651a5d16fac4" Oct 12 06:43:18 crc kubenswrapper[4930]: I1012 06:43:18.741989 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94977d248fda5b6be4799ebeb83e94d2b9b00d544d094b7854d5651a5d16fac4"} err="failed to get container status \"94977d248fda5b6be4799ebeb83e94d2b9b00d544d094b7854d5651a5d16fac4\": rpc error: code = NotFound desc = could not find container \"94977d248fda5b6be4799ebeb83e94d2b9b00d544d094b7854d5651a5d16fac4\": container with ID starting with 94977d248fda5b6be4799ebeb83e94d2b9b00d544d094b7854d5651a5d16fac4 not found: ID does not exist" Oct 12 06:43:18 crc kubenswrapper[4930]: I1012 06:43:18.742015 4930 scope.go:117] "RemoveContainer" containerID="c77b4011abdcbbb7e3fd07e87b05c373fcbbc3fc0369c27aef9529fdfbc0018c" Oct 12 06:43:18 crc kubenswrapper[4930]: E1012 06:43:18.742434 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c77b4011abdcbbb7e3fd07e87b05c373fcbbc3fc0369c27aef9529fdfbc0018c\": container with ID starting with c77b4011abdcbbb7e3fd07e87b05c373fcbbc3fc0369c27aef9529fdfbc0018c not found: ID does not exist" containerID="c77b4011abdcbbb7e3fd07e87b05c373fcbbc3fc0369c27aef9529fdfbc0018c" Oct 12 06:43:18 crc kubenswrapper[4930]: I1012 06:43:18.742477 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c77b4011abdcbbb7e3fd07e87b05c373fcbbc3fc0369c27aef9529fdfbc0018c"} err="failed to get container status \"c77b4011abdcbbb7e3fd07e87b05c373fcbbc3fc0369c27aef9529fdfbc0018c\": rpc error: code = NotFound desc = could not find container \"c77b4011abdcbbb7e3fd07e87b05c373fcbbc3fc0369c27aef9529fdfbc0018c\": container with ID starting with c77b4011abdcbbb7e3fd07e87b05c373fcbbc3fc0369c27aef9529fdfbc0018c not found: ID does not exist" Oct 12 06:43:18 crc kubenswrapper[4930]: I1012 06:43:18.742503 4930 scope.go:117] "RemoveContainer" containerID="20f646b638b9d6fcbef957221d551f4f9503056043c35918b90fc346d47cd6f8" Oct 12 06:43:18 crc kubenswrapper[4930]: E1012 06:43:18.742800 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20f646b638b9d6fcbef957221d551f4f9503056043c35918b90fc346d47cd6f8\": container with ID starting with 20f646b638b9d6fcbef957221d551f4f9503056043c35918b90fc346d47cd6f8 not found: ID does not exist" containerID="20f646b638b9d6fcbef957221d551f4f9503056043c35918b90fc346d47cd6f8" Oct 12 06:43:18 crc kubenswrapper[4930]: I1012 06:43:18.742828 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20f646b638b9d6fcbef957221d551f4f9503056043c35918b90fc346d47cd6f8"} err="failed to get container status \"20f646b638b9d6fcbef957221d551f4f9503056043c35918b90fc346d47cd6f8\": rpc error: code = NotFound desc = could not find container \"20f646b638b9d6fcbef957221d551f4f9503056043c35918b90fc346d47cd6f8\": container with ID starting with 20f646b638b9d6fcbef957221d551f4f9503056043c35918b90fc346d47cd6f8 not found: ID does not exist" Oct 12 06:43:20 crc kubenswrapper[4930]: I1012 06:43:20.156397 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d00a10ec-bd70-428f-8c50-1831c5aff9f3" path="/var/lib/kubelet/pods/d00a10ec-bd70-428f-8c50-1831c5aff9f3/volumes" Oct 12 06:43:33 crc kubenswrapper[4930]: I1012 06:43:33.668882 4930 patch_prober.go:28] interesting pod/machine-config-daemon-mk4tf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 06:43:33 crc kubenswrapper[4930]: I1012 06:43:33.669350 4930 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 06:43:33 crc kubenswrapper[4930]: I1012 06:43:33.669399 4930 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" Oct 12 06:43:33 crc kubenswrapper[4930]: I1012 06:43:33.670196 4930 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"eb75ef60b15b6f921365d8663e4e5999d5843d674724de96ce9347f0001236ed"} pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 12 06:43:33 crc kubenswrapper[4930]: I1012 06:43:33.670253 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerName="machine-config-daemon" containerID="cri-o://eb75ef60b15b6f921365d8663e4e5999d5843d674724de96ce9347f0001236ed" gracePeriod=600 Oct 12 06:43:33 crc kubenswrapper[4930]: E1012 06:43:33.799336 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:43:33 crc kubenswrapper[4930]: I1012 06:43:33.856136 4930 generic.go:334] "Generic (PLEG): container finished" podID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerID="eb75ef60b15b6f921365d8663e4e5999d5843d674724de96ce9347f0001236ed" exitCode=0 Oct 12 06:43:33 crc kubenswrapper[4930]: I1012 06:43:33.856204 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" event={"ID":"02f8684c-a3e4-44e8-9741-9f54488d8d8d","Type":"ContainerDied","Data":"eb75ef60b15b6f921365d8663e4e5999d5843d674724de96ce9347f0001236ed"} Oct 12 06:43:33 crc kubenswrapper[4930]: I1012 06:43:33.856260 4930 scope.go:117] "RemoveContainer" containerID="d151d8c371342d4130e54ee25c023bdb58521d35b6a35844b76618c1e64a97fb" Oct 12 06:43:33 crc kubenswrapper[4930]: I1012 06:43:33.857352 4930 scope.go:117] "RemoveContainer" containerID="eb75ef60b15b6f921365d8663e4e5999d5843d674724de96ce9347f0001236ed" Oct 12 06:43:33 crc kubenswrapper[4930]: E1012 06:43:33.857725 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:43:48 crc kubenswrapper[4930]: I1012 06:43:48.135558 4930 scope.go:117] "RemoveContainer" containerID="eb75ef60b15b6f921365d8663e4e5999d5843d674724de96ce9347f0001236ed" Oct 12 06:43:48 crc kubenswrapper[4930]: E1012 06:43:48.137427 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:44:02 crc kubenswrapper[4930]: I1012 06:44:02.136222 4930 scope.go:117] "RemoveContainer" containerID="eb75ef60b15b6f921365d8663e4e5999d5843d674724de96ce9347f0001236ed" Oct 12 06:44:02 crc kubenswrapper[4930]: E1012 06:44:02.137230 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:44:17 crc kubenswrapper[4930]: I1012 06:44:17.134997 4930 scope.go:117] "RemoveContainer" containerID="eb75ef60b15b6f921365d8663e4e5999d5843d674724de96ce9347f0001236ed" Oct 12 06:44:17 crc kubenswrapper[4930]: E1012 06:44:17.135665 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:44:29 crc kubenswrapper[4930]: I1012 06:44:29.136172 4930 scope.go:117] "RemoveContainer" containerID="eb75ef60b15b6f921365d8663e4e5999d5843d674724de96ce9347f0001236ed" Oct 12 06:44:29 crc kubenswrapper[4930]: E1012 06:44:29.137341 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:44:43 crc kubenswrapper[4930]: I1012 06:44:43.135694 4930 scope.go:117] "RemoveContainer" containerID="eb75ef60b15b6f921365d8663e4e5999d5843d674724de96ce9347f0001236ed" Oct 12 06:44:43 crc kubenswrapper[4930]: E1012 06:44:43.137190 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:44:51 crc kubenswrapper[4930]: E1012 06:44:51.968201 4930 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.111:42202->38.102.83.111:46517: write tcp 38.102.83.111:42202->38.102.83.111:46517: write: broken pipe Oct 12 06:44:55 crc kubenswrapper[4930]: I1012 06:44:55.136733 4930 scope.go:117] "RemoveContainer" containerID="eb75ef60b15b6f921365d8663e4e5999d5843d674724de96ce9347f0001236ed" Oct 12 06:44:55 crc kubenswrapper[4930]: E1012 06:44:55.137946 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:45:00 crc kubenswrapper[4930]: I1012 06:45:00.155157 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29337525-scrpn"] Oct 12 06:45:00 crc kubenswrapper[4930]: E1012 06:45:00.156197 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d00a10ec-bd70-428f-8c50-1831c5aff9f3" containerName="extract-content" Oct 12 06:45:00 crc kubenswrapper[4930]: I1012 06:45:00.156217 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="d00a10ec-bd70-428f-8c50-1831c5aff9f3" containerName="extract-content" Oct 12 06:45:00 crc kubenswrapper[4930]: E1012 06:45:00.156265 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d00a10ec-bd70-428f-8c50-1831c5aff9f3" containerName="registry-server" Oct 12 06:45:00 crc kubenswrapper[4930]: I1012 06:45:00.156277 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="d00a10ec-bd70-428f-8c50-1831c5aff9f3" containerName="registry-server" Oct 12 06:45:00 crc kubenswrapper[4930]: E1012 06:45:00.156324 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d00a10ec-bd70-428f-8c50-1831c5aff9f3" containerName="extract-utilities" Oct 12 06:45:00 crc kubenswrapper[4930]: I1012 06:45:00.156336 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="d00a10ec-bd70-428f-8c50-1831c5aff9f3" containerName="extract-utilities" Oct 12 06:45:00 crc kubenswrapper[4930]: I1012 06:45:00.156671 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="d00a10ec-bd70-428f-8c50-1831c5aff9f3" containerName="registry-server" Oct 12 06:45:00 crc kubenswrapper[4930]: I1012 06:45:00.157883 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29337525-scrpn" Oct 12 06:45:00 crc kubenswrapper[4930]: I1012 06:45:00.159773 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 12 06:45:00 crc kubenswrapper[4930]: I1012 06:45:00.160782 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 12 06:45:00 crc kubenswrapper[4930]: I1012 06:45:00.173834 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29337525-scrpn"] Oct 12 06:45:00 crc kubenswrapper[4930]: I1012 06:45:00.190896 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ps962\" (UniqueName: \"kubernetes.io/projected/7ee28797-be75-4bad-9eca-3cd0444df482-kube-api-access-ps962\") pod \"collect-profiles-29337525-scrpn\" (UID: \"7ee28797-be75-4bad-9eca-3cd0444df482\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337525-scrpn" Oct 12 06:45:00 crc kubenswrapper[4930]: I1012 06:45:00.191045 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7ee28797-be75-4bad-9eca-3cd0444df482-config-volume\") pod \"collect-profiles-29337525-scrpn\" (UID: \"7ee28797-be75-4bad-9eca-3cd0444df482\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337525-scrpn" Oct 12 06:45:00 crc kubenswrapper[4930]: I1012 06:45:00.191229 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7ee28797-be75-4bad-9eca-3cd0444df482-secret-volume\") pod \"collect-profiles-29337525-scrpn\" (UID: \"7ee28797-be75-4bad-9eca-3cd0444df482\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337525-scrpn" Oct 12 06:45:00 crc kubenswrapper[4930]: I1012 06:45:00.293233 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ps962\" (UniqueName: \"kubernetes.io/projected/7ee28797-be75-4bad-9eca-3cd0444df482-kube-api-access-ps962\") pod \"collect-profiles-29337525-scrpn\" (UID: \"7ee28797-be75-4bad-9eca-3cd0444df482\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337525-scrpn" Oct 12 06:45:00 crc kubenswrapper[4930]: I1012 06:45:00.293298 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7ee28797-be75-4bad-9eca-3cd0444df482-config-volume\") pod \"collect-profiles-29337525-scrpn\" (UID: \"7ee28797-be75-4bad-9eca-3cd0444df482\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337525-scrpn" Oct 12 06:45:00 crc kubenswrapper[4930]: I1012 06:45:00.293371 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7ee28797-be75-4bad-9eca-3cd0444df482-secret-volume\") pod \"collect-profiles-29337525-scrpn\" (UID: \"7ee28797-be75-4bad-9eca-3cd0444df482\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337525-scrpn" Oct 12 06:45:00 crc kubenswrapper[4930]: I1012 06:45:00.294449 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7ee28797-be75-4bad-9eca-3cd0444df482-config-volume\") pod \"collect-profiles-29337525-scrpn\" (UID: \"7ee28797-be75-4bad-9eca-3cd0444df482\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337525-scrpn" Oct 12 06:45:00 crc kubenswrapper[4930]: I1012 06:45:00.308997 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7ee28797-be75-4bad-9eca-3cd0444df482-secret-volume\") pod \"collect-profiles-29337525-scrpn\" (UID: \"7ee28797-be75-4bad-9eca-3cd0444df482\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337525-scrpn" Oct 12 06:45:00 crc kubenswrapper[4930]: I1012 06:45:00.315836 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ps962\" (UniqueName: \"kubernetes.io/projected/7ee28797-be75-4bad-9eca-3cd0444df482-kube-api-access-ps962\") pod \"collect-profiles-29337525-scrpn\" (UID: \"7ee28797-be75-4bad-9eca-3cd0444df482\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337525-scrpn" Oct 12 06:45:00 crc kubenswrapper[4930]: I1012 06:45:00.492214 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29337525-scrpn" Oct 12 06:45:01 crc kubenswrapper[4930]: I1012 06:45:01.057411 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29337525-scrpn"] Oct 12 06:45:01 crc kubenswrapper[4930]: I1012 06:45:01.993753 4930 generic.go:334] "Generic (PLEG): container finished" podID="7ee28797-be75-4bad-9eca-3cd0444df482" containerID="11554e68f262e4e73ef00742fe9a0dccd03e99000b7e0d421c52e82192b6ec03" exitCode=0 Oct 12 06:45:01 crc kubenswrapper[4930]: I1012 06:45:01.993827 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29337525-scrpn" event={"ID":"7ee28797-be75-4bad-9eca-3cd0444df482","Type":"ContainerDied","Data":"11554e68f262e4e73ef00742fe9a0dccd03e99000b7e0d421c52e82192b6ec03"} Oct 12 06:45:01 crc kubenswrapper[4930]: I1012 06:45:01.994055 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29337525-scrpn" event={"ID":"7ee28797-be75-4bad-9eca-3cd0444df482","Type":"ContainerStarted","Data":"2b10adec8d56fffca3db0c44d6fa9fef1d8e3f2a130c87f3fa691aafbb01045a"} Oct 12 06:45:03 crc kubenswrapper[4930]: I1012 06:45:03.373415 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29337525-scrpn" Oct 12 06:45:03 crc kubenswrapper[4930]: I1012 06:45:03.561375 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ps962\" (UniqueName: \"kubernetes.io/projected/7ee28797-be75-4bad-9eca-3cd0444df482-kube-api-access-ps962\") pod \"7ee28797-be75-4bad-9eca-3cd0444df482\" (UID: \"7ee28797-be75-4bad-9eca-3cd0444df482\") " Oct 12 06:45:03 crc kubenswrapper[4930]: I1012 06:45:03.561707 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7ee28797-be75-4bad-9eca-3cd0444df482-config-volume\") pod \"7ee28797-be75-4bad-9eca-3cd0444df482\" (UID: \"7ee28797-be75-4bad-9eca-3cd0444df482\") " Oct 12 06:45:03 crc kubenswrapper[4930]: I1012 06:45:03.561771 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7ee28797-be75-4bad-9eca-3cd0444df482-secret-volume\") pod \"7ee28797-be75-4bad-9eca-3cd0444df482\" (UID: \"7ee28797-be75-4bad-9eca-3cd0444df482\") " Oct 12 06:45:03 crc kubenswrapper[4930]: I1012 06:45:03.562359 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ee28797-be75-4bad-9eca-3cd0444df482-config-volume" (OuterVolumeSpecName: "config-volume") pod "7ee28797-be75-4bad-9eca-3cd0444df482" (UID: "7ee28797-be75-4bad-9eca-3cd0444df482"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 06:45:03 crc kubenswrapper[4930]: I1012 06:45:03.562725 4930 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7ee28797-be75-4bad-9eca-3cd0444df482-config-volume\") on node \"crc\" DevicePath \"\"" Oct 12 06:45:03 crc kubenswrapper[4930]: I1012 06:45:03.567430 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ee28797-be75-4bad-9eca-3cd0444df482-kube-api-access-ps962" (OuterVolumeSpecName: "kube-api-access-ps962") pod "7ee28797-be75-4bad-9eca-3cd0444df482" (UID: "7ee28797-be75-4bad-9eca-3cd0444df482"). InnerVolumeSpecName "kube-api-access-ps962". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:45:03 crc kubenswrapper[4930]: I1012 06:45:03.567466 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ee28797-be75-4bad-9eca-3cd0444df482-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7ee28797-be75-4bad-9eca-3cd0444df482" (UID: "7ee28797-be75-4bad-9eca-3cd0444df482"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 06:45:03 crc kubenswrapper[4930]: I1012 06:45:03.665107 4930 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7ee28797-be75-4bad-9eca-3cd0444df482-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 12 06:45:03 crc kubenswrapper[4930]: I1012 06:45:03.665149 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ps962\" (UniqueName: \"kubernetes.io/projected/7ee28797-be75-4bad-9eca-3cd0444df482-kube-api-access-ps962\") on node \"crc\" DevicePath \"\"" Oct 12 06:45:04 crc kubenswrapper[4930]: I1012 06:45:04.022588 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29337525-scrpn" event={"ID":"7ee28797-be75-4bad-9eca-3cd0444df482","Type":"ContainerDied","Data":"2b10adec8d56fffca3db0c44d6fa9fef1d8e3f2a130c87f3fa691aafbb01045a"} Oct 12 06:45:04 crc kubenswrapper[4930]: I1012 06:45:04.022989 4930 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b10adec8d56fffca3db0c44d6fa9fef1d8e3f2a130c87f3fa691aafbb01045a" Oct 12 06:45:04 crc kubenswrapper[4930]: I1012 06:45:04.022645 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29337525-scrpn" Oct 12 06:45:04 crc kubenswrapper[4930]: I1012 06:45:04.475873 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29337480-zt7xb"] Oct 12 06:45:04 crc kubenswrapper[4930]: I1012 06:45:04.486203 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29337480-zt7xb"] Oct 12 06:45:06 crc kubenswrapper[4930]: I1012 06:45:06.162034 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5518b25e-e018-492d-926b-37943813b054" path="/var/lib/kubelet/pods/5518b25e-e018-492d-926b-37943813b054/volumes" Oct 12 06:45:07 crc kubenswrapper[4930]: I1012 06:45:07.135959 4930 scope.go:117] "RemoveContainer" containerID="eb75ef60b15b6f921365d8663e4e5999d5843d674724de96ce9347f0001236ed" Oct 12 06:45:07 crc kubenswrapper[4930]: E1012 06:45:07.136543 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:45:22 crc kubenswrapper[4930]: I1012 06:45:22.135481 4930 scope.go:117] "RemoveContainer" containerID="eb75ef60b15b6f921365d8663e4e5999d5843d674724de96ce9347f0001236ed" Oct 12 06:45:22 crc kubenswrapper[4930]: E1012 06:45:22.136623 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:45:28 crc kubenswrapper[4930]: I1012 06:45:28.100472 4930 scope.go:117] "RemoveContainer" containerID="2ff1d3b5d7d499248e998ca9cbe4ffc2ee3adc45747c49d8875a4ddf19d940bc" Oct 12 06:45:34 crc kubenswrapper[4930]: I1012 06:45:34.135640 4930 scope.go:117] "RemoveContainer" containerID="eb75ef60b15b6f921365d8663e4e5999d5843d674724de96ce9347f0001236ed" Oct 12 06:45:34 crc kubenswrapper[4930]: E1012 06:45:34.136588 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:45:46 crc kubenswrapper[4930]: I1012 06:45:46.135929 4930 scope.go:117] "RemoveContainer" containerID="eb75ef60b15b6f921365d8663e4e5999d5843d674724de96ce9347f0001236ed" Oct 12 06:45:46 crc kubenswrapper[4930]: E1012 06:45:46.136588 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:45:49 crc kubenswrapper[4930]: E1012 06:45:49.551389 4930 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.111:57978->38.102.83.111:46517: write tcp 38.102.83.111:57978->38.102.83.111:46517: write: broken pipe Oct 12 06:45:58 crc kubenswrapper[4930]: I1012 06:45:58.146939 4930 scope.go:117] "RemoveContainer" containerID="eb75ef60b15b6f921365d8663e4e5999d5843d674724de96ce9347f0001236ed" Oct 12 06:45:58 crc kubenswrapper[4930]: E1012 06:45:58.147806 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:46:12 crc kubenswrapper[4930]: I1012 06:46:12.136468 4930 scope.go:117] "RemoveContainer" containerID="eb75ef60b15b6f921365d8663e4e5999d5843d674724de96ce9347f0001236ed" Oct 12 06:46:12 crc kubenswrapper[4930]: E1012 06:46:12.137691 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:46:24 crc kubenswrapper[4930]: I1012 06:46:24.135786 4930 scope.go:117] "RemoveContainer" containerID="eb75ef60b15b6f921365d8663e4e5999d5843d674724de96ce9347f0001236ed" Oct 12 06:46:24 crc kubenswrapper[4930]: E1012 06:46:24.136372 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:46:37 crc kubenswrapper[4930]: I1012 06:46:37.135877 4930 scope.go:117] "RemoveContainer" containerID="eb75ef60b15b6f921365d8663e4e5999d5843d674724de96ce9347f0001236ed" Oct 12 06:46:37 crc kubenswrapper[4930]: E1012 06:46:37.136965 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:46:50 crc kubenswrapper[4930]: I1012 06:46:50.136665 4930 scope.go:117] "RemoveContainer" containerID="eb75ef60b15b6f921365d8663e4e5999d5843d674724de96ce9347f0001236ed" Oct 12 06:46:50 crc kubenswrapper[4930]: E1012 06:46:50.138057 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:47:01 crc kubenswrapper[4930]: I1012 06:47:01.136241 4930 scope.go:117] "RemoveContainer" containerID="eb75ef60b15b6f921365d8663e4e5999d5843d674724de96ce9347f0001236ed" Oct 12 06:47:01 crc kubenswrapper[4930]: E1012 06:47:01.137231 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:47:13 crc kubenswrapper[4930]: I1012 06:47:13.136282 4930 scope.go:117] "RemoveContainer" containerID="eb75ef60b15b6f921365d8663e4e5999d5843d674724de96ce9347f0001236ed" Oct 12 06:47:13 crc kubenswrapper[4930]: E1012 06:47:13.137730 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:47:26 crc kubenswrapper[4930]: I1012 06:47:26.135936 4930 scope.go:117] "RemoveContainer" containerID="eb75ef60b15b6f921365d8663e4e5999d5843d674724de96ce9347f0001236ed" Oct 12 06:47:26 crc kubenswrapper[4930]: E1012 06:47:26.136999 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:47:41 crc kubenswrapper[4930]: I1012 06:47:41.135812 4930 scope.go:117] "RemoveContainer" containerID="eb75ef60b15b6f921365d8663e4e5999d5843d674724de96ce9347f0001236ed" Oct 12 06:47:41 crc kubenswrapper[4930]: E1012 06:47:41.136653 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:47:54 crc kubenswrapper[4930]: I1012 06:47:54.135773 4930 scope.go:117] "RemoveContainer" containerID="eb75ef60b15b6f921365d8663e4e5999d5843d674724de96ce9347f0001236ed" Oct 12 06:47:54 crc kubenswrapper[4930]: E1012 06:47:54.136913 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:48:08 crc kubenswrapper[4930]: I1012 06:48:08.147786 4930 scope.go:117] "RemoveContainer" containerID="eb75ef60b15b6f921365d8663e4e5999d5843d674724de96ce9347f0001236ed" Oct 12 06:48:08 crc kubenswrapper[4930]: E1012 06:48:08.149394 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:48:23 crc kubenswrapper[4930]: I1012 06:48:23.135545 4930 scope.go:117] "RemoveContainer" containerID="eb75ef60b15b6f921365d8663e4e5999d5843d674724de96ce9347f0001236ed" Oct 12 06:48:23 crc kubenswrapper[4930]: E1012 06:48:23.137095 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:48:35 crc kubenswrapper[4930]: I1012 06:48:35.135511 4930 scope.go:117] "RemoveContainer" containerID="eb75ef60b15b6f921365d8663e4e5999d5843d674724de96ce9347f0001236ed" Oct 12 06:48:35 crc kubenswrapper[4930]: I1012 06:48:35.752033 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" event={"ID":"02f8684c-a3e4-44e8-9741-9f54488d8d8d","Type":"ContainerStarted","Data":"f7ce11f1c8d9ed9dc321ae1afbc9191e4181abb37c1c7af64b20a7b2dd6a019a"} Oct 12 06:49:50 crc kubenswrapper[4930]: I1012 06:49:50.160156 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hwj69"] Oct 12 06:49:50 crc kubenswrapper[4930]: E1012 06:49:50.161571 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ee28797-be75-4bad-9eca-3cd0444df482" containerName="collect-profiles" Oct 12 06:49:50 crc kubenswrapper[4930]: I1012 06:49:50.161589 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ee28797-be75-4bad-9eca-3cd0444df482" containerName="collect-profiles" Oct 12 06:49:50 crc kubenswrapper[4930]: I1012 06:49:50.161878 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ee28797-be75-4bad-9eca-3cd0444df482" containerName="collect-profiles" Oct 12 06:49:50 crc kubenswrapper[4930]: I1012 06:49:50.163640 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hwj69" Oct 12 06:49:50 crc kubenswrapper[4930]: I1012 06:49:50.178824 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hwj69"] Oct 12 06:49:50 crc kubenswrapper[4930]: I1012 06:49:50.320795 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rj2pb\" (UniqueName: \"kubernetes.io/projected/2da69f1e-f4a8-4e1e-b97c-2bb20e302725-kube-api-access-rj2pb\") pod \"redhat-marketplace-hwj69\" (UID: \"2da69f1e-f4a8-4e1e-b97c-2bb20e302725\") " pod="openshift-marketplace/redhat-marketplace-hwj69" Oct 12 06:49:50 crc kubenswrapper[4930]: I1012 06:49:50.320953 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2da69f1e-f4a8-4e1e-b97c-2bb20e302725-utilities\") pod \"redhat-marketplace-hwj69\" (UID: \"2da69f1e-f4a8-4e1e-b97c-2bb20e302725\") " pod="openshift-marketplace/redhat-marketplace-hwj69" Oct 12 06:49:50 crc kubenswrapper[4930]: I1012 06:49:50.321020 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2da69f1e-f4a8-4e1e-b97c-2bb20e302725-catalog-content\") pod \"redhat-marketplace-hwj69\" (UID: \"2da69f1e-f4a8-4e1e-b97c-2bb20e302725\") " pod="openshift-marketplace/redhat-marketplace-hwj69" Oct 12 06:49:50 crc kubenswrapper[4930]: I1012 06:49:50.422456 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rj2pb\" (UniqueName: \"kubernetes.io/projected/2da69f1e-f4a8-4e1e-b97c-2bb20e302725-kube-api-access-rj2pb\") pod \"redhat-marketplace-hwj69\" (UID: \"2da69f1e-f4a8-4e1e-b97c-2bb20e302725\") " pod="openshift-marketplace/redhat-marketplace-hwj69" Oct 12 06:49:50 crc kubenswrapper[4930]: I1012 06:49:50.422592 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2da69f1e-f4a8-4e1e-b97c-2bb20e302725-utilities\") pod \"redhat-marketplace-hwj69\" (UID: \"2da69f1e-f4a8-4e1e-b97c-2bb20e302725\") " pod="openshift-marketplace/redhat-marketplace-hwj69" Oct 12 06:49:50 crc kubenswrapper[4930]: I1012 06:49:50.422659 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2da69f1e-f4a8-4e1e-b97c-2bb20e302725-catalog-content\") pod \"redhat-marketplace-hwj69\" (UID: \"2da69f1e-f4a8-4e1e-b97c-2bb20e302725\") " pod="openshift-marketplace/redhat-marketplace-hwj69" Oct 12 06:49:50 crc kubenswrapper[4930]: I1012 06:49:50.423170 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2da69f1e-f4a8-4e1e-b97c-2bb20e302725-catalog-content\") pod \"redhat-marketplace-hwj69\" (UID: \"2da69f1e-f4a8-4e1e-b97c-2bb20e302725\") " pod="openshift-marketplace/redhat-marketplace-hwj69" Oct 12 06:49:50 crc kubenswrapper[4930]: I1012 06:49:50.423480 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2da69f1e-f4a8-4e1e-b97c-2bb20e302725-utilities\") pod \"redhat-marketplace-hwj69\" (UID: \"2da69f1e-f4a8-4e1e-b97c-2bb20e302725\") " pod="openshift-marketplace/redhat-marketplace-hwj69" Oct 12 06:49:50 crc kubenswrapper[4930]: I1012 06:49:50.453195 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rj2pb\" (UniqueName: \"kubernetes.io/projected/2da69f1e-f4a8-4e1e-b97c-2bb20e302725-kube-api-access-rj2pb\") pod \"redhat-marketplace-hwj69\" (UID: \"2da69f1e-f4a8-4e1e-b97c-2bb20e302725\") " pod="openshift-marketplace/redhat-marketplace-hwj69" Oct 12 06:49:50 crc kubenswrapper[4930]: I1012 06:49:50.496835 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hwj69" Oct 12 06:49:51 crc kubenswrapper[4930]: I1012 06:49:51.017714 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hwj69"] Oct 12 06:49:51 crc kubenswrapper[4930]: I1012 06:49:51.763972 4930 generic.go:334] "Generic (PLEG): container finished" podID="2da69f1e-f4a8-4e1e-b97c-2bb20e302725" containerID="2e60da73f2dde06491b58691ad8200f56960307ad12140d7a7c982e0bd1700a7" exitCode=0 Oct 12 06:49:51 crc kubenswrapper[4930]: I1012 06:49:51.764041 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hwj69" event={"ID":"2da69f1e-f4a8-4e1e-b97c-2bb20e302725","Type":"ContainerDied","Data":"2e60da73f2dde06491b58691ad8200f56960307ad12140d7a7c982e0bd1700a7"} Oct 12 06:49:51 crc kubenswrapper[4930]: I1012 06:49:51.764356 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hwj69" event={"ID":"2da69f1e-f4a8-4e1e-b97c-2bb20e302725","Type":"ContainerStarted","Data":"eb307b362d4ec503a9000a655709eb9128da7b1a6d0a634e316d472539d7e9be"} Oct 12 06:49:51 crc kubenswrapper[4930]: I1012 06:49:51.767254 4930 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 12 06:49:52 crc kubenswrapper[4930]: I1012 06:49:52.783070 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hwj69" event={"ID":"2da69f1e-f4a8-4e1e-b97c-2bb20e302725","Type":"ContainerStarted","Data":"591077c9af9e2fa31e92639d573a9316ba878a1021330862ca50c865e2cd5526"} Oct 12 06:49:53 crc kubenswrapper[4930]: I1012 06:49:53.798816 4930 generic.go:334] "Generic (PLEG): container finished" podID="2da69f1e-f4a8-4e1e-b97c-2bb20e302725" containerID="591077c9af9e2fa31e92639d573a9316ba878a1021330862ca50c865e2cd5526" exitCode=0 Oct 12 06:49:53 crc kubenswrapper[4930]: I1012 06:49:53.798914 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hwj69" event={"ID":"2da69f1e-f4a8-4e1e-b97c-2bb20e302725","Type":"ContainerDied","Data":"591077c9af9e2fa31e92639d573a9316ba878a1021330862ca50c865e2cd5526"} Oct 12 06:49:53 crc kubenswrapper[4930]: I1012 06:49:53.799149 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hwj69" event={"ID":"2da69f1e-f4a8-4e1e-b97c-2bb20e302725","Type":"ContainerStarted","Data":"325d15625ff71de9683ed5ccd2ffdf8b89875fc42179791958f9957bc7f01979"} Oct 12 06:49:53 crc kubenswrapper[4930]: I1012 06:49:53.835078 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hwj69" podStartSLOduration=2.196593812 podStartE2EDuration="3.835048677s" podCreationTimestamp="2025-10-12 06:49:50 +0000 UTC" firstStartedPulling="2025-10-12 06:49:51.766938135 +0000 UTC m=+4124.309039910" lastFinishedPulling="2025-10-12 06:49:53.40539298 +0000 UTC m=+4125.947494775" observedRunningTime="2025-10-12 06:49:53.820812945 +0000 UTC m=+4126.362914740" watchObservedRunningTime="2025-10-12 06:49:53.835048677 +0000 UTC m=+4126.377150472" Oct 12 06:50:00 crc kubenswrapper[4930]: I1012 06:50:00.497348 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hwj69" Oct 12 06:50:00 crc kubenswrapper[4930]: I1012 06:50:00.498537 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hwj69" Oct 12 06:50:00 crc kubenswrapper[4930]: I1012 06:50:00.571991 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hwj69" Oct 12 06:50:00 crc kubenswrapper[4930]: I1012 06:50:00.966035 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hwj69" Oct 12 06:50:01 crc kubenswrapper[4930]: I1012 06:50:01.031942 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hwj69"] Oct 12 06:50:02 crc kubenswrapper[4930]: I1012 06:50:02.915518 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hwj69" podUID="2da69f1e-f4a8-4e1e-b97c-2bb20e302725" containerName="registry-server" containerID="cri-o://325d15625ff71de9683ed5ccd2ffdf8b89875fc42179791958f9957bc7f01979" gracePeriod=2 Oct 12 06:50:03 crc kubenswrapper[4930]: I1012 06:50:03.455613 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hwj69" Oct 12 06:50:03 crc kubenswrapper[4930]: I1012 06:50:03.558112 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rj2pb\" (UniqueName: \"kubernetes.io/projected/2da69f1e-f4a8-4e1e-b97c-2bb20e302725-kube-api-access-rj2pb\") pod \"2da69f1e-f4a8-4e1e-b97c-2bb20e302725\" (UID: \"2da69f1e-f4a8-4e1e-b97c-2bb20e302725\") " Oct 12 06:50:03 crc kubenswrapper[4930]: I1012 06:50:03.558420 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2da69f1e-f4a8-4e1e-b97c-2bb20e302725-catalog-content\") pod \"2da69f1e-f4a8-4e1e-b97c-2bb20e302725\" (UID: \"2da69f1e-f4a8-4e1e-b97c-2bb20e302725\") " Oct 12 06:50:03 crc kubenswrapper[4930]: I1012 06:50:03.558526 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2da69f1e-f4a8-4e1e-b97c-2bb20e302725-utilities\") pod \"2da69f1e-f4a8-4e1e-b97c-2bb20e302725\" (UID: \"2da69f1e-f4a8-4e1e-b97c-2bb20e302725\") " Oct 12 06:50:03 crc kubenswrapper[4930]: I1012 06:50:03.559837 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2da69f1e-f4a8-4e1e-b97c-2bb20e302725-utilities" (OuterVolumeSpecName: "utilities") pod "2da69f1e-f4a8-4e1e-b97c-2bb20e302725" (UID: "2da69f1e-f4a8-4e1e-b97c-2bb20e302725"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 06:50:03 crc kubenswrapper[4930]: I1012 06:50:03.567165 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2da69f1e-f4a8-4e1e-b97c-2bb20e302725-kube-api-access-rj2pb" (OuterVolumeSpecName: "kube-api-access-rj2pb") pod "2da69f1e-f4a8-4e1e-b97c-2bb20e302725" (UID: "2da69f1e-f4a8-4e1e-b97c-2bb20e302725"). InnerVolumeSpecName "kube-api-access-rj2pb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:50:03 crc kubenswrapper[4930]: I1012 06:50:03.576265 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2da69f1e-f4a8-4e1e-b97c-2bb20e302725-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2da69f1e-f4a8-4e1e-b97c-2bb20e302725" (UID: "2da69f1e-f4a8-4e1e-b97c-2bb20e302725"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 06:50:03 crc kubenswrapper[4930]: I1012 06:50:03.661352 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rj2pb\" (UniqueName: \"kubernetes.io/projected/2da69f1e-f4a8-4e1e-b97c-2bb20e302725-kube-api-access-rj2pb\") on node \"crc\" DevicePath \"\"" Oct 12 06:50:03 crc kubenswrapper[4930]: I1012 06:50:03.661383 4930 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2da69f1e-f4a8-4e1e-b97c-2bb20e302725-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 06:50:03 crc kubenswrapper[4930]: I1012 06:50:03.661393 4930 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2da69f1e-f4a8-4e1e-b97c-2bb20e302725-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 06:50:03 crc kubenswrapper[4930]: I1012 06:50:03.931625 4930 generic.go:334] "Generic (PLEG): container finished" podID="2da69f1e-f4a8-4e1e-b97c-2bb20e302725" containerID="325d15625ff71de9683ed5ccd2ffdf8b89875fc42179791958f9957bc7f01979" exitCode=0 Oct 12 06:50:03 crc kubenswrapper[4930]: I1012 06:50:03.931710 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hwj69" event={"ID":"2da69f1e-f4a8-4e1e-b97c-2bb20e302725","Type":"ContainerDied","Data":"325d15625ff71de9683ed5ccd2ffdf8b89875fc42179791958f9957bc7f01979"} Oct 12 06:50:03 crc kubenswrapper[4930]: I1012 06:50:03.931801 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hwj69" event={"ID":"2da69f1e-f4a8-4e1e-b97c-2bb20e302725","Type":"ContainerDied","Data":"eb307b362d4ec503a9000a655709eb9128da7b1a6d0a634e316d472539d7e9be"} Oct 12 06:50:03 crc kubenswrapper[4930]: I1012 06:50:03.931845 4930 scope.go:117] "RemoveContainer" containerID="325d15625ff71de9683ed5ccd2ffdf8b89875fc42179791958f9957bc7f01979" Oct 12 06:50:03 crc kubenswrapper[4930]: I1012 06:50:03.932091 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hwj69" Oct 12 06:50:03 crc kubenswrapper[4930]: I1012 06:50:03.972867 4930 scope.go:117] "RemoveContainer" containerID="591077c9af9e2fa31e92639d573a9316ba878a1021330862ca50c865e2cd5526" Oct 12 06:50:03 crc kubenswrapper[4930]: I1012 06:50:03.973034 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hwj69"] Oct 12 06:50:03 crc kubenswrapper[4930]: I1012 06:50:03.984911 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hwj69"] Oct 12 06:50:04 crc kubenswrapper[4930]: I1012 06:50:04.000822 4930 scope.go:117] "RemoveContainer" containerID="2e60da73f2dde06491b58691ad8200f56960307ad12140d7a7c982e0bd1700a7" Oct 12 06:50:04 crc kubenswrapper[4930]: I1012 06:50:04.060122 4930 scope.go:117] "RemoveContainer" containerID="325d15625ff71de9683ed5ccd2ffdf8b89875fc42179791958f9957bc7f01979" Oct 12 06:50:04 crc kubenswrapper[4930]: E1012 06:50:04.060704 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"325d15625ff71de9683ed5ccd2ffdf8b89875fc42179791958f9957bc7f01979\": container with ID starting with 325d15625ff71de9683ed5ccd2ffdf8b89875fc42179791958f9957bc7f01979 not found: ID does not exist" containerID="325d15625ff71de9683ed5ccd2ffdf8b89875fc42179791958f9957bc7f01979" Oct 12 06:50:04 crc kubenswrapper[4930]: I1012 06:50:04.060790 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"325d15625ff71de9683ed5ccd2ffdf8b89875fc42179791958f9957bc7f01979"} err="failed to get container status \"325d15625ff71de9683ed5ccd2ffdf8b89875fc42179791958f9957bc7f01979\": rpc error: code = NotFound desc = could not find container \"325d15625ff71de9683ed5ccd2ffdf8b89875fc42179791958f9957bc7f01979\": container with ID starting with 325d15625ff71de9683ed5ccd2ffdf8b89875fc42179791958f9957bc7f01979 not found: ID does not exist" Oct 12 06:50:04 crc kubenswrapper[4930]: I1012 06:50:04.060823 4930 scope.go:117] "RemoveContainer" containerID="591077c9af9e2fa31e92639d573a9316ba878a1021330862ca50c865e2cd5526" Oct 12 06:50:04 crc kubenswrapper[4930]: E1012 06:50:04.061494 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"591077c9af9e2fa31e92639d573a9316ba878a1021330862ca50c865e2cd5526\": container with ID starting with 591077c9af9e2fa31e92639d573a9316ba878a1021330862ca50c865e2cd5526 not found: ID does not exist" containerID="591077c9af9e2fa31e92639d573a9316ba878a1021330862ca50c865e2cd5526" Oct 12 06:50:04 crc kubenswrapper[4930]: I1012 06:50:04.061557 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"591077c9af9e2fa31e92639d573a9316ba878a1021330862ca50c865e2cd5526"} err="failed to get container status \"591077c9af9e2fa31e92639d573a9316ba878a1021330862ca50c865e2cd5526\": rpc error: code = NotFound desc = could not find container \"591077c9af9e2fa31e92639d573a9316ba878a1021330862ca50c865e2cd5526\": container with ID starting with 591077c9af9e2fa31e92639d573a9316ba878a1021330862ca50c865e2cd5526 not found: ID does not exist" Oct 12 06:50:04 crc kubenswrapper[4930]: I1012 06:50:04.061597 4930 scope.go:117] "RemoveContainer" containerID="2e60da73f2dde06491b58691ad8200f56960307ad12140d7a7c982e0bd1700a7" Oct 12 06:50:04 crc kubenswrapper[4930]: E1012 06:50:04.062008 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e60da73f2dde06491b58691ad8200f56960307ad12140d7a7c982e0bd1700a7\": container with ID starting with 2e60da73f2dde06491b58691ad8200f56960307ad12140d7a7c982e0bd1700a7 not found: ID does not exist" containerID="2e60da73f2dde06491b58691ad8200f56960307ad12140d7a7c982e0bd1700a7" Oct 12 06:50:04 crc kubenswrapper[4930]: I1012 06:50:04.062048 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e60da73f2dde06491b58691ad8200f56960307ad12140d7a7c982e0bd1700a7"} err="failed to get container status \"2e60da73f2dde06491b58691ad8200f56960307ad12140d7a7c982e0bd1700a7\": rpc error: code = NotFound desc = could not find container \"2e60da73f2dde06491b58691ad8200f56960307ad12140d7a7c982e0bd1700a7\": container with ID starting with 2e60da73f2dde06491b58691ad8200f56960307ad12140d7a7c982e0bd1700a7 not found: ID does not exist" Oct 12 06:50:04 crc kubenswrapper[4930]: I1012 06:50:04.159660 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2da69f1e-f4a8-4e1e-b97c-2bb20e302725" path="/var/lib/kubelet/pods/2da69f1e-f4a8-4e1e-b97c-2bb20e302725/volumes" Oct 12 06:51:03 crc kubenswrapper[4930]: I1012 06:51:03.669800 4930 patch_prober.go:28] interesting pod/machine-config-daemon-mk4tf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 06:51:03 crc kubenswrapper[4930]: I1012 06:51:03.670435 4930 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 06:51:33 crc kubenswrapper[4930]: I1012 06:51:33.669825 4930 patch_prober.go:28] interesting pod/machine-config-daemon-mk4tf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 06:51:33 crc kubenswrapper[4930]: I1012 06:51:33.670275 4930 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 06:51:40 crc kubenswrapper[4930]: I1012 06:51:40.519058 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5k9hr"] Oct 12 06:51:40 crc kubenswrapper[4930]: E1012 06:51:40.519808 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2da69f1e-f4a8-4e1e-b97c-2bb20e302725" containerName="registry-server" Oct 12 06:51:40 crc kubenswrapper[4930]: I1012 06:51:40.519820 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="2da69f1e-f4a8-4e1e-b97c-2bb20e302725" containerName="registry-server" Oct 12 06:51:40 crc kubenswrapper[4930]: E1012 06:51:40.519831 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2da69f1e-f4a8-4e1e-b97c-2bb20e302725" containerName="extract-content" Oct 12 06:51:40 crc kubenswrapper[4930]: I1012 06:51:40.519837 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="2da69f1e-f4a8-4e1e-b97c-2bb20e302725" containerName="extract-content" Oct 12 06:51:40 crc kubenswrapper[4930]: E1012 06:51:40.519869 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2da69f1e-f4a8-4e1e-b97c-2bb20e302725" containerName="extract-utilities" Oct 12 06:51:40 crc kubenswrapper[4930]: I1012 06:51:40.519876 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="2da69f1e-f4a8-4e1e-b97c-2bb20e302725" containerName="extract-utilities" Oct 12 06:51:40 crc kubenswrapper[4930]: I1012 06:51:40.520068 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="2da69f1e-f4a8-4e1e-b97c-2bb20e302725" containerName="registry-server" Oct 12 06:51:40 crc kubenswrapper[4930]: I1012 06:51:40.521398 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5k9hr" Oct 12 06:51:40 crc kubenswrapper[4930]: I1012 06:51:40.537847 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5k9hr"] Oct 12 06:51:40 crc kubenswrapper[4930]: I1012 06:51:40.630377 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t996s\" (UniqueName: \"kubernetes.io/projected/ae38942d-f53e-48f3-a81b-ed43467ef540-kube-api-access-t996s\") pod \"redhat-operators-5k9hr\" (UID: \"ae38942d-f53e-48f3-a81b-ed43467ef540\") " pod="openshift-marketplace/redhat-operators-5k9hr" Oct 12 06:51:40 crc kubenswrapper[4930]: I1012 06:51:40.630433 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae38942d-f53e-48f3-a81b-ed43467ef540-catalog-content\") pod \"redhat-operators-5k9hr\" (UID: \"ae38942d-f53e-48f3-a81b-ed43467ef540\") " pod="openshift-marketplace/redhat-operators-5k9hr" Oct 12 06:51:40 crc kubenswrapper[4930]: I1012 06:51:40.630726 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae38942d-f53e-48f3-a81b-ed43467ef540-utilities\") pod \"redhat-operators-5k9hr\" (UID: \"ae38942d-f53e-48f3-a81b-ed43467ef540\") " pod="openshift-marketplace/redhat-operators-5k9hr" Oct 12 06:51:40 crc kubenswrapper[4930]: I1012 06:51:40.733090 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae38942d-f53e-48f3-a81b-ed43467ef540-catalog-content\") pod \"redhat-operators-5k9hr\" (UID: \"ae38942d-f53e-48f3-a81b-ed43467ef540\") " pod="openshift-marketplace/redhat-operators-5k9hr" Oct 12 06:51:40 crc kubenswrapper[4930]: I1012 06:51:40.733216 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae38942d-f53e-48f3-a81b-ed43467ef540-utilities\") pod \"redhat-operators-5k9hr\" (UID: \"ae38942d-f53e-48f3-a81b-ed43467ef540\") " pod="openshift-marketplace/redhat-operators-5k9hr" Oct 12 06:51:40 crc kubenswrapper[4930]: I1012 06:51:40.733301 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t996s\" (UniqueName: \"kubernetes.io/projected/ae38942d-f53e-48f3-a81b-ed43467ef540-kube-api-access-t996s\") pod \"redhat-operators-5k9hr\" (UID: \"ae38942d-f53e-48f3-a81b-ed43467ef540\") " pod="openshift-marketplace/redhat-operators-5k9hr" Oct 12 06:51:40 crc kubenswrapper[4930]: I1012 06:51:40.733851 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae38942d-f53e-48f3-a81b-ed43467ef540-catalog-content\") pod \"redhat-operators-5k9hr\" (UID: \"ae38942d-f53e-48f3-a81b-ed43467ef540\") " pod="openshift-marketplace/redhat-operators-5k9hr" Oct 12 06:51:40 crc kubenswrapper[4930]: I1012 06:51:40.733857 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae38942d-f53e-48f3-a81b-ed43467ef540-utilities\") pod \"redhat-operators-5k9hr\" (UID: \"ae38942d-f53e-48f3-a81b-ed43467ef540\") " pod="openshift-marketplace/redhat-operators-5k9hr" Oct 12 06:51:40 crc kubenswrapper[4930]: I1012 06:51:40.765462 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t996s\" (UniqueName: \"kubernetes.io/projected/ae38942d-f53e-48f3-a81b-ed43467ef540-kube-api-access-t996s\") pod \"redhat-operators-5k9hr\" (UID: \"ae38942d-f53e-48f3-a81b-ed43467ef540\") " pod="openshift-marketplace/redhat-operators-5k9hr" Oct 12 06:51:40 crc kubenswrapper[4930]: I1012 06:51:40.839545 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5k9hr" Oct 12 06:51:41 crc kubenswrapper[4930]: I1012 06:51:41.364078 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5k9hr"] Oct 12 06:51:42 crc kubenswrapper[4930]: I1012 06:51:42.149255 4930 generic.go:334] "Generic (PLEG): container finished" podID="ae38942d-f53e-48f3-a81b-ed43467ef540" containerID="995f14289f3880f02e8c58e49759a69ef499fc4c5b771602ae28ce19e2416e96" exitCode=0 Oct 12 06:51:42 crc kubenswrapper[4930]: I1012 06:51:42.151041 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5k9hr" event={"ID":"ae38942d-f53e-48f3-a81b-ed43467ef540","Type":"ContainerDied","Data":"995f14289f3880f02e8c58e49759a69ef499fc4c5b771602ae28ce19e2416e96"} Oct 12 06:51:42 crc kubenswrapper[4930]: I1012 06:51:42.151126 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5k9hr" event={"ID":"ae38942d-f53e-48f3-a81b-ed43467ef540","Type":"ContainerStarted","Data":"859d549571b849b6be9779e35877c30e2f766ae3ae15a3785f4e3a1dca10f742"} Oct 12 06:51:43 crc kubenswrapper[4930]: I1012 06:51:43.107844 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ffcsh"] Oct 12 06:51:43 crc kubenswrapper[4930]: I1012 06:51:43.111992 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ffcsh" Oct 12 06:51:43 crc kubenswrapper[4930]: I1012 06:51:43.136667 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ffcsh"] Oct 12 06:51:43 crc kubenswrapper[4930]: I1012 06:51:43.182199 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6479a0ef-f6ea-476e-80b7-ded4c0a56451-catalog-content\") pod \"community-operators-ffcsh\" (UID: \"6479a0ef-f6ea-476e-80b7-ded4c0a56451\") " pod="openshift-marketplace/community-operators-ffcsh" Oct 12 06:51:43 crc kubenswrapper[4930]: I1012 06:51:43.182358 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6479a0ef-f6ea-476e-80b7-ded4c0a56451-utilities\") pod \"community-operators-ffcsh\" (UID: \"6479a0ef-f6ea-476e-80b7-ded4c0a56451\") " pod="openshift-marketplace/community-operators-ffcsh" Oct 12 06:51:43 crc kubenswrapper[4930]: I1012 06:51:43.182532 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zllgm\" (UniqueName: \"kubernetes.io/projected/6479a0ef-f6ea-476e-80b7-ded4c0a56451-kube-api-access-zllgm\") pod \"community-operators-ffcsh\" (UID: \"6479a0ef-f6ea-476e-80b7-ded4c0a56451\") " pod="openshift-marketplace/community-operators-ffcsh" Oct 12 06:51:43 crc kubenswrapper[4930]: I1012 06:51:43.284337 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6479a0ef-f6ea-476e-80b7-ded4c0a56451-catalog-content\") pod \"community-operators-ffcsh\" (UID: \"6479a0ef-f6ea-476e-80b7-ded4c0a56451\") " pod="openshift-marketplace/community-operators-ffcsh" Oct 12 06:51:43 crc kubenswrapper[4930]: I1012 06:51:43.284420 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6479a0ef-f6ea-476e-80b7-ded4c0a56451-utilities\") pod \"community-operators-ffcsh\" (UID: \"6479a0ef-f6ea-476e-80b7-ded4c0a56451\") " pod="openshift-marketplace/community-operators-ffcsh" Oct 12 06:51:43 crc kubenswrapper[4930]: I1012 06:51:43.284486 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zllgm\" (UniqueName: \"kubernetes.io/projected/6479a0ef-f6ea-476e-80b7-ded4c0a56451-kube-api-access-zllgm\") pod \"community-operators-ffcsh\" (UID: \"6479a0ef-f6ea-476e-80b7-ded4c0a56451\") " pod="openshift-marketplace/community-operators-ffcsh" Oct 12 06:51:43 crc kubenswrapper[4930]: I1012 06:51:43.285014 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6479a0ef-f6ea-476e-80b7-ded4c0a56451-catalog-content\") pod \"community-operators-ffcsh\" (UID: \"6479a0ef-f6ea-476e-80b7-ded4c0a56451\") " pod="openshift-marketplace/community-operators-ffcsh" Oct 12 06:51:43 crc kubenswrapper[4930]: I1012 06:51:43.285037 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6479a0ef-f6ea-476e-80b7-ded4c0a56451-utilities\") pod \"community-operators-ffcsh\" (UID: \"6479a0ef-f6ea-476e-80b7-ded4c0a56451\") " pod="openshift-marketplace/community-operators-ffcsh" Oct 12 06:51:43 crc kubenswrapper[4930]: I1012 06:51:43.318388 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zllgm\" (UniqueName: \"kubernetes.io/projected/6479a0ef-f6ea-476e-80b7-ded4c0a56451-kube-api-access-zllgm\") pod \"community-operators-ffcsh\" (UID: \"6479a0ef-f6ea-476e-80b7-ded4c0a56451\") " pod="openshift-marketplace/community-operators-ffcsh" Oct 12 06:51:43 crc kubenswrapper[4930]: I1012 06:51:43.435678 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ffcsh" Oct 12 06:51:43 crc kubenswrapper[4930]: I1012 06:51:43.989501 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ffcsh"] Oct 12 06:51:44 crc kubenswrapper[4930]: W1012 06:51:44.011921 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6479a0ef_f6ea_476e_80b7_ded4c0a56451.slice/crio-a17add444d9f207408feb9f08618a54826737d1c3ee0a3d33e4e83f65f381d8d WatchSource:0}: Error finding container a17add444d9f207408feb9f08618a54826737d1c3ee0a3d33e4e83f65f381d8d: Status 404 returned error can't find the container with id a17add444d9f207408feb9f08618a54826737d1c3ee0a3d33e4e83f65f381d8d Oct 12 06:51:44 crc kubenswrapper[4930]: I1012 06:51:44.198863 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ffcsh" event={"ID":"6479a0ef-f6ea-476e-80b7-ded4c0a56451","Type":"ContainerStarted","Data":"a17add444d9f207408feb9f08618a54826737d1c3ee0a3d33e4e83f65f381d8d"} Oct 12 06:51:44 crc kubenswrapper[4930]: I1012 06:51:44.201491 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5k9hr" event={"ID":"ae38942d-f53e-48f3-a81b-ed43467ef540","Type":"ContainerStarted","Data":"76786a1927b5c16e9ab1742697dc7c12aae386f4b2f33c4ba8bc3ac5dd6137c0"} Oct 12 06:51:45 crc kubenswrapper[4930]: I1012 06:51:45.218831 4930 generic.go:334] "Generic (PLEG): container finished" podID="6479a0ef-f6ea-476e-80b7-ded4c0a56451" containerID="dc6f382b15af3fbf4e552dc0b3fdb692e3582b82c816f3740129e4c41ae5df89" exitCode=0 Oct 12 06:51:45 crc kubenswrapper[4930]: I1012 06:51:45.221464 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ffcsh" event={"ID":"6479a0ef-f6ea-476e-80b7-ded4c0a56451","Type":"ContainerDied","Data":"dc6f382b15af3fbf4e552dc0b3fdb692e3582b82c816f3740129e4c41ae5df89"} Oct 12 06:51:46 crc kubenswrapper[4930]: I1012 06:51:46.236164 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ffcsh" event={"ID":"6479a0ef-f6ea-476e-80b7-ded4c0a56451","Type":"ContainerStarted","Data":"c48812d604e73240278376e0faa61ad6cfb681342235e2fce858cf762e8e6e21"} Oct 12 06:51:48 crc kubenswrapper[4930]: I1012 06:51:48.264568 4930 generic.go:334] "Generic (PLEG): container finished" podID="ae38942d-f53e-48f3-a81b-ed43467ef540" containerID="76786a1927b5c16e9ab1742697dc7c12aae386f4b2f33c4ba8bc3ac5dd6137c0" exitCode=0 Oct 12 06:51:48 crc kubenswrapper[4930]: I1012 06:51:48.264812 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5k9hr" event={"ID":"ae38942d-f53e-48f3-a81b-ed43467ef540","Type":"ContainerDied","Data":"76786a1927b5c16e9ab1742697dc7c12aae386f4b2f33c4ba8bc3ac5dd6137c0"} Oct 12 06:51:48 crc kubenswrapper[4930]: I1012 06:51:48.269371 4930 generic.go:334] "Generic (PLEG): container finished" podID="6479a0ef-f6ea-476e-80b7-ded4c0a56451" containerID="c48812d604e73240278376e0faa61ad6cfb681342235e2fce858cf762e8e6e21" exitCode=0 Oct 12 06:51:48 crc kubenswrapper[4930]: I1012 06:51:48.269411 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ffcsh" event={"ID":"6479a0ef-f6ea-476e-80b7-ded4c0a56451","Type":"ContainerDied","Data":"c48812d604e73240278376e0faa61ad6cfb681342235e2fce858cf762e8e6e21"} Oct 12 06:51:49 crc kubenswrapper[4930]: I1012 06:51:49.283630 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ffcsh" event={"ID":"6479a0ef-f6ea-476e-80b7-ded4c0a56451","Type":"ContainerStarted","Data":"61f53b2259893b5e93b20055ae795f2b0aec16a445b4873e1848f8b3a1f49fe1"} Oct 12 06:51:49 crc kubenswrapper[4930]: I1012 06:51:49.287432 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5k9hr" event={"ID":"ae38942d-f53e-48f3-a81b-ed43467ef540","Type":"ContainerStarted","Data":"c2de068e3c232ce8cbbaed47ba84db976a7884ae99223d28732de9b3ab02e076"} Oct 12 06:51:49 crc kubenswrapper[4930]: I1012 06:51:49.316386 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ffcsh" podStartSLOduration=2.821833663 podStartE2EDuration="6.316365113s" podCreationTimestamp="2025-10-12 06:51:43 +0000 UTC" firstStartedPulling="2025-10-12 06:51:45.223861317 +0000 UTC m=+4237.765963092" lastFinishedPulling="2025-10-12 06:51:48.718392777 +0000 UTC m=+4241.260494542" observedRunningTime="2025-10-12 06:51:49.305061984 +0000 UTC m=+4241.847163759" watchObservedRunningTime="2025-10-12 06:51:49.316365113 +0000 UTC m=+4241.858466888" Oct 12 06:51:49 crc kubenswrapper[4930]: I1012 06:51:49.336080 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5k9hr" podStartSLOduration=2.751846475 podStartE2EDuration="9.33605197s" podCreationTimestamp="2025-10-12 06:51:40 +0000 UTC" firstStartedPulling="2025-10-12 06:51:42.151678254 +0000 UTC m=+4234.693780039" lastFinishedPulling="2025-10-12 06:51:48.735883739 +0000 UTC m=+4241.277985534" observedRunningTime="2025-10-12 06:51:49.324975416 +0000 UTC m=+4241.867077181" watchObservedRunningTime="2025-10-12 06:51:49.33605197 +0000 UTC m=+4241.878153735" Oct 12 06:51:50 crc kubenswrapper[4930]: I1012 06:51:50.839882 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5k9hr" Oct 12 06:51:50 crc kubenswrapper[4930]: I1012 06:51:50.840199 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5k9hr" Oct 12 06:51:51 crc kubenswrapper[4930]: I1012 06:51:51.903111 4930 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5k9hr" podUID="ae38942d-f53e-48f3-a81b-ed43467ef540" containerName="registry-server" probeResult="failure" output=< Oct 12 06:51:51 crc kubenswrapper[4930]: timeout: failed to connect service ":50051" within 1s Oct 12 06:51:51 crc kubenswrapper[4930]: > Oct 12 06:51:53 crc kubenswrapper[4930]: I1012 06:51:53.436928 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ffcsh" Oct 12 06:51:53 crc kubenswrapper[4930]: I1012 06:51:53.437201 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ffcsh" Oct 12 06:51:53 crc kubenswrapper[4930]: I1012 06:51:53.522428 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ffcsh" Oct 12 06:51:54 crc kubenswrapper[4930]: I1012 06:51:54.414621 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ffcsh" Oct 12 06:51:54 crc kubenswrapper[4930]: I1012 06:51:54.497510 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ffcsh"] Oct 12 06:51:56 crc kubenswrapper[4930]: I1012 06:51:56.367583 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ffcsh" podUID="6479a0ef-f6ea-476e-80b7-ded4c0a56451" containerName="registry-server" containerID="cri-o://61f53b2259893b5e93b20055ae795f2b0aec16a445b4873e1848f8b3a1f49fe1" gracePeriod=2 Oct 12 06:51:56 crc kubenswrapper[4930]: I1012 06:51:56.872375 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ffcsh" Oct 12 06:51:57 crc kubenswrapper[4930]: I1012 06:51:57.059777 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zllgm\" (UniqueName: \"kubernetes.io/projected/6479a0ef-f6ea-476e-80b7-ded4c0a56451-kube-api-access-zllgm\") pod \"6479a0ef-f6ea-476e-80b7-ded4c0a56451\" (UID: \"6479a0ef-f6ea-476e-80b7-ded4c0a56451\") " Oct 12 06:51:57 crc kubenswrapper[4930]: I1012 06:51:57.060523 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6479a0ef-f6ea-476e-80b7-ded4c0a56451-catalog-content\") pod \"6479a0ef-f6ea-476e-80b7-ded4c0a56451\" (UID: \"6479a0ef-f6ea-476e-80b7-ded4c0a56451\") " Oct 12 06:51:57 crc kubenswrapper[4930]: I1012 06:51:57.060616 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6479a0ef-f6ea-476e-80b7-ded4c0a56451-utilities\") pod \"6479a0ef-f6ea-476e-80b7-ded4c0a56451\" (UID: \"6479a0ef-f6ea-476e-80b7-ded4c0a56451\") " Oct 12 06:51:57 crc kubenswrapper[4930]: I1012 06:51:57.061870 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6479a0ef-f6ea-476e-80b7-ded4c0a56451-utilities" (OuterVolumeSpecName: "utilities") pod "6479a0ef-f6ea-476e-80b7-ded4c0a56451" (UID: "6479a0ef-f6ea-476e-80b7-ded4c0a56451"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 06:51:57 crc kubenswrapper[4930]: I1012 06:51:57.072428 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6479a0ef-f6ea-476e-80b7-ded4c0a56451-kube-api-access-zllgm" (OuterVolumeSpecName: "kube-api-access-zllgm") pod "6479a0ef-f6ea-476e-80b7-ded4c0a56451" (UID: "6479a0ef-f6ea-476e-80b7-ded4c0a56451"). InnerVolumeSpecName "kube-api-access-zllgm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:51:57 crc kubenswrapper[4930]: I1012 06:51:57.109605 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6479a0ef-f6ea-476e-80b7-ded4c0a56451-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6479a0ef-f6ea-476e-80b7-ded4c0a56451" (UID: "6479a0ef-f6ea-476e-80b7-ded4c0a56451"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 06:51:57 crc kubenswrapper[4930]: I1012 06:51:57.163052 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zllgm\" (UniqueName: \"kubernetes.io/projected/6479a0ef-f6ea-476e-80b7-ded4c0a56451-kube-api-access-zllgm\") on node \"crc\" DevicePath \"\"" Oct 12 06:51:57 crc kubenswrapper[4930]: I1012 06:51:57.163090 4930 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6479a0ef-f6ea-476e-80b7-ded4c0a56451-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 06:51:57 crc kubenswrapper[4930]: I1012 06:51:57.163102 4930 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6479a0ef-f6ea-476e-80b7-ded4c0a56451-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 06:51:57 crc kubenswrapper[4930]: I1012 06:51:57.385402 4930 generic.go:334] "Generic (PLEG): container finished" podID="6479a0ef-f6ea-476e-80b7-ded4c0a56451" containerID="61f53b2259893b5e93b20055ae795f2b0aec16a445b4873e1848f8b3a1f49fe1" exitCode=0 Oct 12 06:51:57 crc kubenswrapper[4930]: I1012 06:51:57.385465 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ffcsh" event={"ID":"6479a0ef-f6ea-476e-80b7-ded4c0a56451","Type":"ContainerDied","Data":"61f53b2259893b5e93b20055ae795f2b0aec16a445b4873e1848f8b3a1f49fe1"} Oct 12 06:51:57 crc kubenswrapper[4930]: I1012 06:51:57.385541 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ffcsh" event={"ID":"6479a0ef-f6ea-476e-80b7-ded4c0a56451","Type":"ContainerDied","Data":"a17add444d9f207408feb9f08618a54826737d1c3ee0a3d33e4e83f65f381d8d"} Oct 12 06:51:57 crc kubenswrapper[4930]: I1012 06:51:57.385539 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ffcsh" Oct 12 06:51:57 crc kubenswrapper[4930]: I1012 06:51:57.385565 4930 scope.go:117] "RemoveContainer" containerID="61f53b2259893b5e93b20055ae795f2b0aec16a445b4873e1848f8b3a1f49fe1" Oct 12 06:51:57 crc kubenswrapper[4930]: I1012 06:51:57.419921 4930 scope.go:117] "RemoveContainer" containerID="c48812d604e73240278376e0faa61ad6cfb681342235e2fce858cf762e8e6e21" Oct 12 06:51:57 crc kubenswrapper[4930]: I1012 06:51:57.459662 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ffcsh"] Oct 12 06:51:57 crc kubenswrapper[4930]: I1012 06:51:57.465458 4930 scope.go:117] "RemoveContainer" containerID="dc6f382b15af3fbf4e552dc0b3fdb692e3582b82c816f3740129e4c41ae5df89" Oct 12 06:51:57 crc kubenswrapper[4930]: I1012 06:51:57.477009 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ffcsh"] Oct 12 06:51:57 crc kubenswrapper[4930]: I1012 06:51:57.512533 4930 scope.go:117] "RemoveContainer" containerID="61f53b2259893b5e93b20055ae795f2b0aec16a445b4873e1848f8b3a1f49fe1" Oct 12 06:51:57 crc kubenswrapper[4930]: E1012 06:51:57.513581 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61f53b2259893b5e93b20055ae795f2b0aec16a445b4873e1848f8b3a1f49fe1\": container with ID starting with 61f53b2259893b5e93b20055ae795f2b0aec16a445b4873e1848f8b3a1f49fe1 not found: ID does not exist" containerID="61f53b2259893b5e93b20055ae795f2b0aec16a445b4873e1848f8b3a1f49fe1" Oct 12 06:51:57 crc kubenswrapper[4930]: I1012 06:51:57.513626 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61f53b2259893b5e93b20055ae795f2b0aec16a445b4873e1848f8b3a1f49fe1"} err="failed to get container status \"61f53b2259893b5e93b20055ae795f2b0aec16a445b4873e1848f8b3a1f49fe1\": rpc error: code = NotFound desc = could not find container \"61f53b2259893b5e93b20055ae795f2b0aec16a445b4873e1848f8b3a1f49fe1\": container with ID starting with 61f53b2259893b5e93b20055ae795f2b0aec16a445b4873e1848f8b3a1f49fe1 not found: ID does not exist" Oct 12 06:51:57 crc kubenswrapper[4930]: I1012 06:51:57.513653 4930 scope.go:117] "RemoveContainer" containerID="c48812d604e73240278376e0faa61ad6cfb681342235e2fce858cf762e8e6e21" Oct 12 06:51:57 crc kubenswrapper[4930]: E1012 06:51:57.514007 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c48812d604e73240278376e0faa61ad6cfb681342235e2fce858cf762e8e6e21\": container with ID starting with c48812d604e73240278376e0faa61ad6cfb681342235e2fce858cf762e8e6e21 not found: ID does not exist" containerID="c48812d604e73240278376e0faa61ad6cfb681342235e2fce858cf762e8e6e21" Oct 12 06:51:57 crc kubenswrapper[4930]: I1012 06:51:57.514027 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c48812d604e73240278376e0faa61ad6cfb681342235e2fce858cf762e8e6e21"} err="failed to get container status \"c48812d604e73240278376e0faa61ad6cfb681342235e2fce858cf762e8e6e21\": rpc error: code = NotFound desc = could not find container \"c48812d604e73240278376e0faa61ad6cfb681342235e2fce858cf762e8e6e21\": container with ID starting with c48812d604e73240278376e0faa61ad6cfb681342235e2fce858cf762e8e6e21 not found: ID does not exist" Oct 12 06:51:57 crc kubenswrapper[4930]: I1012 06:51:57.514039 4930 scope.go:117] "RemoveContainer" containerID="dc6f382b15af3fbf4e552dc0b3fdb692e3582b82c816f3740129e4c41ae5df89" Oct 12 06:51:57 crc kubenswrapper[4930]: E1012 06:51:57.514383 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc6f382b15af3fbf4e552dc0b3fdb692e3582b82c816f3740129e4c41ae5df89\": container with ID starting with dc6f382b15af3fbf4e552dc0b3fdb692e3582b82c816f3740129e4c41ae5df89 not found: ID does not exist" containerID="dc6f382b15af3fbf4e552dc0b3fdb692e3582b82c816f3740129e4c41ae5df89" Oct 12 06:51:57 crc kubenswrapper[4930]: I1012 06:51:57.514485 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc6f382b15af3fbf4e552dc0b3fdb692e3582b82c816f3740129e4c41ae5df89"} err="failed to get container status \"dc6f382b15af3fbf4e552dc0b3fdb692e3582b82c816f3740129e4c41ae5df89\": rpc error: code = NotFound desc = could not find container \"dc6f382b15af3fbf4e552dc0b3fdb692e3582b82c816f3740129e4c41ae5df89\": container with ID starting with dc6f382b15af3fbf4e552dc0b3fdb692e3582b82c816f3740129e4c41ae5df89 not found: ID does not exist" Oct 12 06:51:58 crc kubenswrapper[4930]: I1012 06:51:58.150458 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6479a0ef-f6ea-476e-80b7-ded4c0a56451" path="/var/lib/kubelet/pods/6479a0ef-f6ea-476e-80b7-ded4c0a56451/volumes" Oct 12 06:52:01 crc kubenswrapper[4930]: I1012 06:52:01.919064 4930 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5k9hr" podUID="ae38942d-f53e-48f3-a81b-ed43467ef540" containerName="registry-server" probeResult="failure" output=< Oct 12 06:52:01 crc kubenswrapper[4930]: timeout: failed to connect service ":50051" within 1s Oct 12 06:52:01 crc kubenswrapper[4930]: > Oct 12 06:52:03 crc kubenswrapper[4930]: I1012 06:52:03.669436 4930 patch_prober.go:28] interesting pod/machine-config-daemon-mk4tf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 06:52:03 crc kubenswrapper[4930]: I1012 06:52:03.669848 4930 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 06:52:03 crc kubenswrapper[4930]: I1012 06:52:03.669915 4930 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" Oct 12 06:52:03 crc kubenswrapper[4930]: I1012 06:52:03.671102 4930 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f7ce11f1c8d9ed9dc321ae1afbc9191e4181abb37c1c7af64b20a7b2dd6a019a"} pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 12 06:52:03 crc kubenswrapper[4930]: I1012 06:52:03.671202 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerName="machine-config-daemon" containerID="cri-o://f7ce11f1c8d9ed9dc321ae1afbc9191e4181abb37c1c7af64b20a7b2dd6a019a" gracePeriod=600 Oct 12 06:52:04 crc kubenswrapper[4930]: I1012 06:52:04.472449 4930 generic.go:334] "Generic (PLEG): container finished" podID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerID="f7ce11f1c8d9ed9dc321ae1afbc9191e4181abb37c1c7af64b20a7b2dd6a019a" exitCode=0 Oct 12 06:52:04 crc kubenswrapper[4930]: I1012 06:52:04.472551 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" event={"ID":"02f8684c-a3e4-44e8-9741-9f54488d8d8d","Type":"ContainerDied","Data":"f7ce11f1c8d9ed9dc321ae1afbc9191e4181abb37c1c7af64b20a7b2dd6a019a"} Oct 12 06:52:04 crc kubenswrapper[4930]: I1012 06:52:04.472791 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" event={"ID":"02f8684c-a3e4-44e8-9741-9f54488d8d8d","Type":"ContainerStarted","Data":"da1b40f1545e0bfb69caf312917b67747a376eb7d694076fa7b4ee28958c55b0"} Oct 12 06:52:04 crc kubenswrapper[4930]: I1012 06:52:04.472819 4930 scope.go:117] "RemoveContainer" containerID="eb75ef60b15b6f921365d8663e4e5999d5843d674724de96ce9347f0001236ed" Oct 12 06:52:10 crc kubenswrapper[4930]: I1012 06:52:10.895284 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5k9hr" Oct 12 06:52:11 crc kubenswrapper[4930]: I1012 06:52:11.039429 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5k9hr" Oct 12 06:52:11 crc kubenswrapper[4930]: I1012 06:52:11.728601 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5k9hr"] Oct 12 06:52:12 crc kubenswrapper[4930]: I1012 06:52:12.565093 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5k9hr" podUID="ae38942d-f53e-48f3-a81b-ed43467ef540" containerName="registry-server" containerID="cri-o://c2de068e3c232ce8cbbaed47ba84db976a7884ae99223d28732de9b3ab02e076" gracePeriod=2 Oct 12 06:52:13 crc kubenswrapper[4930]: I1012 06:52:13.133504 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5k9hr" Oct 12 06:52:13 crc kubenswrapper[4930]: I1012 06:52:13.216383 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae38942d-f53e-48f3-a81b-ed43467ef540-catalog-content\") pod \"ae38942d-f53e-48f3-a81b-ed43467ef540\" (UID: \"ae38942d-f53e-48f3-a81b-ed43467ef540\") " Oct 12 06:52:13 crc kubenswrapper[4930]: I1012 06:52:13.216492 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t996s\" (UniqueName: \"kubernetes.io/projected/ae38942d-f53e-48f3-a81b-ed43467ef540-kube-api-access-t996s\") pod \"ae38942d-f53e-48f3-a81b-ed43467ef540\" (UID: \"ae38942d-f53e-48f3-a81b-ed43467ef540\") " Oct 12 06:52:13 crc kubenswrapper[4930]: I1012 06:52:13.216585 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae38942d-f53e-48f3-a81b-ed43467ef540-utilities\") pod \"ae38942d-f53e-48f3-a81b-ed43467ef540\" (UID: \"ae38942d-f53e-48f3-a81b-ed43467ef540\") " Oct 12 06:52:13 crc kubenswrapper[4930]: I1012 06:52:13.217277 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae38942d-f53e-48f3-a81b-ed43467ef540-utilities" (OuterVolumeSpecName: "utilities") pod "ae38942d-f53e-48f3-a81b-ed43467ef540" (UID: "ae38942d-f53e-48f3-a81b-ed43467ef540"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 06:52:13 crc kubenswrapper[4930]: I1012 06:52:13.218242 4930 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae38942d-f53e-48f3-a81b-ed43467ef540-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 06:52:13 crc kubenswrapper[4930]: I1012 06:52:13.222233 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae38942d-f53e-48f3-a81b-ed43467ef540-kube-api-access-t996s" (OuterVolumeSpecName: "kube-api-access-t996s") pod "ae38942d-f53e-48f3-a81b-ed43467ef540" (UID: "ae38942d-f53e-48f3-a81b-ed43467ef540"). InnerVolumeSpecName "kube-api-access-t996s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:52:13 crc kubenswrapper[4930]: I1012 06:52:13.302029 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae38942d-f53e-48f3-a81b-ed43467ef540-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ae38942d-f53e-48f3-a81b-ed43467ef540" (UID: "ae38942d-f53e-48f3-a81b-ed43467ef540"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 06:52:13 crc kubenswrapper[4930]: I1012 06:52:13.319441 4930 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae38942d-f53e-48f3-a81b-ed43467ef540-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 06:52:13 crc kubenswrapper[4930]: I1012 06:52:13.319466 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t996s\" (UniqueName: \"kubernetes.io/projected/ae38942d-f53e-48f3-a81b-ed43467ef540-kube-api-access-t996s\") on node \"crc\" DevicePath \"\"" Oct 12 06:52:13 crc kubenswrapper[4930]: I1012 06:52:13.582836 4930 generic.go:334] "Generic (PLEG): container finished" podID="ae38942d-f53e-48f3-a81b-ed43467ef540" containerID="c2de068e3c232ce8cbbaed47ba84db976a7884ae99223d28732de9b3ab02e076" exitCode=0 Oct 12 06:52:13 crc kubenswrapper[4930]: I1012 06:52:13.582873 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5k9hr" Oct 12 06:52:13 crc kubenswrapper[4930]: I1012 06:52:13.582912 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5k9hr" event={"ID":"ae38942d-f53e-48f3-a81b-ed43467ef540","Type":"ContainerDied","Data":"c2de068e3c232ce8cbbaed47ba84db976a7884ae99223d28732de9b3ab02e076"} Oct 12 06:52:13 crc kubenswrapper[4930]: I1012 06:52:13.583316 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5k9hr" event={"ID":"ae38942d-f53e-48f3-a81b-ed43467ef540","Type":"ContainerDied","Data":"859d549571b849b6be9779e35877c30e2f766ae3ae15a3785f4e3a1dca10f742"} Oct 12 06:52:13 crc kubenswrapper[4930]: I1012 06:52:13.583364 4930 scope.go:117] "RemoveContainer" containerID="c2de068e3c232ce8cbbaed47ba84db976a7884ae99223d28732de9b3ab02e076" Oct 12 06:52:13 crc kubenswrapper[4930]: I1012 06:52:13.629546 4930 scope.go:117] "RemoveContainer" containerID="76786a1927b5c16e9ab1742697dc7c12aae386f4b2f33c4ba8bc3ac5dd6137c0" Oct 12 06:52:13 crc kubenswrapper[4930]: I1012 06:52:13.637075 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5k9hr"] Oct 12 06:52:13 crc kubenswrapper[4930]: I1012 06:52:13.647454 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5k9hr"] Oct 12 06:52:13 crc kubenswrapper[4930]: I1012 06:52:13.660320 4930 scope.go:117] "RemoveContainer" containerID="995f14289f3880f02e8c58e49759a69ef499fc4c5b771602ae28ce19e2416e96" Oct 12 06:52:13 crc kubenswrapper[4930]: I1012 06:52:13.756322 4930 scope.go:117] "RemoveContainer" containerID="c2de068e3c232ce8cbbaed47ba84db976a7884ae99223d28732de9b3ab02e076" Oct 12 06:52:13 crc kubenswrapper[4930]: E1012 06:52:13.756887 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2de068e3c232ce8cbbaed47ba84db976a7884ae99223d28732de9b3ab02e076\": container with ID starting with c2de068e3c232ce8cbbaed47ba84db976a7884ae99223d28732de9b3ab02e076 not found: ID does not exist" containerID="c2de068e3c232ce8cbbaed47ba84db976a7884ae99223d28732de9b3ab02e076" Oct 12 06:52:13 crc kubenswrapper[4930]: I1012 06:52:13.756941 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2de068e3c232ce8cbbaed47ba84db976a7884ae99223d28732de9b3ab02e076"} err="failed to get container status \"c2de068e3c232ce8cbbaed47ba84db976a7884ae99223d28732de9b3ab02e076\": rpc error: code = NotFound desc = could not find container \"c2de068e3c232ce8cbbaed47ba84db976a7884ae99223d28732de9b3ab02e076\": container with ID starting with c2de068e3c232ce8cbbaed47ba84db976a7884ae99223d28732de9b3ab02e076 not found: ID does not exist" Oct 12 06:52:13 crc kubenswrapper[4930]: I1012 06:52:13.756979 4930 scope.go:117] "RemoveContainer" containerID="76786a1927b5c16e9ab1742697dc7c12aae386f4b2f33c4ba8bc3ac5dd6137c0" Oct 12 06:52:13 crc kubenswrapper[4930]: E1012 06:52:13.757604 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76786a1927b5c16e9ab1742697dc7c12aae386f4b2f33c4ba8bc3ac5dd6137c0\": container with ID starting with 76786a1927b5c16e9ab1742697dc7c12aae386f4b2f33c4ba8bc3ac5dd6137c0 not found: ID does not exist" containerID="76786a1927b5c16e9ab1742697dc7c12aae386f4b2f33c4ba8bc3ac5dd6137c0" Oct 12 06:52:13 crc kubenswrapper[4930]: I1012 06:52:13.757656 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76786a1927b5c16e9ab1742697dc7c12aae386f4b2f33c4ba8bc3ac5dd6137c0"} err="failed to get container status \"76786a1927b5c16e9ab1742697dc7c12aae386f4b2f33c4ba8bc3ac5dd6137c0\": rpc error: code = NotFound desc = could not find container \"76786a1927b5c16e9ab1742697dc7c12aae386f4b2f33c4ba8bc3ac5dd6137c0\": container with ID starting with 76786a1927b5c16e9ab1742697dc7c12aae386f4b2f33c4ba8bc3ac5dd6137c0 not found: ID does not exist" Oct 12 06:52:13 crc kubenswrapper[4930]: I1012 06:52:13.757699 4930 scope.go:117] "RemoveContainer" containerID="995f14289f3880f02e8c58e49759a69ef499fc4c5b771602ae28ce19e2416e96" Oct 12 06:52:13 crc kubenswrapper[4930]: E1012 06:52:13.759188 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"995f14289f3880f02e8c58e49759a69ef499fc4c5b771602ae28ce19e2416e96\": container with ID starting with 995f14289f3880f02e8c58e49759a69ef499fc4c5b771602ae28ce19e2416e96 not found: ID does not exist" containerID="995f14289f3880f02e8c58e49759a69ef499fc4c5b771602ae28ce19e2416e96" Oct 12 06:52:13 crc kubenswrapper[4930]: I1012 06:52:13.759240 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"995f14289f3880f02e8c58e49759a69ef499fc4c5b771602ae28ce19e2416e96"} err="failed to get container status \"995f14289f3880f02e8c58e49759a69ef499fc4c5b771602ae28ce19e2416e96\": rpc error: code = NotFound desc = could not find container \"995f14289f3880f02e8c58e49759a69ef499fc4c5b771602ae28ce19e2416e96\": container with ID starting with 995f14289f3880f02e8c58e49759a69ef499fc4c5b771602ae28ce19e2416e96 not found: ID does not exist" Oct 12 06:52:14 crc kubenswrapper[4930]: I1012 06:52:14.152801 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae38942d-f53e-48f3-a81b-ed43467ef540" path="/var/lib/kubelet/pods/ae38942d-f53e-48f3-a81b-ed43467ef540/volumes" Oct 12 06:53:23 crc kubenswrapper[4930]: I1012 06:53:23.879024 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-frzvb"] Oct 12 06:53:23 crc kubenswrapper[4930]: E1012 06:53:23.880382 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae38942d-f53e-48f3-a81b-ed43467ef540" containerName="registry-server" Oct 12 06:53:23 crc kubenswrapper[4930]: I1012 06:53:23.880409 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae38942d-f53e-48f3-a81b-ed43467ef540" containerName="registry-server" Oct 12 06:53:23 crc kubenswrapper[4930]: E1012 06:53:23.880430 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae38942d-f53e-48f3-a81b-ed43467ef540" containerName="extract-content" Oct 12 06:53:23 crc kubenswrapper[4930]: I1012 06:53:23.880442 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae38942d-f53e-48f3-a81b-ed43467ef540" containerName="extract-content" Oct 12 06:53:23 crc kubenswrapper[4930]: E1012 06:53:23.880469 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae38942d-f53e-48f3-a81b-ed43467ef540" containerName="extract-utilities" Oct 12 06:53:23 crc kubenswrapper[4930]: I1012 06:53:23.880481 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae38942d-f53e-48f3-a81b-ed43467ef540" containerName="extract-utilities" Oct 12 06:53:23 crc kubenswrapper[4930]: E1012 06:53:23.880505 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6479a0ef-f6ea-476e-80b7-ded4c0a56451" containerName="extract-content" Oct 12 06:53:23 crc kubenswrapper[4930]: I1012 06:53:23.880517 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="6479a0ef-f6ea-476e-80b7-ded4c0a56451" containerName="extract-content" Oct 12 06:53:23 crc kubenswrapper[4930]: E1012 06:53:23.880539 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6479a0ef-f6ea-476e-80b7-ded4c0a56451" containerName="registry-server" Oct 12 06:53:23 crc kubenswrapper[4930]: I1012 06:53:23.880551 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="6479a0ef-f6ea-476e-80b7-ded4c0a56451" containerName="registry-server" Oct 12 06:53:23 crc kubenswrapper[4930]: E1012 06:53:23.880593 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6479a0ef-f6ea-476e-80b7-ded4c0a56451" containerName="extract-utilities" Oct 12 06:53:23 crc kubenswrapper[4930]: I1012 06:53:23.880605 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="6479a0ef-f6ea-476e-80b7-ded4c0a56451" containerName="extract-utilities" Oct 12 06:53:23 crc kubenswrapper[4930]: I1012 06:53:23.881010 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae38942d-f53e-48f3-a81b-ed43467ef540" containerName="registry-server" Oct 12 06:53:23 crc kubenswrapper[4930]: I1012 06:53:23.881043 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="6479a0ef-f6ea-476e-80b7-ded4c0a56451" containerName="registry-server" Oct 12 06:53:23 crc kubenswrapper[4930]: I1012 06:53:23.885917 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-frzvb" Oct 12 06:53:23 crc kubenswrapper[4930]: I1012 06:53:23.916921 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-frzvb"] Oct 12 06:53:23 crc kubenswrapper[4930]: I1012 06:53:23.956328 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krdgs\" (UniqueName: \"kubernetes.io/projected/51dbda11-5c26-400b-9c49-c518594c0f78-kube-api-access-krdgs\") pod \"certified-operators-frzvb\" (UID: \"51dbda11-5c26-400b-9c49-c518594c0f78\") " pod="openshift-marketplace/certified-operators-frzvb" Oct 12 06:53:23 crc kubenswrapper[4930]: I1012 06:53:23.956414 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51dbda11-5c26-400b-9c49-c518594c0f78-utilities\") pod \"certified-operators-frzvb\" (UID: \"51dbda11-5c26-400b-9c49-c518594c0f78\") " pod="openshift-marketplace/certified-operators-frzvb" Oct 12 06:53:23 crc kubenswrapper[4930]: I1012 06:53:23.956671 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51dbda11-5c26-400b-9c49-c518594c0f78-catalog-content\") pod \"certified-operators-frzvb\" (UID: \"51dbda11-5c26-400b-9c49-c518594c0f78\") " pod="openshift-marketplace/certified-operators-frzvb" Oct 12 06:53:24 crc kubenswrapper[4930]: I1012 06:53:24.058858 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51dbda11-5c26-400b-9c49-c518594c0f78-catalog-content\") pod \"certified-operators-frzvb\" (UID: \"51dbda11-5c26-400b-9c49-c518594c0f78\") " pod="openshift-marketplace/certified-operators-frzvb" Oct 12 06:53:24 crc kubenswrapper[4930]: I1012 06:53:24.059054 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krdgs\" (UniqueName: \"kubernetes.io/projected/51dbda11-5c26-400b-9c49-c518594c0f78-kube-api-access-krdgs\") pod \"certified-operators-frzvb\" (UID: \"51dbda11-5c26-400b-9c49-c518594c0f78\") " pod="openshift-marketplace/certified-operators-frzvb" Oct 12 06:53:24 crc kubenswrapper[4930]: I1012 06:53:24.059112 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51dbda11-5c26-400b-9c49-c518594c0f78-utilities\") pod \"certified-operators-frzvb\" (UID: \"51dbda11-5c26-400b-9c49-c518594c0f78\") " pod="openshift-marketplace/certified-operators-frzvb" Oct 12 06:53:24 crc kubenswrapper[4930]: I1012 06:53:24.059469 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51dbda11-5c26-400b-9c49-c518594c0f78-catalog-content\") pod \"certified-operators-frzvb\" (UID: \"51dbda11-5c26-400b-9c49-c518594c0f78\") " pod="openshift-marketplace/certified-operators-frzvb" Oct 12 06:53:24 crc kubenswrapper[4930]: I1012 06:53:24.059617 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51dbda11-5c26-400b-9c49-c518594c0f78-utilities\") pod \"certified-operators-frzvb\" (UID: \"51dbda11-5c26-400b-9c49-c518594c0f78\") " pod="openshift-marketplace/certified-operators-frzvb" Oct 12 06:53:24 crc kubenswrapper[4930]: I1012 06:53:24.110024 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krdgs\" (UniqueName: \"kubernetes.io/projected/51dbda11-5c26-400b-9c49-c518594c0f78-kube-api-access-krdgs\") pod \"certified-operators-frzvb\" (UID: \"51dbda11-5c26-400b-9c49-c518594c0f78\") " pod="openshift-marketplace/certified-operators-frzvb" Oct 12 06:53:24 crc kubenswrapper[4930]: I1012 06:53:24.223003 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-frzvb" Oct 12 06:53:24 crc kubenswrapper[4930]: I1012 06:53:24.811589 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-frzvb"] Oct 12 06:53:24 crc kubenswrapper[4930]: W1012 06:53:24.816620 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51dbda11_5c26_400b_9c49_c518594c0f78.slice/crio-457012d8400e1e6c9ed01c22d35d2083be9861932e2cd477db192edc8a981dbd WatchSource:0}: Error finding container 457012d8400e1e6c9ed01c22d35d2083be9861932e2cd477db192edc8a981dbd: Status 404 returned error can't find the container with id 457012d8400e1e6c9ed01c22d35d2083be9861932e2cd477db192edc8a981dbd Oct 12 06:53:25 crc kubenswrapper[4930]: I1012 06:53:25.493141 4930 generic.go:334] "Generic (PLEG): container finished" podID="51dbda11-5c26-400b-9c49-c518594c0f78" containerID="9ffce0a230e759813c0a9b854e59a0181a58374344364bb2036a157d5906dfc7" exitCode=0 Oct 12 06:53:25 crc kubenswrapper[4930]: I1012 06:53:25.493289 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-frzvb" event={"ID":"51dbda11-5c26-400b-9c49-c518594c0f78","Type":"ContainerDied","Data":"9ffce0a230e759813c0a9b854e59a0181a58374344364bb2036a157d5906dfc7"} Oct 12 06:53:25 crc kubenswrapper[4930]: I1012 06:53:25.493554 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-frzvb" event={"ID":"51dbda11-5c26-400b-9c49-c518594c0f78","Type":"ContainerStarted","Data":"457012d8400e1e6c9ed01c22d35d2083be9861932e2cd477db192edc8a981dbd"} Oct 12 06:53:26 crc kubenswrapper[4930]: I1012 06:53:26.506336 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-frzvb" event={"ID":"51dbda11-5c26-400b-9c49-c518594c0f78","Type":"ContainerStarted","Data":"e36c59aa07fdc25175f7f9f20577663fdf6ee00f9285a32fceef4db40388992c"} Oct 12 06:53:28 crc kubenswrapper[4930]: I1012 06:53:28.530927 4930 generic.go:334] "Generic (PLEG): container finished" podID="51dbda11-5c26-400b-9c49-c518594c0f78" containerID="e36c59aa07fdc25175f7f9f20577663fdf6ee00f9285a32fceef4db40388992c" exitCode=0 Oct 12 06:53:28 crc kubenswrapper[4930]: I1012 06:53:28.530992 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-frzvb" event={"ID":"51dbda11-5c26-400b-9c49-c518594c0f78","Type":"ContainerDied","Data":"e36c59aa07fdc25175f7f9f20577663fdf6ee00f9285a32fceef4db40388992c"} Oct 12 06:53:29 crc kubenswrapper[4930]: I1012 06:53:29.546713 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-frzvb" event={"ID":"51dbda11-5c26-400b-9c49-c518594c0f78","Type":"ContainerStarted","Data":"8ef2d6a2287a4094c9eb488bc46dc1b3b912fd0fc6896e89f0374da5c7c086f8"} Oct 12 06:53:29 crc kubenswrapper[4930]: I1012 06:53:29.562802 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-frzvb" podStartSLOduration=3.16048685 podStartE2EDuration="6.562780709s" podCreationTimestamp="2025-10-12 06:53:23 +0000 UTC" firstStartedPulling="2025-10-12 06:53:25.496248206 +0000 UTC m=+4338.038350011" lastFinishedPulling="2025-10-12 06:53:28.898542075 +0000 UTC m=+4341.440643870" observedRunningTime="2025-10-12 06:53:29.562205354 +0000 UTC m=+4342.104307139" watchObservedRunningTime="2025-10-12 06:53:29.562780709 +0000 UTC m=+4342.104882474" Oct 12 06:53:34 crc kubenswrapper[4930]: I1012 06:53:34.241623 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-frzvb" Oct 12 06:53:34 crc kubenswrapper[4930]: I1012 06:53:34.243685 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-frzvb" Oct 12 06:53:34 crc kubenswrapper[4930]: I1012 06:53:34.319345 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-frzvb" Oct 12 06:53:34 crc kubenswrapper[4930]: I1012 06:53:34.658103 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-frzvb" Oct 12 06:53:34 crc kubenswrapper[4930]: I1012 06:53:34.710520 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-frzvb"] Oct 12 06:53:36 crc kubenswrapper[4930]: I1012 06:53:36.642005 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-frzvb" podUID="51dbda11-5c26-400b-9c49-c518594c0f78" containerName="registry-server" containerID="cri-o://8ef2d6a2287a4094c9eb488bc46dc1b3b912fd0fc6896e89f0374da5c7c086f8" gracePeriod=2 Oct 12 06:53:37 crc kubenswrapper[4930]: I1012 06:53:37.201029 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-frzvb" Oct 12 06:53:37 crc kubenswrapper[4930]: I1012 06:53:37.249165 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51dbda11-5c26-400b-9c49-c518594c0f78-catalog-content\") pod \"51dbda11-5c26-400b-9c49-c518594c0f78\" (UID: \"51dbda11-5c26-400b-9c49-c518594c0f78\") " Oct 12 06:53:37 crc kubenswrapper[4930]: I1012 06:53:37.249264 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51dbda11-5c26-400b-9c49-c518594c0f78-utilities\") pod \"51dbda11-5c26-400b-9c49-c518594c0f78\" (UID: \"51dbda11-5c26-400b-9c49-c518594c0f78\") " Oct 12 06:53:37 crc kubenswrapper[4930]: I1012 06:53:37.249407 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krdgs\" (UniqueName: \"kubernetes.io/projected/51dbda11-5c26-400b-9c49-c518594c0f78-kube-api-access-krdgs\") pod \"51dbda11-5c26-400b-9c49-c518594c0f78\" (UID: \"51dbda11-5c26-400b-9c49-c518594c0f78\") " Oct 12 06:53:37 crc kubenswrapper[4930]: I1012 06:53:37.250348 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51dbda11-5c26-400b-9c49-c518594c0f78-utilities" (OuterVolumeSpecName: "utilities") pod "51dbda11-5c26-400b-9c49-c518594c0f78" (UID: "51dbda11-5c26-400b-9c49-c518594c0f78"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 06:53:37 crc kubenswrapper[4930]: I1012 06:53:37.250885 4930 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51dbda11-5c26-400b-9c49-c518594c0f78-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 06:53:37 crc kubenswrapper[4930]: I1012 06:53:37.257309 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51dbda11-5c26-400b-9c49-c518594c0f78-kube-api-access-krdgs" (OuterVolumeSpecName: "kube-api-access-krdgs") pod "51dbda11-5c26-400b-9c49-c518594c0f78" (UID: "51dbda11-5c26-400b-9c49-c518594c0f78"). InnerVolumeSpecName "kube-api-access-krdgs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 06:53:37 crc kubenswrapper[4930]: I1012 06:53:37.304985 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51dbda11-5c26-400b-9c49-c518594c0f78-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "51dbda11-5c26-400b-9c49-c518594c0f78" (UID: "51dbda11-5c26-400b-9c49-c518594c0f78"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 06:53:37 crc kubenswrapper[4930]: I1012 06:53:37.353414 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krdgs\" (UniqueName: \"kubernetes.io/projected/51dbda11-5c26-400b-9c49-c518594c0f78-kube-api-access-krdgs\") on node \"crc\" DevicePath \"\"" Oct 12 06:53:37 crc kubenswrapper[4930]: I1012 06:53:37.353457 4930 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51dbda11-5c26-400b-9c49-c518594c0f78-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 06:53:37 crc kubenswrapper[4930]: I1012 06:53:37.651709 4930 generic.go:334] "Generic (PLEG): container finished" podID="51dbda11-5c26-400b-9c49-c518594c0f78" containerID="8ef2d6a2287a4094c9eb488bc46dc1b3b912fd0fc6896e89f0374da5c7c086f8" exitCode=0 Oct 12 06:53:37 crc kubenswrapper[4930]: I1012 06:53:37.651771 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-frzvb" event={"ID":"51dbda11-5c26-400b-9c49-c518594c0f78","Type":"ContainerDied","Data":"8ef2d6a2287a4094c9eb488bc46dc1b3b912fd0fc6896e89f0374da5c7c086f8"} Oct 12 06:53:37 crc kubenswrapper[4930]: I1012 06:53:37.652040 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-frzvb" event={"ID":"51dbda11-5c26-400b-9c49-c518594c0f78","Type":"ContainerDied","Data":"457012d8400e1e6c9ed01c22d35d2083be9861932e2cd477db192edc8a981dbd"} Oct 12 06:53:37 crc kubenswrapper[4930]: I1012 06:53:37.652070 4930 scope.go:117] "RemoveContainer" containerID="8ef2d6a2287a4094c9eb488bc46dc1b3b912fd0fc6896e89f0374da5c7c086f8" Oct 12 06:53:37 crc kubenswrapper[4930]: I1012 06:53:37.651787 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-frzvb" Oct 12 06:53:37 crc kubenswrapper[4930]: I1012 06:53:37.676366 4930 scope.go:117] "RemoveContainer" containerID="e36c59aa07fdc25175f7f9f20577663fdf6ee00f9285a32fceef4db40388992c" Oct 12 06:53:37 crc kubenswrapper[4930]: I1012 06:53:37.681427 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-frzvb"] Oct 12 06:53:37 crc kubenswrapper[4930]: I1012 06:53:37.689499 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-frzvb"] Oct 12 06:53:37 crc kubenswrapper[4930]: I1012 06:53:37.698321 4930 scope.go:117] "RemoveContainer" containerID="9ffce0a230e759813c0a9b854e59a0181a58374344364bb2036a157d5906dfc7" Oct 12 06:53:37 crc kubenswrapper[4930]: I1012 06:53:37.742878 4930 scope.go:117] "RemoveContainer" containerID="8ef2d6a2287a4094c9eb488bc46dc1b3b912fd0fc6896e89f0374da5c7c086f8" Oct 12 06:53:37 crc kubenswrapper[4930]: E1012 06:53:37.743400 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ef2d6a2287a4094c9eb488bc46dc1b3b912fd0fc6896e89f0374da5c7c086f8\": container with ID starting with 8ef2d6a2287a4094c9eb488bc46dc1b3b912fd0fc6896e89f0374da5c7c086f8 not found: ID does not exist" containerID="8ef2d6a2287a4094c9eb488bc46dc1b3b912fd0fc6896e89f0374da5c7c086f8" Oct 12 06:53:37 crc kubenswrapper[4930]: I1012 06:53:37.743445 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ef2d6a2287a4094c9eb488bc46dc1b3b912fd0fc6896e89f0374da5c7c086f8"} err="failed to get container status \"8ef2d6a2287a4094c9eb488bc46dc1b3b912fd0fc6896e89f0374da5c7c086f8\": rpc error: code = NotFound desc = could not find container \"8ef2d6a2287a4094c9eb488bc46dc1b3b912fd0fc6896e89f0374da5c7c086f8\": container with ID starting with 8ef2d6a2287a4094c9eb488bc46dc1b3b912fd0fc6896e89f0374da5c7c086f8 not found: ID does not exist" Oct 12 06:53:37 crc kubenswrapper[4930]: I1012 06:53:37.743473 4930 scope.go:117] "RemoveContainer" containerID="e36c59aa07fdc25175f7f9f20577663fdf6ee00f9285a32fceef4db40388992c" Oct 12 06:53:37 crc kubenswrapper[4930]: E1012 06:53:37.743989 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e36c59aa07fdc25175f7f9f20577663fdf6ee00f9285a32fceef4db40388992c\": container with ID starting with e36c59aa07fdc25175f7f9f20577663fdf6ee00f9285a32fceef4db40388992c not found: ID does not exist" containerID="e36c59aa07fdc25175f7f9f20577663fdf6ee00f9285a32fceef4db40388992c" Oct 12 06:53:37 crc kubenswrapper[4930]: I1012 06:53:37.744030 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e36c59aa07fdc25175f7f9f20577663fdf6ee00f9285a32fceef4db40388992c"} err="failed to get container status \"e36c59aa07fdc25175f7f9f20577663fdf6ee00f9285a32fceef4db40388992c\": rpc error: code = NotFound desc = could not find container \"e36c59aa07fdc25175f7f9f20577663fdf6ee00f9285a32fceef4db40388992c\": container with ID starting with e36c59aa07fdc25175f7f9f20577663fdf6ee00f9285a32fceef4db40388992c not found: ID does not exist" Oct 12 06:53:37 crc kubenswrapper[4930]: I1012 06:53:37.744058 4930 scope.go:117] "RemoveContainer" containerID="9ffce0a230e759813c0a9b854e59a0181a58374344364bb2036a157d5906dfc7" Oct 12 06:53:37 crc kubenswrapper[4930]: E1012 06:53:37.744384 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ffce0a230e759813c0a9b854e59a0181a58374344364bb2036a157d5906dfc7\": container with ID starting with 9ffce0a230e759813c0a9b854e59a0181a58374344364bb2036a157d5906dfc7 not found: ID does not exist" containerID="9ffce0a230e759813c0a9b854e59a0181a58374344364bb2036a157d5906dfc7" Oct 12 06:53:37 crc kubenswrapper[4930]: I1012 06:53:37.744431 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ffce0a230e759813c0a9b854e59a0181a58374344364bb2036a157d5906dfc7"} err="failed to get container status \"9ffce0a230e759813c0a9b854e59a0181a58374344364bb2036a157d5906dfc7\": rpc error: code = NotFound desc = could not find container \"9ffce0a230e759813c0a9b854e59a0181a58374344364bb2036a157d5906dfc7\": container with ID starting with 9ffce0a230e759813c0a9b854e59a0181a58374344364bb2036a157d5906dfc7 not found: ID does not exist" Oct 12 06:53:38 crc kubenswrapper[4930]: I1012 06:53:38.148832 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51dbda11-5c26-400b-9c49-c518594c0f78" path="/var/lib/kubelet/pods/51dbda11-5c26-400b-9c49-c518594c0f78/volumes" Oct 12 06:54:33 crc kubenswrapper[4930]: I1012 06:54:33.669289 4930 patch_prober.go:28] interesting pod/machine-config-daemon-mk4tf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 06:54:33 crc kubenswrapper[4930]: I1012 06:54:33.670013 4930 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 06:55:03 crc kubenswrapper[4930]: I1012 06:55:03.669119 4930 patch_prober.go:28] interesting pod/machine-config-daemon-mk4tf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 06:55:03 crc kubenswrapper[4930]: I1012 06:55:03.669660 4930 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 06:55:33 crc kubenswrapper[4930]: I1012 06:55:33.669547 4930 patch_prober.go:28] interesting pod/machine-config-daemon-mk4tf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 06:55:33 crc kubenswrapper[4930]: I1012 06:55:33.670047 4930 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 06:55:33 crc kubenswrapper[4930]: I1012 06:55:33.670101 4930 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" Oct 12 06:55:33 crc kubenswrapper[4930]: I1012 06:55:33.670918 4930 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"da1b40f1545e0bfb69caf312917b67747a376eb7d694076fa7b4ee28958c55b0"} pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 12 06:55:33 crc kubenswrapper[4930]: I1012 06:55:33.670982 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerName="machine-config-daemon" containerID="cri-o://da1b40f1545e0bfb69caf312917b67747a376eb7d694076fa7b4ee28958c55b0" gracePeriod=600 Oct 12 06:55:33 crc kubenswrapper[4930]: E1012 06:55:33.796430 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:55:34 crc kubenswrapper[4930]: I1012 06:55:34.195936 4930 generic.go:334] "Generic (PLEG): container finished" podID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerID="da1b40f1545e0bfb69caf312917b67747a376eb7d694076fa7b4ee28958c55b0" exitCode=0 Oct 12 06:55:34 crc kubenswrapper[4930]: I1012 06:55:34.207634 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" event={"ID":"02f8684c-a3e4-44e8-9741-9f54488d8d8d","Type":"ContainerDied","Data":"da1b40f1545e0bfb69caf312917b67747a376eb7d694076fa7b4ee28958c55b0"} Oct 12 06:55:34 crc kubenswrapper[4930]: I1012 06:55:34.207684 4930 scope.go:117] "RemoveContainer" containerID="f7ce11f1c8d9ed9dc321ae1afbc9191e4181abb37c1c7af64b20a7b2dd6a019a" Oct 12 06:55:34 crc kubenswrapper[4930]: I1012 06:55:34.208472 4930 scope.go:117] "RemoveContainer" containerID="da1b40f1545e0bfb69caf312917b67747a376eb7d694076fa7b4ee28958c55b0" Oct 12 06:55:34 crc kubenswrapper[4930]: E1012 06:55:34.208944 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:55:46 crc kubenswrapper[4930]: I1012 06:55:46.135383 4930 scope.go:117] "RemoveContainer" containerID="da1b40f1545e0bfb69caf312917b67747a376eb7d694076fa7b4ee28958c55b0" Oct 12 06:55:46 crc kubenswrapper[4930]: E1012 06:55:46.136219 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:55:57 crc kubenswrapper[4930]: I1012 06:55:57.135663 4930 scope.go:117] "RemoveContainer" containerID="da1b40f1545e0bfb69caf312917b67747a376eb7d694076fa7b4ee28958c55b0" Oct 12 06:55:57 crc kubenswrapper[4930]: E1012 06:55:57.136675 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:56:11 crc kubenswrapper[4930]: I1012 06:56:11.135562 4930 scope.go:117] "RemoveContainer" containerID="da1b40f1545e0bfb69caf312917b67747a376eb7d694076fa7b4ee28958c55b0" Oct 12 06:56:11 crc kubenswrapper[4930]: E1012 06:56:11.136727 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:56:25 crc kubenswrapper[4930]: I1012 06:56:25.135991 4930 scope.go:117] "RemoveContainer" containerID="da1b40f1545e0bfb69caf312917b67747a376eb7d694076fa7b4ee28958c55b0" Oct 12 06:56:25 crc kubenswrapper[4930]: E1012 06:56:25.137048 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:56:28 crc kubenswrapper[4930]: E1012 06:56:28.653794 4930 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.111:45728->38.102.83.111:46517: write tcp 38.102.83.111:45728->38.102.83.111:46517: write: broken pipe Oct 12 06:56:37 crc kubenswrapper[4930]: I1012 06:56:37.136011 4930 scope.go:117] "RemoveContainer" containerID="da1b40f1545e0bfb69caf312917b67747a376eb7d694076fa7b4ee28958c55b0" Oct 12 06:56:37 crc kubenswrapper[4930]: E1012 06:56:37.137150 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:56:50 crc kubenswrapper[4930]: I1012 06:56:50.137586 4930 scope.go:117] "RemoveContainer" containerID="da1b40f1545e0bfb69caf312917b67747a376eb7d694076fa7b4ee28958c55b0" Oct 12 06:56:50 crc kubenswrapper[4930]: E1012 06:56:50.138941 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:57:01 crc kubenswrapper[4930]: I1012 06:57:01.136125 4930 scope.go:117] "RemoveContainer" containerID="da1b40f1545e0bfb69caf312917b67747a376eb7d694076fa7b4ee28958c55b0" Oct 12 06:57:01 crc kubenswrapper[4930]: E1012 06:57:01.137083 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:57:13 crc kubenswrapper[4930]: I1012 06:57:13.135449 4930 scope.go:117] "RemoveContainer" containerID="da1b40f1545e0bfb69caf312917b67747a376eb7d694076fa7b4ee28958c55b0" Oct 12 06:57:13 crc kubenswrapper[4930]: E1012 06:57:13.136682 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:57:25 crc kubenswrapper[4930]: I1012 06:57:25.136501 4930 scope.go:117] "RemoveContainer" containerID="da1b40f1545e0bfb69caf312917b67747a376eb7d694076fa7b4ee28958c55b0" Oct 12 06:57:25 crc kubenswrapper[4930]: E1012 06:57:25.137566 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:57:39 crc kubenswrapper[4930]: I1012 06:57:39.135927 4930 scope.go:117] "RemoveContainer" containerID="da1b40f1545e0bfb69caf312917b67747a376eb7d694076fa7b4ee28958c55b0" Oct 12 06:57:39 crc kubenswrapper[4930]: E1012 06:57:39.138132 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:57:53 crc kubenswrapper[4930]: I1012 06:57:53.136130 4930 scope.go:117] "RemoveContainer" containerID="da1b40f1545e0bfb69caf312917b67747a376eb7d694076fa7b4ee28958c55b0" Oct 12 06:57:53 crc kubenswrapper[4930]: E1012 06:57:53.137171 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:58:08 crc kubenswrapper[4930]: I1012 06:58:08.144998 4930 scope.go:117] "RemoveContainer" containerID="da1b40f1545e0bfb69caf312917b67747a376eb7d694076fa7b4ee28958c55b0" Oct 12 06:58:08 crc kubenswrapper[4930]: E1012 06:58:08.145964 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:58:22 crc kubenswrapper[4930]: I1012 06:58:22.135638 4930 scope.go:117] "RemoveContainer" containerID="da1b40f1545e0bfb69caf312917b67747a376eb7d694076fa7b4ee28958c55b0" Oct 12 06:58:22 crc kubenswrapper[4930]: E1012 06:58:22.137402 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:58:33 crc kubenswrapper[4930]: I1012 06:58:33.136288 4930 scope.go:117] "RemoveContainer" containerID="da1b40f1545e0bfb69caf312917b67747a376eb7d694076fa7b4ee28958c55b0" Oct 12 06:58:33 crc kubenswrapper[4930]: E1012 06:58:33.137327 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:58:46 crc kubenswrapper[4930]: I1012 06:58:46.136440 4930 scope.go:117] "RemoveContainer" containerID="da1b40f1545e0bfb69caf312917b67747a376eb7d694076fa7b4ee28958c55b0" Oct 12 06:58:46 crc kubenswrapper[4930]: E1012 06:58:46.138273 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:58:59 crc kubenswrapper[4930]: I1012 06:58:59.135934 4930 scope.go:117] "RemoveContainer" containerID="da1b40f1545e0bfb69caf312917b67747a376eb7d694076fa7b4ee28958c55b0" Oct 12 06:58:59 crc kubenswrapper[4930]: E1012 06:58:59.136994 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:59:14 crc kubenswrapper[4930]: I1012 06:59:14.135852 4930 scope.go:117] "RemoveContainer" containerID="da1b40f1545e0bfb69caf312917b67747a376eb7d694076fa7b4ee28958c55b0" Oct 12 06:59:14 crc kubenswrapper[4930]: E1012 06:59:14.138226 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:59:27 crc kubenswrapper[4930]: I1012 06:59:27.136450 4930 scope.go:117] "RemoveContainer" containerID="da1b40f1545e0bfb69caf312917b67747a376eb7d694076fa7b4ee28958c55b0" Oct 12 06:59:27 crc kubenswrapper[4930]: E1012 06:59:27.137689 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:59:41 crc kubenswrapper[4930]: I1012 06:59:41.135659 4930 scope.go:117] "RemoveContainer" containerID="da1b40f1545e0bfb69caf312917b67747a376eb7d694076fa7b4ee28958c55b0" Oct 12 06:59:41 crc kubenswrapper[4930]: E1012 06:59:41.136687 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 06:59:56 crc kubenswrapper[4930]: I1012 06:59:56.138841 4930 scope.go:117] "RemoveContainer" containerID="da1b40f1545e0bfb69caf312917b67747a376eb7d694076fa7b4ee28958c55b0" Oct 12 06:59:56 crc kubenswrapper[4930]: E1012 06:59:56.140509 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 07:00:00 crc kubenswrapper[4930]: I1012 07:00:00.169680 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29337540-vtwg4"] Oct 12 07:00:00 crc kubenswrapper[4930]: E1012 07:00:00.171064 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51dbda11-5c26-400b-9c49-c518594c0f78" containerName="registry-server" Oct 12 07:00:00 crc kubenswrapper[4930]: I1012 07:00:00.171089 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="51dbda11-5c26-400b-9c49-c518594c0f78" containerName="registry-server" Oct 12 07:00:00 crc kubenswrapper[4930]: E1012 07:00:00.171118 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51dbda11-5c26-400b-9c49-c518594c0f78" containerName="extract-utilities" Oct 12 07:00:00 crc kubenswrapper[4930]: I1012 07:00:00.171131 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="51dbda11-5c26-400b-9c49-c518594c0f78" containerName="extract-utilities" Oct 12 07:00:00 crc kubenswrapper[4930]: E1012 07:00:00.171157 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51dbda11-5c26-400b-9c49-c518594c0f78" containerName="extract-content" Oct 12 07:00:00 crc kubenswrapper[4930]: I1012 07:00:00.171170 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="51dbda11-5c26-400b-9c49-c518594c0f78" containerName="extract-content" Oct 12 07:00:00 crc kubenswrapper[4930]: I1012 07:00:00.171787 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="51dbda11-5c26-400b-9c49-c518594c0f78" containerName="registry-server" Oct 12 07:00:00 crc kubenswrapper[4930]: I1012 07:00:00.173300 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29337540-vtwg4" Oct 12 07:00:00 crc kubenswrapper[4930]: I1012 07:00:00.175790 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 12 07:00:00 crc kubenswrapper[4930]: I1012 07:00:00.176414 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 12 07:00:00 crc kubenswrapper[4930]: I1012 07:00:00.186121 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29337540-vtwg4"] Oct 12 07:00:00 crc kubenswrapper[4930]: I1012 07:00:00.268579 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jslvr\" (UniqueName: \"kubernetes.io/projected/4240ce6a-969c-43cb-bcbd-869f9f642535-kube-api-access-jslvr\") pod \"collect-profiles-29337540-vtwg4\" (UID: \"4240ce6a-969c-43cb-bcbd-869f9f642535\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337540-vtwg4" Oct 12 07:00:00 crc kubenswrapper[4930]: I1012 07:00:00.269120 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4240ce6a-969c-43cb-bcbd-869f9f642535-secret-volume\") pod \"collect-profiles-29337540-vtwg4\" (UID: \"4240ce6a-969c-43cb-bcbd-869f9f642535\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337540-vtwg4" Oct 12 07:00:00 crc kubenswrapper[4930]: I1012 07:00:00.269336 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4240ce6a-969c-43cb-bcbd-869f9f642535-config-volume\") pod \"collect-profiles-29337540-vtwg4\" (UID: \"4240ce6a-969c-43cb-bcbd-869f9f642535\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337540-vtwg4" Oct 12 07:00:00 crc kubenswrapper[4930]: I1012 07:00:00.371754 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jslvr\" (UniqueName: \"kubernetes.io/projected/4240ce6a-969c-43cb-bcbd-869f9f642535-kube-api-access-jslvr\") pod \"collect-profiles-29337540-vtwg4\" (UID: \"4240ce6a-969c-43cb-bcbd-869f9f642535\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337540-vtwg4" Oct 12 07:00:00 crc kubenswrapper[4930]: I1012 07:00:00.372134 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4240ce6a-969c-43cb-bcbd-869f9f642535-secret-volume\") pod \"collect-profiles-29337540-vtwg4\" (UID: \"4240ce6a-969c-43cb-bcbd-869f9f642535\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337540-vtwg4" Oct 12 07:00:00 crc kubenswrapper[4930]: I1012 07:00:00.372357 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4240ce6a-969c-43cb-bcbd-869f9f642535-config-volume\") pod \"collect-profiles-29337540-vtwg4\" (UID: \"4240ce6a-969c-43cb-bcbd-869f9f642535\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337540-vtwg4" Oct 12 07:00:00 crc kubenswrapper[4930]: I1012 07:00:00.373992 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4240ce6a-969c-43cb-bcbd-869f9f642535-config-volume\") pod \"collect-profiles-29337540-vtwg4\" (UID: \"4240ce6a-969c-43cb-bcbd-869f9f642535\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337540-vtwg4" Oct 12 07:00:00 crc kubenswrapper[4930]: I1012 07:00:00.378437 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4240ce6a-969c-43cb-bcbd-869f9f642535-secret-volume\") pod \"collect-profiles-29337540-vtwg4\" (UID: \"4240ce6a-969c-43cb-bcbd-869f9f642535\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337540-vtwg4" Oct 12 07:00:00 crc kubenswrapper[4930]: I1012 07:00:00.397753 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jslvr\" (UniqueName: \"kubernetes.io/projected/4240ce6a-969c-43cb-bcbd-869f9f642535-kube-api-access-jslvr\") pod \"collect-profiles-29337540-vtwg4\" (UID: \"4240ce6a-969c-43cb-bcbd-869f9f642535\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337540-vtwg4" Oct 12 07:00:00 crc kubenswrapper[4930]: I1012 07:00:00.522062 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29337540-vtwg4" Oct 12 07:00:01 crc kubenswrapper[4930]: I1012 07:00:01.058988 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29337540-vtwg4"] Oct 12 07:00:01 crc kubenswrapper[4930]: I1012 07:00:01.511672 4930 generic.go:334] "Generic (PLEG): container finished" podID="4240ce6a-969c-43cb-bcbd-869f9f642535" containerID="c16fbda0672bfb1acdb4870fb8bf97365d557193cc5e0ae1077f97ed5eaf6cdb" exitCode=0 Oct 12 07:00:01 crc kubenswrapper[4930]: I1012 07:00:01.511986 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29337540-vtwg4" event={"ID":"4240ce6a-969c-43cb-bcbd-869f9f642535","Type":"ContainerDied","Data":"c16fbda0672bfb1acdb4870fb8bf97365d557193cc5e0ae1077f97ed5eaf6cdb"} Oct 12 07:00:01 crc kubenswrapper[4930]: I1012 07:00:01.513188 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29337540-vtwg4" event={"ID":"4240ce6a-969c-43cb-bcbd-869f9f642535","Type":"ContainerStarted","Data":"77ec40cecb6fb735f13c58c315b7267c00712fc31737c608cc7784792fb226dc"} Oct 12 07:00:02 crc kubenswrapper[4930]: I1012 07:00:02.945949 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29337540-vtwg4" Oct 12 07:00:03 crc kubenswrapper[4930]: I1012 07:00:03.039699 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4240ce6a-969c-43cb-bcbd-869f9f642535-secret-volume\") pod \"4240ce6a-969c-43cb-bcbd-869f9f642535\" (UID: \"4240ce6a-969c-43cb-bcbd-869f9f642535\") " Oct 12 07:00:03 crc kubenswrapper[4930]: I1012 07:00:03.039873 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jslvr\" (UniqueName: \"kubernetes.io/projected/4240ce6a-969c-43cb-bcbd-869f9f642535-kube-api-access-jslvr\") pod \"4240ce6a-969c-43cb-bcbd-869f9f642535\" (UID: \"4240ce6a-969c-43cb-bcbd-869f9f642535\") " Oct 12 07:00:03 crc kubenswrapper[4930]: I1012 07:00:03.039912 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4240ce6a-969c-43cb-bcbd-869f9f642535-config-volume\") pod \"4240ce6a-969c-43cb-bcbd-869f9f642535\" (UID: \"4240ce6a-969c-43cb-bcbd-869f9f642535\") " Oct 12 07:00:03 crc kubenswrapper[4930]: I1012 07:00:03.040503 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4240ce6a-969c-43cb-bcbd-869f9f642535-config-volume" (OuterVolumeSpecName: "config-volume") pod "4240ce6a-969c-43cb-bcbd-869f9f642535" (UID: "4240ce6a-969c-43cb-bcbd-869f9f642535"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:00:03 crc kubenswrapper[4930]: I1012 07:00:03.041180 4930 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4240ce6a-969c-43cb-bcbd-869f9f642535-config-volume\") on node \"crc\" DevicePath \"\"" Oct 12 07:00:03 crc kubenswrapper[4930]: I1012 07:00:03.049372 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4240ce6a-969c-43cb-bcbd-869f9f642535-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "4240ce6a-969c-43cb-bcbd-869f9f642535" (UID: "4240ce6a-969c-43cb-bcbd-869f9f642535"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:00:03 crc kubenswrapper[4930]: I1012 07:00:03.049431 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4240ce6a-969c-43cb-bcbd-869f9f642535-kube-api-access-jslvr" (OuterVolumeSpecName: "kube-api-access-jslvr") pod "4240ce6a-969c-43cb-bcbd-869f9f642535" (UID: "4240ce6a-969c-43cb-bcbd-869f9f642535"). InnerVolumeSpecName "kube-api-access-jslvr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:00:03 crc kubenswrapper[4930]: I1012 07:00:03.142993 4930 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4240ce6a-969c-43cb-bcbd-869f9f642535-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 12 07:00:03 crc kubenswrapper[4930]: I1012 07:00:03.143023 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jslvr\" (UniqueName: \"kubernetes.io/projected/4240ce6a-969c-43cb-bcbd-869f9f642535-kube-api-access-jslvr\") on node \"crc\" DevicePath \"\"" Oct 12 07:00:03 crc kubenswrapper[4930]: I1012 07:00:03.536658 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29337540-vtwg4" event={"ID":"4240ce6a-969c-43cb-bcbd-869f9f642535","Type":"ContainerDied","Data":"77ec40cecb6fb735f13c58c315b7267c00712fc31737c608cc7784792fb226dc"} Oct 12 07:00:03 crc kubenswrapper[4930]: I1012 07:00:03.537101 4930 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="77ec40cecb6fb735f13c58c315b7267c00712fc31737c608cc7784792fb226dc" Oct 12 07:00:03 crc kubenswrapper[4930]: I1012 07:00:03.536778 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29337540-vtwg4" Oct 12 07:00:04 crc kubenswrapper[4930]: I1012 07:00:04.015491 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29337495-8dkz5"] Oct 12 07:00:04 crc kubenswrapper[4930]: I1012 07:00:04.024396 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29337495-8dkz5"] Oct 12 07:00:04 crc kubenswrapper[4930]: I1012 07:00:04.175539 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14f6608c-f601-4659-aef3-4764a8727ae8" path="/var/lib/kubelet/pods/14f6608c-f601-4659-aef3-4764a8727ae8/volumes" Oct 12 07:00:09 crc kubenswrapper[4930]: I1012 07:00:09.135668 4930 scope.go:117] "RemoveContainer" containerID="da1b40f1545e0bfb69caf312917b67747a376eb7d694076fa7b4ee28958c55b0" Oct 12 07:00:09 crc kubenswrapper[4930]: E1012 07:00:09.136840 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 07:00:20 crc kubenswrapper[4930]: I1012 07:00:20.136194 4930 scope.go:117] "RemoveContainer" containerID="da1b40f1545e0bfb69caf312917b67747a376eb7d694076fa7b4ee28958c55b0" Oct 12 07:00:20 crc kubenswrapper[4930]: E1012 07:00:20.137187 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 07:00:28 crc kubenswrapper[4930]: I1012 07:00:28.652973 4930 scope.go:117] "RemoveContainer" containerID="7a9a7f19b821a58a80c8eca69cb140dbf1a2febb15cb6baedec0eb613e93c210" Oct 12 07:00:33 crc kubenswrapper[4930]: I1012 07:00:33.138246 4930 scope.go:117] "RemoveContainer" containerID="da1b40f1545e0bfb69caf312917b67747a376eb7d694076fa7b4ee28958c55b0" Oct 12 07:00:33 crc kubenswrapper[4930]: E1012 07:00:33.140470 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 07:00:33 crc kubenswrapper[4930]: I1012 07:00:33.272288 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2tpkg"] Oct 12 07:00:33 crc kubenswrapper[4930]: E1012 07:00:33.272800 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4240ce6a-969c-43cb-bcbd-869f9f642535" containerName="collect-profiles" Oct 12 07:00:33 crc kubenswrapper[4930]: I1012 07:00:33.272821 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="4240ce6a-969c-43cb-bcbd-869f9f642535" containerName="collect-profiles" Oct 12 07:00:33 crc kubenswrapper[4930]: I1012 07:00:33.273119 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="4240ce6a-969c-43cb-bcbd-869f9f642535" containerName="collect-profiles" Oct 12 07:00:33 crc kubenswrapper[4930]: I1012 07:00:33.276068 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2tpkg" Oct 12 07:00:33 crc kubenswrapper[4930]: I1012 07:00:33.282939 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2tpkg"] Oct 12 07:00:33 crc kubenswrapper[4930]: I1012 07:00:33.459036 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29c8d3f9-19a7-46ce-a0ba-71bec54e66e0-utilities\") pod \"redhat-marketplace-2tpkg\" (UID: \"29c8d3f9-19a7-46ce-a0ba-71bec54e66e0\") " pod="openshift-marketplace/redhat-marketplace-2tpkg" Oct 12 07:00:33 crc kubenswrapper[4930]: I1012 07:00:33.459364 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tp8gv\" (UniqueName: \"kubernetes.io/projected/29c8d3f9-19a7-46ce-a0ba-71bec54e66e0-kube-api-access-tp8gv\") pod \"redhat-marketplace-2tpkg\" (UID: \"29c8d3f9-19a7-46ce-a0ba-71bec54e66e0\") " pod="openshift-marketplace/redhat-marketplace-2tpkg" Oct 12 07:00:33 crc kubenswrapper[4930]: I1012 07:00:33.459486 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29c8d3f9-19a7-46ce-a0ba-71bec54e66e0-catalog-content\") pod \"redhat-marketplace-2tpkg\" (UID: \"29c8d3f9-19a7-46ce-a0ba-71bec54e66e0\") " pod="openshift-marketplace/redhat-marketplace-2tpkg" Oct 12 07:00:33 crc kubenswrapper[4930]: I1012 07:00:33.561127 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29c8d3f9-19a7-46ce-a0ba-71bec54e66e0-utilities\") pod \"redhat-marketplace-2tpkg\" (UID: \"29c8d3f9-19a7-46ce-a0ba-71bec54e66e0\") " pod="openshift-marketplace/redhat-marketplace-2tpkg" Oct 12 07:00:33 crc kubenswrapper[4930]: I1012 07:00:33.561320 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tp8gv\" (UniqueName: \"kubernetes.io/projected/29c8d3f9-19a7-46ce-a0ba-71bec54e66e0-kube-api-access-tp8gv\") pod \"redhat-marketplace-2tpkg\" (UID: \"29c8d3f9-19a7-46ce-a0ba-71bec54e66e0\") " pod="openshift-marketplace/redhat-marketplace-2tpkg" Oct 12 07:00:33 crc kubenswrapper[4930]: I1012 07:00:33.561394 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29c8d3f9-19a7-46ce-a0ba-71bec54e66e0-catalog-content\") pod \"redhat-marketplace-2tpkg\" (UID: \"29c8d3f9-19a7-46ce-a0ba-71bec54e66e0\") " pod="openshift-marketplace/redhat-marketplace-2tpkg" Oct 12 07:00:33 crc kubenswrapper[4930]: I1012 07:00:33.562159 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29c8d3f9-19a7-46ce-a0ba-71bec54e66e0-catalog-content\") pod \"redhat-marketplace-2tpkg\" (UID: \"29c8d3f9-19a7-46ce-a0ba-71bec54e66e0\") " pod="openshift-marketplace/redhat-marketplace-2tpkg" Oct 12 07:00:33 crc kubenswrapper[4930]: I1012 07:00:33.562513 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29c8d3f9-19a7-46ce-a0ba-71bec54e66e0-utilities\") pod \"redhat-marketplace-2tpkg\" (UID: \"29c8d3f9-19a7-46ce-a0ba-71bec54e66e0\") " pod="openshift-marketplace/redhat-marketplace-2tpkg" Oct 12 07:00:33 crc kubenswrapper[4930]: I1012 07:00:33.592943 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tp8gv\" (UniqueName: \"kubernetes.io/projected/29c8d3f9-19a7-46ce-a0ba-71bec54e66e0-kube-api-access-tp8gv\") pod \"redhat-marketplace-2tpkg\" (UID: \"29c8d3f9-19a7-46ce-a0ba-71bec54e66e0\") " pod="openshift-marketplace/redhat-marketplace-2tpkg" Oct 12 07:00:33 crc kubenswrapper[4930]: I1012 07:00:33.619224 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2tpkg" Oct 12 07:00:34 crc kubenswrapper[4930]: W1012 07:00:34.145266 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29c8d3f9_19a7_46ce_a0ba_71bec54e66e0.slice/crio-dfbb6d610e63dcb47cffcdb356baf36a4ec9a26d060fca30cfbc59183dab8a23 WatchSource:0}: Error finding container dfbb6d610e63dcb47cffcdb356baf36a4ec9a26d060fca30cfbc59183dab8a23: Status 404 returned error can't find the container with id dfbb6d610e63dcb47cffcdb356baf36a4ec9a26d060fca30cfbc59183dab8a23 Oct 12 07:00:34 crc kubenswrapper[4930]: I1012 07:00:34.148379 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2tpkg"] Oct 12 07:00:34 crc kubenswrapper[4930]: I1012 07:00:34.902031 4930 generic.go:334] "Generic (PLEG): container finished" podID="29c8d3f9-19a7-46ce-a0ba-71bec54e66e0" containerID="076c76e97089f1cdead79659e146da8eb4992a0465ff256d81a0c88519c79503" exitCode=0 Oct 12 07:00:34 crc kubenswrapper[4930]: I1012 07:00:34.902158 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2tpkg" event={"ID":"29c8d3f9-19a7-46ce-a0ba-71bec54e66e0","Type":"ContainerDied","Data":"076c76e97089f1cdead79659e146da8eb4992a0465ff256d81a0c88519c79503"} Oct 12 07:00:34 crc kubenswrapper[4930]: I1012 07:00:34.902317 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2tpkg" event={"ID":"29c8d3f9-19a7-46ce-a0ba-71bec54e66e0","Type":"ContainerStarted","Data":"dfbb6d610e63dcb47cffcdb356baf36a4ec9a26d060fca30cfbc59183dab8a23"} Oct 12 07:00:34 crc kubenswrapper[4930]: I1012 07:00:34.907570 4930 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 12 07:00:36 crc kubenswrapper[4930]: I1012 07:00:36.926756 4930 generic.go:334] "Generic (PLEG): container finished" podID="29c8d3f9-19a7-46ce-a0ba-71bec54e66e0" containerID="9ad9265ff9997af9e053266631992b179d609a52b5e717236b3f963db231dc31" exitCode=0 Oct 12 07:00:36 crc kubenswrapper[4930]: I1012 07:00:36.926905 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2tpkg" event={"ID":"29c8d3f9-19a7-46ce-a0ba-71bec54e66e0","Type":"ContainerDied","Data":"9ad9265ff9997af9e053266631992b179d609a52b5e717236b3f963db231dc31"} Oct 12 07:00:38 crc kubenswrapper[4930]: I1012 07:00:38.954548 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2tpkg" event={"ID":"29c8d3f9-19a7-46ce-a0ba-71bec54e66e0","Type":"ContainerStarted","Data":"29915e426798e47c899f540d8b386c61ec1ea4dd0044356466bab037993411ce"} Oct 12 07:00:38 crc kubenswrapper[4930]: I1012 07:00:38.972415 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2tpkg" podStartSLOduration=3.206813245 podStartE2EDuration="5.97239927s" podCreationTimestamp="2025-10-12 07:00:33 +0000 UTC" firstStartedPulling="2025-10-12 07:00:34.90718485 +0000 UTC m=+4767.449286655" lastFinishedPulling="2025-10-12 07:00:37.672770885 +0000 UTC m=+4770.214872680" observedRunningTime="2025-10-12 07:00:38.970505224 +0000 UTC m=+4771.512606989" watchObservedRunningTime="2025-10-12 07:00:38.97239927 +0000 UTC m=+4771.514501025" Oct 12 07:00:43 crc kubenswrapper[4930]: I1012 07:00:43.619878 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2tpkg" Oct 12 07:00:43 crc kubenswrapper[4930]: I1012 07:00:43.622188 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2tpkg" Oct 12 07:00:43 crc kubenswrapper[4930]: I1012 07:00:43.693997 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2tpkg" Oct 12 07:00:44 crc kubenswrapper[4930]: I1012 07:00:44.057039 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2tpkg" Oct 12 07:00:44 crc kubenswrapper[4930]: I1012 07:00:44.108136 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2tpkg"] Oct 12 07:00:46 crc kubenswrapper[4930]: I1012 07:00:46.032590 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2tpkg" podUID="29c8d3f9-19a7-46ce-a0ba-71bec54e66e0" containerName="registry-server" containerID="cri-o://29915e426798e47c899f540d8b386c61ec1ea4dd0044356466bab037993411ce" gracePeriod=2 Oct 12 07:00:46 crc kubenswrapper[4930]: I1012 07:00:46.673214 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2tpkg" Oct 12 07:00:46 crc kubenswrapper[4930]: I1012 07:00:46.772944 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29c8d3f9-19a7-46ce-a0ba-71bec54e66e0-catalog-content\") pod \"29c8d3f9-19a7-46ce-a0ba-71bec54e66e0\" (UID: \"29c8d3f9-19a7-46ce-a0ba-71bec54e66e0\") " Oct 12 07:00:46 crc kubenswrapper[4930]: I1012 07:00:46.773044 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29c8d3f9-19a7-46ce-a0ba-71bec54e66e0-utilities\") pod \"29c8d3f9-19a7-46ce-a0ba-71bec54e66e0\" (UID: \"29c8d3f9-19a7-46ce-a0ba-71bec54e66e0\") " Oct 12 07:00:46 crc kubenswrapper[4930]: I1012 07:00:46.773171 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tp8gv\" (UniqueName: \"kubernetes.io/projected/29c8d3f9-19a7-46ce-a0ba-71bec54e66e0-kube-api-access-tp8gv\") pod \"29c8d3f9-19a7-46ce-a0ba-71bec54e66e0\" (UID: \"29c8d3f9-19a7-46ce-a0ba-71bec54e66e0\") " Oct 12 07:00:46 crc kubenswrapper[4930]: I1012 07:00:46.774789 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29c8d3f9-19a7-46ce-a0ba-71bec54e66e0-utilities" (OuterVolumeSpecName: "utilities") pod "29c8d3f9-19a7-46ce-a0ba-71bec54e66e0" (UID: "29c8d3f9-19a7-46ce-a0ba-71bec54e66e0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:00:46 crc kubenswrapper[4930]: I1012 07:00:46.783037 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29c8d3f9-19a7-46ce-a0ba-71bec54e66e0-kube-api-access-tp8gv" (OuterVolumeSpecName: "kube-api-access-tp8gv") pod "29c8d3f9-19a7-46ce-a0ba-71bec54e66e0" (UID: "29c8d3f9-19a7-46ce-a0ba-71bec54e66e0"). InnerVolumeSpecName "kube-api-access-tp8gv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:00:46 crc kubenswrapper[4930]: I1012 07:00:46.805467 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29c8d3f9-19a7-46ce-a0ba-71bec54e66e0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "29c8d3f9-19a7-46ce-a0ba-71bec54e66e0" (UID: "29c8d3f9-19a7-46ce-a0ba-71bec54e66e0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:00:46 crc kubenswrapper[4930]: I1012 07:00:46.876943 4930 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29c8d3f9-19a7-46ce-a0ba-71bec54e66e0-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 07:00:46 crc kubenswrapper[4930]: I1012 07:00:46.877012 4930 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29c8d3f9-19a7-46ce-a0ba-71bec54e66e0-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 07:00:46 crc kubenswrapper[4930]: I1012 07:00:46.877028 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tp8gv\" (UniqueName: \"kubernetes.io/projected/29c8d3f9-19a7-46ce-a0ba-71bec54e66e0-kube-api-access-tp8gv\") on node \"crc\" DevicePath \"\"" Oct 12 07:00:47 crc kubenswrapper[4930]: I1012 07:00:47.048660 4930 generic.go:334] "Generic (PLEG): container finished" podID="29c8d3f9-19a7-46ce-a0ba-71bec54e66e0" containerID="29915e426798e47c899f540d8b386c61ec1ea4dd0044356466bab037993411ce" exitCode=0 Oct 12 07:00:47 crc kubenswrapper[4930]: I1012 07:00:47.048889 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2tpkg" event={"ID":"29c8d3f9-19a7-46ce-a0ba-71bec54e66e0","Type":"ContainerDied","Data":"29915e426798e47c899f540d8b386c61ec1ea4dd0044356466bab037993411ce"} Oct 12 07:00:47 crc kubenswrapper[4930]: I1012 07:00:47.048994 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2tpkg" event={"ID":"29c8d3f9-19a7-46ce-a0ba-71bec54e66e0","Type":"ContainerDied","Data":"dfbb6d610e63dcb47cffcdb356baf36a4ec9a26d060fca30cfbc59183dab8a23"} Oct 12 07:00:47 crc kubenswrapper[4930]: I1012 07:00:47.049002 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2tpkg" Oct 12 07:00:47 crc kubenswrapper[4930]: I1012 07:00:47.049045 4930 scope.go:117] "RemoveContainer" containerID="29915e426798e47c899f540d8b386c61ec1ea4dd0044356466bab037993411ce" Oct 12 07:00:47 crc kubenswrapper[4930]: I1012 07:00:47.084837 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2tpkg"] Oct 12 07:00:47 crc kubenswrapper[4930]: I1012 07:00:47.094589 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2tpkg"] Oct 12 07:00:47 crc kubenswrapper[4930]: I1012 07:00:47.102038 4930 scope.go:117] "RemoveContainer" containerID="9ad9265ff9997af9e053266631992b179d609a52b5e717236b3f963db231dc31" Oct 12 07:00:47 crc kubenswrapper[4930]: I1012 07:00:47.135341 4930 scope.go:117] "RemoveContainer" containerID="da1b40f1545e0bfb69caf312917b67747a376eb7d694076fa7b4ee28958c55b0" Oct 12 07:00:47 crc kubenswrapper[4930]: I1012 07:00:47.140944 4930 scope.go:117] "RemoveContainer" containerID="076c76e97089f1cdead79659e146da8eb4992a0465ff256d81a0c88519c79503" Oct 12 07:00:47 crc kubenswrapper[4930]: I1012 07:00:47.193730 4930 scope.go:117] "RemoveContainer" containerID="29915e426798e47c899f540d8b386c61ec1ea4dd0044356466bab037993411ce" Oct 12 07:00:47 crc kubenswrapper[4930]: E1012 07:00:47.194256 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29915e426798e47c899f540d8b386c61ec1ea4dd0044356466bab037993411ce\": container with ID starting with 29915e426798e47c899f540d8b386c61ec1ea4dd0044356466bab037993411ce not found: ID does not exist" containerID="29915e426798e47c899f540d8b386c61ec1ea4dd0044356466bab037993411ce" Oct 12 07:00:47 crc kubenswrapper[4930]: I1012 07:00:47.194293 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29915e426798e47c899f540d8b386c61ec1ea4dd0044356466bab037993411ce"} err="failed to get container status \"29915e426798e47c899f540d8b386c61ec1ea4dd0044356466bab037993411ce\": rpc error: code = NotFound desc = could not find container \"29915e426798e47c899f540d8b386c61ec1ea4dd0044356466bab037993411ce\": container with ID starting with 29915e426798e47c899f540d8b386c61ec1ea4dd0044356466bab037993411ce not found: ID does not exist" Oct 12 07:00:47 crc kubenswrapper[4930]: I1012 07:00:47.194317 4930 scope.go:117] "RemoveContainer" containerID="9ad9265ff9997af9e053266631992b179d609a52b5e717236b3f963db231dc31" Oct 12 07:00:47 crc kubenswrapper[4930]: E1012 07:00:47.194619 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ad9265ff9997af9e053266631992b179d609a52b5e717236b3f963db231dc31\": container with ID starting with 9ad9265ff9997af9e053266631992b179d609a52b5e717236b3f963db231dc31 not found: ID does not exist" containerID="9ad9265ff9997af9e053266631992b179d609a52b5e717236b3f963db231dc31" Oct 12 07:00:47 crc kubenswrapper[4930]: I1012 07:00:47.194650 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ad9265ff9997af9e053266631992b179d609a52b5e717236b3f963db231dc31"} err="failed to get container status \"9ad9265ff9997af9e053266631992b179d609a52b5e717236b3f963db231dc31\": rpc error: code = NotFound desc = could not find container \"9ad9265ff9997af9e053266631992b179d609a52b5e717236b3f963db231dc31\": container with ID starting with 9ad9265ff9997af9e053266631992b179d609a52b5e717236b3f963db231dc31 not found: ID does not exist" Oct 12 07:00:47 crc kubenswrapper[4930]: I1012 07:00:47.194667 4930 scope.go:117] "RemoveContainer" containerID="076c76e97089f1cdead79659e146da8eb4992a0465ff256d81a0c88519c79503" Oct 12 07:00:47 crc kubenswrapper[4930]: E1012 07:00:47.195055 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"076c76e97089f1cdead79659e146da8eb4992a0465ff256d81a0c88519c79503\": container with ID starting with 076c76e97089f1cdead79659e146da8eb4992a0465ff256d81a0c88519c79503 not found: ID does not exist" containerID="076c76e97089f1cdead79659e146da8eb4992a0465ff256d81a0c88519c79503" Oct 12 07:00:47 crc kubenswrapper[4930]: I1012 07:00:47.195084 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"076c76e97089f1cdead79659e146da8eb4992a0465ff256d81a0c88519c79503"} err="failed to get container status \"076c76e97089f1cdead79659e146da8eb4992a0465ff256d81a0c88519c79503\": rpc error: code = NotFound desc = could not find container \"076c76e97089f1cdead79659e146da8eb4992a0465ff256d81a0c88519c79503\": container with ID starting with 076c76e97089f1cdead79659e146da8eb4992a0465ff256d81a0c88519c79503 not found: ID does not exist" Oct 12 07:00:48 crc kubenswrapper[4930]: I1012 07:00:48.063992 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" event={"ID":"02f8684c-a3e4-44e8-9741-9f54488d8d8d","Type":"ContainerStarted","Data":"9812bf7d8319d477322547486dc4d274e43fe7f8659ca19f2dccdabfb78ab503"} Oct 12 07:00:48 crc kubenswrapper[4930]: I1012 07:00:48.154144 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29c8d3f9-19a7-46ce-a0ba-71bec54e66e0" path="/var/lib/kubelet/pods/29c8d3f9-19a7-46ce-a0ba-71bec54e66e0/volumes" Oct 12 07:01:00 crc kubenswrapper[4930]: I1012 07:01:00.188109 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29337541-5db6s"] Oct 12 07:01:00 crc kubenswrapper[4930]: E1012 07:01:00.189907 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29c8d3f9-19a7-46ce-a0ba-71bec54e66e0" containerName="extract-utilities" Oct 12 07:01:00 crc kubenswrapper[4930]: I1012 07:01:00.189947 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="29c8d3f9-19a7-46ce-a0ba-71bec54e66e0" containerName="extract-utilities" Oct 12 07:01:00 crc kubenswrapper[4930]: E1012 07:01:00.190049 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29c8d3f9-19a7-46ce-a0ba-71bec54e66e0" containerName="extract-content" Oct 12 07:01:00 crc kubenswrapper[4930]: I1012 07:01:00.190068 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="29c8d3f9-19a7-46ce-a0ba-71bec54e66e0" containerName="extract-content" Oct 12 07:01:00 crc kubenswrapper[4930]: E1012 07:01:00.190107 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29c8d3f9-19a7-46ce-a0ba-71bec54e66e0" containerName="registry-server" Oct 12 07:01:00 crc kubenswrapper[4930]: I1012 07:01:00.190126 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="29c8d3f9-19a7-46ce-a0ba-71bec54e66e0" containerName="registry-server" Oct 12 07:01:00 crc kubenswrapper[4930]: I1012 07:01:00.190631 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="29c8d3f9-19a7-46ce-a0ba-71bec54e66e0" containerName="registry-server" Oct 12 07:01:00 crc kubenswrapper[4930]: I1012 07:01:00.192508 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29337541-5db6s" Oct 12 07:01:00 crc kubenswrapper[4930]: I1012 07:01:00.215171 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29337541-5db6s"] Oct 12 07:01:00 crc kubenswrapper[4930]: I1012 07:01:00.220779 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgl86\" (UniqueName: \"kubernetes.io/projected/f9a44177-8b21-4b9b-8367-bdb45a60f379-kube-api-access-wgl86\") pod \"keystone-cron-29337541-5db6s\" (UID: \"f9a44177-8b21-4b9b-8367-bdb45a60f379\") " pod="openstack/keystone-cron-29337541-5db6s" Oct 12 07:01:00 crc kubenswrapper[4930]: I1012 07:01:00.220956 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9a44177-8b21-4b9b-8367-bdb45a60f379-combined-ca-bundle\") pod \"keystone-cron-29337541-5db6s\" (UID: \"f9a44177-8b21-4b9b-8367-bdb45a60f379\") " pod="openstack/keystone-cron-29337541-5db6s" Oct 12 07:01:00 crc kubenswrapper[4930]: I1012 07:01:00.221086 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9a44177-8b21-4b9b-8367-bdb45a60f379-config-data\") pod \"keystone-cron-29337541-5db6s\" (UID: \"f9a44177-8b21-4b9b-8367-bdb45a60f379\") " pod="openstack/keystone-cron-29337541-5db6s" Oct 12 07:01:00 crc kubenswrapper[4930]: I1012 07:01:00.221173 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f9a44177-8b21-4b9b-8367-bdb45a60f379-fernet-keys\") pod \"keystone-cron-29337541-5db6s\" (UID: \"f9a44177-8b21-4b9b-8367-bdb45a60f379\") " pod="openstack/keystone-cron-29337541-5db6s" Oct 12 07:01:00 crc kubenswrapper[4930]: I1012 07:01:00.322296 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9a44177-8b21-4b9b-8367-bdb45a60f379-config-data\") pod \"keystone-cron-29337541-5db6s\" (UID: \"f9a44177-8b21-4b9b-8367-bdb45a60f379\") " pod="openstack/keystone-cron-29337541-5db6s" Oct 12 07:01:00 crc kubenswrapper[4930]: I1012 07:01:00.322376 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f9a44177-8b21-4b9b-8367-bdb45a60f379-fernet-keys\") pod \"keystone-cron-29337541-5db6s\" (UID: \"f9a44177-8b21-4b9b-8367-bdb45a60f379\") " pod="openstack/keystone-cron-29337541-5db6s" Oct 12 07:01:00 crc kubenswrapper[4930]: I1012 07:01:00.322466 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgl86\" (UniqueName: \"kubernetes.io/projected/f9a44177-8b21-4b9b-8367-bdb45a60f379-kube-api-access-wgl86\") pod \"keystone-cron-29337541-5db6s\" (UID: \"f9a44177-8b21-4b9b-8367-bdb45a60f379\") " pod="openstack/keystone-cron-29337541-5db6s" Oct 12 07:01:00 crc kubenswrapper[4930]: I1012 07:01:00.322543 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9a44177-8b21-4b9b-8367-bdb45a60f379-combined-ca-bundle\") pod \"keystone-cron-29337541-5db6s\" (UID: \"f9a44177-8b21-4b9b-8367-bdb45a60f379\") " pod="openstack/keystone-cron-29337541-5db6s" Oct 12 07:01:00 crc kubenswrapper[4930]: I1012 07:01:00.331225 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9a44177-8b21-4b9b-8367-bdb45a60f379-combined-ca-bundle\") pod \"keystone-cron-29337541-5db6s\" (UID: \"f9a44177-8b21-4b9b-8367-bdb45a60f379\") " pod="openstack/keystone-cron-29337541-5db6s" Oct 12 07:01:00 crc kubenswrapper[4930]: I1012 07:01:00.332671 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9a44177-8b21-4b9b-8367-bdb45a60f379-config-data\") pod \"keystone-cron-29337541-5db6s\" (UID: \"f9a44177-8b21-4b9b-8367-bdb45a60f379\") " pod="openstack/keystone-cron-29337541-5db6s" Oct 12 07:01:00 crc kubenswrapper[4930]: I1012 07:01:00.333425 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f9a44177-8b21-4b9b-8367-bdb45a60f379-fernet-keys\") pod \"keystone-cron-29337541-5db6s\" (UID: \"f9a44177-8b21-4b9b-8367-bdb45a60f379\") " pod="openstack/keystone-cron-29337541-5db6s" Oct 12 07:01:00 crc kubenswrapper[4930]: I1012 07:01:00.360553 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgl86\" (UniqueName: \"kubernetes.io/projected/f9a44177-8b21-4b9b-8367-bdb45a60f379-kube-api-access-wgl86\") pod \"keystone-cron-29337541-5db6s\" (UID: \"f9a44177-8b21-4b9b-8367-bdb45a60f379\") " pod="openstack/keystone-cron-29337541-5db6s" Oct 12 07:01:00 crc kubenswrapper[4930]: I1012 07:01:00.541551 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29337541-5db6s" Oct 12 07:01:01 crc kubenswrapper[4930]: I1012 07:01:01.040362 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29337541-5db6s"] Oct 12 07:01:01 crc kubenswrapper[4930]: I1012 07:01:01.256463 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29337541-5db6s" event={"ID":"f9a44177-8b21-4b9b-8367-bdb45a60f379","Type":"ContainerStarted","Data":"43d5cd27d6c96f91069c3e180a7221d137e5b7553895966914597f0c98f605e2"} Oct 12 07:01:02 crc kubenswrapper[4930]: I1012 07:01:02.276275 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29337541-5db6s" event={"ID":"f9a44177-8b21-4b9b-8367-bdb45a60f379","Type":"ContainerStarted","Data":"6cf27fc1a46cd16b7b1bf2fd9c3a47a35b1f0f94b4ee378e297f1380538c5096"} Oct 12 07:01:02 crc kubenswrapper[4930]: I1012 07:01:02.297233 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29337541-5db6s" podStartSLOduration=2.297211932 podStartE2EDuration="2.297211932s" podCreationTimestamp="2025-10-12 07:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:01:02.295784137 +0000 UTC m=+4794.837885982" watchObservedRunningTime="2025-10-12 07:01:02.297211932 +0000 UTC m=+4794.839313707" Oct 12 07:01:06 crc kubenswrapper[4930]: I1012 07:01:06.326591 4930 generic.go:334] "Generic (PLEG): container finished" podID="f9a44177-8b21-4b9b-8367-bdb45a60f379" containerID="6cf27fc1a46cd16b7b1bf2fd9c3a47a35b1f0f94b4ee378e297f1380538c5096" exitCode=0 Oct 12 07:01:06 crc kubenswrapper[4930]: I1012 07:01:06.326670 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29337541-5db6s" event={"ID":"f9a44177-8b21-4b9b-8367-bdb45a60f379","Type":"ContainerDied","Data":"6cf27fc1a46cd16b7b1bf2fd9c3a47a35b1f0f94b4ee378e297f1380538c5096"} Oct 12 07:01:08 crc kubenswrapper[4930]: I1012 07:01:08.273097 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29337541-5db6s" Oct 12 07:01:08 crc kubenswrapper[4930]: I1012 07:01:08.352963 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29337541-5db6s" event={"ID":"f9a44177-8b21-4b9b-8367-bdb45a60f379","Type":"ContainerDied","Data":"43d5cd27d6c96f91069c3e180a7221d137e5b7553895966914597f0c98f605e2"} Oct 12 07:01:08 crc kubenswrapper[4930]: I1012 07:01:08.353004 4930 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="43d5cd27d6c96f91069c3e180a7221d137e5b7553895966914597f0c98f605e2" Oct 12 07:01:08 crc kubenswrapper[4930]: I1012 07:01:08.353038 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29337541-5db6s" Oct 12 07:01:08 crc kubenswrapper[4930]: I1012 07:01:08.411835 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f9a44177-8b21-4b9b-8367-bdb45a60f379-fernet-keys\") pod \"f9a44177-8b21-4b9b-8367-bdb45a60f379\" (UID: \"f9a44177-8b21-4b9b-8367-bdb45a60f379\") " Oct 12 07:01:08 crc kubenswrapper[4930]: I1012 07:01:08.411950 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9a44177-8b21-4b9b-8367-bdb45a60f379-combined-ca-bundle\") pod \"f9a44177-8b21-4b9b-8367-bdb45a60f379\" (UID: \"f9a44177-8b21-4b9b-8367-bdb45a60f379\") " Oct 12 07:01:08 crc kubenswrapper[4930]: I1012 07:01:08.412081 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wgl86\" (UniqueName: \"kubernetes.io/projected/f9a44177-8b21-4b9b-8367-bdb45a60f379-kube-api-access-wgl86\") pod \"f9a44177-8b21-4b9b-8367-bdb45a60f379\" (UID: \"f9a44177-8b21-4b9b-8367-bdb45a60f379\") " Oct 12 07:01:08 crc kubenswrapper[4930]: I1012 07:01:08.412209 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9a44177-8b21-4b9b-8367-bdb45a60f379-config-data\") pod \"f9a44177-8b21-4b9b-8367-bdb45a60f379\" (UID: \"f9a44177-8b21-4b9b-8367-bdb45a60f379\") " Oct 12 07:01:08 crc kubenswrapper[4930]: I1012 07:01:08.443491 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9a44177-8b21-4b9b-8367-bdb45a60f379-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "f9a44177-8b21-4b9b-8367-bdb45a60f379" (UID: "f9a44177-8b21-4b9b-8367-bdb45a60f379"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:01:08 crc kubenswrapper[4930]: I1012 07:01:08.444192 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9a44177-8b21-4b9b-8367-bdb45a60f379-kube-api-access-wgl86" (OuterVolumeSpecName: "kube-api-access-wgl86") pod "f9a44177-8b21-4b9b-8367-bdb45a60f379" (UID: "f9a44177-8b21-4b9b-8367-bdb45a60f379"). InnerVolumeSpecName "kube-api-access-wgl86". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:01:08 crc kubenswrapper[4930]: I1012 07:01:08.451236 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9a44177-8b21-4b9b-8367-bdb45a60f379-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f9a44177-8b21-4b9b-8367-bdb45a60f379" (UID: "f9a44177-8b21-4b9b-8367-bdb45a60f379"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:01:08 crc kubenswrapper[4930]: I1012 07:01:08.474121 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9a44177-8b21-4b9b-8367-bdb45a60f379-config-data" (OuterVolumeSpecName: "config-data") pod "f9a44177-8b21-4b9b-8367-bdb45a60f379" (UID: "f9a44177-8b21-4b9b-8367-bdb45a60f379"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:01:08 crc kubenswrapper[4930]: I1012 07:01:08.514791 4930 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9a44177-8b21-4b9b-8367-bdb45a60f379-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 12 07:01:08 crc kubenswrapper[4930]: I1012 07:01:08.514826 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wgl86\" (UniqueName: \"kubernetes.io/projected/f9a44177-8b21-4b9b-8367-bdb45a60f379-kube-api-access-wgl86\") on node \"crc\" DevicePath \"\"" Oct 12 07:01:08 crc kubenswrapper[4930]: I1012 07:01:08.514836 4930 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9a44177-8b21-4b9b-8367-bdb45a60f379-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 07:01:08 crc kubenswrapper[4930]: I1012 07:01:08.514845 4930 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f9a44177-8b21-4b9b-8367-bdb45a60f379-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 12 07:02:06 crc kubenswrapper[4930]: I1012 07:02:06.576293 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pd6gf"] Oct 12 07:02:06 crc kubenswrapper[4930]: E1012 07:02:06.577170 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9a44177-8b21-4b9b-8367-bdb45a60f379" containerName="keystone-cron" Oct 12 07:02:06 crc kubenswrapper[4930]: I1012 07:02:06.577183 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9a44177-8b21-4b9b-8367-bdb45a60f379" containerName="keystone-cron" Oct 12 07:02:06 crc kubenswrapper[4930]: I1012 07:02:06.577388 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9a44177-8b21-4b9b-8367-bdb45a60f379" containerName="keystone-cron" Oct 12 07:02:06 crc kubenswrapper[4930]: I1012 07:02:06.578766 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pd6gf" Oct 12 07:02:06 crc kubenswrapper[4930]: I1012 07:02:06.663465 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pd6gf"] Oct 12 07:02:06 crc kubenswrapper[4930]: I1012 07:02:06.678054 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44d19823-bc67-45ff-851a-6bec1a19de1d-catalog-content\") pod \"community-operators-pd6gf\" (UID: \"44d19823-bc67-45ff-851a-6bec1a19de1d\") " pod="openshift-marketplace/community-operators-pd6gf" Oct 12 07:02:06 crc kubenswrapper[4930]: I1012 07:02:06.678117 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2x6mp\" (UniqueName: \"kubernetes.io/projected/44d19823-bc67-45ff-851a-6bec1a19de1d-kube-api-access-2x6mp\") pod \"community-operators-pd6gf\" (UID: \"44d19823-bc67-45ff-851a-6bec1a19de1d\") " pod="openshift-marketplace/community-operators-pd6gf" Oct 12 07:02:06 crc kubenswrapper[4930]: I1012 07:02:06.678235 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44d19823-bc67-45ff-851a-6bec1a19de1d-utilities\") pod \"community-operators-pd6gf\" (UID: \"44d19823-bc67-45ff-851a-6bec1a19de1d\") " pod="openshift-marketplace/community-operators-pd6gf" Oct 12 07:02:06 crc kubenswrapper[4930]: I1012 07:02:06.779882 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44d19823-bc67-45ff-851a-6bec1a19de1d-utilities\") pod \"community-operators-pd6gf\" (UID: \"44d19823-bc67-45ff-851a-6bec1a19de1d\") " pod="openshift-marketplace/community-operators-pd6gf" Oct 12 07:02:06 crc kubenswrapper[4930]: I1012 07:02:06.779999 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44d19823-bc67-45ff-851a-6bec1a19de1d-catalog-content\") pod \"community-operators-pd6gf\" (UID: \"44d19823-bc67-45ff-851a-6bec1a19de1d\") " pod="openshift-marketplace/community-operators-pd6gf" Oct 12 07:02:06 crc kubenswrapper[4930]: I1012 07:02:06.780052 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2x6mp\" (UniqueName: \"kubernetes.io/projected/44d19823-bc67-45ff-851a-6bec1a19de1d-kube-api-access-2x6mp\") pod \"community-operators-pd6gf\" (UID: \"44d19823-bc67-45ff-851a-6bec1a19de1d\") " pod="openshift-marketplace/community-operators-pd6gf" Oct 12 07:02:06 crc kubenswrapper[4930]: I1012 07:02:06.780429 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44d19823-bc67-45ff-851a-6bec1a19de1d-utilities\") pod \"community-operators-pd6gf\" (UID: \"44d19823-bc67-45ff-851a-6bec1a19de1d\") " pod="openshift-marketplace/community-operators-pd6gf" Oct 12 07:02:06 crc kubenswrapper[4930]: I1012 07:02:06.780482 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44d19823-bc67-45ff-851a-6bec1a19de1d-catalog-content\") pod \"community-operators-pd6gf\" (UID: \"44d19823-bc67-45ff-851a-6bec1a19de1d\") " pod="openshift-marketplace/community-operators-pd6gf" Oct 12 07:02:06 crc kubenswrapper[4930]: I1012 07:02:06.806580 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2x6mp\" (UniqueName: \"kubernetes.io/projected/44d19823-bc67-45ff-851a-6bec1a19de1d-kube-api-access-2x6mp\") pod \"community-operators-pd6gf\" (UID: \"44d19823-bc67-45ff-851a-6bec1a19de1d\") " pod="openshift-marketplace/community-operators-pd6gf" Oct 12 07:02:06 crc kubenswrapper[4930]: I1012 07:02:06.903460 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pd6gf" Oct 12 07:02:07 crc kubenswrapper[4930]: I1012 07:02:07.451133 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pd6gf"] Oct 12 07:02:07 crc kubenswrapper[4930]: W1012 07:02:07.460937 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44d19823_bc67_45ff_851a_6bec1a19de1d.slice/crio-26b1ed6f7139b4ddae9878ceea158a2c0babc4e7e8f1cd5e76e01c18a69354c8 WatchSource:0}: Error finding container 26b1ed6f7139b4ddae9878ceea158a2c0babc4e7e8f1cd5e76e01c18a69354c8: Status 404 returned error can't find the container with id 26b1ed6f7139b4ddae9878ceea158a2c0babc4e7e8f1cd5e76e01c18a69354c8 Oct 12 07:02:08 crc kubenswrapper[4930]: I1012 07:02:08.144902 4930 generic.go:334] "Generic (PLEG): container finished" podID="44d19823-bc67-45ff-851a-6bec1a19de1d" containerID="34be9d56d9b2137be6ea1f62d4f00ba262a28d1669e3cb925d5d40bd86668e17" exitCode=0 Oct 12 07:02:08 crc kubenswrapper[4930]: I1012 07:02:08.158449 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pd6gf" event={"ID":"44d19823-bc67-45ff-851a-6bec1a19de1d","Type":"ContainerDied","Data":"34be9d56d9b2137be6ea1f62d4f00ba262a28d1669e3cb925d5d40bd86668e17"} Oct 12 07:02:08 crc kubenswrapper[4930]: I1012 07:02:08.158490 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pd6gf" event={"ID":"44d19823-bc67-45ff-851a-6bec1a19de1d","Type":"ContainerStarted","Data":"26b1ed6f7139b4ddae9878ceea158a2c0babc4e7e8f1cd5e76e01c18a69354c8"} Oct 12 07:02:09 crc kubenswrapper[4930]: I1012 07:02:09.164641 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pd6gf" event={"ID":"44d19823-bc67-45ff-851a-6bec1a19de1d","Type":"ContainerStarted","Data":"7ddafdeeaea4c492fe7a0a9b37e31631e61ca50690493b99a9dcc2bf4b8fcb16"} Oct 12 07:02:11 crc kubenswrapper[4930]: I1012 07:02:11.188729 4930 generic.go:334] "Generic (PLEG): container finished" podID="44d19823-bc67-45ff-851a-6bec1a19de1d" containerID="7ddafdeeaea4c492fe7a0a9b37e31631e61ca50690493b99a9dcc2bf4b8fcb16" exitCode=0 Oct 12 07:02:11 crc kubenswrapper[4930]: I1012 07:02:11.188775 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pd6gf" event={"ID":"44d19823-bc67-45ff-851a-6bec1a19de1d","Type":"ContainerDied","Data":"7ddafdeeaea4c492fe7a0a9b37e31631e61ca50690493b99a9dcc2bf4b8fcb16"} Oct 12 07:02:12 crc kubenswrapper[4930]: I1012 07:02:12.203541 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pd6gf" event={"ID":"44d19823-bc67-45ff-851a-6bec1a19de1d","Type":"ContainerStarted","Data":"72a4008c096d1577aa549234599eb4492315cfe87b76496b352394fdfaff5aa9"} Oct 12 07:02:12 crc kubenswrapper[4930]: I1012 07:02:12.232755 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pd6gf" podStartSLOduration=2.692491585 podStartE2EDuration="6.232716199s" podCreationTimestamp="2025-10-12 07:02:06 +0000 UTC" firstStartedPulling="2025-10-12 07:02:08.150420321 +0000 UTC m=+4860.692522096" lastFinishedPulling="2025-10-12 07:02:11.690644945 +0000 UTC m=+4864.232746710" observedRunningTime="2025-10-12 07:02:12.225061581 +0000 UTC m=+4864.767163346" watchObservedRunningTime="2025-10-12 07:02:12.232716199 +0000 UTC m=+4864.774817984" Oct 12 07:02:16 crc kubenswrapper[4930]: I1012 07:02:16.904558 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pd6gf" Oct 12 07:02:16 crc kubenswrapper[4930]: I1012 07:02:16.905996 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pd6gf" Oct 12 07:02:17 crc kubenswrapper[4930]: I1012 07:02:17.613874 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pd6gf" Oct 12 07:02:17 crc kubenswrapper[4930]: I1012 07:02:17.687117 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pd6gf" Oct 12 07:02:17 crc kubenswrapper[4930]: I1012 07:02:17.866343 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pd6gf"] Oct 12 07:02:19 crc kubenswrapper[4930]: I1012 07:02:19.291131 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-pd6gf" podUID="44d19823-bc67-45ff-851a-6bec1a19de1d" containerName="registry-server" containerID="cri-o://72a4008c096d1577aa549234599eb4492315cfe87b76496b352394fdfaff5aa9" gracePeriod=2 Oct 12 07:02:19 crc kubenswrapper[4930]: I1012 07:02:19.836986 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pd6gf" Oct 12 07:02:19 crc kubenswrapper[4930]: I1012 07:02:19.989454 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2x6mp\" (UniqueName: \"kubernetes.io/projected/44d19823-bc67-45ff-851a-6bec1a19de1d-kube-api-access-2x6mp\") pod \"44d19823-bc67-45ff-851a-6bec1a19de1d\" (UID: \"44d19823-bc67-45ff-851a-6bec1a19de1d\") " Oct 12 07:02:19 crc kubenswrapper[4930]: I1012 07:02:19.989496 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44d19823-bc67-45ff-851a-6bec1a19de1d-catalog-content\") pod \"44d19823-bc67-45ff-851a-6bec1a19de1d\" (UID: \"44d19823-bc67-45ff-851a-6bec1a19de1d\") " Oct 12 07:02:19 crc kubenswrapper[4930]: I1012 07:02:19.989671 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44d19823-bc67-45ff-851a-6bec1a19de1d-utilities\") pod \"44d19823-bc67-45ff-851a-6bec1a19de1d\" (UID: \"44d19823-bc67-45ff-851a-6bec1a19de1d\") " Oct 12 07:02:19 crc kubenswrapper[4930]: I1012 07:02:19.990669 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44d19823-bc67-45ff-851a-6bec1a19de1d-utilities" (OuterVolumeSpecName: "utilities") pod "44d19823-bc67-45ff-851a-6bec1a19de1d" (UID: "44d19823-bc67-45ff-851a-6bec1a19de1d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:02:20 crc kubenswrapper[4930]: I1012 07:02:20.001536 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44d19823-bc67-45ff-851a-6bec1a19de1d-kube-api-access-2x6mp" (OuterVolumeSpecName: "kube-api-access-2x6mp") pod "44d19823-bc67-45ff-851a-6bec1a19de1d" (UID: "44d19823-bc67-45ff-851a-6bec1a19de1d"). InnerVolumeSpecName "kube-api-access-2x6mp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:02:20 crc kubenswrapper[4930]: I1012 07:02:20.038263 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44d19823-bc67-45ff-851a-6bec1a19de1d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "44d19823-bc67-45ff-851a-6bec1a19de1d" (UID: "44d19823-bc67-45ff-851a-6bec1a19de1d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:02:20 crc kubenswrapper[4930]: I1012 07:02:20.091212 4930 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44d19823-bc67-45ff-851a-6bec1a19de1d-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 07:02:20 crc kubenswrapper[4930]: I1012 07:02:20.091246 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2x6mp\" (UniqueName: \"kubernetes.io/projected/44d19823-bc67-45ff-851a-6bec1a19de1d-kube-api-access-2x6mp\") on node \"crc\" DevicePath \"\"" Oct 12 07:02:20 crc kubenswrapper[4930]: I1012 07:02:20.091255 4930 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44d19823-bc67-45ff-851a-6bec1a19de1d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 07:02:20 crc kubenswrapper[4930]: I1012 07:02:20.310103 4930 generic.go:334] "Generic (PLEG): container finished" podID="44d19823-bc67-45ff-851a-6bec1a19de1d" containerID="72a4008c096d1577aa549234599eb4492315cfe87b76496b352394fdfaff5aa9" exitCode=0 Oct 12 07:02:20 crc kubenswrapper[4930]: I1012 07:02:20.310147 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pd6gf" event={"ID":"44d19823-bc67-45ff-851a-6bec1a19de1d","Type":"ContainerDied","Data":"72a4008c096d1577aa549234599eb4492315cfe87b76496b352394fdfaff5aa9"} Oct 12 07:02:20 crc kubenswrapper[4930]: I1012 07:02:20.310177 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pd6gf" event={"ID":"44d19823-bc67-45ff-851a-6bec1a19de1d","Type":"ContainerDied","Data":"26b1ed6f7139b4ddae9878ceea158a2c0babc4e7e8f1cd5e76e01c18a69354c8"} Oct 12 07:02:20 crc kubenswrapper[4930]: I1012 07:02:20.310199 4930 scope.go:117] "RemoveContainer" containerID="72a4008c096d1577aa549234599eb4492315cfe87b76496b352394fdfaff5aa9" Oct 12 07:02:20 crc kubenswrapper[4930]: I1012 07:02:20.310353 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pd6gf" Oct 12 07:02:20 crc kubenswrapper[4930]: I1012 07:02:20.345380 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pd6gf"] Oct 12 07:02:20 crc kubenswrapper[4930]: I1012 07:02:20.358173 4930 scope.go:117] "RemoveContainer" containerID="7ddafdeeaea4c492fe7a0a9b37e31631e61ca50690493b99a9dcc2bf4b8fcb16" Oct 12 07:02:20 crc kubenswrapper[4930]: I1012 07:02:20.361183 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-pd6gf"] Oct 12 07:02:20 crc kubenswrapper[4930]: I1012 07:02:20.403548 4930 scope.go:117] "RemoveContainer" containerID="34be9d56d9b2137be6ea1f62d4f00ba262a28d1669e3cb925d5d40bd86668e17" Oct 12 07:02:20 crc kubenswrapper[4930]: I1012 07:02:20.457567 4930 scope.go:117] "RemoveContainer" containerID="72a4008c096d1577aa549234599eb4492315cfe87b76496b352394fdfaff5aa9" Oct 12 07:02:20 crc kubenswrapper[4930]: E1012 07:02:20.459173 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72a4008c096d1577aa549234599eb4492315cfe87b76496b352394fdfaff5aa9\": container with ID starting with 72a4008c096d1577aa549234599eb4492315cfe87b76496b352394fdfaff5aa9 not found: ID does not exist" containerID="72a4008c096d1577aa549234599eb4492315cfe87b76496b352394fdfaff5aa9" Oct 12 07:02:20 crc kubenswrapper[4930]: I1012 07:02:20.459211 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72a4008c096d1577aa549234599eb4492315cfe87b76496b352394fdfaff5aa9"} err="failed to get container status \"72a4008c096d1577aa549234599eb4492315cfe87b76496b352394fdfaff5aa9\": rpc error: code = NotFound desc = could not find container \"72a4008c096d1577aa549234599eb4492315cfe87b76496b352394fdfaff5aa9\": container with ID starting with 72a4008c096d1577aa549234599eb4492315cfe87b76496b352394fdfaff5aa9 not found: ID does not exist" Oct 12 07:02:20 crc kubenswrapper[4930]: I1012 07:02:20.459234 4930 scope.go:117] "RemoveContainer" containerID="7ddafdeeaea4c492fe7a0a9b37e31631e61ca50690493b99a9dcc2bf4b8fcb16" Oct 12 07:02:20 crc kubenswrapper[4930]: E1012 07:02:20.459918 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ddafdeeaea4c492fe7a0a9b37e31631e61ca50690493b99a9dcc2bf4b8fcb16\": container with ID starting with 7ddafdeeaea4c492fe7a0a9b37e31631e61ca50690493b99a9dcc2bf4b8fcb16 not found: ID does not exist" containerID="7ddafdeeaea4c492fe7a0a9b37e31631e61ca50690493b99a9dcc2bf4b8fcb16" Oct 12 07:02:20 crc kubenswrapper[4930]: I1012 07:02:20.459973 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ddafdeeaea4c492fe7a0a9b37e31631e61ca50690493b99a9dcc2bf4b8fcb16"} err="failed to get container status \"7ddafdeeaea4c492fe7a0a9b37e31631e61ca50690493b99a9dcc2bf4b8fcb16\": rpc error: code = NotFound desc = could not find container \"7ddafdeeaea4c492fe7a0a9b37e31631e61ca50690493b99a9dcc2bf4b8fcb16\": container with ID starting with 7ddafdeeaea4c492fe7a0a9b37e31631e61ca50690493b99a9dcc2bf4b8fcb16 not found: ID does not exist" Oct 12 07:02:20 crc kubenswrapper[4930]: I1012 07:02:20.460004 4930 scope.go:117] "RemoveContainer" containerID="34be9d56d9b2137be6ea1f62d4f00ba262a28d1669e3cb925d5d40bd86668e17" Oct 12 07:02:20 crc kubenswrapper[4930]: E1012 07:02:20.460327 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34be9d56d9b2137be6ea1f62d4f00ba262a28d1669e3cb925d5d40bd86668e17\": container with ID starting with 34be9d56d9b2137be6ea1f62d4f00ba262a28d1669e3cb925d5d40bd86668e17 not found: ID does not exist" containerID="34be9d56d9b2137be6ea1f62d4f00ba262a28d1669e3cb925d5d40bd86668e17" Oct 12 07:02:20 crc kubenswrapper[4930]: I1012 07:02:20.460354 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34be9d56d9b2137be6ea1f62d4f00ba262a28d1669e3cb925d5d40bd86668e17"} err="failed to get container status \"34be9d56d9b2137be6ea1f62d4f00ba262a28d1669e3cb925d5d40bd86668e17\": rpc error: code = NotFound desc = could not find container \"34be9d56d9b2137be6ea1f62d4f00ba262a28d1669e3cb925d5d40bd86668e17\": container with ID starting with 34be9d56d9b2137be6ea1f62d4f00ba262a28d1669e3cb925d5d40bd86668e17 not found: ID does not exist" Oct 12 07:02:22 crc kubenswrapper[4930]: I1012 07:02:22.146039 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44d19823-bc67-45ff-851a-6bec1a19de1d" path="/var/lib/kubelet/pods/44d19823-bc67-45ff-851a-6bec1a19de1d/volumes" Oct 12 07:02:33 crc kubenswrapper[4930]: I1012 07:02:33.592775 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qfw8k"] Oct 12 07:02:33 crc kubenswrapper[4930]: E1012 07:02:33.593640 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44d19823-bc67-45ff-851a-6bec1a19de1d" containerName="extract-content" Oct 12 07:02:33 crc kubenswrapper[4930]: I1012 07:02:33.593653 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="44d19823-bc67-45ff-851a-6bec1a19de1d" containerName="extract-content" Oct 12 07:02:33 crc kubenswrapper[4930]: E1012 07:02:33.593666 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44d19823-bc67-45ff-851a-6bec1a19de1d" containerName="registry-server" Oct 12 07:02:33 crc kubenswrapper[4930]: I1012 07:02:33.593674 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="44d19823-bc67-45ff-851a-6bec1a19de1d" containerName="registry-server" Oct 12 07:02:33 crc kubenswrapper[4930]: E1012 07:02:33.593688 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44d19823-bc67-45ff-851a-6bec1a19de1d" containerName="extract-utilities" Oct 12 07:02:33 crc kubenswrapper[4930]: I1012 07:02:33.593695 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="44d19823-bc67-45ff-851a-6bec1a19de1d" containerName="extract-utilities" Oct 12 07:02:33 crc kubenswrapper[4930]: I1012 07:02:33.593940 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="44d19823-bc67-45ff-851a-6bec1a19de1d" containerName="registry-server" Oct 12 07:02:33 crc kubenswrapper[4930]: I1012 07:02:33.595395 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qfw8k" Oct 12 07:02:33 crc kubenswrapper[4930]: I1012 07:02:33.616008 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qfw8k"] Oct 12 07:02:33 crc kubenswrapper[4930]: I1012 07:02:33.726442 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9krzh\" (UniqueName: \"kubernetes.io/projected/2a733cc2-49dd-46af-ad96-e3dce973bedf-kube-api-access-9krzh\") pod \"redhat-operators-qfw8k\" (UID: \"2a733cc2-49dd-46af-ad96-e3dce973bedf\") " pod="openshift-marketplace/redhat-operators-qfw8k" Oct 12 07:02:33 crc kubenswrapper[4930]: I1012 07:02:33.726714 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a733cc2-49dd-46af-ad96-e3dce973bedf-utilities\") pod \"redhat-operators-qfw8k\" (UID: \"2a733cc2-49dd-46af-ad96-e3dce973bedf\") " pod="openshift-marketplace/redhat-operators-qfw8k" Oct 12 07:02:33 crc kubenswrapper[4930]: I1012 07:02:33.726771 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a733cc2-49dd-46af-ad96-e3dce973bedf-catalog-content\") pod \"redhat-operators-qfw8k\" (UID: \"2a733cc2-49dd-46af-ad96-e3dce973bedf\") " pod="openshift-marketplace/redhat-operators-qfw8k" Oct 12 07:02:33 crc kubenswrapper[4930]: I1012 07:02:33.829422 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a733cc2-49dd-46af-ad96-e3dce973bedf-utilities\") pod \"redhat-operators-qfw8k\" (UID: \"2a733cc2-49dd-46af-ad96-e3dce973bedf\") " pod="openshift-marketplace/redhat-operators-qfw8k" Oct 12 07:02:33 crc kubenswrapper[4930]: I1012 07:02:33.829767 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a733cc2-49dd-46af-ad96-e3dce973bedf-catalog-content\") pod \"redhat-operators-qfw8k\" (UID: \"2a733cc2-49dd-46af-ad96-e3dce973bedf\") " pod="openshift-marketplace/redhat-operators-qfw8k" Oct 12 07:02:33 crc kubenswrapper[4930]: I1012 07:02:33.829837 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9krzh\" (UniqueName: \"kubernetes.io/projected/2a733cc2-49dd-46af-ad96-e3dce973bedf-kube-api-access-9krzh\") pod \"redhat-operators-qfw8k\" (UID: \"2a733cc2-49dd-46af-ad96-e3dce973bedf\") " pod="openshift-marketplace/redhat-operators-qfw8k" Oct 12 07:02:33 crc kubenswrapper[4930]: I1012 07:02:33.830935 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a733cc2-49dd-46af-ad96-e3dce973bedf-utilities\") pod \"redhat-operators-qfw8k\" (UID: \"2a733cc2-49dd-46af-ad96-e3dce973bedf\") " pod="openshift-marketplace/redhat-operators-qfw8k" Oct 12 07:02:33 crc kubenswrapper[4930]: I1012 07:02:33.830961 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a733cc2-49dd-46af-ad96-e3dce973bedf-catalog-content\") pod \"redhat-operators-qfw8k\" (UID: \"2a733cc2-49dd-46af-ad96-e3dce973bedf\") " pod="openshift-marketplace/redhat-operators-qfw8k" Oct 12 07:02:33 crc kubenswrapper[4930]: I1012 07:02:33.862635 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9krzh\" (UniqueName: \"kubernetes.io/projected/2a733cc2-49dd-46af-ad96-e3dce973bedf-kube-api-access-9krzh\") pod \"redhat-operators-qfw8k\" (UID: \"2a733cc2-49dd-46af-ad96-e3dce973bedf\") " pod="openshift-marketplace/redhat-operators-qfw8k" Oct 12 07:02:33 crc kubenswrapper[4930]: I1012 07:02:33.924451 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qfw8k" Oct 12 07:02:34 crc kubenswrapper[4930]: I1012 07:02:34.426398 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qfw8k"] Oct 12 07:02:34 crc kubenswrapper[4930]: I1012 07:02:34.482554 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qfw8k" event={"ID":"2a733cc2-49dd-46af-ad96-e3dce973bedf","Type":"ContainerStarted","Data":"d0e4f0806d3b64bd5fc9ce09366ae8e50c6eeffa319d8c2462fcd0ae89997429"} Oct 12 07:02:35 crc kubenswrapper[4930]: I1012 07:02:35.494544 4930 generic.go:334] "Generic (PLEG): container finished" podID="2a733cc2-49dd-46af-ad96-e3dce973bedf" containerID="d3ee61c8d492aae7fc66e389b370ad36e95bbd2bb54fdf83c6bc8421f1f8f76c" exitCode=0 Oct 12 07:02:35 crc kubenswrapper[4930]: I1012 07:02:35.494647 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qfw8k" event={"ID":"2a733cc2-49dd-46af-ad96-e3dce973bedf","Type":"ContainerDied","Data":"d3ee61c8d492aae7fc66e389b370ad36e95bbd2bb54fdf83c6bc8421f1f8f76c"} Oct 12 07:02:37 crc kubenswrapper[4930]: I1012 07:02:37.524234 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qfw8k" event={"ID":"2a733cc2-49dd-46af-ad96-e3dce973bedf","Type":"ContainerStarted","Data":"39bf51d24891434484623e2dd6efd9d295c8d8e22c8dbc3248ac47928cac47df"} Oct 12 07:02:40 crc kubenswrapper[4930]: I1012 07:02:40.560262 4930 generic.go:334] "Generic (PLEG): container finished" podID="2a733cc2-49dd-46af-ad96-e3dce973bedf" containerID="39bf51d24891434484623e2dd6efd9d295c8d8e22c8dbc3248ac47928cac47df" exitCode=0 Oct 12 07:02:40 crc kubenswrapper[4930]: I1012 07:02:40.560308 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qfw8k" event={"ID":"2a733cc2-49dd-46af-ad96-e3dce973bedf","Type":"ContainerDied","Data":"39bf51d24891434484623e2dd6efd9d295c8d8e22c8dbc3248ac47928cac47df"} Oct 12 07:02:41 crc kubenswrapper[4930]: I1012 07:02:41.575877 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qfw8k" event={"ID":"2a733cc2-49dd-46af-ad96-e3dce973bedf","Type":"ContainerStarted","Data":"4070fb7c9f1ae3479d41bf94a35db62cdb963649096290ec9fbabb43da5ee06a"} Oct 12 07:02:41 crc kubenswrapper[4930]: I1012 07:02:41.601680 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qfw8k" podStartSLOduration=3.140194905 podStartE2EDuration="8.601656412s" podCreationTimestamp="2025-10-12 07:02:33 +0000 UTC" firstStartedPulling="2025-10-12 07:02:35.497361449 +0000 UTC m=+4888.039463254" lastFinishedPulling="2025-10-12 07:02:40.958822986 +0000 UTC m=+4893.500924761" observedRunningTime="2025-10-12 07:02:41.596053015 +0000 UTC m=+4894.138154840" watchObservedRunningTime="2025-10-12 07:02:41.601656412 +0000 UTC m=+4894.143758197" Oct 12 07:02:43 crc kubenswrapper[4930]: I1012 07:02:43.924513 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qfw8k" Oct 12 07:02:43 crc kubenswrapper[4930]: I1012 07:02:43.924826 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qfw8k" Oct 12 07:02:44 crc kubenswrapper[4930]: I1012 07:02:44.975520 4930 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qfw8k" podUID="2a733cc2-49dd-46af-ad96-e3dce973bedf" containerName="registry-server" probeResult="failure" output=< Oct 12 07:02:44 crc kubenswrapper[4930]: timeout: failed to connect service ":50051" within 1s Oct 12 07:02:44 crc kubenswrapper[4930]: > Oct 12 07:02:54 crc kubenswrapper[4930]: I1012 07:02:54.156417 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qfw8k" Oct 12 07:02:54 crc kubenswrapper[4930]: I1012 07:02:54.224047 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qfw8k" Oct 12 07:02:54 crc kubenswrapper[4930]: I1012 07:02:54.407055 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qfw8k"] Oct 12 07:02:55 crc kubenswrapper[4930]: I1012 07:02:55.737945 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qfw8k" podUID="2a733cc2-49dd-46af-ad96-e3dce973bedf" containerName="registry-server" containerID="cri-o://4070fb7c9f1ae3479d41bf94a35db62cdb963649096290ec9fbabb43da5ee06a" gracePeriod=2 Oct 12 07:02:56 crc kubenswrapper[4930]: I1012 07:02:56.374156 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qfw8k" Oct 12 07:02:56 crc kubenswrapper[4930]: I1012 07:02:56.542245 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9krzh\" (UniqueName: \"kubernetes.io/projected/2a733cc2-49dd-46af-ad96-e3dce973bedf-kube-api-access-9krzh\") pod \"2a733cc2-49dd-46af-ad96-e3dce973bedf\" (UID: \"2a733cc2-49dd-46af-ad96-e3dce973bedf\") " Oct 12 07:02:56 crc kubenswrapper[4930]: I1012 07:02:56.542408 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a733cc2-49dd-46af-ad96-e3dce973bedf-utilities\") pod \"2a733cc2-49dd-46af-ad96-e3dce973bedf\" (UID: \"2a733cc2-49dd-46af-ad96-e3dce973bedf\") " Oct 12 07:02:56 crc kubenswrapper[4930]: I1012 07:02:56.542468 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a733cc2-49dd-46af-ad96-e3dce973bedf-catalog-content\") pod \"2a733cc2-49dd-46af-ad96-e3dce973bedf\" (UID: \"2a733cc2-49dd-46af-ad96-e3dce973bedf\") " Oct 12 07:02:56 crc kubenswrapper[4930]: I1012 07:02:56.545415 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a733cc2-49dd-46af-ad96-e3dce973bedf-utilities" (OuterVolumeSpecName: "utilities") pod "2a733cc2-49dd-46af-ad96-e3dce973bedf" (UID: "2a733cc2-49dd-46af-ad96-e3dce973bedf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:02:56 crc kubenswrapper[4930]: I1012 07:02:56.551888 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a733cc2-49dd-46af-ad96-e3dce973bedf-kube-api-access-9krzh" (OuterVolumeSpecName: "kube-api-access-9krzh") pod "2a733cc2-49dd-46af-ad96-e3dce973bedf" (UID: "2a733cc2-49dd-46af-ad96-e3dce973bedf"). InnerVolumeSpecName "kube-api-access-9krzh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:02:56 crc kubenswrapper[4930]: I1012 07:02:56.645986 4930 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a733cc2-49dd-46af-ad96-e3dce973bedf-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 07:02:56 crc kubenswrapper[4930]: I1012 07:02:56.646279 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9krzh\" (UniqueName: \"kubernetes.io/projected/2a733cc2-49dd-46af-ad96-e3dce973bedf-kube-api-access-9krzh\") on node \"crc\" DevicePath \"\"" Oct 12 07:02:56 crc kubenswrapper[4930]: I1012 07:02:56.652710 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a733cc2-49dd-46af-ad96-e3dce973bedf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2a733cc2-49dd-46af-ad96-e3dce973bedf" (UID: "2a733cc2-49dd-46af-ad96-e3dce973bedf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:02:56 crc kubenswrapper[4930]: I1012 07:02:56.748102 4930 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a733cc2-49dd-46af-ad96-e3dce973bedf-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 07:02:56 crc kubenswrapper[4930]: I1012 07:02:56.751432 4930 generic.go:334] "Generic (PLEG): container finished" podID="2a733cc2-49dd-46af-ad96-e3dce973bedf" containerID="4070fb7c9f1ae3479d41bf94a35db62cdb963649096290ec9fbabb43da5ee06a" exitCode=0 Oct 12 07:02:56 crc kubenswrapper[4930]: I1012 07:02:56.751487 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qfw8k" event={"ID":"2a733cc2-49dd-46af-ad96-e3dce973bedf","Type":"ContainerDied","Data":"4070fb7c9f1ae3479d41bf94a35db62cdb963649096290ec9fbabb43da5ee06a"} Oct 12 07:02:56 crc kubenswrapper[4930]: I1012 07:02:56.751498 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qfw8k" Oct 12 07:02:56 crc kubenswrapper[4930]: I1012 07:02:56.751527 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qfw8k" event={"ID":"2a733cc2-49dd-46af-ad96-e3dce973bedf","Type":"ContainerDied","Data":"d0e4f0806d3b64bd5fc9ce09366ae8e50c6eeffa319d8c2462fcd0ae89997429"} Oct 12 07:02:56 crc kubenswrapper[4930]: I1012 07:02:56.751556 4930 scope.go:117] "RemoveContainer" containerID="4070fb7c9f1ae3479d41bf94a35db62cdb963649096290ec9fbabb43da5ee06a" Oct 12 07:02:56 crc kubenswrapper[4930]: I1012 07:02:56.792452 4930 scope.go:117] "RemoveContainer" containerID="39bf51d24891434484623e2dd6efd9d295c8d8e22c8dbc3248ac47928cac47df" Oct 12 07:02:56 crc kubenswrapper[4930]: I1012 07:02:56.805517 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qfw8k"] Oct 12 07:02:56 crc kubenswrapper[4930]: I1012 07:02:56.817817 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qfw8k"] Oct 12 07:02:56 crc kubenswrapper[4930]: I1012 07:02:56.820615 4930 scope.go:117] "RemoveContainer" containerID="d3ee61c8d492aae7fc66e389b370ad36e95bbd2bb54fdf83c6bc8421f1f8f76c" Oct 12 07:02:56 crc kubenswrapper[4930]: I1012 07:02:56.863523 4930 scope.go:117] "RemoveContainer" containerID="4070fb7c9f1ae3479d41bf94a35db62cdb963649096290ec9fbabb43da5ee06a" Oct 12 07:02:56 crc kubenswrapper[4930]: E1012 07:02:56.863991 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4070fb7c9f1ae3479d41bf94a35db62cdb963649096290ec9fbabb43da5ee06a\": container with ID starting with 4070fb7c9f1ae3479d41bf94a35db62cdb963649096290ec9fbabb43da5ee06a not found: ID does not exist" containerID="4070fb7c9f1ae3479d41bf94a35db62cdb963649096290ec9fbabb43da5ee06a" Oct 12 07:02:56 crc kubenswrapper[4930]: I1012 07:02:56.864052 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4070fb7c9f1ae3479d41bf94a35db62cdb963649096290ec9fbabb43da5ee06a"} err="failed to get container status \"4070fb7c9f1ae3479d41bf94a35db62cdb963649096290ec9fbabb43da5ee06a\": rpc error: code = NotFound desc = could not find container \"4070fb7c9f1ae3479d41bf94a35db62cdb963649096290ec9fbabb43da5ee06a\": container with ID starting with 4070fb7c9f1ae3479d41bf94a35db62cdb963649096290ec9fbabb43da5ee06a not found: ID does not exist" Oct 12 07:02:56 crc kubenswrapper[4930]: I1012 07:02:56.864082 4930 scope.go:117] "RemoveContainer" containerID="39bf51d24891434484623e2dd6efd9d295c8d8e22c8dbc3248ac47928cac47df" Oct 12 07:02:56 crc kubenswrapper[4930]: E1012 07:02:56.864443 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39bf51d24891434484623e2dd6efd9d295c8d8e22c8dbc3248ac47928cac47df\": container with ID starting with 39bf51d24891434484623e2dd6efd9d295c8d8e22c8dbc3248ac47928cac47df not found: ID does not exist" containerID="39bf51d24891434484623e2dd6efd9d295c8d8e22c8dbc3248ac47928cac47df" Oct 12 07:02:56 crc kubenswrapper[4930]: I1012 07:02:56.864477 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39bf51d24891434484623e2dd6efd9d295c8d8e22c8dbc3248ac47928cac47df"} err="failed to get container status \"39bf51d24891434484623e2dd6efd9d295c8d8e22c8dbc3248ac47928cac47df\": rpc error: code = NotFound desc = could not find container \"39bf51d24891434484623e2dd6efd9d295c8d8e22c8dbc3248ac47928cac47df\": container with ID starting with 39bf51d24891434484623e2dd6efd9d295c8d8e22c8dbc3248ac47928cac47df not found: ID does not exist" Oct 12 07:02:56 crc kubenswrapper[4930]: I1012 07:02:56.864498 4930 scope.go:117] "RemoveContainer" containerID="d3ee61c8d492aae7fc66e389b370ad36e95bbd2bb54fdf83c6bc8421f1f8f76c" Oct 12 07:02:56 crc kubenswrapper[4930]: E1012 07:02:56.864805 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3ee61c8d492aae7fc66e389b370ad36e95bbd2bb54fdf83c6bc8421f1f8f76c\": container with ID starting with d3ee61c8d492aae7fc66e389b370ad36e95bbd2bb54fdf83c6bc8421f1f8f76c not found: ID does not exist" containerID="d3ee61c8d492aae7fc66e389b370ad36e95bbd2bb54fdf83c6bc8421f1f8f76c" Oct 12 07:02:56 crc kubenswrapper[4930]: I1012 07:02:56.864837 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3ee61c8d492aae7fc66e389b370ad36e95bbd2bb54fdf83c6bc8421f1f8f76c"} err="failed to get container status \"d3ee61c8d492aae7fc66e389b370ad36e95bbd2bb54fdf83c6bc8421f1f8f76c\": rpc error: code = NotFound desc = could not find container \"d3ee61c8d492aae7fc66e389b370ad36e95bbd2bb54fdf83c6bc8421f1f8f76c\": container with ID starting with d3ee61c8d492aae7fc66e389b370ad36e95bbd2bb54fdf83c6bc8421f1f8f76c not found: ID does not exist" Oct 12 07:02:58 crc kubenswrapper[4930]: I1012 07:02:58.154235 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a733cc2-49dd-46af-ad96-e3dce973bedf" path="/var/lib/kubelet/pods/2a733cc2-49dd-46af-ad96-e3dce973bedf/volumes" Oct 12 07:03:03 crc kubenswrapper[4930]: I1012 07:03:03.669671 4930 patch_prober.go:28] interesting pod/machine-config-daemon-mk4tf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 07:03:03 crc kubenswrapper[4930]: I1012 07:03:03.670110 4930 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 07:03:33 crc kubenswrapper[4930]: I1012 07:03:33.669798 4930 patch_prober.go:28] interesting pod/machine-config-daemon-mk4tf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 07:03:33 crc kubenswrapper[4930]: I1012 07:03:33.670345 4930 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 07:04:03 crc kubenswrapper[4930]: I1012 07:04:03.669588 4930 patch_prober.go:28] interesting pod/machine-config-daemon-mk4tf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 07:04:03 crc kubenswrapper[4930]: I1012 07:04:03.670166 4930 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 07:04:03 crc kubenswrapper[4930]: I1012 07:04:03.670216 4930 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" Oct 12 07:04:03 crc kubenswrapper[4930]: I1012 07:04:03.670940 4930 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9812bf7d8319d477322547486dc4d274e43fe7f8659ca19f2dccdabfb78ab503"} pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 12 07:04:03 crc kubenswrapper[4930]: I1012 07:04:03.670988 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerName="machine-config-daemon" containerID="cri-o://9812bf7d8319d477322547486dc4d274e43fe7f8659ca19f2dccdabfb78ab503" gracePeriod=600 Oct 12 07:04:04 crc kubenswrapper[4930]: I1012 07:04:04.603747 4930 generic.go:334] "Generic (PLEG): container finished" podID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerID="9812bf7d8319d477322547486dc4d274e43fe7f8659ca19f2dccdabfb78ab503" exitCode=0 Oct 12 07:04:04 crc kubenswrapper[4930]: I1012 07:04:04.603844 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" event={"ID":"02f8684c-a3e4-44e8-9741-9f54488d8d8d","Type":"ContainerDied","Data":"9812bf7d8319d477322547486dc4d274e43fe7f8659ca19f2dccdabfb78ab503"} Oct 12 07:04:04 crc kubenswrapper[4930]: I1012 07:04:04.604121 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" event={"ID":"02f8684c-a3e4-44e8-9741-9f54488d8d8d","Type":"ContainerStarted","Data":"3508cacfbe81eb819f121a8aa9dec29ca2ce11e5e5e32ed6ab29606cfbe47413"} Oct 12 07:04:04 crc kubenswrapper[4930]: I1012 07:04:04.604146 4930 scope.go:117] "RemoveContainer" containerID="da1b40f1545e0bfb69caf312917b67747a376eb7d694076fa7b4ee28958c55b0" Oct 12 07:04:40 crc kubenswrapper[4930]: I1012 07:04:40.279338 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6sxts"] Oct 12 07:04:40 crc kubenswrapper[4930]: E1012 07:04:40.280604 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a733cc2-49dd-46af-ad96-e3dce973bedf" containerName="extract-content" Oct 12 07:04:40 crc kubenswrapper[4930]: I1012 07:04:40.280628 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a733cc2-49dd-46af-ad96-e3dce973bedf" containerName="extract-content" Oct 12 07:04:40 crc kubenswrapper[4930]: E1012 07:04:40.280665 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a733cc2-49dd-46af-ad96-e3dce973bedf" containerName="extract-utilities" Oct 12 07:04:40 crc kubenswrapper[4930]: I1012 07:04:40.280679 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a733cc2-49dd-46af-ad96-e3dce973bedf" containerName="extract-utilities" Oct 12 07:04:40 crc kubenswrapper[4930]: E1012 07:04:40.280728 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a733cc2-49dd-46af-ad96-e3dce973bedf" containerName="registry-server" Oct 12 07:04:40 crc kubenswrapper[4930]: I1012 07:04:40.280775 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a733cc2-49dd-46af-ad96-e3dce973bedf" containerName="registry-server" Oct 12 07:04:40 crc kubenswrapper[4930]: I1012 07:04:40.281892 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a733cc2-49dd-46af-ad96-e3dce973bedf" containerName="registry-server" Oct 12 07:04:40 crc kubenswrapper[4930]: I1012 07:04:40.284788 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6sxts" Oct 12 07:04:40 crc kubenswrapper[4930]: I1012 07:04:40.321809 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6sxts"] Oct 12 07:04:40 crc kubenswrapper[4930]: I1012 07:04:40.336105 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f2d2e1c-42c1-4751-a418-28dff0a3a079-utilities\") pod \"certified-operators-6sxts\" (UID: \"5f2d2e1c-42c1-4751-a418-28dff0a3a079\") " pod="openshift-marketplace/certified-operators-6sxts" Oct 12 07:04:40 crc kubenswrapper[4930]: I1012 07:04:40.336306 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4w8r\" (UniqueName: \"kubernetes.io/projected/5f2d2e1c-42c1-4751-a418-28dff0a3a079-kube-api-access-m4w8r\") pod \"certified-operators-6sxts\" (UID: \"5f2d2e1c-42c1-4751-a418-28dff0a3a079\") " pod="openshift-marketplace/certified-operators-6sxts" Oct 12 07:04:40 crc kubenswrapper[4930]: I1012 07:04:40.336473 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f2d2e1c-42c1-4751-a418-28dff0a3a079-catalog-content\") pod \"certified-operators-6sxts\" (UID: \"5f2d2e1c-42c1-4751-a418-28dff0a3a079\") " pod="openshift-marketplace/certified-operators-6sxts" Oct 12 07:04:40 crc kubenswrapper[4930]: I1012 07:04:40.438120 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4w8r\" (UniqueName: \"kubernetes.io/projected/5f2d2e1c-42c1-4751-a418-28dff0a3a079-kube-api-access-m4w8r\") pod \"certified-operators-6sxts\" (UID: \"5f2d2e1c-42c1-4751-a418-28dff0a3a079\") " pod="openshift-marketplace/certified-operators-6sxts" Oct 12 07:04:40 crc kubenswrapper[4930]: I1012 07:04:40.438215 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f2d2e1c-42c1-4751-a418-28dff0a3a079-catalog-content\") pod \"certified-operators-6sxts\" (UID: \"5f2d2e1c-42c1-4751-a418-28dff0a3a079\") " pod="openshift-marketplace/certified-operators-6sxts" Oct 12 07:04:40 crc kubenswrapper[4930]: I1012 07:04:40.438357 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f2d2e1c-42c1-4751-a418-28dff0a3a079-utilities\") pod \"certified-operators-6sxts\" (UID: \"5f2d2e1c-42c1-4751-a418-28dff0a3a079\") " pod="openshift-marketplace/certified-operators-6sxts" Oct 12 07:04:40 crc kubenswrapper[4930]: I1012 07:04:40.439142 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f2d2e1c-42c1-4751-a418-28dff0a3a079-utilities\") pod \"certified-operators-6sxts\" (UID: \"5f2d2e1c-42c1-4751-a418-28dff0a3a079\") " pod="openshift-marketplace/certified-operators-6sxts" Oct 12 07:04:40 crc kubenswrapper[4930]: I1012 07:04:40.439190 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f2d2e1c-42c1-4751-a418-28dff0a3a079-catalog-content\") pod \"certified-operators-6sxts\" (UID: \"5f2d2e1c-42c1-4751-a418-28dff0a3a079\") " pod="openshift-marketplace/certified-operators-6sxts" Oct 12 07:04:40 crc kubenswrapper[4930]: I1012 07:04:40.470797 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4w8r\" (UniqueName: \"kubernetes.io/projected/5f2d2e1c-42c1-4751-a418-28dff0a3a079-kube-api-access-m4w8r\") pod \"certified-operators-6sxts\" (UID: \"5f2d2e1c-42c1-4751-a418-28dff0a3a079\") " pod="openshift-marketplace/certified-operators-6sxts" Oct 12 07:04:40 crc kubenswrapper[4930]: I1012 07:04:40.622511 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6sxts" Oct 12 07:04:41 crc kubenswrapper[4930]: I1012 07:04:41.821260 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6sxts"] Oct 12 07:04:41 crc kubenswrapper[4930]: W1012 07:04:41.831357 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f2d2e1c_42c1_4751_a418_28dff0a3a079.slice/crio-fb0f9cd9ab1f3963eac2d890f8002a6ab5d38a02a729f218c84e7402c78e3aed WatchSource:0}: Error finding container fb0f9cd9ab1f3963eac2d890f8002a6ab5d38a02a729f218c84e7402c78e3aed: Status 404 returned error can't find the container with id fb0f9cd9ab1f3963eac2d890f8002a6ab5d38a02a729f218c84e7402c78e3aed Oct 12 07:04:42 crc kubenswrapper[4930]: I1012 07:04:42.056650 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6sxts" event={"ID":"5f2d2e1c-42c1-4751-a418-28dff0a3a079","Type":"ContainerStarted","Data":"fb0f9cd9ab1f3963eac2d890f8002a6ab5d38a02a729f218c84e7402c78e3aed"} Oct 12 07:04:43 crc kubenswrapper[4930]: I1012 07:04:43.073236 4930 generic.go:334] "Generic (PLEG): container finished" podID="5f2d2e1c-42c1-4751-a418-28dff0a3a079" containerID="60546c64efdb2eba02a1c0cd0efb817d6ee988b0436a085416dcdd159b3f93a0" exitCode=0 Oct 12 07:04:43 crc kubenswrapper[4930]: I1012 07:04:43.073313 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6sxts" event={"ID":"5f2d2e1c-42c1-4751-a418-28dff0a3a079","Type":"ContainerDied","Data":"60546c64efdb2eba02a1c0cd0efb817d6ee988b0436a085416dcdd159b3f93a0"} Oct 12 07:04:44 crc kubenswrapper[4930]: I1012 07:04:44.089941 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6sxts" event={"ID":"5f2d2e1c-42c1-4751-a418-28dff0a3a079","Type":"ContainerStarted","Data":"0a7d9b2e50b518ff38ae109caec6a039e6693a347fe0ad795ed7d3934e2793b5"} Oct 12 07:04:45 crc kubenswrapper[4930]: I1012 07:04:45.122836 4930 generic.go:334] "Generic (PLEG): container finished" podID="5f2d2e1c-42c1-4751-a418-28dff0a3a079" containerID="0a7d9b2e50b518ff38ae109caec6a039e6693a347fe0ad795ed7d3934e2793b5" exitCode=0 Oct 12 07:04:45 crc kubenswrapper[4930]: I1012 07:04:45.122983 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6sxts" event={"ID":"5f2d2e1c-42c1-4751-a418-28dff0a3a079","Type":"ContainerDied","Data":"0a7d9b2e50b518ff38ae109caec6a039e6693a347fe0ad795ed7d3934e2793b5"} Oct 12 07:04:46 crc kubenswrapper[4930]: I1012 07:04:46.150850 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6sxts" event={"ID":"5f2d2e1c-42c1-4751-a418-28dff0a3a079","Type":"ContainerStarted","Data":"b4c34edb855c0931e6d48d81267ff09133ae8beffe51e9d642aef315b544112f"} Oct 12 07:04:50 crc kubenswrapper[4930]: I1012 07:04:50.622810 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6sxts" Oct 12 07:04:50 crc kubenswrapper[4930]: I1012 07:04:50.623471 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6sxts" Oct 12 07:04:50 crc kubenswrapper[4930]: I1012 07:04:50.730283 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6sxts" Oct 12 07:04:50 crc kubenswrapper[4930]: I1012 07:04:50.768181 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6sxts" podStartSLOduration=8.090417346 podStartE2EDuration="10.768162563s" podCreationTimestamp="2025-10-12 07:04:40 +0000 UTC" firstStartedPulling="2025-10-12 07:04:43.077983858 +0000 UTC m=+5015.620085663" lastFinishedPulling="2025-10-12 07:04:45.755729085 +0000 UTC m=+5018.297830880" observedRunningTime="2025-10-12 07:04:46.170391894 +0000 UTC m=+5018.712493689" watchObservedRunningTime="2025-10-12 07:04:50.768162563 +0000 UTC m=+5023.310264338" Oct 12 07:04:51 crc kubenswrapper[4930]: I1012 07:04:51.276139 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6sxts" Oct 12 07:04:51 crc kubenswrapper[4930]: I1012 07:04:51.320577 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6sxts"] Oct 12 07:04:53 crc kubenswrapper[4930]: I1012 07:04:53.250631 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6sxts" podUID="5f2d2e1c-42c1-4751-a418-28dff0a3a079" containerName="registry-server" containerID="cri-o://b4c34edb855c0931e6d48d81267ff09133ae8beffe51e9d642aef315b544112f" gracePeriod=2 Oct 12 07:04:53 crc kubenswrapper[4930]: I1012 07:04:53.749689 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6sxts" Oct 12 07:04:53 crc kubenswrapper[4930]: I1012 07:04:53.869795 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f2d2e1c-42c1-4751-a418-28dff0a3a079-catalog-content\") pod \"5f2d2e1c-42c1-4751-a418-28dff0a3a079\" (UID: \"5f2d2e1c-42c1-4751-a418-28dff0a3a079\") " Oct 12 07:04:53 crc kubenswrapper[4930]: I1012 07:04:53.874006 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f2d2e1c-42c1-4751-a418-28dff0a3a079-utilities\") pod \"5f2d2e1c-42c1-4751-a418-28dff0a3a079\" (UID: \"5f2d2e1c-42c1-4751-a418-28dff0a3a079\") " Oct 12 07:04:53 crc kubenswrapper[4930]: I1012 07:04:53.874082 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4w8r\" (UniqueName: \"kubernetes.io/projected/5f2d2e1c-42c1-4751-a418-28dff0a3a079-kube-api-access-m4w8r\") pod \"5f2d2e1c-42c1-4751-a418-28dff0a3a079\" (UID: \"5f2d2e1c-42c1-4751-a418-28dff0a3a079\") " Oct 12 07:04:53 crc kubenswrapper[4930]: I1012 07:04:53.874895 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f2d2e1c-42c1-4751-a418-28dff0a3a079-utilities" (OuterVolumeSpecName: "utilities") pod "5f2d2e1c-42c1-4751-a418-28dff0a3a079" (UID: "5f2d2e1c-42c1-4751-a418-28dff0a3a079"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:04:53 crc kubenswrapper[4930]: I1012 07:04:53.875161 4930 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f2d2e1c-42c1-4751-a418-28dff0a3a079-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 07:04:53 crc kubenswrapper[4930]: I1012 07:04:53.883769 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f2d2e1c-42c1-4751-a418-28dff0a3a079-kube-api-access-m4w8r" (OuterVolumeSpecName: "kube-api-access-m4w8r") pod "5f2d2e1c-42c1-4751-a418-28dff0a3a079" (UID: "5f2d2e1c-42c1-4751-a418-28dff0a3a079"). InnerVolumeSpecName "kube-api-access-m4w8r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:04:53 crc kubenswrapper[4930]: I1012 07:04:53.946524 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f2d2e1c-42c1-4751-a418-28dff0a3a079-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5f2d2e1c-42c1-4751-a418-28dff0a3a079" (UID: "5f2d2e1c-42c1-4751-a418-28dff0a3a079"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:04:53 crc kubenswrapper[4930]: I1012 07:04:53.979464 4930 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f2d2e1c-42c1-4751-a418-28dff0a3a079-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 07:04:53 crc kubenswrapper[4930]: I1012 07:04:53.979526 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4w8r\" (UniqueName: \"kubernetes.io/projected/5f2d2e1c-42c1-4751-a418-28dff0a3a079-kube-api-access-m4w8r\") on node \"crc\" DevicePath \"\"" Oct 12 07:04:54 crc kubenswrapper[4930]: I1012 07:04:54.263529 4930 generic.go:334] "Generic (PLEG): container finished" podID="5f2d2e1c-42c1-4751-a418-28dff0a3a079" containerID="b4c34edb855c0931e6d48d81267ff09133ae8beffe51e9d642aef315b544112f" exitCode=0 Oct 12 07:04:54 crc kubenswrapper[4930]: I1012 07:04:54.263567 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6sxts" event={"ID":"5f2d2e1c-42c1-4751-a418-28dff0a3a079","Type":"ContainerDied","Data":"b4c34edb855c0931e6d48d81267ff09133ae8beffe51e9d642aef315b544112f"} Oct 12 07:04:54 crc kubenswrapper[4930]: I1012 07:04:54.263594 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6sxts" event={"ID":"5f2d2e1c-42c1-4751-a418-28dff0a3a079","Type":"ContainerDied","Data":"fb0f9cd9ab1f3963eac2d890f8002a6ab5d38a02a729f218c84e7402c78e3aed"} Oct 12 07:04:54 crc kubenswrapper[4930]: I1012 07:04:54.263612 4930 scope.go:117] "RemoveContainer" containerID="b4c34edb855c0931e6d48d81267ff09133ae8beffe51e9d642aef315b544112f" Oct 12 07:04:54 crc kubenswrapper[4930]: I1012 07:04:54.263673 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6sxts" Oct 12 07:04:54 crc kubenswrapper[4930]: I1012 07:04:54.288980 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6sxts"] Oct 12 07:04:54 crc kubenswrapper[4930]: I1012 07:04:54.297968 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6sxts"] Oct 12 07:04:54 crc kubenswrapper[4930]: I1012 07:04:54.298055 4930 scope.go:117] "RemoveContainer" containerID="0a7d9b2e50b518ff38ae109caec6a039e6693a347fe0ad795ed7d3934e2793b5" Oct 12 07:04:54 crc kubenswrapper[4930]: I1012 07:04:54.319622 4930 scope.go:117] "RemoveContainer" containerID="60546c64efdb2eba02a1c0cd0efb817d6ee988b0436a085416dcdd159b3f93a0" Oct 12 07:04:54 crc kubenswrapper[4930]: I1012 07:04:54.397051 4930 scope.go:117] "RemoveContainer" containerID="b4c34edb855c0931e6d48d81267ff09133ae8beffe51e9d642aef315b544112f" Oct 12 07:04:54 crc kubenswrapper[4930]: E1012 07:04:54.397546 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4c34edb855c0931e6d48d81267ff09133ae8beffe51e9d642aef315b544112f\": container with ID starting with b4c34edb855c0931e6d48d81267ff09133ae8beffe51e9d642aef315b544112f not found: ID does not exist" containerID="b4c34edb855c0931e6d48d81267ff09133ae8beffe51e9d642aef315b544112f" Oct 12 07:04:54 crc kubenswrapper[4930]: I1012 07:04:54.397589 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4c34edb855c0931e6d48d81267ff09133ae8beffe51e9d642aef315b544112f"} err="failed to get container status \"b4c34edb855c0931e6d48d81267ff09133ae8beffe51e9d642aef315b544112f\": rpc error: code = NotFound desc = could not find container \"b4c34edb855c0931e6d48d81267ff09133ae8beffe51e9d642aef315b544112f\": container with ID starting with b4c34edb855c0931e6d48d81267ff09133ae8beffe51e9d642aef315b544112f not found: ID does not exist" Oct 12 07:04:54 crc kubenswrapper[4930]: I1012 07:04:54.397621 4930 scope.go:117] "RemoveContainer" containerID="0a7d9b2e50b518ff38ae109caec6a039e6693a347fe0ad795ed7d3934e2793b5" Oct 12 07:04:54 crc kubenswrapper[4930]: E1012 07:04:54.398213 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a7d9b2e50b518ff38ae109caec6a039e6693a347fe0ad795ed7d3934e2793b5\": container with ID starting with 0a7d9b2e50b518ff38ae109caec6a039e6693a347fe0ad795ed7d3934e2793b5 not found: ID does not exist" containerID="0a7d9b2e50b518ff38ae109caec6a039e6693a347fe0ad795ed7d3934e2793b5" Oct 12 07:04:54 crc kubenswrapper[4930]: I1012 07:04:54.398250 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a7d9b2e50b518ff38ae109caec6a039e6693a347fe0ad795ed7d3934e2793b5"} err="failed to get container status \"0a7d9b2e50b518ff38ae109caec6a039e6693a347fe0ad795ed7d3934e2793b5\": rpc error: code = NotFound desc = could not find container \"0a7d9b2e50b518ff38ae109caec6a039e6693a347fe0ad795ed7d3934e2793b5\": container with ID starting with 0a7d9b2e50b518ff38ae109caec6a039e6693a347fe0ad795ed7d3934e2793b5 not found: ID does not exist" Oct 12 07:04:54 crc kubenswrapper[4930]: I1012 07:04:54.398279 4930 scope.go:117] "RemoveContainer" containerID="60546c64efdb2eba02a1c0cd0efb817d6ee988b0436a085416dcdd159b3f93a0" Oct 12 07:04:54 crc kubenswrapper[4930]: E1012 07:04:54.398718 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60546c64efdb2eba02a1c0cd0efb817d6ee988b0436a085416dcdd159b3f93a0\": container with ID starting with 60546c64efdb2eba02a1c0cd0efb817d6ee988b0436a085416dcdd159b3f93a0 not found: ID does not exist" containerID="60546c64efdb2eba02a1c0cd0efb817d6ee988b0436a085416dcdd159b3f93a0" Oct 12 07:04:54 crc kubenswrapper[4930]: I1012 07:04:54.398760 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60546c64efdb2eba02a1c0cd0efb817d6ee988b0436a085416dcdd159b3f93a0"} err="failed to get container status \"60546c64efdb2eba02a1c0cd0efb817d6ee988b0436a085416dcdd159b3f93a0\": rpc error: code = NotFound desc = could not find container \"60546c64efdb2eba02a1c0cd0efb817d6ee988b0436a085416dcdd159b3f93a0\": container with ID starting with 60546c64efdb2eba02a1c0cd0efb817d6ee988b0436a085416dcdd159b3f93a0 not found: ID does not exist" Oct 12 07:04:56 crc kubenswrapper[4930]: I1012 07:04:56.150642 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f2d2e1c-42c1-4751-a418-28dff0a3a079" path="/var/lib/kubelet/pods/5f2d2e1c-42c1-4751-a418-28dff0a3a079/volumes" Oct 12 07:06:33 crc kubenswrapper[4930]: I1012 07:06:33.669872 4930 patch_prober.go:28] interesting pod/machine-config-daemon-mk4tf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 07:06:33 crc kubenswrapper[4930]: I1012 07:06:33.670640 4930 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 07:07:03 crc kubenswrapper[4930]: I1012 07:07:03.669784 4930 patch_prober.go:28] interesting pod/machine-config-daemon-mk4tf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 07:07:03 crc kubenswrapper[4930]: I1012 07:07:03.670774 4930 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 07:07:33 crc kubenswrapper[4930]: I1012 07:07:33.669610 4930 patch_prober.go:28] interesting pod/machine-config-daemon-mk4tf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 07:07:33 crc kubenswrapper[4930]: I1012 07:07:33.670432 4930 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 07:07:33 crc kubenswrapper[4930]: I1012 07:07:33.670517 4930 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" Oct 12 07:07:33 crc kubenswrapper[4930]: I1012 07:07:33.671662 4930 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3508cacfbe81eb819f121a8aa9dec29ca2ce11e5e5e32ed6ab29606cfbe47413"} pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 12 07:07:33 crc kubenswrapper[4930]: I1012 07:07:33.671787 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerName="machine-config-daemon" containerID="cri-o://3508cacfbe81eb819f121a8aa9dec29ca2ce11e5e5e32ed6ab29606cfbe47413" gracePeriod=600 Oct 12 07:07:33 crc kubenswrapper[4930]: E1012 07:07:33.803825 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 07:07:34 crc kubenswrapper[4930]: I1012 07:07:34.309657 4930 generic.go:334] "Generic (PLEG): container finished" podID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerID="3508cacfbe81eb819f121a8aa9dec29ca2ce11e5e5e32ed6ab29606cfbe47413" exitCode=0 Oct 12 07:07:34 crc kubenswrapper[4930]: I1012 07:07:34.309703 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" event={"ID":"02f8684c-a3e4-44e8-9741-9f54488d8d8d","Type":"ContainerDied","Data":"3508cacfbe81eb819f121a8aa9dec29ca2ce11e5e5e32ed6ab29606cfbe47413"} Oct 12 07:07:34 crc kubenswrapper[4930]: I1012 07:07:34.309764 4930 scope.go:117] "RemoveContainer" containerID="9812bf7d8319d477322547486dc4d274e43fe7f8659ca19f2dccdabfb78ab503" Oct 12 07:07:34 crc kubenswrapper[4930]: I1012 07:07:34.310611 4930 scope.go:117] "RemoveContainer" containerID="3508cacfbe81eb819f121a8aa9dec29ca2ce11e5e5e32ed6ab29606cfbe47413" Oct 12 07:07:34 crc kubenswrapper[4930]: E1012 07:07:34.311190 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 07:07:47 crc kubenswrapper[4930]: I1012 07:07:47.136639 4930 scope.go:117] "RemoveContainer" containerID="3508cacfbe81eb819f121a8aa9dec29ca2ce11e5e5e32ed6ab29606cfbe47413" Oct 12 07:07:47 crc kubenswrapper[4930]: E1012 07:07:47.137314 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 07:08:00 crc kubenswrapper[4930]: I1012 07:08:00.136281 4930 scope.go:117] "RemoveContainer" containerID="3508cacfbe81eb819f121a8aa9dec29ca2ce11e5e5e32ed6ab29606cfbe47413" Oct 12 07:08:00 crc kubenswrapper[4930]: E1012 07:08:00.137049 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 07:08:13 crc kubenswrapper[4930]: I1012 07:08:13.135841 4930 scope.go:117] "RemoveContainer" containerID="3508cacfbe81eb819f121a8aa9dec29ca2ce11e5e5e32ed6ab29606cfbe47413" Oct 12 07:08:13 crc kubenswrapper[4930]: E1012 07:08:13.136574 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 07:08:28 crc kubenswrapper[4930]: I1012 07:08:28.143929 4930 scope.go:117] "RemoveContainer" containerID="3508cacfbe81eb819f121a8aa9dec29ca2ce11e5e5e32ed6ab29606cfbe47413" Oct 12 07:08:28 crc kubenswrapper[4930]: E1012 07:08:28.144922 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 07:08:40 crc kubenswrapper[4930]: I1012 07:08:40.136796 4930 scope.go:117] "RemoveContainer" containerID="3508cacfbe81eb819f121a8aa9dec29ca2ce11e5e5e32ed6ab29606cfbe47413" Oct 12 07:08:40 crc kubenswrapper[4930]: E1012 07:08:40.137888 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 07:08:51 crc kubenswrapper[4930]: I1012 07:08:51.135855 4930 scope.go:117] "RemoveContainer" containerID="3508cacfbe81eb819f121a8aa9dec29ca2ce11e5e5e32ed6ab29606cfbe47413" Oct 12 07:08:51 crc kubenswrapper[4930]: E1012 07:08:51.136667 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 07:09:03 crc kubenswrapper[4930]: I1012 07:09:03.136448 4930 scope.go:117] "RemoveContainer" containerID="3508cacfbe81eb819f121a8aa9dec29ca2ce11e5e5e32ed6ab29606cfbe47413" Oct 12 07:09:03 crc kubenswrapper[4930]: E1012 07:09:03.137598 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 07:09:16 crc kubenswrapper[4930]: I1012 07:09:16.135436 4930 scope.go:117] "RemoveContainer" containerID="3508cacfbe81eb819f121a8aa9dec29ca2ce11e5e5e32ed6ab29606cfbe47413" Oct 12 07:09:16 crc kubenswrapper[4930]: E1012 07:09:16.136630 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 07:09:29 crc kubenswrapper[4930]: I1012 07:09:29.135626 4930 scope.go:117] "RemoveContainer" containerID="3508cacfbe81eb819f121a8aa9dec29ca2ce11e5e5e32ed6ab29606cfbe47413" Oct 12 07:09:29 crc kubenswrapper[4930]: E1012 07:09:29.136886 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 07:09:42 crc kubenswrapper[4930]: I1012 07:09:42.136067 4930 scope.go:117] "RemoveContainer" containerID="3508cacfbe81eb819f121a8aa9dec29ca2ce11e5e5e32ed6ab29606cfbe47413" Oct 12 07:09:42 crc kubenswrapper[4930]: E1012 07:09:42.136842 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 07:09:56 crc kubenswrapper[4930]: I1012 07:09:56.136017 4930 scope.go:117] "RemoveContainer" containerID="3508cacfbe81eb819f121a8aa9dec29ca2ce11e5e5e32ed6ab29606cfbe47413" Oct 12 07:09:56 crc kubenswrapper[4930]: E1012 07:09:56.136855 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 07:10:10 crc kubenswrapper[4930]: I1012 07:10:10.135937 4930 scope.go:117] "RemoveContainer" containerID="3508cacfbe81eb819f121a8aa9dec29ca2ce11e5e5e32ed6ab29606cfbe47413" Oct 12 07:10:10 crc kubenswrapper[4930]: E1012 07:10:10.136903 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 07:10:19 crc kubenswrapper[4930]: I1012 07:10:19.384636 4930 generic.go:334] "Generic (PLEG): container finished" podID="acf9e824-abb4-4b1c-9925-c7794fafaad4" containerID="8a7b273e05d7dc912375aecb7500a13d3f402987b770443fe657374bc03c24b5" exitCode=0 Oct 12 07:10:19 crc kubenswrapper[4930]: I1012 07:10:19.384991 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"acf9e824-abb4-4b1c-9925-c7794fafaad4","Type":"ContainerDied","Data":"8a7b273e05d7dc912375aecb7500a13d3f402987b770443fe657374bc03c24b5"} Oct 12 07:10:20 crc kubenswrapper[4930]: I1012 07:10:20.929027 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 12 07:10:21 crc kubenswrapper[4930]: I1012 07:10:21.064929 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/acf9e824-abb4-4b1c-9925-c7794fafaad4-test-operator-ephemeral-workdir\") pod \"acf9e824-abb4-4b1c-9925-c7794fafaad4\" (UID: \"acf9e824-abb4-4b1c-9925-c7794fafaad4\") " Oct 12 07:10:21 crc kubenswrapper[4930]: I1012 07:10:21.065035 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/acf9e824-abb4-4b1c-9925-c7794fafaad4-ssh-key\") pod \"acf9e824-abb4-4b1c-9925-c7794fafaad4\" (UID: \"acf9e824-abb4-4b1c-9925-c7794fafaad4\") " Oct 12 07:10:21 crc kubenswrapper[4930]: I1012 07:10:21.065140 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/acf9e824-abb4-4b1c-9925-c7794fafaad4-test-operator-ephemeral-temporary\") pod \"acf9e824-abb4-4b1c-9925-c7794fafaad4\" (UID: \"acf9e824-abb4-4b1c-9925-c7794fafaad4\") " Oct 12 07:10:21 crc kubenswrapper[4930]: I1012 07:10:21.065195 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/acf9e824-abb4-4b1c-9925-c7794fafaad4-openstack-config\") pod \"acf9e824-abb4-4b1c-9925-c7794fafaad4\" (UID: \"acf9e824-abb4-4b1c-9925-c7794fafaad4\") " Oct 12 07:10:21 crc kubenswrapper[4930]: I1012 07:10:21.065226 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pm8hk\" (UniqueName: \"kubernetes.io/projected/acf9e824-abb4-4b1c-9925-c7794fafaad4-kube-api-access-pm8hk\") pod \"acf9e824-abb4-4b1c-9925-c7794fafaad4\" (UID: \"acf9e824-abb4-4b1c-9925-c7794fafaad4\") " Oct 12 07:10:21 crc kubenswrapper[4930]: I1012 07:10:21.065274 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/acf9e824-abb4-4b1c-9925-c7794fafaad4-ca-certs\") pod \"acf9e824-abb4-4b1c-9925-c7794fafaad4\" (UID: \"acf9e824-abb4-4b1c-9925-c7794fafaad4\") " Oct 12 07:10:21 crc kubenswrapper[4930]: I1012 07:10:21.065354 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/acf9e824-abb4-4b1c-9925-c7794fafaad4-openstack-config-secret\") pod \"acf9e824-abb4-4b1c-9925-c7794fafaad4\" (UID: \"acf9e824-abb4-4b1c-9925-c7794fafaad4\") " Oct 12 07:10:21 crc kubenswrapper[4930]: I1012 07:10:21.065395 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/acf9e824-abb4-4b1c-9925-c7794fafaad4-config-data\") pod \"acf9e824-abb4-4b1c-9925-c7794fafaad4\" (UID: \"acf9e824-abb4-4b1c-9925-c7794fafaad4\") " Oct 12 07:10:21 crc kubenswrapper[4930]: I1012 07:10:21.065467 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"acf9e824-abb4-4b1c-9925-c7794fafaad4\" (UID: \"acf9e824-abb4-4b1c-9925-c7794fafaad4\") " Oct 12 07:10:21 crc kubenswrapper[4930]: I1012 07:10:21.065722 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acf9e824-abb4-4b1c-9925-c7794fafaad4-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "acf9e824-abb4-4b1c-9925-c7794fafaad4" (UID: "acf9e824-abb4-4b1c-9925-c7794fafaad4"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:10:21 crc kubenswrapper[4930]: I1012 07:10:21.066338 4930 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/acf9e824-abb4-4b1c-9925-c7794fafaad4-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Oct 12 07:10:21 crc kubenswrapper[4930]: I1012 07:10:21.066564 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/acf9e824-abb4-4b1c-9925-c7794fafaad4-config-data" (OuterVolumeSpecName: "config-data") pod "acf9e824-abb4-4b1c-9925-c7794fafaad4" (UID: "acf9e824-abb4-4b1c-9925-c7794fafaad4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:10:21 crc kubenswrapper[4930]: I1012 07:10:21.072366 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acf9e824-abb4-4b1c-9925-c7794fafaad4-kube-api-access-pm8hk" (OuterVolumeSpecName: "kube-api-access-pm8hk") pod "acf9e824-abb4-4b1c-9925-c7794fafaad4" (UID: "acf9e824-abb4-4b1c-9925-c7794fafaad4"). InnerVolumeSpecName "kube-api-access-pm8hk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:10:21 crc kubenswrapper[4930]: I1012 07:10:21.072602 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acf9e824-abb4-4b1c-9925-c7794fafaad4-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "acf9e824-abb4-4b1c-9925-c7794fafaad4" (UID: "acf9e824-abb4-4b1c-9925-c7794fafaad4"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:10:21 crc kubenswrapper[4930]: I1012 07:10:21.084879 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "test-operator-logs") pod "acf9e824-abb4-4b1c-9925-c7794fafaad4" (UID: "acf9e824-abb4-4b1c-9925-c7794fafaad4"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 12 07:10:21 crc kubenswrapper[4930]: I1012 07:10:21.096594 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acf9e824-abb4-4b1c-9925-c7794fafaad4-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "acf9e824-abb4-4b1c-9925-c7794fafaad4" (UID: "acf9e824-abb4-4b1c-9925-c7794fafaad4"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:10:21 crc kubenswrapper[4930]: I1012 07:10:21.110987 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acf9e824-abb4-4b1c-9925-c7794fafaad4-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "acf9e824-abb4-4b1c-9925-c7794fafaad4" (UID: "acf9e824-abb4-4b1c-9925-c7794fafaad4"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:10:21 crc kubenswrapper[4930]: I1012 07:10:21.114038 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acf9e824-abb4-4b1c-9925-c7794fafaad4-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "acf9e824-abb4-4b1c-9925-c7794fafaad4" (UID: "acf9e824-abb4-4b1c-9925-c7794fafaad4"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:10:21 crc kubenswrapper[4930]: I1012 07:10:21.119328 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/acf9e824-abb4-4b1c-9925-c7794fafaad4-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "acf9e824-abb4-4b1c-9925-c7794fafaad4" (UID: "acf9e824-abb4-4b1c-9925-c7794fafaad4"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:10:21 crc kubenswrapper[4930]: I1012 07:10:21.168204 4930 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/acf9e824-abb4-4b1c-9925-c7794fafaad4-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 12 07:10:21 crc kubenswrapper[4930]: I1012 07:10:21.168238 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pm8hk\" (UniqueName: \"kubernetes.io/projected/acf9e824-abb4-4b1c-9925-c7794fafaad4-kube-api-access-pm8hk\") on node \"crc\" DevicePath \"\"" Oct 12 07:10:21 crc kubenswrapper[4930]: I1012 07:10:21.168249 4930 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/acf9e824-abb4-4b1c-9925-c7794fafaad4-ca-certs\") on node \"crc\" DevicePath \"\"" Oct 12 07:10:21 crc kubenswrapper[4930]: I1012 07:10:21.168258 4930 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/acf9e824-abb4-4b1c-9925-c7794fafaad4-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 12 07:10:21 crc kubenswrapper[4930]: I1012 07:10:21.168267 4930 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/acf9e824-abb4-4b1c-9925-c7794fafaad4-config-data\") on node \"crc\" DevicePath \"\"" Oct 12 07:10:21 crc kubenswrapper[4930]: I1012 07:10:21.168296 4930 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Oct 12 07:10:21 crc kubenswrapper[4930]: I1012 07:10:21.168307 4930 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/acf9e824-abb4-4b1c-9925-c7794fafaad4-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Oct 12 07:10:21 crc kubenswrapper[4930]: I1012 07:10:21.168317 4930 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/acf9e824-abb4-4b1c-9925-c7794fafaad4-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 12 07:10:21 crc kubenswrapper[4930]: I1012 07:10:21.193748 4930 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Oct 12 07:10:21 crc kubenswrapper[4930]: I1012 07:10:21.270683 4930 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Oct 12 07:10:21 crc kubenswrapper[4930]: I1012 07:10:21.413732 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"acf9e824-abb4-4b1c-9925-c7794fafaad4","Type":"ContainerDied","Data":"173b73af5735ed9a4f5b24ea6fe93e888f5a17249fcff85fb84dec639310a3da"} Oct 12 07:10:21 crc kubenswrapper[4930]: I1012 07:10:21.413833 4930 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="173b73af5735ed9a4f5b24ea6fe93e888f5a17249fcff85fb84dec639310a3da" Oct 12 07:10:21 crc kubenswrapper[4930]: I1012 07:10:21.414007 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 12 07:10:25 crc kubenswrapper[4930]: I1012 07:10:25.136146 4930 scope.go:117] "RemoveContainer" containerID="3508cacfbe81eb819f121a8aa9dec29ca2ce11e5e5e32ed6ab29606cfbe47413" Oct 12 07:10:25 crc kubenswrapper[4930]: E1012 07:10:25.137324 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 07:10:27 crc kubenswrapper[4930]: I1012 07:10:27.135557 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 12 07:10:27 crc kubenswrapper[4930]: E1012 07:10:27.136600 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f2d2e1c-42c1-4751-a418-28dff0a3a079" containerName="extract-content" Oct 12 07:10:27 crc kubenswrapper[4930]: I1012 07:10:27.136623 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f2d2e1c-42c1-4751-a418-28dff0a3a079" containerName="extract-content" Oct 12 07:10:27 crc kubenswrapper[4930]: E1012 07:10:27.136660 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acf9e824-abb4-4b1c-9925-c7794fafaad4" containerName="tempest-tests-tempest-tests-runner" Oct 12 07:10:27 crc kubenswrapper[4930]: I1012 07:10:27.136673 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="acf9e824-abb4-4b1c-9925-c7794fafaad4" containerName="tempest-tests-tempest-tests-runner" Oct 12 07:10:27 crc kubenswrapper[4930]: E1012 07:10:27.136715 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f2d2e1c-42c1-4751-a418-28dff0a3a079" containerName="extract-utilities" Oct 12 07:10:27 crc kubenswrapper[4930]: I1012 07:10:27.136731 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f2d2e1c-42c1-4751-a418-28dff0a3a079" containerName="extract-utilities" Oct 12 07:10:27 crc kubenswrapper[4930]: E1012 07:10:27.136774 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f2d2e1c-42c1-4751-a418-28dff0a3a079" containerName="registry-server" Oct 12 07:10:27 crc kubenswrapper[4930]: I1012 07:10:27.136788 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f2d2e1c-42c1-4751-a418-28dff0a3a079" containerName="registry-server" Oct 12 07:10:27 crc kubenswrapper[4930]: I1012 07:10:27.137177 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f2d2e1c-42c1-4751-a418-28dff0a3a079" containerName="registry-server" Oct 12 07:10:27 crc kubenswrapper[4930]: I1012 07:10:27.137213 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="acf9e824-abb4-4b1c-9925-c7794fafaad4" containerName="tempest-tests-tempest-tests-runner" Oct 12 07:10:27 crc kubenswrapper[4930]: I1012 07:10:27.138571 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 12 07:10:27 crc kubenswrapper[4930]: I1012 07:10:27.141725 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-mlg5l" Oct 12 07:10:27 crc kubenswrapper[4930]: I1012 07:10:27.151317 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 12 07:10:27 crc kubenswrapper[4930]: I1012 07:10:27.229330 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r49x8\" (UniqueName: \"kubernetes.io/projected/da6596b0-b6eb-4af2-97dd-7bda46883284-kube-api-access-r49x8\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"da6596b0-b6eb-4af2-97dd-7bda46883284\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 12 07:10:27 crc kubenswrapper[4930]: I1012 07:10:27.229660 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"da6596b0-b6eb-4af2-97dd-7bda46883284\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 12 07:10:27 crc kubenswrapper[4930]: I1012 07:10:27.332147 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r49x8\" (UniqueName: \"kubernetes.io/projected/da6596b0-b6eb-4af2-97dd-7bda46883284-kube-api-access-r49x8\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"da6596b0-b6eb-4af2-97dd-7bda46883284\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 12 07:10:27 crc kubenswrapper[4930]: I1012 07:10:27.332346 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"da6596b0-b6eb-4af2-97dd-7bda46883284\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 12 07:10:27 crc kubenswrapper[4930]: I1012 07:10:27.332991 4930 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"da6596b0-b6eb-4af2-97dd-7bda46883284\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 12 07:10:27 crc kubenswrapper[4930]: I1012 07:10:27.954293 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r49x8\" (UniqueName: \"kubernetes.io/projected/da6596b0-b6eb-4af2-97dd-7bda46883284-kube-api-access-r49x8\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"da6596b0-b6eb-4af2-97dd-7bda46883284\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 12 07:10:28 crc kubenswrapper[4930]: I1012 07:10:28.039848 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"da6596b0-b6eb-4af2-97dd-7bda46883284\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 12 07:10:28 crc kubenswrapper[4930]: I1012 07:10:28.078221 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 12 07:10:28 crc kubenswrapper[4930]: I1012 07:10:28.574503 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 12 07:10:28 crc kubenswrapper[4930]: W1012 07:10:28.589108 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda6596b0_b6eb_4af2_97dd_7bda46883284.slice/crio-d66fdc727d148b1ff3123aca0d0ed48337236de6aaaa707aad0e6fda2a8c3f74 WatchSource:0}: Error finding container d66fdc727d148b1ff3123aca0d0ed48337236de6aaaa707aad0e6fda2a8c3f74: Status 404 returned error can't find the container with id d66fdc727d148b1ff3123aca0d0ed48337236de6aaaa707aad0e6fda2a8c3f74 Oct 12 07:10:28 crc kubenswrapper[4930]: I1012 07:10:28.592933 4930 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 12 07:10:29 crc kubenswrapper[4930]: I1012 07:10:29.517155 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"da6596b0-b6eb-4af2-97dd-7bda46883284","Type":"ContainerStarted","Data":"d66fdc727d148b1ff3123aca0d0ed48337236de6aaaa707aad0e6fda2a8c3f74"} Oct 12 07:10:30 crc kubenswrapper[4930]: I1012 07:10:30.537516 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"da6596b0-b6eb-4af2-97dd-7bda46883284","Type":"ContainerStarted","Data":"d365cbdaab4a6f75b2e7946a43edb5cf0fc1d94e84b1315916ecc5274d6243d9"} Oct 12 07:10:30 crc kubenswrapper[4930]: I1012 07:10:30.566279 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.695115023 podStartE2EDuration="3.566251272s" podCreationTimestamp="2025-10-12 07:10:27 +0000 UTC" firstStartedPulling="2025-10-12 07:10:28.592502596 +0000 UTC m=+5361.134604401" lastFinishedPulling="2025-10-12 07:10:29.463638845 +0000 UTC m=+5362.005740650" observedRunningTime="2025-10-12 07:10:30.555030468 +0000 UTC m=+5363.097132283" watchObservedRunningTime="2025-10-12 07:10:30.566251272 +0000 UTC m=+5363.108353067" Oct 12 07:10:37 crc kubenswrapper[4930]: I1012 07:10:37.136663 4930 scope.go:117] "RemoveContainer" containerID="3508cacfbe81eb819f121a8aa9dec29ca2ce11e5e5e32ed6ab29606cfbe47413" Oct 12 07:10:37 crc kubenswrapper[4930]: E1012 07:10:37.137804 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 07:10:47 crc kubenswrapper[4930]: I1012 07:10:47.308585 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-smj9w/must-gather-q65p2"] Oct 12 07:10:47 crc kubenswrapper[4930]: I1012 07:10:47.313375 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-smj9w/must-gather-q65p2" Oct 12 07:10:47 crc kubenswrapper[4930]: I1012 07:10:47.342986 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-smj9w"/"openshift-service-ca.crt" Oct 12 07:10:47 crc kubenswrapper[4930]: I1012 07:10:47.343244 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-smj9w"/"kube-root-ca.crt" Oct 12 07:10:47 crc kubenswrapper[4930]: I1012 07:10:47.348778 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-smj9w/must-gather-q65p2"] Oct 12 07:10:47 crc kubenswrapper[4930]: I1012 07:10:47.367019 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skqpt\" (UniqueName: \"kubernetes.io/projected/2243a5d2-1751-4335-8e6f-1c500a51b226-kube-api-access-skqpt\") pod \"must-gather-q65p2\" (UID: \"2243a5d2-1751-4335-8e6f-1c500a51b226\") " pod="openshift-must-gather-smj9w/must-gather-q65p2" Oct 12 07:10:47 crc kubenswrapper[4930]: I1012 07:10:47.367101 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2243a5d2-1751-4335-8e6f-1c500a51b226-must-gather-output\") pod \"must-gather-q65p2\" (UID: \"2243a5d2-1751-4335-8e6f-1c500a51b226\") " pod="openshift-must-gather-smj9w/must-gather-q65p2" Oct 12 07:10:47 crc kubenswrapper[4930]: I1012 07:10:47.468793 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skqpt\" (UniqueName: \"kubernetes.io/projected/2243a5d2-1751-4335-8e6f-1c500a51b226-kube-api-access-skqpt\") pod \"must-gather-q65p2\" (UID: \"2243a5d2-1751-4335-8e6f-1c500a51b226\") " pod="openshift-must-gather-smj9w/must-gather-q65p2" Oct 12 07:10:47 crc kubenswrapper[4930]: I1012 07:10:47.469313 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2243a5d2-1751-4335-8e6f-1c500a51b226-must-gather-output\") pod \"must-gather-q65p2\" (UID: \"2243a5d2-1751-4335-8e6f-1c500a51b226\") " pod="openshift-must-gather-smj9w/must-gather-q65p2" Oct 12 07:10:47 crc kubenswrapper[4930]: I1012 07:10:47.469678 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2243a5d2-1751-4335-8e6f-1c500a51b226-must-gather-output\") pod \"must-gather-q65p2\" (UID: \"2243a5d2-1751-4335-8e6f-1c500a51b226\") " pod="openshift-must-gather-smj9w/must-gather-q65p2" Oct 12 07:10:47 crc kubenswrapper[4930]: I1012 07:10:47.486701 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skqpt\" (UniqueName: \"kubernetes.io/projected/2243a5d2-1751-4335-8e6f-1c500a51b226-kube-api-access-skqpt\") pod \"must-gather-q65p2\" (UID: \"2243a5d2-1751-4335-8e6f-1c500a51b226\") " pod="openshift-must-gather-smj9w/must-gather-q65p2" Oct 12 07:10:47 crc kubenswrapper[4930]: I1012 07:10:47.683867 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-smj9w/must-gather-q65p2" Oct 12 07:10:48 crc kubenswrapper[4930]: I1012 07:10:48.162521 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-smj9w/must-gather-q65p2"] Oct 12 07:10:48 crc kubenswrapper[4930]: W1012 07:10:48.162884 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2243a5d2_1751_4335_8e6f_1c500a51b226.slice/crio-28ae9b69e76f6bc2e182d919da6d9c1a53d408594ce82ac4220b66e7b563c1e8 WatchSource:0}: Error finding container 28ae9b69e76f6bc2e182d919da6d9c1a53d408594ce82ac4220b66e7b563c1e8: Status 404 returned error can't find the container with id 28ae9b69e76f6bc2e182d919da6d9c1a53d408594ce82ac4220b66e7b563c1e8 Oct 12 07:10:48 crc kubenswrapper[4930]: I1012 07:10:48.771950 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-smj9w/must-gather-q65p2" event={"ID":"2243a5d2-1751-4335-8e6f-1c500a51b226","Type":"ContainerStarted","Data":"28ae9b69e76f6bc2e182d919da6d9c1a53d408594ce82ac4220b66e7b563c1e8"} Oct 12 07:10:48 crc kubenswrapper[4930]: I1012 07:10:48.854606 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fgbx4"] Oct 12 07:10:48 crc kubenswrapper[4930]: I1012 07:10:48.858284 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fgbx4" Oct 12 07:10:48 crc kubenswrapper[4930]: I1012 07:10:48.876491 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fgbx4"] Oct 12 07:10:48 crc kubenswrapper[4930]: I1012 07:10:48.998357 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcngd\" (UniqueName: \"kubernetes.io/projected/fc093a59-ff99-4c7c-97da-adc89ec15b5b-kube-api-access-bcngd\") pod \"redhat-marketplace-fgbx4\" (UID: \"fc093a59-ff99-4c7c-97da-adc89ec15b5b\") " pod="openshift-marketplace/redhat-marketplace-fgbx4" Oct 12 07:10:48 crc kubenswrapper[4930]: I1012 07:10:48.998433 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc093a59-ff99-4c7c-97da-adc89ec15b5b-utilities\") pod \"redhat-marketplace-fgbx4\" (UID: \"fc093a59-ff99-4c7c-97da-adc89ec15b5b\") " pod="openshift-marketplace/redhat-marketplace-fgbx4" Oct 12 07:10:48 crc kubenswrapper[4930]: I1012 07:10:48.998449 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc093a59-ff99-4c7c-97da-adc89ec15b5b-catalog-content\") pod \"redhat-marketplace-fgbx4\" (UID: \"fc093a59-ff99-4c7c-97da-adc89ec15b5b\") " pod="openshift-marketplace/redhat-marketplace-fgbx4" Oct 12 07:10:49 crc kubenswrapper[4930]: I1012 07:10:49.100022 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcngd\" (UniqueName: \"kubernetes.io/projected/fc093a59-ff99-4c7c-97da-adc89ec15b5b-kube-api-access-bcngd\") pod \"redhat-marketplace-fgbx4\" (UID: \"fc093a59-ff99-4c7c-97da-adc89ec15b5b\") " pod="openshift-marketplace/redhat-marketplace-fgbx4" Oct 12 07:10:49 crc kubenswrapper[4930]: I1012 07:10:49.100112 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc093a59-ff99-4c7c-97da-adc89ec15b5b-utilities\") pod \"redhat-marketplace-fgbx4\" (UID: \"fc093a59-ff99-4c7c-97da-adc89ec15b5b\") " pod="openshift-marketplace/redhat-marketplace-fgbx4" Oct 12 07:10:49 crc kubenswrapper[4930]: I1012 07:10:49.100128 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc093a59-ff99-4c7c-97da-adc89ec15b5b-catalog-content\") pod \"redhat-marketplace-fgbx4\" (UID: \"fc093a59-ff99-4c7c-97da-adc89ec15b5b\") " pod="openshift-marketplace/redhat-marketplace-fgbx4" Oct 12 07:10:49 crc kubenswrapper[4930]: I1012 07:10:49.100594 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc093a59-ff99-4c7c-97da-adc89ec15b5b-catalog-content\") pod \"redhat-marketplace-fgbx4\" (UID: \"fc093a59-ff99-4c7c-97da-adc89ec15b5b\") " pod="openshift-marketplace/redhat-marketplace-fgbx4" Oct 12 07:10:49 crc kubenswrapper[4930]: I1012 07:10:49.101053 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc093a59-ff99-4c7c-97da-adc89ec15b5b-utilities\") pod \"redhat-marketplace-fgbx4\" (UID: \"fc093a59-ff99-4c7c-97da-adc89ec15b5b\") " pod="openshift-marketplace/redhat-marketplace-fgbx4" Oct 12 07:10:49 crc kubenswrapper[4930]: I1012 07:10:49.123664 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcngd\" (UniqueName: \"kubernetes.io/projected/fc093a59-ff99-4c7c-97da-adc89ec15b5b-kube-api-access-bcngd\") pod \"redhat-marketplace-fgbx4\" (UID: \"fc093a59-ff99-4c7c-97da-adc89ec15b5b\") " pod="openshift-marketplace/redhat-marketplace-fgbx4" Oct 12 07:10:49 crc kubenswrapper[4930]: I1012 07:10:49.194984 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fgbx4" Oct 12 07:10:49 crc kubenswrapper[4930]: I1012 07:10:49.631349 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fgbx4"] Oct 12 07:10:49 crc kubenswrapper[4930]: W1012 07:10:49.637422 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc093a59_ff99_4c7c_97da_adc89ec15b5b.slice/crio-1a69cdfc7255e765f3442eeaf0f5d6486cc93f086393796c75b2cccd91954451 WatchSource:0}: Error finding container 1a69cdfc7255e765f3442eeaf0f5d6486cc93f086393796c75b2cccd91954451: Status 404 returned error can't find the container with id 1a69cdfc7255e765f3442eeaf0f5d6486cc93f086393796c75b2cccd91954451 Oct 12 07:10:49 crc kubenswrapper[4930]: I1012 07:10:49.786887 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fgbx4" event={"ID":"fc093a59-ff99-4c7c-97da-adc89ec15b5b","Type":"ContainerStarted","Data":"1a69cdfc7255e765f3442eeaf0f5d6486cc93f086393796c75b2cccd91954451"} Oct 12 07:10:50 crc kubenswrapper[4930]: I1012 07:10:50.135895 4930 scope.go:117] "RemoveContainer" containerID="3508cacfbe81eb819f121a8aa9dec29ca2ce11e5e5e32ed6ab29606cfbe47413" Oct 12 07:10:50 crc kubenswrapper[4930]: E1012 07:10:50.136360 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 07:10:50 crc kubenswrapper[4930]: I1012 07:10:50.802494 4930 generic.go:334] "Generic (PLEG): container finished" podID="fc093a59-ff99-4c7c-97da-adc89ec15b5b" containerID="55452ad4f033ae02e4e266e5099a8c8b656dfa3f18d679ea39b04ab1a39de8fa" exitCode=0 Oct 12 07:10:50 crc kubenswrapper[4930]: I1012 07:10:50.802564 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fgbx4" event={"ID":"fc093a59-ff99-4c7c-97da-adc89ec15b5b","Type":"ContainerDied","Data":"55452ad4f033ae02e4e266e5099a8c8b656dfa3f18d679ea39b04ab1a39de8fa"} Oct 12 07:10:55 crc kubenswrapper[4930]: I1012 07:10:55.868389 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fgbx4" event={"ID":"fc093a59-ff99-4c7c-97da-adc89ec15b5b","Type":"ContainerStarted","Data":"0b03f0c5c97d36301d8ff100e75dcb843f9c21f9f46ab2dcff0948c8025ea194"} Oct 12 07:10:55 crc kubenswrapper[4930]: I1012 07:10:55.871225 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-smj9w/must-gather-q65p2" event={"ID":"2243a5d2-1751-4335-8e6f-1c500a51b226","Type":"ContainerStarted","Data":"5d180da7adfa332e35940aef536aaea1ef82d5116cbb40d52849797afea65b14"} Oct 12 07:10:55 crc kubenswrapper[4930]: I1012 07:10:55.871275 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-smj9w/must-gather-q65p2" event={"ID":"2243a5d2-1751-4335-8e6f-1c500a51b226","Type":"ContainerStarted","Data":"fcff62ab0ae0934eb7c77fb05a98256fc523f92c8c58b5a6e4fe29af5d7c32e4"} Oct 12 07:10:55 crc kubenswrapper[4930]: I1012 07:10:55.926380 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-smj9w/must-gather-q65p2" podStartSLOduration=1.875676774 podStartE2EDuration="8.926355944s" podCreationTimestamp="2025-10-12 07:10:47 +0000 UTC" firstStartedPulling="2025-10-12 07:10:48.166189878 +0000 UTC m=+5380.708291643" lastFinishedPulling="2025-10-12 07:10:55.216869038 +0000 UTC m=+5387.758970813" observedRunningTime="2025-10-12 07:10:55.912653399 +0000 UTC m=+5388.454755174" watchObservedRunningTime="2025-10-12 07:10:55.926355944 +0000 UTC m=+5388.468457709" Oct 12 07:10:56 crc kubenswrapper[4930]: I1012 07:10:56.884905 4930 generic.go:334] "Generic (PLEG): container finished" podID="fc093a59-ff99-4c7c-97da-adc89ec15b5b" containerID="0b03f0c5c97d36301d8ff100e75dcb843f9c21f9f46ab2dcff0948c8025ea194" exitCode=0 Oct 12 07:10:56 crc kubenswrapper[4930]: I1012 07:10:56.888348 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fgbx4" event={"ID":"fc093a59-ff99-4c7c-97da-adc89ec15b5b","Type":"ContainerDied","Data":"0b03f0c5c97d36301d8ff100e75dcb843f9c21f9f46ab2dcff0948c8025ea194"} Oct 12 07:10:58 crc kubenswrapper[4930]: E1012 07:10:58.983267 4930 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.111:36338->38.102.83.111:46517: read tcp 38.102.83.111:36338->38.102.83.111:46517: read: connection reset by peer Oct 12 07:10:59 crc kubenswrapper[4930]: I1012 07:10:59.790715 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-smj9w/crc-debug-jfjxw"] Oct 12 07:10:59 crc kubenswrapper[4930]: I1012 07:10:59.791936 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-smj9w/crc-debug-jfjxw" Oct 12 07:10:59 crc kubenswrapper[4930]: I1012 07:10:59.794232 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-smj9w"/"default-dockercfg-pxzgv" Oct 12 07:10:59 crc kubenswrapper[4930]: I1012 07:10:59.860729 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dbcf7b20-8a2c-4fe2-aac5-9fe839c25590-host\") pod \"crc-debug-jfjxw\" (UID: \"dbcf7b20-8a2c-4fe2-aac5-9fe839c25590\") " pod="openshift-must-gather-smj9w/crc-debug-jfjxw" Oct 12 07:10:59 crc kubenswrapper[4930]: I1012 07:10:59.861037 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66xkm\" (UniqueName: \"kubernetes.io/projected/dbcf7b20-8a2c-4fe2-aac5-9fe839c25590-kube-api-access-66xkm\") pod \"crc-debug-jfjxw\" (UID: \"dbcf7b20-8a2c-4fe2-aac5-9fe839c25590\") " pod="openshift-must-gather-smj9w/crc-debug-jfjxw" Oct 12 07:10:59 crc kubenswrapper[4930]: I1012 07:10:59.927075 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fgbx4" event={"ID":"fc093a59-ff99-4c7c-97da-adc89ec15b5b","Type":"ContainerStarted","Data":"f31912211de59ef040d31947c9b53106a4792b028ffa02e84ff754af9b0cc9a7"} Oct 12 07:10:59 crc kubenswrapper[4930]: I1012 07:10:59.951806 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fgbx4" podStartSLOduration=7.415382041 podStartE2EDuration="11.95178843s" podCreationTimestamp="2025-10-12 07:10:48 +0000 UTC" firstStartedPulling="2025-10-12 07:10:52.767176946 +0000 UTC m=+5385.309278761" lastFinishedPulling="2025-10-12 07:10:57.303583375 +0000 UTC m=+5389.845685150" observedRunningTime="2025-10-12 07:10:59.942511974 +0000 UTC m=+5392.484613739" watchObservedRunningTime="2025-10-12 07:10:59.95178843 +0000 UTC m=+5392.493890195" Oct 12 07:10:59 crc kubenswrapper[4930]: I1012 07:10:59.962947 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66xkm\" (UniqueName: \"kubernetes.io/projected/dbcf7b20-8a2c-4fe2-aac5-9fe839c25590-kube-api-access-66xkm\") pod \"crc-debug-jfjxw\" (UID: \"dbcf7b20-8a2c-4fe2-aac5-9fe839c25590\") " pod="openshift-must-gather-smj9w/crc-debug-jfjxw" Oct 12 07:10:59 crc kubenswrapper[4930]: I1012 07:10:59.963087 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dbcf7b20-8a2c-4fe2-aac5-9fe839c25590-host\") pod \"crc-debug-jfjxw\" (UID: \"dbcf7b20-8a2c-4fe2-aac5-9fe839c25590\") " pod="openshift-must-gather-smj9w/crc-debug-jfjxw" Oct 12 07:10:59 crc kubenswrapper[4930]: I1012 07:10:59.963183 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dbcf7b20-8a2c-4fe2-aac5-9fe839c25590-host\") pod \"crc-debug-jfjxw\" (UID: \"dbcf7b20-8a2c-4fe2-aac5-9fe839c25590\") " pod="openshift-must-gather-smj9w/crc-debug-jfjxw" Oct 12 07:11:00 crc kubenswrapper[4930]: I1012 07:11:00.001499 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66xkm\" (UniqueName: \"kubernetes.io/projected/dbcf7b20-8a2c-4fe2-aac5-9fe839c25590-kube-api-access-66xkm\") pod \"crc-debug-jfjxw\" (UID: \"dbcf7b20-8a2c-4fe2-aac5-9fe839c25590\") " pod="openshift-must-gather-smj9w/crc-debug-jfjxw" Oct 12 07:11:00 crc kubenswrapper[4930]: I1012 07:11:00.106713 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-smj9w/crc-debug-jfjxw" Oct 12 07:11:00 crc kubenswrapper[4930]: W1012 07:11:00.152932 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddbcf7b20_8a2c_4fe2_aac5_9fe839c25590.slice/crio-618e2f3696704b7ba493e26223912f615f3fe630e2706ea0b442ff513199e6ba WatchSource:0}: Error finding container 618e2f3696704b7ba493e26223912f615f3fe630e2706ea0b442ff513199e6ba: Status 404 returned error can't find the container with id 618e2f3696704b7ba493e26223912f615f3fe630e2706ea0b442ff513199e6ba Oct 12 07:11:00 crc kubenswrapper[4930]: I1012 07:11:00.942694 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-smj9w/crc-debug-jfjxw" event={"ID":"dbcf7b20-8a2c-4fe2-aac5-9fe839c25590","Type":"ContainerStarted","Data":"618e2f3696704b7ba493e26223912f615f3fe630e2706ea0b442ff513199e6ba"} Oct 12 07:11:04 crc kubenswrapper[4930]: I1012 07:11:04.136098 4930 scope.go:117] "RemoveContainer" containerID="3508cacfbe81eb819f121a8aa9dec29ca2ce11e5e5e32ed6ab29606cfbe47413" Oct 12 07:11:04 crc kubenswrapper[4930]: E1012 07:11:04.136675 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 07:11:09 crc kubenswrapper[4930]: I1012 07:11:09.195403 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fgbx4" Oct 12 07:11:09 crc kubenswrapper[4930]: I1012 07:11:09.195917 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fgbx4" Oct 12 07:11:09 crc kubenswrapper[4930]: I1012 07:11:09.250118 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fgbx4" Oct 12 07:11:10 crc kubenswrapper[4930]: I1012 07:11:10.111595 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fgbx4" Oct 12 07:11:10 crc kubenswrapper[4930]: I1012 07:11:10.166456 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fgbx4"] Oct 12 07:11:11 crc kubenswrapper[4930]: I1012 07:11:11.056972 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-smj9w/crc-debug-jfjxw" event={"ID":"dbcf7b20-8a2c-4fe2-aac5-9fe839c25590","Type":"ContainerStarted","Data":"ebd3b52568c4b7e9e0540dfbadd08d73f1feca1de1abd375c31cab54e4afea32"} Oct 12 07:11:11 crc kubenswrapper[4930]: I1012 07:11:11.076404 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-smj9w/crc-debug-jfjxw" podStartSLOduration=1.759041026 podStartE2EDuration="12.076387053s" podCreationTimestamp="2025-10-12 07:10:59 +0000 UTC" firstStartedPulling="2025-10-12 07:11:00.154747012 +0000 UTC m=+5392.696848777" lastFinishedPulling="2025-10-12 07:11:10.472093039 +0000 UTC m=+5403.014194804" observedRunningTime="2025-10-12 07:11:11.075340427 +0000 UTC m=+5403.617442192" watchObservedRunningTime="2025-10-12 07:11:11.076387053 +0000 UTC m=+5403.618488838" Oct 12 07:11:12 crc kubenswrapper[4930]: I1012 07:11:12.066180 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fgbx4" podUID="fc093a59-ff99-4c7c-97da-adc89ec15b5b" containerName="registry-server" containerID="cri-o://f31912211de59ef040d31947c9b53106a4792b028ffa02e84ff754af9b0cc9a7" gracePeriod=2 Oct 12 07:11:12 crc kubenswrapper[4930]: I1012 07:11:12.537286 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fgbx4" Oct 12 07:11:12 crc kubenswrapper[4930]: I1012 07:11:12.632830 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc093a59-ff99-4c7c-97da-adc89ec15b5b-utilities\") pod \"fc093a59-ff99-4c7c-97da-adc89ec15b5b\" (UID: \"fc093a59-ff99-4c7c-97da-adc89ec15b5b\") " Oct 12 07:11:12 crc kubenswrapper[4930]: I1012 07:11:12.632962 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bcngd\" (UniqueName: \"kubernetes.io/projected/fc093a59-ff99-4c7c-97da-adc89ec15b5b-kube-api-access-bcngd\") pod \"fc093a59-ff99-4c7c-97da-adc89ec15b5b\" (UID: \"fc093a59-ff99-4c7c-97da-adc89ec15b5b\") " Oct 12 07:11:12 crc kubenswrapper[4930]: I1012 07:11:12.633064 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc093a59-ff99-4c7c-97da-adc89ec15b5b-catalog-content\") pod \"fc093a59-ff99-4c7c-97da-adc89ec15b5b\" (UID: \"fc093a59-ff99-4c7c-97da-adc89ec15b5b\") " Oct 12 07:11:12 crc kubenswrapper[4930]: I1012 07:11:12.633688 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc093a59-ff99-4c7c-97da-adc89ec15b5b-utilities" (OuterVolumeSpecName: "utilities") pod "fc093a59-ff99-4c7c-97da-adc89ec15b5b" (UID: "fc093a59-ff99-4c7c-97da-adc89ec15b5b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:11:12 crc kubenswrapper[4930]: I1012 07:11:12.634051 4930 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc093a59-ff99-4c7c-97da-adc89ec15b5b-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 07:11:12 crc kubenswrapper[4930]: I1012 07:11:12.638876 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc093a59-ff99-4c7c-97da-adc89ec15b5b-kube-api-access-bcngd" (OuterVolumeSpecName: "kube-api-access-bcngd") pod "fc093a59-ff99-4c7c-97da-adc89ec15b5b" (UID: "fc093a59-ff99-4c7c-97da-adc89ec15b5b"). InnerVolumeSpecName "kube-api-access-bcngd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:11:12 crc kubenswrapper[4930]: I1012 07:11:12.651105 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc093a59-ff99-4c7c-97da-adc89ec15b5b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fc093a59-ff99-4c7c-97da-adc89ec15b5b" (UID: "fc093a59-ff99-4c7c-97da-adc89ec15b5b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:11:12 crc kubenswrapper[4930]: I1012 07:11:12.736041 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bcngd\" (UniqueName: \"kubernetes.io/projected/fc093a59-ff99-4c7c-97da-adc89ec15b5b-kube-api-access-bcngd\") on node \"crc\" DevicePath \"\"" Oct 12 07:11:12 crc kubenswrapper[4930]: I1012 07:11:12.736077 4930 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc093a59-ff99-4c7c-97da-adc89ec15b5b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 07:11:13 crc kubenswrapper[4930]: I1012 07:11:13.077543 4930 generic.go:334] "Generic (PLEG): container finished" podID="fc093a59-ff99-4c7c-97da-adc89ec15b5b" containerID="f31912211de59ef040d31947c9b53106a4792b028ffa02e84ff754af9b0cc9a7" exitCode=0 Oct 12 07:11:13 crc kubenswrapper[4930]: I1012 07:11:13.077698 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fgbx4" event={"ID":"fc093a59-ff99-4c7c-97da-adc89ec15b5b","Type":"ContainerDied","Data":"f31912211de59ef040d31947c9b53106a4792b028ffa02e84ff754af9b0cc9a7"} Oct 12 07:11:13 crc kubenswrapper[4930]: I1012 07:11:13.077821 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fgbx4" event={"ID":"fc093a59-ff99-4c7c-97da-adc89ec15b5b","Type":"ContainerDied","Data":"1a69cdfc7255e765f3442eeaf0f5d6486cc93f086393796c75b2cccd91954451"} Oct 12 07:11:13 crc kubenswrapper[4930]: I1012 07:11:13.077836 4930 scope.go:117] "RemoveContainer" containerID="f31912211de59ef040d31947c9b53106a4792b028ffa02e84ff754af9b0cc9a7" Oct 12 07:11:13 crc kubenswrapper[4930]: I1012 07:11:13.077775 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fgbx4" Oct 12 07:11:13 crc kubenswrapper[4930]: I1012 07:11:13.115054 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fgbx4"] Oct 12 07:11:13 crc kubenswrapper[4930]: I1012 07:11:13.124206 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fgbx4"] Oct 12 07:11:13 crc kubenswrapper[4930]: I1012 07:11:13.141849 4930 scope.go:117] "RemoveContainer" containerID="0b03f0c5c97d36301d8ff100e75dcb843f9c21f9f46ab2dcff0948c8025ea194" Oct 12 07:11:13 crc kubenswrapper[4930]: I1012 07:11:13.168310 4930 scope.go:117] "RemoveContainer" containerID="55452ad4f033ae02e4e266e5099a8c8b656dfa3f18d679ea39b04ab1a39de8fa" Oct 12 07:11:13 crc kubenswrapper[4930]: I1012 07:11:13.227828 4930 scope.go:117] "RemoveContainer" containerID="f31912211de59ef040d31947c9b53106a4792b028ffa02e84ff754af9b0cc9a7" Oct 12 07:11:13 crc kubenswrapper[4930]: E1012 07:11:13.228236 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f31912211de59ef040d31947c9b53106a4792b028ffa02e84ff754af9b0cc9a7\": container with ID starting with f31912211de59ef040d31947c9b53106a4792b028ffa02e84ff754af9b0cc9a7 not found: ID does not exist" containerID="f31912211de59ef040d31947c9b53106a4792b028ffa02e84ff754af9b0cc9a7" Oct 12 07:11:13 crc kubenswrapper[4930]: I1012 07:11:13.228260 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f31912211de59ef040d31947c9b53106a4792b028ffa02e84ff754af9b0cc9a7"} err="failed to get container status \"f31912211de59ef040d31947c9b53106a4792b028ffa02e84ff754af9b0cc9a7\": rpc error: code = NotFound desc = could not find container \"f31912211de59ef040d31947c9b53106a4792b028ffa02e84ff754af9b0cc9a7\": container with ID starting with f31912211de59ef040d31947c9b53106a4792b028ffa02e84ff754af9b0cc9a7 not found: ID does not exist" Oct 12 07:11:13 crc kubenswrapper[4930]: I1012 07:11:13.228278 4930 scope.go:117] "RemoveContainer" containerID="0b03f0c5c97d36301d8ff100e75dcb843f9c21f9f46ab2dcff0948c8025ea194" Oct 12 07:11:13 crc kubenswrapper[4930]: E1012 07:11:13.228582 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b03f0c5c97d36301d8ff100e75dcb843f9c21f9f46ab2dcff0948c8025ea194\": container with ID starting with 0b03f0c5c97d36301d8ff100e75dcb843f9c21f9f46ab2dcff0948c8025ea194 not found: ID does not exist" containerID="0b03f0c5c97d36301d8ff100e75dcb843f9c21f9f46ab2dcff0948c8025ea194" Oct 12 07:11:13 crc kubenswrapper[4930]: I1012 07:11:13.228691 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b03f0c5c97d36301d8ff100e75dcb843f9c21f9f46ab2dcff0948c8025ea194"} err="failed to get container status \"0b03f0c5c97d36301d8ff100e75dcb843f9c21f9f46ab2dcff0948c8025ea194\": rpc error: code = NotFound desc = could not find container \"0b03f0c5c97d36301d8ff100e75dcb843f9c21f9f46ab2dcff0948c8025ea194\": container with ID starting with 0b03f0c5c97d36301d8ff100e75dcb843f9c21f9f46ab2dcff0948c8025ea194 not found: ID does not exist" Oct 12 07:11:13 crc kubenswrapper[4930]: I1012 07:11:13.228815 4930 scope.go:117] "RemoveContainer" containerID="55452ad4f033ae02e4e266e5099a8c8b656dfa3f18d679ea39b04ab1a39de8fa" Oct 12 07:11:13 crc kubenswrapper[4930]: E1012 07:11:13.229118 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55452ad4f033ae02e4e266e5099a8c8b656dfa3f18d679ea39b04ab1a39de8fa\": container with ID starting with 55452ad4f033ae02e4e266e5099a8c8b656dfa3f18d679ea39b04ab1a39de8fa not found: ID does not exist" containerID="55452ad4f033ae02e4e266e5099a8c8b656dfa3f18d679ea39b04ab1a39de8fa" Oct 12 07:11:13 crc kubenswrapper[4930]: I1012 07:11:13.229135 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55452ad4f033ae02e4e266e5099a8c8b656dfa3f18d679ea39b04ab1a39de8fa"} err="failed to get container status \"55452ad4f033ae02e4e266e5099a8c8b656dfa3f18d679ea39b04ab1a39de8fa\": rpc error: code = NotFound desc = could not find container \"55452ad4f033ae02e4e266e5099a8c8b656dfa3f18d679ea39b04ab1a39de8fa\": container with ID starting with 55452ad4f033ae02e4e266e5099a8c8b656dfa3f18d679ea39b04ab1a39de8fa not found: ID does not exist" Oct 12 07:11:14 crc kubenswrapper[4930]: I1012 07:11:14.146846 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc093a59-ff99-4c7c-97da-adc89ec15b5b" path="/var/lib/kubelet/pods/fc093a59-ff99-4c7c-97da-adc89ec15b5b/volumes" Oct 12 07:11:16 crc kubenswrapper[4930]: I1012 07:11:16.135996 4930 scope.go:117] "RemoveContainer" containerID="3508cacfbe81eb819f121a8aa9dec29ca2ce11e5e5e32ed6ab29606cfbe47413" Oct 12 07:11:16 crc kubenswrapper[4930]: E1012 07:11:16.136727 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 07:11:28 crc kubenswrapper[4930]: I1012 07:11:28.137862 4930 scope.go:117] "RemoveContainer" containerID="3508cacfbe81eb819f121a8aa9dec29ca2ce11e5e5e32ed6ab29606cfbe47413" Oct 12 07:11:28 crc kubenswrapper[4930]: E1012 07:11:28.139087 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 07:11:43 crc kubenswrapper[4930]: I1012 07:11:43.137774 4930 scope.go:117] "RemoveContainer" containerID="3508cacfbe81eb819f121a8aa9dec29ca2ce11e5e5e32ed6ab29606cfbe47413" Oct 12 07:11:43 crc kubenswrapper[4930]: E1012 07:11:43.138725 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 07:11:54 crc kubenswrapper[4930]: I1012 07:11:54.500748 4930 generic.go:334] "Generic (PLEG): container finished" podID="dbcf7b20-8a2c-4fe2-aac5-9fe839c25590" containerID="ebd3b52568c4b7e9e0540dfbadd08d73f1feca1de1abd375c31cab54e4afea32" exitCode=0 Oct 12 07:11:54 crc kubenswrapper[4930]: I1012 07:11:54.500775 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-smj9w/crc-debug-jfjxw" event={"ID":"dbcf7b20-8a2c-4fe2-aac5-9fe839c25590","Type":"ContainerDied","Data":"ebd3b52568c4b7e9e0540dfbadd08d73f1feca1de1abd375c31cab54e4afea32"} Oct 12 07:11:55 crc kubenswrapper[4930]: I1012 07:11:55.614199 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-smj9w/crc-debug-jfjxw" Oct 12 07:11:55 crc kubenswrapper[4930]: I1012 07:11:55.656581 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-smj9w/crc-debug-jfjxw"] Oct 12 07:11:55 crc kubenswrapper[4930]: I1012 07:11:55.665447 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-smj9w/crc-debug-jfjxw"] Oct 12 07:11:55 crc kubenswrapper[4930]: I1012 07:11:55.707798 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66xkm\" (UniqueName: \"kubernetes.io/projected/dbcf7b20-8a2c-4fe2-aac5-9fe839c25590-kube-api-access-66xkm\") pod \"dbcf7b20-8a2c-4fe2-aac5-9fe839c25590\" (UID: \"dbcf7b20-8a2c-4fe2-aac5-9fe839c25590\") " Oct 12 07:11:55 crc kubenswrapper[4930]: I1012 07:11:55.707907 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dbcf7b20-8a2c-4fe2-aac5-9fe839c25590-host\") pod \"dbcf7b20-8a2c-4fe2-aac5-9fe839c25590\" (UID: \"dbcf7b20-8a2c-4fe2-aac5-9fe839c25590\") " Oct 12 07:11:55 crc kubenswrapper[4930]: I1012 07:11:55.707976 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dbcf7b20-8a2c-4fe2-aac5-9fe839c25590-host" (OuterVolumeSpecName: "host") pod "dbcf7b20-8a2c-4fe2-aac5-9fe839c25590" (UID: "dbcf7b20-8a2c-4fe2-aac5-9fe839c25590"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 07:11:55 crc kubenswrapper[4930]: I1012 07:11:55.708680 4930 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dbcf7b20-8a2c-4fe2-aac5-9fe839c25590-host\") on node \"crc\" DevicePath \"\"" Oct 12 07:11:55 crc kubenswrapper[4930]: I1012 07:11:55.713983 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbcf7b20-8a2c-4fe2-aac5-9fe839c25590-kube-api-access-66xkm" (OuterVolumeSpecName: "kube-api-access-66xkm") pod "dbcf7b20-8a2c-4fe2-aac5-9fe839c25590" (UID: "dbcf7b20-8a2c-4fe2-aac5-9fe839c25590"). InnerVolumeSpecName "kube-api-access-66xkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:11:55 crc kubenswrapper[4930]: I1012 07:11:55.810653 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66xkm\" (UniqueName: \"kubernetes.io/projected/dbcf7b20-8a2c-4fe2-aac5-9fe839c25590-kube-api-access-66xkm\") on node \"crc\" DevicePath \"\"" Oct 12 07:11:56 crc kubenswrapper[4930]: I1012 07:11:56.157356 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbcf7b20-8a2c-4fe2-aac5-9fe839c25590" path="/var/lib/kubelet/pods/dbcf7b20-8a2c-4fe2-aac5-9fe839c25590/volumes" Oct 12 07:11:56 crc kubenswrapper[4930]: I1012 07:11:56.528980 4930 scope.go:117] "RemoveContainer" containerID="ebd3b52568c4b7e9e0540dfbadd08d73f1feca1de1abd375c31cab54e4afea32" Oct 12 07:11:56 crc kubenswrapper[4930]: I1012 07:11:56.529064 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-smj9w/crc-debug-jfjxw" Oct 12 07:11:56 crc kubenswrapper[4930]: I1012 07:11:56.834512 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-smj9w/crc-debug-vvcbp"] Oct 12 07:11:56 crc kubenswrapper[4930]: E1012 07:11:56.835765 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc093a59-ff99-4c7c-97da-adc89ec15b5b" containerName="extract-content" Oct 12 07:11:56 crc kubenswrapper[4930]: I1012 07:11:56.835857 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc093a59-ff99-4c7c-97da-adc89ec15b5b" containerName="extract-content" Oct 12 07:11:56 crc kubenswrapper[4930]: E1012 07:11:56.835929 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbcf7b20-8a2c-4fe2-aac5-9fe839c25590" containerName="container-00" Oct 12 07:11:56 crc kubenswrapper[4930]: I1012 07:11:56.835982 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbcf7b20-8a2c-4fe2-aac5-9fe839c25590" containerName="container-00" Oct 12 07:11:56 crc kubenswrapper[4930]: E1012 07:11:56.836044 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc093a59-ff99-4c7c-97da-adc89ec15b5b" containerName="extract-utilities" Oct 12 07:11:56 crc kubenswrapper[4930]: I1012 07:11:56.836100 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc093a59-ff99-4c7c-97da-adc89ec15b5b" containerName="extract-utilities" Oct 12 07:11:56 crc kubenswrapper[4930]: E1012 07:11:56.836166 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc093a59-ff99-4c7c-97da-adc89ec15b5b" containerName="registry-server" Oct 12 07:11:56 crc kubenswrapper[4930]: I1012 07:11:56.836219 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc093a59-ff99-4c7c-97da-adc89ec15b5b" containerName="registry-server" Oct 12 07:11:56 crc kubenswrapper[4930]: I1012 07:11:56.836442 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc093a59-ff99-4c7c-97da-adc89ec15b5b" containerName="registry-server" Oct 12 07:11:56 crc kubenswrapper[4930]: I1012 07:11:56.836507 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbcf7b20-8a2c-4fe2-aac5-9fe839c25590" containerName="container-00" Oct 12 07:11:56 crc kubenswrapper[4930]: I1012 07:11:56.837187 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-smj9w/crc-debug-vvcbp" Oct 12 07:11:56 crc kubenswrapper[4930]: I1012 07:11:56.840010 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-smj9w"/"default-dockercfg-pxzgv" Oct 12 07:11:56 crc kubenswrapper[4930]: I1012 07:11:56.935943 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zb8l\" (UniqueName: \"kubernetes.io/projected/2177f6a1-7af1-4f48-91d9-900603a134cb-kube-api-access-5zb8l\") pod \"crc-debug-vvcbp\" (UID: \"2177f6a1-7af1-4f48-91d9-900603a134cb\") " pod="openshift-must-gather-smj9w/crc-debug-vvcbp" Oct 12 07:11:56 crc kubenswrapper[4930]: I1012 07:11:56.936273 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2177f6a1-7af1-4f48-91d9-900603a134cb-host\") pod \"crc-debug-vvcbp\" (UID: \"2177f6a1-7af1-4f48-91d9-900603a134cb\") " pod="openshift-must-gather-smj9w/crc-debug-vvcbp" Oct 12 07:11:57 crc kubenswrapper[4930]: I1012 07:11:57.038304 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2177f6a1-7af1-4f48-91d9-900603a134cb-host\") pod \"crc-debug-vvcbp\" (UID: \"2177f6a1-7af1-4f48-91d9-900603a134cb\") " pod="openshift-must-gather-smj9w/crc-debug-vvcbp" Oct 12 07:11:57 crc kubenswrapper[4930]: I1012 07:11:57.038409 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zb8l\" (UniqueName: \"kubernetes.io/projected/2177f6a1-7af1-4f48-91d9-900603a134cb-kube-api-access-5zb8l\") pod \"crc-debug-vvcbp\" (UID: \"2177f6a1-7af1-4f48-91d9-900603a134cb\") " pod="openshift-must-gather-smj9w/crc-debug-vvcbp" Oct 12 07:11:57 crc kubenswrapper[4930]: I1012 07:11:57.038489 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2177f6a1-7af1-4f48-91d9-900603a134cb-host\") pod \"crc-debug-vvcbp\" (UID: \"2177f6a1-7af1-4f48-91d9-900603a134cb\") " pod="openshift-must-gather-smj9w/crc-debug-vvcbp" Oct 12 07:11:57 crc kubenswrapper[4930]: I1012 07:11:57.060975 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zb8l\" (UniqueName: \"kubernetes.io/projected/2177f6a1-7af1-4f48-91d9-900603a134cb-kube-api-access-5zb8l\") pod \"crc-debug-vvcbp\" (UID: \"2177f6a1-7af1-4f48-91d9-900603a134cb\") " pod="openshift-must-gather-smj9w/crc-debug-vvcbp" Oct 12 07:11:57 crc kubenswrapper[4930]: I1012 07:11:57.135404 4930 scope.go:117] "RemoveContainer" containerID="3508cacfbe81eb819f121a8aa9dec29ca2ce11e5e5e32ed6ab29606cfbe47413" Oct 12 07:11:57 crc kubenswrapper[4930]: E1012 07:11:57.135718 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 07:11:57 crc kubenswrapper[4930]: I1012 07:11:57.153390 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-smj9w/crc-debug-vvcbp" Oct 12 07:11:57 crc kubenswrapper[4930]: I1012 07:11:57.539214 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-smj9w/crc-debug-vvcbp" event={"ID":"2177f6a1-7af1-4f48-91d9-900603a134cb","Type":"ContainerStarted","Data":"d088c9733a8ebc3327ee509757ec27a0f7f126f7343ddf20f273a77c0623b6b5"} Oct 12 07:11:57 crc kubenswrapper[4930]: I1012 07:11:57.539506 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-smj9w/crc-debug-vvcbp" event={"ID":"2177f6a1-7af1-4f48-91d9-900603a134cb","Type":"ContainerStarted","Data":"9920c5e600556b08824cc1e03c379363cc31828d18f833052bdcd3543a674384"} Oct 12 07:11:57 crc kubenswrapper[4930]: I1012 07:11:57.562901 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-smj9w/crc-debug-vvcbp" podStartSLOduration=1.5628815390000002 podStartE2EDuration="1.562881539s" podCreationTimestamp="2025-10-12 07:11:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:11:57.552838944 +0000 UTC m=+5450.094940719" watchObservedRunningTime="2025-10-12 07:11:57.562881539 +0000 UTC m=+5450.104983314" Oct 12 07:11:58 crc kubenswrapper[4930]: I1012 07:11:58.548784 4930 generic.go:334] "Generic (PLEG): container finished" podID="2177f6a1-7af1-4f48-91d9-900603a134cb" containerID="d088c9733a8ebc3327ee509757ec27a0f7f126f7343ddf20f273a77c0623b6b5" exitCode=0 Oct 12 07:11:58 crc kubenswrapper[4930]: I1012 07:11:58.548832 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-smj9w/crc-debug-vvcbp" event={"ID":"2177f6a1-7af1-4f48-91d9-900603a134cb","Type":"ContainerDied","Data":"d088c9733a8ebc3327ee509757ec27a0f7f126f7343ddf20f273a77c0623b6b5"} Oct 12 07:11:59 crc kubenswrapper[4930]: I1012 07:11:59.668908 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-smj9w/crc-debug-vvcbp" Oct 12 07:11:59 crc kubenswrapper[4930]: I1012 07:11:59.702918 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-smj9w/crc-debug-vvcbp"] Oct 12 07:11:59 crc kubenswrapper[4930]: I1012 07:11:59.711543 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-smj9w/crc-debug-vvcbp"] Oct 12 07:11:59 crc kubenswrapper[4930]: I1012 07:11:59.786695 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2177f6a1-7af1-4f48-91d9-900603a134cb-host\") pod \"2177f6a1-7af1-4f48-91d9-900603a134cb\" (UID: \"2177f6a1-7af1-4f48-91d9-900603a134cb\") " Oct 12 07:11:59 crc kubenswrapper[4930]: I1012 07:11:59.786823 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2177f6a1-7af1-4f48-91d9-900603a134cb-host" (OuterVolumeSpecName: "host") pod "2177f6a1-7af1-4f48-91d9-900603a134cb" (UID: "2177f6a1-7af1-4f48-91d9-900603a134cb"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 07:11:59 crc kubenswrapper[4930]: I1012 07:11:59.786969 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zb8l\" (UniqueName: \"kubernetes.io/projected/2177f6a1-7af1-4f48-91d9-900603a134cb-kube-api-access-5zb8l\") pod \"2177f6a1-7af1-4f48-91d9-900603a134cb\" (UID: \"2177f6a1-7af1-4f48-91d9-900603a134cb\") " Oct 12 07:11:59 crc kubenswrapper[4930]: I1012 07:11:59.787455 4930 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2177f6a1-7af1-4f48-91d9-900603a134cb-host\") on node \"crc\" DevicePath \"\"" Oct 12 07:11:59 crc kubenswrapper[4930]: I1012 07:11:59.796918 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2177f6a1-7af1-4f48-91d9-900603a134cb-kube-api-access-5zb8l" (OuterVolumeSpecName: "kube-api-access-5zb8l") pod "2177f6a1-7af1-4f48-91d9-900603a134cb" (UID: "2177f6a1-7af1-4f48-91d9-900603a134cb"). InnerVolumeSpecName "kube-api-access-5zb8l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:11:59 crc kubenswrapper[4930]: I1012 07:11:59.889645 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zb8l\" (UniqueName: \"kubernetes.io/projected/2177f6a1-7af1-4f48-91d9-900603a134cb-kube-api-access-5zb8l\") on node \"crc\" DevicePath \"\"" Oct 12 07:12:00 crc kubenswrapper[4930]: I1012 07:12:00.148969 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2177f6a1-7af1-4f48-91d9-900603a134cb" path="/var/lib/kubelet/pods/2177f6a1-7af1-4f48-91d9-900603a134cb/volumes" Oct 12 07:12:00 crc kubenswrapper[4930]: I1012 07:12:00.568543 4930 scope.go:117] "RemoveContainer" containerID="d088c9733a8ebc3327ee509757ec27a0f7f126f7343ddf20f273a77c0623b6b5" Oct 12 07:12:00 crc kubenswrapper[4930]: I1012 07:12:00.568902 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-smj9w/crc-debug-vvcbp" Oct 12 07:12:00 crc kubenswrapper[4930]: I1012 07:12:00.897683 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-smj9w/crc-debug-bl5j7"] Oct 12 07:12:00 crc kubenswrapper[4930]: E1012 07:12:00.898319 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2177f6a1-7af1-4f48-91d9-900603a134cb" containerName="container-00" Oct 12 07:12:00 crc kubenswrapper[4930]: I1012 07:12:00.898332 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="2177f6a1-7af1-4f48-91d9-900603a134cb" containerName="container-00" Oct 12 07:12:00 crc kubenswrapper[4930]: I1012 07:12:00.898546 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="2177f6a1-7af1-4f48-91d9-900603a134cb" containerName="container-00" Oct 12 07:12:00 crc kubenswrapper[4930]: I1012 07:12:00.899182 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-smj9w/crc-debug-bl5j7" Oct 12 07:12:00 crc kubenswrapper[4930]: I1012 07:12:00.900896 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-smj9w"/"default-dockercfg-pxzgv" Oct 12 07:12:01 crc kubenswrapper[4930]: I1012 07:12:01.012713 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l94qj\" (UniqueName: \"kubernetes.io/projected/46317ce4-695e-4024-ab3b-fba5817fdb40-kube-api-access-l94qj\") pod \"crc-debug-bl5j7\" (UID: \"46317ce4-695e-4024-ab3b-fba5817fdb40\") " pod="openshift-must-gather-smj9w/crc-debug-bl5j7" Oct 12 07:12:01 crc kubenswrapper[4930]: I1012 07:12:01.012777 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/46317ce4-695e-4024-ab3b-fba5817fdb40-host\") pod \"crc-debug-bl5j7\" (UID: \"46317ce4-695e-4024-ab3b-fba5817fdb40\") " pod="openshift-must-gather-smj9w/crc-debug-bl5j7" Oct 12 07:12:01 crc kubenswrapper[4930]: I1012 07:12:01.114599 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l94qj\" (UniqueName: \"kubernetes.io/projected/46317ce4-695e-4024-ab3b-fba5817fdb40-kube-api-access-l94qj\") pod \"crc-debug-bl5j7\" (UID: \"46317ce4-695e-4024-ab3b-fba5817fdb40\") " pod="openshift-must-gather-smj9w/crc-debug-bl5j7" Oct 12 07:12:01 crc kubenswrapper[4930]: I1012 07:12:01.114657 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/46317ce4-695e-4024-ab3b-fba5817fdb40-host\") pod \"crc-debug-bl5j7\" (UID: \"46317ce4-695e-4024-ab3b-fba5817fdb40\") " pod="openshift-must-gather-smj9w/crc-debug-bl5j7" Oct 12 07:12:01 crc kubenswrapper[4930]: I1012 07:12:01.114875 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/46317ce4-695e-4024-ab3b-fba5817fdb40-host\") pod \"crc-debug-bl5j7\" (UID: \"46317ce4-695e-4024-ab3b-fba5817fdb40\") " pod="openshift-must-gather-smj9w/crc-debug-bl5j7" Oct 12 07:12:01 crc kubenswrapper[4930]: I1012 07:12:01.133164 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l94qj\" (UniqueName: \"kubernetes.io/projected/46317ce4-695e-4024-ab3b-fba5817fdb40-kube-api-access-l94qj\") pod \"crc-debug-bl5j7\" (UID: \"46317ce4-695e-4024-ab3b-fba5817fdb40\") " pod="openshift-must-gather-smj9w/crc-debug-bl5j7" Oct 12 07:12:01 crc kubenswrapper[4930]: I1012 07:12:01.220410 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-smj9w/crc-debug-bl5j7" Oct 12 07:12:01 crc kubenswrapper[4930]: W1012 07:12:01.258270 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46317ce4_695e_4024_ab3b_fba5817fdb40.slice/crio-85067c96578eb2a13a1fae6faffb89441ee6d55cd612439cbd15fe5d8dba8fb0 WatchSource:0}: Error finding container 85067c96578eb2a13a1fae6faffb89441ee6d55cd612439cbd15fe5d8dba8fb0: Status 404 returned error can't find the container with id 85067c96578eb2a13a1fae6faffb89441ee6d55cd612439cbd15fe5d8dba8fb0 Oct 12 07:12:01 crc kubenswrapper[4930]: I1012 07:12:01.578158 4930 generic.go:334] "Generic (PLEG): container finished" podID="46317ce4-695e-4024-ab3b-fba5817fdb40" containerID="043b967643a56b1a3c135f9b8ac283a0dee5a72562344438895e16f14ffa339f" exitCode=0 Oct 12 07:12:01 crc kubenswrapper[4930]: I1012 07:12:01.578239 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-smj9w/crc-debug-bl5j7" event={"ID":"46317ce4-695e-4024-ab3b-fba5817fdb40","Type":"ContainerDied","Data":"043b967643a56b1a3c135f9b8ac283a0dee5a72562344438895e16f14ffa339f"} Oct 12 07:12:01 crc kubenswrapper[4930]: I1012 07:12:01.578533 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-smj9w/crc-debug-bl5j7" event={"ID":"46317ce4-695e-4024-ab3b-fba5817fdb40","Type":"ContainerStarted","Data":"85067c96578eb2a13a1fae6faffb89441ee6d55cd612439cbd15fe5d8dba8fb0"} Oct 12 07:12:01 crc kubenswrapper[4930]: I1012 07:12:01.616825 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-smj9w/crc-debug-bl5j7"] Oct 12 07:12:01 crc kubenswrapper[4930]: I1012 07:12:01.626391 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-smj9w/crc-debug-bl5j7"] Oct 12 07:12:02 crc kubenswrapper[4930]: I1012 07:12:02.697855 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-smj9w/crc-debug-bl5j7" Oct 12 07:12:02 crc kubenswrapper[4930]: I1012 07:12:02.843930 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/46317ce4-695e-4024-ab3b-fba5817fdb40-host\") pod \"46317ce4-695e-4024-ab3b-fba5817fdb40\" (UID: \"46317ce4-695e-4024-ab3b-fba5817fdb40\") " Oct 12 07:12:02 crc kubenswrapper[4930]: I1012 07:12:02.844025 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l94qj\" (UniqueName: \"kubernetes.io/projected/46317ce4-695e-4024-ab3b-fba5817fdb40-kube-api-access-l94qj\") pod \"46317ce4-695e-4024-ab3b-fba5817fdb40\" (UID: \"46317ce4-695e-4024-ab3b-fba5817fdb40\") " Oct 12 07:12:02 crc kubenswrapper[4930]: I1012 07:12:02.845563 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/46317ce4-695e-4024-ab3b-fba5817fdb40-host" (OuterVolumeSpecName: "host") pod "46317ce4-695e-4024-ab3b-fba5817fdb40" (UID: "46317ce4-695e-4024-ab3b-fba5817fdb40"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 07:12:02 crc kubenswrapper[4930]: I1012 07:12:02.850445 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46317ce4-695e-4024-ab3b-fba5817fdb40-kube-api-access-l94qj" (OuterVolumeSpecName: "kube-api-access-l94qj") pod "46317ce4-695e-4024-ab3b-fba5817fdb40" (UID: "46317ce4-695e-4024-ab3b-fba5817fdb40"). InnerVolumeSpecName "kube-api-access-l94qj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:12:02 crc kubenswrapper[4930]: I1012 07:12:02.947786 4930 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/46317ce4-695e-4024-ab3b-fba5817fdb40-host\") on node \"crc\" DevicePath \"\"" Oct 12 07:12:02 crc kubenswrapper[4930]: I1012 07:12:02.947829 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l94qj\" (UniqueName: \"kubernetes.io/projected/46317ce4-695e-4024-ab3b-fba5817fdb40-kube-api-access-l94qj\") on node \"crc\" DevicePath \"\"" Oct 12 07:12:03 crc kubenswrapper[4930]: I1012 07:12:03.601651 4930 scope.go:117] "RemoveContainer" containerID="043b967643a56b1a3c135f9b8ac283a0dee5a72562344438895e16f14ffa339f" Oct 12 07:12:03 crc kubenswrapper[4930]: I1012 07:12:03.601676 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-smj9w/crc-debug-bl5j7" Oct 12 07:12:04 crc kubenswrapper[4930]: I1012 07:12:04.144766 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46317ce4-695e-4024-ab3b-fba5817fdb40" path="/var/lib/kubelet/pods/46317ce4-695e-4024-ab3b-fba5817fdb40/volumes" Oct 12 07:12:10 crc kubenswrapper[4930]: I1012 07:12:10.138630 4930 scope.go:117] "RemoveContainer" containerID="3508cacfbe81eb819f121a8aa9dec29ca2ce11e5e5e32ed6ab29606cfbe47413" Oct 12 07:12:10 crc kubenswrapper[4930]: E1012 07:12:10.140683 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 07:12:11 crc kubenswrapper[4930]: I1012 07:12:11.538307 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nkddp"] Oct 12 07:12:11 crc kubenswrapper[4930]: E1012 07:12:11.539142 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46317ce4-695e-4024-ab3b-fba5817fdb40" containerName="container-00" Oct 12 07:12:11 crc kubenswrapper[4930]: I1012 07:12:11.539167 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="46317ce4-695e-4024-ab3b-fba5817fdb40" containerName="container-00" Oct 12 07:12:11 crc kubenswrapper[4930]: I1012 07:12:11.539796 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="46317ce4-695e-4024-ab3b-fba5817fdb40" containerName="container-00" Oct 12 07:12:11 crc kubenswrapper[4930]: I1012 07:12:11.546440 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nkddp" Oct 12 07:12:11 crc kubenswrapper[4930]: I1012 07:12:11.563218 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nkddp"] Oct 12 07:12:11 crc kubenswrapper[4930]: I1012 07:12:11.625145 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b68sc\" (UniqueName: \"kubernetes.io/projected/9b5ca05b-19f7-4863-b08d-76b4a29307bf-kube-api-access-b68sc\") pod \"community-operators-nkddp\" (UID: \"9b5ca05b-19f7-4863-b08d-76b4a29307bf\") " pod="openshift-marketplace/community-operators-nkddp" Oct 12 07:12:11 crc kubenswrapper[4930]: I1012 07:12:11.625441 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b5ca05b-19f7-4863-b08d-76b4a29307bf-catalog-content\") pod \"community-operators-nkddp\" (UID: \"9b5ca05b-19f7-4863-b08d-76b4a29307bf\") " pod="openshift-marketplace/community-operators-nkddp" Oct 12 07:12:11 crc kubenswrapper[4930]: I1012 07:12:11.625517 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b5ca05b-19f7-4863-b08d-76b4a29307bf-utilities\") pod \"community-operators-nkddp\" (UID: \"9b5ca05b-19f7-4863-b08d-76b4a29307bf\") " pod="openshift-marketplace/community-operators-nkddp" Oct 12 07:12:11 crc kubenswrapper[4930]: I1012 07:12:11.729795 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b5ca05b-19f7-4863-b08d-76b4a29307bf-catalog-content\") pod \"community-operators-nkddp\" (UID: \"9b5ca05b-19f7-4863-b08d-76b4a29307bf\") " pod="openshift-marketplace/community-operators-nkddp" Oct 12 07:12:11 crc kubenswrapper[4930]: I1012 07:12:11.731774 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b5ca05b-19f7-4863-b08d-76b4a29307bf-catalog-content\") pod \"community-operators-nkddp\" (UID: \"9b5ca05b-19f7-4863-b08d-76b4a29307bf\") " pod="openshift-marketplace/community-operators-nkddp" Oct 12 07:12:11 crc kubenswrapper[4930]: I1012 07:12:11.732824 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b5ca05b-19f7-4863-b08d-76b4a29307bf-utilities\") pod \"community-operators-nkddp\" (UID: \"9b5ca05b-19f7-4863-b08d-76b4a29307bf\") " pod="openshift-marketplace/community-operators-nkddp" Oct 12 07:12:11 crc kubenswrapper[4930]: I1012 07:12:11.733317 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b68sc\" (UniqueName: \"kubernetes.io/projected/9b5ca05b-19f7-4863-b08d-76b4a29307bf-kube-api-access-b68sc\") pod \"community-operators-nkddp\" (UID: \"9b5ca05b-19f7-4863-b08d-76b4a29307bf\") " pod="openshift-marketplace/community-operators-nkddp" Oct 12 07:12:11 crc kubenswrapper[4930]: I1012 07:12:11.733508 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b5ca05b-19f7-4863-b08d-76b4a29307bf-utilities\") pod \"community-operators-nkddp\" (UID: \"9b5ca05b-19f7-4863-b08d-76b4a29307bf\") " pod="openshift-marketplace/community-operators-nkddp" Oct 12 07:12:11 crc kubenswrapper[4930]: I1012 07:12:11.759547 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b68sc\" (UniqueName: \"kubernetes.io/projected/9b5ca05b-19f7-4863-b08d-76b4a29307bf-kube-api-access-b68sc\") pod \"community-operators-nkddp\" (UID: \"9b5ca05b-19f7-4863-b08d-76b4a29307bf\") " pod="openshift-marketplace/community-operators-nkddp" Oct 12 07:12:11 crc kubenswrapper[4930]: I1012 07:12:11.889068 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nkddp" Oct 12 07:12:12 crc kubenswrapper[4930]: I1012 07:12:12.411131 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nkddp"] Oct 12 07:12:12 crc kubenswrapper[4930]: I1012 07:12:12.712847 4930 generic.go:334] "Generic (PLEG): container finished" podID="9b5ca05b-19f7-4863-b08d-76b4a29307bf" containerID="620b5172b2bc1db13bc7e6bb6b7a4e3dd8041981c9542c5796186cec41286b6b" exitCode=0 Oct 12 07:12:12 crc kubenswrapper[4930]: I1012 07:12:12.712914 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nkddp" event={"ID":"9b5ca05b-19f7-4863-b08d-76b4a29307bf","Type":"ContainerDied","Data":"620b5172b2bc1db13bc7e6bb6b7a4e3dd8041981c9542c5796186cec41286b6b"} Oct 12 07:12:12 crc kubenswrapper[4930]: I1012 07:12:12.712984 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nkddp" event={"ID":"9b5ca05b-19f7-4863-b08d-76b4a29307bf","Type":"ContainerStarted","Data":"619cc4c16beff85046b0dd630a490e92a82ca2b9e966003432215ae947ecc320"} Oct 12 07:12:13 crc kubenswrapper[4930]: I1012 07:12:13.724839 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nkddp" event={"ID":"9b5ca05b-19f7-4863-b08d-76b4a29307bf","Type":"ContainerStarted","Data":"e34a3395962e7958650d92e1f3f912393566c101d82b8b5a37a0413598601c49"} Oct 12 07:12:15 crc kubenswrapper[4930]: I1012 07:12:15.752631 4930 generic.go:334] "Generic (PLEG): container finished" podID="9b5ca05b-19f7-4863-b08d-76b4a29307bf" containerID="e34a3395962e7958650d92e1f3f912393566c101d82b8b5a37a0413598601c49" exitCode=0 Oct 12 07:12:15 crc kubenswrapper[4930]: I1012 07:12:15.752733 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nkddp" event={"ID":"9b5ca05b-19f7-4863-b08d-76b4a29307bf","Type":"ContainerDied","Data":"e34a3395962e7958650d92e1f3f912393566c101d82b8b5a37a0413598601c49"} Oct 12 07:12:16 crc kubenswrapper[4930]: I1012 07:12:16.770755 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nkddp" event={"ID":"9b5ca05b-19f7-4863-b08d-76b4a29307bf","Type":"ContainerStarted","Data":"ecbaea0ece6079150706881b5e9483db553584abe3bb7e44f58ec584292e4e93"} Oct 12 07:12:16 crc kubenswrapper[4930]: I1012 07:12:16.808659 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nkddp" podStartSLOduration=2.216826139 podStartE2EDuration="5.808640514s" podCreationTimestamp="2025-10-12 07:12:11 +0000 UTC" firstStartedPulling="2025-10-12 07:12:12.715333758 +0000 UTC m=+5465.257435563" lastFinishedPulling="2025-10-12 07:12:16.307148143 +0000 UTC m=+5468.849249938" observedRunningTime="2025-10-12 07:12:16.797930733 +0000 UTC m=+5469.340032498" watchObservedRunningTime="2025-10-12 07:12:16.808640514 +0000 UTC m=+5469.350742289" Oct 12 07:12:21 crc kubenswrapper[4930]: I1012 07:12:21.135573 4930 scope.go:117] "RemoveContainer" containerID="3508cacfbe81eb819f121a8aa9dec29ca2ce11e5e5e32ed6ab29606cfbe47413" Oct 12 07:12:21 crc kubenswrapper[4930]: E1012 07:12:21.136261 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 07:12:21 crc kubenswrapper[4930]: I1012 07:12:21.679786 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-696d7778c8-zcb9x_b054ea5a-466c-432d-aa75-7af68a134c5e/barbican-api/0.log" Oct 12 07:12:21 crc kubenswrapper[4930]: I1012 07:12:21.861928 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-696d7778c8-zcb9x_b054ea5a-466c-432d-aa75-7af68a134c5e/barbican-api-log/0.log" Oct 12 07:12:21 crc kubenswrapper[4930]: I1012 07:12:21.881393 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-587996ddf4-fcrwq_f37233c9-4b67-4e63-949a-24fd340b334b/barbican-keystone-listener/0.log" Oct 12 07:12:21 crc kubenswrapper[4930]: I1012 07:12:21.889849 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nkddp" Oct 12 07:12:21 crc kubenswrapper[4930]: I1012 07:12:21.889968 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nkddp" Oct 12 07:12:21 crc kubenswrapper[4930]: I1012 07:12:21.941346 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nkddp" Oct 12 07:12:21 crc kubenswrapper[4930]: I1012 07:12:21.949654 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-587996ddf4-fcrwq_f37233c9-4b67-4e63-949a-24fd340b334b/barbican-keystone-listener-log/0.log" Oct 12 07:12:22 crc kubenswrapper[4930]: I1012 07:12:22.069070 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-76d484677c-ptrh6_e26aa90e-071d-46ff-8fa1-b86f43a70e01/barbican-worker-log/0.log" Oct 12 07:12:22 crc kubenswrapper[4930]: I1012 07:12:22.105521 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-76d484677c-ptrh6_e26aa90e-071d-46ff-8fa1-b86f43a70e01/barbican-worker/0.log" Oct 12 07:12:22 crc kubenswrapper[4930]: I1012 07:12:22.278994 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-fdbjn_fe5863bc-46d9-4eb8-8ab5-e5c6c2e6dbdc/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 07:12:22 crc kubenswrapper[4930]: I1012 07:12:22.409299 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_1d036de3-1a99-408d-8677-978880f41705/ceilometer-central-agent/0.log" Oct 12 07:12:22 crc kubenswrapper[4930]: I1012 07:12:22.443564 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_1d036de3-1a99-408d-8677-978880f41705/ceilometer-notification-agent/0.log" Oct 12 07:12:22 crc kubenswrapper[4930]: I1012 07:12:22.477410 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_1d036de3-1a99-408d-8677-978880f41705/proxy-httpd/0.log" Oct 12 07:12:22 crc kubenswrapper[4930]: I1012 07:12:22.522705 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_1d036de3-1a99-408d-8677-978880f41705/sg-core/0.log" Oct 12 07:12:22 crc kubenswrapper[4930]: I1012 07:12:22.752552 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_eceafd59-b491-4468-b6b6-78fe1c689e6b/cinder-api-log/0.log" Oct 12 07:12:22 crc kubenswrapper[4930]: I1012 07:12:22.805362 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_eceafd59-b491-4468-b6b6-78fe1c689e6b/cinder-api/0.log" Oct 12 07:12:22 crc kubenswrapper[4930]: I1012 07:12:22.851907 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_6c911528-2136-4abe-a716-c75437784628/cinder-scheduler/0.log" Oct 12 07:12:22 crc kubenswrapper[4930]: I1012 07:12:22.882574 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nkddp" Oct 12 07:12:22 crc kubenswrapper[4930]: I1012 07:12:22.931986 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nkddp"] Oct 12 07:12:23 crc kubenswrapper[4930]: I1012 07:12:23.027557 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-wrzrm_d8b9bf67-f82f-421d-98df-4f8e95911d5a/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 07:12:23 crc kubenswrapper[4930]: I1012 07:12:23.037251 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_6c911528-2136-4abe-a716-c75437784628/probe/0.log" Oct 12 07:12:23 crc kubenswrapper[4930]: I1012 07:12:23.223200 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-l8jmm_8096b9b1-e514-4928-8e48-88e6519dc35e/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 07:12:23 crc kubenswrapper[4930]: I1012 07:12:23.519629 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-w7vqt_eebc8efc-b160-4a75-a213-74fcf9c2595e/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 07:12:23 crc kubenswrapper[4930]: I1012 07:12:23.594905 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-77b58f4b85-jll5z_829039a6-ad10-4532-b406-f497e661fd8d/init/0.log" Oct 12 07:12:23 crc kubenswrapper[4930]: I1012 07:12:23.777199 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-77b58f4b85-jll5z_829039a6-ad10-4532-b406-f497e661fd8d/init/0.log" Oct 12 07:12:23 crc kubenswrapper[4930]: I1012 07:12:23.803839 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-k6s6t_9727cf23-8270-491f-ba18-218bd73cd0c8/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 07:12:23 crc kubenswrapper[4930]: I1012 07:12:23.860491 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-77b58f4b85-jll5z_829039a6-ad10-4532-b406-f497e661fd8d/dnsmasq-dns/0.log" Oct 12 07:12:23 crc kubenswrapper[4930]: I1012 07:12:23.996055 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_00546299-d7c8-4536-9059-85a75dc5824e/glance-log/0.log" Oct 12 07:12:24 crc kubenswrapper[4930]: I1012 07:12:24.026327 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_00546299-d7c8-4536-9059-85a75dc5824e/glance-httpd/0.log" Oct 12 07:12:24 crc kubenswrapper[4930]: I1012 07:12:24.175594 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_6ceabefb-4c59-49ab-9ec7-cc011d6aa659/glance-log/0.log" Oct 12 07:12:24 crc kubenswrapper[4930]: I1012 07:12:24.216800 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_6ceabefb-4c59-49ab-9ec7-cc011d6aa659/glance-httpd/0.log" Oct 12 07:12:24 crc kubenswrapper[4930]: I1012 07:12:24.256493 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6d76466876-jf9t8_a97771f5-bcbe-42d8-bdd8-41b43f8899a0/horizon/0.log" Oct 12 07:12:24 crc kubenswrapper[4930]: I1012 07:12:24.403876 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9_91938dad-6cac-4246-94d7-d93214ae2a5d/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 07:12:24 crc kubenswrapper[4930]: I1012 07:12:24.505689 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-pckm6_f64044f7-a939-48e9-a986-a3a39f4d1a4a/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 07:12:24 crc kubenswrapper[4930]: I1012 07:12:24.686558 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6d76466876-jf9t8_a97771f5-bcbe-42d8-bdd8-41b43f8899a0/horizon-log/0.log" Oct 12 07:12:24 crc kubenswrapper[4930]: I1012 07:12:24.747384 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29337481-hzx9c_f8e371c3-204c-4a74-8c6f-49c25f6b7e90/keystone-cron/0.log" Oct 12 07:12:24 crc kubenswrapper[4930]: I1012 07:12:24.853142 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nkddp" podUID="9b5ca05b-19f7-4863-b08d-76b4a29307bf" containerName="registry-server" containerID="cri-o://ecbaea0ece6079150706881b5e9483db553584abe3bb7e44f58ec584292e4e93" gracePeriod=2 Oct 12 07:12:24 crc kubenswrapper[4930]: I1012 07:12:24.911649 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29337541-5db6s_f9a44177-8b21-4b9b-8367-bdb45a60f379/keystone-cron/0.log" Oct 12 07:12:25 crc kubenswrapper[4930]: I1012 07:12:25.043828 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_8a5cf183-2e6a-408d-9baa-2f43f7b7b354/kube-state-metrics/0.log" Oct 12 07:12:25 crc kubenswrapper[4930]: I1012 07:12:25.119361 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-546b85cb56-ln9lt_456742ca-6f3a-485a-81ee-2a4d84df38c8/keystone-api/0.log" Oct 12 07:12:25 crc kubenswrapper[4930]: I1012 07:12:25.271559 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-4dbjr_1c3c7557-a115-43bb-9147-7faf17337317/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 07:12:25 crc kubenswrapper[4930]: I1012 07:12:25.328032 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nkddp" Oct 12 07:12:25 crc kubenswrapper[4930]: I1012 07:12:25.441249 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b5ca05b-19f7-4863-b08d-76b4a29307bf-utilities\") pod \"9b5ca05b-19f7-4863-b08d-76b4a29307bf\" (UID: \"9b5ca05b-19f7-4863-b08d-76b4a29307bf\") " Oct 12 07:12:25 crc kubenswrapper[4930]: I1012 07:12:25.441378 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b68sc\" (UniqueName: \"kubernetes.io/projected/9b5ca05b-19f7-4863-b08d-76b4a29307bf-kube-api-access-b68sc\") pod \"9b5ca05b-19f7-4863-b08d-76b4a29307bf\" (UID: \"9b5ca05b-19f7-4863-b08d-76b4a29307bf\") " Oct 12 07:12:25 crc kubenswrapper[4930]: I1012 07:12:25.441426 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b5ca05b-19f7-4863-b08d-76b4a29307bf-catalog-content\") pod \"9b5ca05b-19f7-4863-b08d-76b4a29307bf\" (UID: \"9b5ca05b-19f7-4863-b08d-76b4a29307bf\") " Oct 12 07:12:25 crc kubenswrapper[4930]: I1012 07:12:25.442317 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b5ca05b-19f7-4863-b08d-76b4a29307bf-utilities" (OuterVolumeSpecName: "utilities") pod "9b5ca05b-19f7-4863-b08d-76b4a29307bf" (UID: "9b5ca05b-19f7-4863-b08d-76b4a29307bf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:12:25 crc kubenswrapper[4930]: I1012 07:12:25.447363 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b5ca05b-19f7-4863-b08d-76b4a29307bf-kube-api-access-b68sc" (OuterVolumeSpecName: "kube-api-access-b68sc") pod "9b5ca05b-19f7-4863-b08d-76b4a29307bf" (UID: "9b5ca05b-19f7-4863-b08d-76b4a29307bf"). InnerVolumeSpecName "kube-api-access-b68sc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:12:25 crc kubenswrapper[4930]: I1012 07:12:25.496126 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b5ca05b-19f7-4863-b08d-76b4a29307bf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9b5ca05b-19f7-4863-b08d-76b4a29307bf" (UID: "9b5ca05b-19f7-4863-b08d-76b4a29307bf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:12:25 crc kubenswrapper[4930]: I1012 07:12:25.543963 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b68sc\" (UniqueName: \"kubernetes.io/projected/9b5ca05b-19f7-4863-b08d-76b4a29307bf-kube-api-access-b68sc\") on node \"crc\" DevicePath \"\"" Oct 12 07:12:25 crc kubenswrapper[4930]: I1012 07:12:25.543992 4930 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b5ca05b-19f7-4863-b08d-76b4a29307bf-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 07:12:25 crc kubenswrapper[4930]: I1012 07:12:25.544001 4930 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b5ca05b-19f7-4863-b08d-76b4a29307bf-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 07:12:25 crc kubenswrapper[4930]: I1012 07:12:25.624331 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-68ch8_31615b46-4290-46db-993e-3e5afa29c3f6/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 07:12:25 crc kubenswrapper[4930]: I1012 07:12:25.727426 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6844c9655c-rvdcz_532bae95-f9fc-4633-b53c-2f398cbb8bd2/neutron-httpd/0.log" Oct 12 07:12:25 crc kubenswrapper[4930]: I1012 07:12:25.834639 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6844c9655c-rvdcz_532bae95-f9fc-4633-b53c-2f398cbb8bd2/neutron-api/0.log" Oct 12 07:12:25 crc kubenswrapper[4930]: I1012 07:12:25.865188 4930 generic.go:334] "Generic (PLEG): container finished" podID="9b5ca05b-19f7-4863-b08d-76b4a29307bf" containerID="ecbaea0ece6079150706881b5e9483db553584abe3bb7e44f58ec584292e4e93" exitCode=0 Oct 12 07:12:25 crc kubenswrapper[4930]: I1012 07:12:25.865228 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nkddp" event={"ID":"9b5ca05b-19f7-4863-b08d-76b4a29307bf","Type":"ContainerDied","Data":"ecbaea0ece6079150706881b5e9483db553584abe3bb7e44f58ec584292e4e93"} Oct 12 07:12:25 crc kubenswrapper[4930]: I1012 07:12:25.865254 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nkddp" event={"ID":"9b5ca05b-19f7-4863-b08d-76b4a29307bf","Type":"ContainerDied","Data":"619cc4c16beff85046b0dd630a490e92a82ca2b9e966003432215ae947ecc320"} Oct 12 07:12:25 crc kubenswrapper[4930]: I1012 07:12:25.865271 4930 scope.go:117] "RemoveContainer" containerID="ecbaea0ece6079150706881b5e9483db553584abe3bb7e44f58ec584292e4e93" Oct 12 07:12:25 crc kubenswrapper[4930]: I1012 07:12:25.865297 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nkddp" Oct 12 07:12:25 crc kubenswrapper[4930]: I1012 07:12:25.886087 4930 scope.go:117] "RemoveContainer" containerID="e34a3395962e7958650d92e1f3f912393566c101d82b8b5a37a0413598601c49" Oct 12 07:12:25 crc kubenswrapper[4930]: I1012 07:12:25.939759 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nkddp"] Oct 12 07:12:25 crc kubenswrapper[4930]: I1012 07:12:25.949392 4930 scope.go:117] "RemoveContainer" containerID="620b5172b2bc1db13bc7e6bb6b7a4e3dd8041981c9542c5796186cec41286b6b" Oct 12 07:12:25 crc kubenswrapper[4930]: I1012 07:12:25.964431 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nkddp"] Oct 12 07:12:25 crc kubenswrapper[4930]: I1012 07:12:25.985409 4930 scope.go:117] "RemoveContainer" containerID="ecbaea0ece6079150706881b5e9483db553584abe3bb7e44f58ec584292e4e93" Oct 12 07:12:25 crc kubenswrapper[4930]: E1012 07:12:25.985917 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ecbaea0ece6079150706881b5e9483db553584abe3bb7e44f58ec584292e4e93\": container with ID starting with ecbaea0ece6079150706881b5e9483db553584abe3bb7e44f58ec584292e4e93 not found: ID does not exist" containerID="ecbaea0ece6079150706881b5e9483db553584abe3bb7e44f58ec584292e4e93" Oct 12 07:12:25 crc kubenswrapper[4930]: I1012 07:12:25.986042 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecbaea0ece6079150706881b5e9483db553584abe3bb7e44f58ec584292e4e93"} err="failed to get container status \"ecbaea0ece6079150706881b5e9483db553584abe3bb7e44f58ec584292e4e93\": rpc error: code = NotFound desc = could not find container \"ecbaea0ece6079150706881b5e9483db553584abe3bb7e44f58ec584292e4e93\": container with ID starting with ecbaea0ece6079150706881b5e9483db553584abe3bb7e44f58ec584292e4e93 not found: ID does not exist" Oct 12 07:12:25 crc kubenswrapper[4930]: I1012 07:12:25.986162 4930 scope.go:117] "RemoveContainer" containerID="e34a3395962e7958650d92e1f3f912393566c101d82b8b5a37a0413598601c49" Oct 12 07:12:25 crc kubenswrapper[4930]: E1012 07:12:25.986621 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e34a3395962e7958650d92e1f3f912393566c101d82b8b5a37a0413598601c49\": container with ID starting with e34a3395962e7958650d92e1f3f912393566c101d82b8b5a37a0413598601c49 not found: ID does not exist" containerID="e34a3395962e7958650d92e1f3f912393566c101d82b8b5a37a0413598601c49" Oct 12 07:12:25 crc kubenswrapper[4930]: I1012 07:12:25.986664 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e34a3395962e7958650d92e1f3f912393566c101d82b8b5a37a0413598601c49"} err="failed to get container status \"e34a3395962e7958650d92e1f3f912393566c101d82b8b5a37a0413598601c49\": rpc error: code = NotFound desc = could not find container \"e34a3395962e7958650d92e1f3f912393566c101d82b8b5a37a0413598601c49\": container with ID starting with e34a3395962e7958650d92e1f3f912393566c101d82b8b5a37a0413598601c49 not found: ID does not exist" Oct 12 07:12:25 crc kubenswrapper[4930]: I1012 07:12:25.986689 4930 scope.go:117] "RemoveContainer" containerID="620b5172b2bc1db13bc7e6bb6b7a4e3dd8041981c9542c5796186cec41286b6b" Oct 12 07:12:25 crc kubenswrapper[4930]: E1012 07:12:25.987015 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"620b5172b2bc1db13bc7e6bb6b7a4e3dd8041981c9542c5796186cec41286b6b\": container with ID starting with 620b5172b2bc1db13bc7e6bb6b7a4e3dd8041981c9542c5796186cec41286b6b not found: ID does not exist" containerID="620b5172b2bc1db13bc7e6bb6b7a4e3dd8041981c9542c5796186cec41286b6b" Oct 12 07:12:25 crc kubenswrapper[4930]: I1012 07:12:25.987046 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"620b5172b2bc1db13bc7e6bb6b7a4e3dd8041981c9542c5796186cec41286b6b"} err="failed to get container status \"620b5172b2bc1db13bc7e6bb6b7a4e3dd8041981c9542c5796186cec41286b6b\": rpc error: code = NotFound desc = could not find container \"620b5172b2bc1db13bc7e6bb6b7a4e3dd8041981c9542c5796186cec41286b6b\": container with ID starting with 620b5172b2bc1db13bc7e6bb6b7a4e3dd8041981c9542c5796186cec41286b6b not found: ID does not exist" Oct 12 07:12:26 crc kubenswrapper[4930]: I1012 07:12:26.157402 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b5ca05b-19f7-4863-b08d-76b4a29307bf" path="/var/lib/kubelet/pods/9b5ca05b-19f7-4863-b08d-76b4a29307bf/volumes" Oct 12 07:12:26 crc kubenswrapper[4930]: I1012 07:12:26.349250 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_534efb5a-d958-48db-9d8d-1a49091be4de/nova-cell0-conductor-conductor/0.log" Oct 12 07:12:26 crc kubenswrapper[4930]: I1012 07:12:26.558045 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_e6981a83-f891-4520-8602-a51b9132dbfa/nova-cell1-conductor-conductor/0.log" Oct 12 07:12:26 crc kubenswrapper[4930]: I1012 07:12:26.917553 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_3e40e826-773b-46e2-aa5d-d1efe925bf9f/nova-cell1-novncproxy-novncproxy/0.log" Oct 12 07:12:27 crc kubenswrapper[4930]: I1012 07:12:27.070251 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-t8xlg_e6afdef5-2b76-470f-9fb1-a98ae115072a/nova-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 07:12:27 crc kubenswrapper[4930]: I1012 07:12:27.459891 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_0b0fcd1f-00b5-41e2-85e9-b1fd08dcf139/nova-metadata-log/0.log" Oct 12 07:12:27 crc kubenswrapper[4930]: I1012 07:12:27.492182 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_5709234d-7700-462e-9fa6-7e4f09bd0d91/nova-api-log/0.log" Oct 12 07:12:27 crc kubenswrapper[4930]: I1012 07:12:27.689988 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_5709234d-7700-462e-9fa6-7e4f09bd0d91/nova-api-api/0.log" Oct 12 07:12:28 crc kubenswrapper[4930]: I1012 07:12:28.064684 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_58a1e3ca-ad7a-49ac-8129-33ba2953d881/nova-scheduler-scheduler/0.log" Oct 12 07:12:28 crc kubenswrapper[4930]: I1012 07:12:28.772593 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_46dfc5e5-80f4-49c8-bd9d-885e5dcb3fe5/mysql-bootstrap/0.log" Oct 12 07:12:28 crc kubenswrapper[4930]: I1012 07:12:28.982341 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_46dfc5e5-80f4-49c8-bd9d-885e5dcb3fe5/mysql-bootstrap/0.log" Oct 12 07:12:29 crc kubenswrapper[4930]: I1012 07:12:29.017556 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_46dfc5e5-80f4-49c8-bd9d-885e5dcb3fe5/galera/0.log" Oct 12 07:12:29 crc kubenswrapper[4930]: I1012 07:12:29.210544 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e52fde5b-22df-4fea-ae39-2bb2ef6fa033/mysql-bootstrap/0.log" Oct 12 07:12:29 crc kubenswrapper[4930]: I1012 07:12:29.712484 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_0b0fcd1f-00b5-41e2-85e9-b1fd08dcf139/nova-metadata-metadata/0.log" Oct 12 07:12:29 crc kubenswrapper[4930]: I1012 07:12:29.819868 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e52fde5b-22df-4fea-ae39-2bb2ef6fa033/mysql-bootstrap/0.log" Oct 12 07:12:29 crc kubenswrapper[4930]: I1012 07:12:29.857395 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e52fde5b-22df-4fea-ae39-2bb2ef6fa033/galera/0.log" Oct 12 07:12:29 crc kubenswrapper[4930]: I1012 07:12:29.935515 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_1975f875-9e09-4d30-b5d4-2e883f13781b/openstackclient/0.log" Oct 12 07:12:30 crc kubenswrapper[4930]: I1012 07:12:30.028202 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-kpwjv_78fdc19a-5689-461a-89da-3054932b88c3/openstack-network-exporter/0.log" Oct 12 07:12:30 crc kubenswrapper[4930]: I1012 07:12:30.212935 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-nw5dm_ddccae59-8916-4bd7-bffa-041cf574e89e/ovn-controller/0.log" Oct 12 07:12:30 crc kubenswrapper[4930]: I1012 07:12:30.321402 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-hjqfl_382036ff-8896-494d-9670-ec527019676f/ovsdb-server-init/0.log" Oct 12 07:12:30 crc kubenswrapper[4930]: I1012 07:12:30.511585 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-hjqfl_382036ff-8896-494d-9670-ec527019676f/ovsdb-server-init/0.log" Oct 12 07:12:30 crc kubenswrapper[4930]: I1012 07:12:30.557049 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-hjqfl_382036ff-8896-494d-9670-ec527019676f/ovsdb-server/0.log" Oct 12 07:12:30 crc kubenswrapper[4930]: I1012 07:12:30.750950 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-h52cl_1d14f488-c958-4022-b1db-a1161afad246/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 07:12:30 crc kubenswrapper[4930]: I1012 07:12:30.812727 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_b4580a3d-faab-45c2-a7a6-ef2802549ef9/openstack-network-exporter/0.log" Oct 12 07:12:30 crc kubenswrapper[4930]: I1012 07:12:30.953391 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-hjqfl_382036ff-8896-494d-9670-ec527019676f/ovs-vswitchd/0.log" Oct 12 07:12:31 crc kubenswrapper[4930]: I1012 07:12:31.001086 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_b4580a3d-faab-45c2-a7a6-ef2802549ef9/ovn-northd/0.log" Oct 12 07:12:31 crc kubenswrapper[4930]: I1012 07:12:31.064001 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_94bfaf3d-7abe-446f-b5ca-a359c65039b9/openstack-network-exporter/0.log" Oct 12 07:12:31 crc kubenswrapper[4930]: I1012 07:12:31.169915 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_94bfaf3d-7abe-446f-b5ca-a359c65039b9/ovsdbserver-nb/0.log" Oct 12 07:12:31 crc kubenswrapper[4930]: I1012 07:12:31.174451 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_acd687f2-88b8-4750-9f1a-ba8fa345e290/openstack-network-exporter/0.log" Oct 12 07:12:31 crc kubenswrapper[4930]: I1012 07:12:31.244516 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_76348a63-90b8-46b5-8856-da5c983b6d72/memcached/0.log" Oct 12 07:12:31 crc kubenswrapper[4930]: I1012 07:12:31.308134 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_acd687f2-88b8-4750-9f1a-ba8fa345e290/ovsdbserver-sb/0.log" Oct 12 07:12:31 crc kubenswrapper[4930]: I1012 07:12:31.533750 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_4a86a13a-7ebf-4fba-b066-f4c1ff705ffd/init-config-reloader/0.log" Oct 12 07:12:31 crc kubenswrapper[4930]: I1012 07:12:31.539122 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-544d94f45b-79l8m_8cfa2a2e-ac4f-415b-9dd2-dabf059ad679/placement-api/0.log" Oct 12 07:12:31 crc kubenswrapper[4930]: I1012 07:12:31.616792 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-544d94f45b-79l8m_8cfa2a2e-ac4f-415b-9dd2-dabf059ad679/placement-log/0.log" Oct 12 07:12:31 crc kubenswrapper[4930]: I1012 07:12:31.895228 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_4a86a13a-7ebf-4fba-b066-f4c1ff705ffd/thanos-sidecar/0.log" Oct 12 07:12:31 crc kubenswrapper[4930]: I1012 07:12:31.925338 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_4a86a13a-7ebf-4fba-b066-f4c1ff705ffd/init-config-reloader/0.log" Oct 12 07:12:31 crc kubenswrapper[4930]: I1012 07:12:31.956203 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_4a86a13a-7ebf-4fba-b066-f4c1ff705ffd/config-reloader/0.log" Oct 12 07:12:31 crc kubenswrapper[4930]: I1012 07:12:31.985781 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_4a86a13a-7ebf-4fba-b066-f4c1ff705ffd/prometheus/0.log" Oct 12 07:12:32 crc kubenswrapper[4930]: I1012 07:12:32.088367 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_62c5d71d-6283-44b4-9b50-96fd50d7ad99/setup-container/0.log" Oct 12 07:12:32 crc kubenswrapper[4930]: I1012 07:12:32.245515 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_62c5d71d-6283-44b4-9b50-96fd50d7ad99/rabbitmq/0.log" Oct 12 07:12:32 crc kubenswrapper[4930]: I1012 07:12:32.314993 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_62c5d71d-6283-44b4-9b50-96fd50d7ad99/setup-container/0.log" Oct 12 07:12:32 crc kubenswrapper[4930]: I1012 07:12:32.332037 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_1594846a-5c2f-49f8-9bea-22661720c5a6/setup-container/0.log" Oct 12 07:12:32 crc kubenswrapper[4930]: I1012 07:12:32.552344 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_1594846a-5c2f-49f8-9bea-22661720c5a6/rabbitmq/0.log" Oct 12 07:12:32 crc kubenswrapper[4930]: I1012 07:12:32.565793 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_cb65a4f8-39a7-4b20-b1cd-a077fff7ad8d/setup-container/0.log" Oct 12 07:12:32 crc kubenswrapper[4930]: I1012 07:12:32.595260 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_1594846a-5c2f-49f8-9bea-22661720c5a6/setup-container/0.log" Oct 12 07:12:32 crc kubenswrapper[4930]: I1012 07:12:32.778055 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_cb65a4f8-39a7-4b20-b1cd-a077fff7ad8d/setup-container/0.log" Oct 12 07:12:32 crc kubenswrapper[4930]: I1012 07:12:32.886895 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-bgltt_5afc7c35-49e0-45d7-a3fd-ab6584abe8a7/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 07:12:32 crc kubenswrapper[4930]: I1012 07:12:32.907441 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_cb65a4f8-39a7-4b20-b1cd-a077fff7ad8d/rabbitmq/0.log" Oct 12 07:12:33 crc kubenswrapper[4930]: I1012 07:12:33.013117 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-rj48r_4f819c97-4853-42ee-ac71-a252fefe38c5/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 07:12:33 crc kubenswrapper[4930]: I1012 07:12:33.114087 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-lw8bg_6b8c39cb-bec0-49b4-a4bb-5949e695db04/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 07:12:33 crc kubenswrapper[4930]: I1012 07:12:33.135045 4930 scope.go:117] "RemoveContainer" containerID="3508cacfbe81eb819f121a8aa9dec29ca2ce11e5e5e32ed6ab29606cfbe47413" Oct 12 07:12:33 crc kubenswrapper[4930]: E1012 07:12:33.135352 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 07:12:33 crc kubenswrapper[4930]: I1012 07:12:33.152183 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-nnxtv_d23402d4-c8f9-4e04-9972-f40037dadec9/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 07:12:33 crc kubenswrapper[4930]: I1012 07:12:33.325537 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-xksvf_0b9f3033-c01f-46bf-9d12-3e60310ec6f3/ssh-known-hosts-edpm-deployment/0.log" Oct 12 07:12:33 crc kubenswrapper[4930]: I1012 07:12:33.476459 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7d8c9db847-bqfrb_4ed14594-beb5-4ce3-bf04-4a9299a932be/proxy-server/0.log" Oct 12 07:12:33 crc kubenswrapper[4930]: I1012 07:12:33.561022 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-xldxb_df839ac4-27ff-436b-b328-55b948887fce/swift-ring-rebalance/0.log" Oct 12 07:12:33 crc kubenswrapper[4930]: I1012 07:12:33.564137 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7d8c9db847-bqfrb_4ed14594-beb5-4ce3-bf04-4a9299a932be/proxy-httpd/0.log" Oct 12 07:12:33 crc kubenswrapper[4930]: I1012 07:12:33.681256 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3/account-auditor/0.log" Oct 12 07:12:33 crc kubenswrapper[4930]: I1012 07:12:33.726285 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3/account-reaper/0.log" Oct 12 07:12:33 crc kubenswrapper[4930]: I1012 07:12:33.781754 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3/account-replicator/0.log" Oct 12 07:12:33 crc kubenswrapper[4930]: I1012 07:12:33.795384 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3/account-server/0.log" Oct 12 07:12:33 crc kubenswrapper[4930]: I1012 07:12:33.896230 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3/container-auditor/0.log" Oct 12 07:12:33 crc kubenswrapper[4930]: I1012 07:12:33.938800 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3/container-replicator/0.log" Oct 12 07:12:33 crc kubenswrapper[4930]: I1012 07:12:33.968527 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3/container-server/0.log" Oct 12 07:12:33 crc kubenswrapper[4930]: I1012 07:12:33.980486 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3/container-updater/0.log" Oct 12 07:12:34 crc kubenswrapper[4930]: I1012 07:12:34.021981 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3/object-auditor/0.log" Oct 12 07:12:34 crc kubenswrapper[4930]: I1012 07:12:34.144124 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3/object-replicator/0.log" Oct 12 07:12:34 crc kubenswrapper[4930]: I1012 07:12:34.154633 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3/object-expirer/0.log" Oct 12 07:12:34 crc kubenswrapper[4930]: I1012 07:12:34.163493 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3/object-server/0.log" Oct 12 07:12:34 crc kubenswrapper[4930]: I1012 07:12:34.171025 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3/object-updater/0.log" Oct 12 07:12:34 crc kubenswrapper[4930]: I1012 07:12:34.242496 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3/rsync/0.log" Oct 12 07:12:34 crc kubenswrapper[4930]: I1012 07:12:34.342983 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3/swift-recon-cron/0.log" Oct 12 07:12:34 crc kubenswrapper[4930]: I1012 07:12:34.500332 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-n5xlj_fa0905ab-f3dc-41c6-b517-9f9ac23d7adc/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 07:12:34 crc kubenswrapper[4930]: I1012 07:12:34.507571 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_acf9e824-abb4-4b1c-9925-c7794fafaad4/tempest-tests-tempest-tests-runner/0.log" Oct 12 07:12:34 crc kubenswrapper[4930]: I1012 07:12:34.546201 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7szpd"] Oct 12 07:12:34 crc kubenswrapper[4930]: E1012 07:12:34.546622 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b5ca05b-19f7-4863-b08d-76b4a29307bf" containerName="extract-content" Oct 12 07:12:34 crc kubenswrapper[4930]: I1012 07:12:34.546644 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b5ca05b-19f7-4863-b08d-76b4a29307bf" containerName="extract-content" Oct 12 07:12:34 crc kubenswrapper[4930]: E1012 07:12:34.546682 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b5ca05b-19f7-4863-b08d-76b4a29307bf" containerName="extract-utilities" Oct 12 07:12:34 crc kubenswrapper[4930]: I1012 07:12:34.546690 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b5ca05b-19f7-4863-b08d-76b4a29307bf" containerName="extract-utilities" Oct 12 07:12:34 crc kubenswrapper[4930]: E1012 07:12:34.546707 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b5ca05b-19f7-4863-b08d-76b4a29307bf" containerName="registry-server" Oct 12 07:12:34 crc kubenswrapper[4930]: I1012 07:12:34.546714 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b5ca05b-19f7-4863-b08d-76b4a29307bf" containerName="registry-server" Oct 12 07:12:34 crc kubenswrapper[4930]: I1012 07:12:34.546937 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b5ca05b-19f7-4863-b08d-76b4a29307bf" containerName="registry-server" Oct 12 07:12:34 crc kubenswrapper[4930]: I1012 07:12:34.548363 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7szpd" Oct 12 07:12:34 crc kubenswrapper[4930]: I1012 07:12:34.586863 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7szpd"] Oct 12 07:12:34 crc kubenswrapper[4930]: I1012 07:12:34.625543 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f08cba15-5f4a-4a0c-acc0-d9b88c9c9ba3-catalog-content\") pod \"redhat-operators-7szpd\" (UID: \"f08cba15-5f4a-4a0c-acc0-d9b88c9c9ba3\") " pod="openshift-marketplace/redhat-operators-7szpd" Oct 12 07:12:34 crc kubenswrapper[4930]: I1012 07:12:34.625610 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrww8\" (UniqueName: \"kubernetes.io/projected/f08cba15-5f4a-4a0c-acc0-d9b88c9c9ba3-kube-api-access-rrww8\") pod \"redhat-operators-7szpd\" (UID: \"f08cba15-5f4a-4a0c-acc0-d9b88c9c9ba3\") " pod="openshift-marketplace/redhat-operators-7szpd" Oct 12 07:12:34 crc kubenswrapper[4930]: I1012 07:12:34.625665 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f08cba15-5f4a-4a0c-acc0-d9b88c9c9ba3-utilities\") pod \"redhat-operators-7szpd\" (UID: \"f08cba15-5f4a-4a0c-acc0-d9b88c9c9ba3\") " pod="openshift-marketplace/redhat-operators-7szpd" Oct 12 07:12:34 crc kubenswrapper[4930]: I1012 07:12:34.645690 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_da6596b0-b6eb-4af2-97dd-7bda46883284/test-operator-logs-container/0.log" Oct 12 07:12:34 crc kubenswrapper[4930]: I1012 07:12:34.727710 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f08cba15-5f4a-4a0c-acc0-d9b88c9c9ba3-catalog-content\") pod \"redhat-operators-7szpd\" (UID: \"f08cba15-5f4a-4a0c-acc0-d9b88c9c9ba3\") " pod="openshift-marketplace/redhat-operators-7szpd" Oct 12 07:12:34 crc kubenswrapper[4930]: I1012 07:12:34.727788 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrww8\" (UniqueName: \"kubernetes.io/projected/f08cba15-5f4a-4a0c-acc0-d9b88c9c9ba3-kube-api-access-rrww8\") pod \"redhat-operators-7szpd\" (UID: \"f08cba15-5f4a-4a0c-acc0-d9b88c9c9ba3\") " pod="openshift-marketplace/redhat-operators-7szpd" Oct 12 07:12:34 crc kubenswrapper[4930]: I1012 07:12:34.727843 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f08cba15-5f4a-4a0c-acc0-d9b88c9c9ba3-utilities\") pod \"redhat-operators-7szpd\" (UID: \"f08cba15-5f4a-4a0c-acc0-d9b88c9c9ba3\") " pod="openshift-marketplace/redhat-operators-7szpd" Oct 12 07:12:34 crc kubenswrapper[4930]: I1012 07:12:34.728281 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f08cba15-5f4a-4a0c-acc0-d9b88c9c9ba3-utilities\") pod \"redhat-operators-7szpd\" (UID: \"f08cba15-5f4a-4a0c-acc0-d9b88c9c9ba3\") " pod="openshift-marketplace/redhat-operators-7szpd" Oct 12 07:12:34 crc kubenswrapper[4930]: I1012 07:12:34.728488 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f08cba15-5f4a-4a0c-acc0-d9b88c9c9ba3-catalog-content\") pod \"redhat-operators-7szpd\" (UID: \"f08cba15-5f4a-4a0c-acc0-d9b88c9c9ba3\") " pod="openshift-marketplace/redhat-operators-7szpd" Oct 12 07:12:34 crc kubenswrapper[4930]: I1012 07:12:34.763788 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrww8\" (UniqueName: \"kubernetes.io/projected/f08cba15-5f4a-4a0c-acc0-d9b88c9c9ba3-kube-api-access-rrww8\") pod \"redhat-operators-7szpd\" (UID: \"f08cba15-5f4a-4a0c-acc0-d9b88c9c9ba3\") " pod="openshift-marketplace/redhat-operators-7szpd" Oct 12 07:12:34 crc kubenswrapper[4930]: I1012 07:12:34.899278 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7szpd" Oct 12 07:12:34 crc kubenswrapper[4930]: I1012 07:12:34.951796 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-w8fp5_33c2f765-a7aa-4d04-87b3-2f8483e4623a/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 07:12:35 crc kubenswrapper[4930]: I1012 07:12:35.446506 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7szpd"] Oct 12 07:12:35 crc kubenswrapper[4930]: I1012 07:12:35.841244 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-applier-0_079031ef-591d-44a8-9a65-fdc0eaea1a0d/watcher-applier/0.log" Oct 12 07:12:35 crc kubenswrapper[4930]: I1012 07:12:35.984721 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_eb68a1f2-d5d6-4fea-b29a-bc253bfc919d/watcher-api-log/0.log" Oct 12 07:12:36 crc kubenswrapper[4930]: I1012 07:12:36.011526 4930 generic.go:334] "Generic (PLEG): container finished" podID="f08cba15-5f4a-4a0c-acc0-d9b88c9c9ba3" containerID="4cb98227db959ecd0dd2121df3297ed50b28c699df474010d09e0286d9d27924" exitCode=0 Oct 12 07:12:36 crc kubenswrapper[4930]: I1012 07:12:36.011566 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7szpd" event={"ID":"f08cba15-5f4a-4a0c-acc0-d9b88c9c9ba3","Type":"ContainerDied","Data":"4cb98227db959ecd0dd2121df3297ed50b28c699df474010d09e0286d9d27924"} Oct 12 07:12:36 crc kubenswrapper[4930]: I1012 07:12:36.011592 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7szpd" event={"ID":"f08cba15-5f4a-4a0c-acc0-d9b88c9c9ba3","Type":"ContainerStarted","Data":"ea6337b5831b7043f344ebed7e98c270da513d332026d576f2de346cde2b0610"} Oct 12 07:12:38 crc kubenswrapper[4930]: I1012 07:12:38.032451 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7szpd" event={"ID":"f08cba15-5f4a-4a0c-acc0-d9b88c9c9ba3","Type":"ContainerStarted","Data":"45b95ea15c160281b8f97dc972900c17b01d56c6be87d4775fe56be6aa73296e"} Oct 12 07:12:38 crc kubenswrapper[4930]: I1012 07:12:38.232110 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-decision-engine-0_220ebc1c-6f2b-4beb-8a34-339ba62a484f/watcher-decision-engine/0.log" Oct 12 07:12:39 crc kubenswrapper[4930]: I1012 07:12:39.134080 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_eb68a1f2-d5d6-4fea-b29a-bc253bfc919d/watcher-api/0.log" Oct 12 07:12:41 crc kubenswrapper[4930]: I1012 07:12:41.063816 4930 generic.go:334] "Generic (PLEG): container finished" podID="f08cba15-5f4a-4a0c-acc0-d9b88c9c9ba3" containerID="45b95ea15c160281b8f97dc972900c17b01d56c6be87d4775fe56be6aa73296e" exitCode=0 Oct 12 07:12:41 crc kubenswrapper[4930]: I1012 07:12:41.063878 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7szpd" event={"ID":"f08cba15-5f4a-4a0c-acc0-d9b88c9c9ba3","Type":"ContainerDied","Data":"45b95ea15c160281b8f97dc972900c17b01d56c6be87d4775fe56be6aa73296e"} Oct 12 07:12:42 crc kubenswrapper[4930]: I1012 07:12:42.074111 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7szpd" event={"ID":"f08cba15-5f4a-4a0c-acc0-d9b88c9c9ba3","Type":"ContainerStarted","Data":"77ed7b5a7061a2ebfdd4d67fb8b46b19d61fbd32ae2805f15653d8604fd64c00"} Oct 12 07:12:42 crc kubenswrapper[4930]: I1012 07:12:42.093499 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7szpd" podStartSLOduration=2.6486004469999997 podStartE2EDuration="8.093484467s" podCreationTimestamp="2025-10-12 07:12:34 +0000 UTC" firstStartedPulling="2025-10-12 07:12:36.014299619 +0000 UTC m=+5488.556401374" lastFinishedPulling="2025-10-12 07:12:41.459183629 +0000 UTC m=+5494.001285394" observedRunningTime="2025-10-12 07:12:42.087713386 +0000 UTC m=+5494.629815151" watchObservedRunningTime="2025-10-12 07:12:42.093484467 +0000 UTC m=+5494.635586232" Oct 12 07:12:44 crc kubenswrapper[4930]: I1012 07:12:44.899749 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7szpd" Oct 12 07:12:44 crc kubenswrapper[4930]: I1012 07:12:44.900220 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7szpd" Oct 12 07:12:45 crc kubenswrapper[4930]: I1012 07:12:45.953913 4930 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7szpd" podUID="f08cba15-5f4a-4a0c-acc0-d9b88c9c9ba3" containerName="registry-server" probeResult="failure" output=< Oct 12 07:12:45 crc kubenswrapper[4930]: timeout: failed to connect service ":50051" within 1s Oct 12 07:12:45 crc kubenswrapper[4930]: > Oct 12 07:12:48 crc kubenswrapper[4930]: I1012 07:12:48.148684 4930 scope.go:117] "RemoveContainer" containerID="3508cacfbe81eb819f121a8aa9dec29ca2ce11e5e5e32ed6ab29606cfbe47413" Oct 12 07:12:49 crc kubenswrapper[4930]: I1012 07:12:49.135045 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" event={"ID":"02f8684c-a3e4-44e8-9741-9f54488d8d8d","Type":"ContainerStarted","Data":"fadbe983b9d7f8cf56d93bde4ead838c4a3a392820ff85b2d655d6e5fed57ddf"} Oct 12 07:12:54 crc kubenswrapper[4930]: I1012 07:12:54.965838 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7szpd" Oct 12 07:12:55 crc kubenswrapper[4930]: I1012 07:12:55.037905 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7szpd" Oct 12 07:12:55 crc kubenswrapper[4930]: I1012 07:12:55.220087 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7szpd"] Oct 12 07:12:56 crc kubenswrapper[4930]: I1012 07:12:56.211601 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7szpd" podUID="f08cba15-5f4a-4a0c-acc0-d9b88c9c9ba3" containerName="registry-server" containerID="cri-o://77ed7b5a7061a2ebfdd4d67fb8b46b19d61fbd32ae2805f15653d8604fd64c00" gracePeriod=2 Oct 12 07:12:56 crc kubenswrapper[4930]: I1012 07:12:56.702827 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7szpd" Oct 12 07:12:56 crc kubenswrapper[4930]: I1012 07:12:56.781034 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f08cba15-5f4a-4a0c-acc0-d9b88c9c9ba3-utilities\") pod \"f08cba15-5f4a-4a0c-acc0-d9b88c9c9ba3\" (UID: \"f08cba15-5f4a-4a0c-acc0-d9b88c9c9ba3\") " Oct 12 07:12:56 crc kubenswrapper[4930]: I1012 07:12:56.781231 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrww8\" (UniqueName: \"kubernetes.io/projected/f08cba15-5f4a-4a0c-acc0-d9b88c9c9ba3-kube-api-access-rrww8\") pod \"f08cba15-5f4a-4a0c-acc0-d9b88c9c9ba3\" (UID: \"f08cba15-5f4a-4a0c-acc0-d9b88c9c9ba3\") " Oct 12 07:12:56 crc kubenswrapper[4930]: I1012 07:12:56.781352 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f08cba15-5f4a-4a0c-acc0-d9b88c9c9ba3-catalog-content\") pod \"f08cba15-5f4a-4a0c-acc0-d9b88c9c9ba3\" (UID: \"f08cba15-5f4a-4a0c-acc0-d9b88c9c9ba3\") " Oct 12 07:12:56 crc kubenswrapper[4930]: I1012 07:12:56.782258 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f08cba15-5f4a-4a0c-acc0-d9b88c9c9ba3-utilities" (OuterVolumeSpecName: "utilities") pod "f08cba15-5f4a-4a0c-acc0-d9b88c9c9ba3" (UID: "f08cba15-5f4a-4a0c-acc0-d9b88c9c9ba3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:12:56 crc kubenswrapper[4930]: I1012 07:12:56.789703 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f08cba15-5f4a-4a0c-acc0-d9b88c9c9ba3-kube-api-access-rrww8" (OuterVolumeSpecName: "kube-api-access-rrww8") pod "f08cba15-5f4a-4a0c-acc0-d9b88c9c9ba3" (UID: "f08cba15-5f4a-4a0c-acc0-d9b88c9c9ba3"). InnerVolumeSpecName "kube-api-access-rrww8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:12:56 crc kubenswrapper[4930]: I1012 07:12:56.870264 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f08cba15-5f4a-4a0c-acc0-d9b88c9c9ba3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f08cba15-5f4a-4a0c-acc0-d9b88c9c9ba3" (UID: "f08cba15-5f4a-4a0c-acc0-d9b88c9c9ba3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:12:56 crc kubenswrapper[4930]: I1012 07:12:56.883499 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrww8\" (UniqueName: \"kubernetes.io/projected/f08cba15-5f4a-4a0c-acc0-d9b88c9c9ba3-kube-api-access-rrww8\") on node \"crc\" DevicePath \"\"" Oct 12 07:12:56 crc kubenswrapper[4930]: I1012 07:12:56.883862 4930 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f08cba15-5f4a-4a0c-acc0-d9b88c9c9ba3-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 07:12:56 crc kubenswrapper[4930]: I1012 07:12:56.883875 4930 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f08cba15-5f4a-4a0c-acc0-d9b88c9c9ba3-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 07:12:57 crc kubenswrapper[4930]: I1012 07:12:57.221869 4930 generic.go:334] "Generic (PLEG): container finished" podID="f08cba15-5f4a-4a0c-acc0-d9b88c9c9ba3" containerID="77ed7b5a7061a2ebfdd4d67fb8b46b19d61fbd32ae2805f15653d8604fd64c00" exitCode=0 Oct 12 07:12:57 crc kubenswrapper[4930]: I1012 07:12:57.221908 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7szpd" event={"ID":"f08cba15-5f4a-4a0c-acc0-d9b88c9c9ba3","Type":"ContainerDied","Data":"77ed7b5a7061a2ebfdd4d67fb8b46b19d61fbd32ae2805f15653d8604fd64c00"} Oct 12 07:12:57 crc kubenswrapper[4930]: I1012 07:12:57.221937 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7szpd" event={"ID":"f08cba15-5f4a-4a0c-acc0-d9b88c9c9ba3","Type":"ContainerDied","Data":"ea6337b5831b7043f344ebed7e98c270da513d332026d576f2de346cde2b0610"} Oct 12 07:12:57 crc kubenswrapper[4930]: I1012 07:12:57.221959 4930 scope.go:117] "RemoveContainer" containerID="77ed7b5a7061a2ebfdd4d67fb8b46b19d61fbd32ae2805f15653d8604fd64c00" Oct 12 07:12:57 crc kubenswrapper[4930]: I1012 07:12:57.222005 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7szpd" Oct 12 07:12:57 crc kubenswrapper[4930]: I1012 07:12:57.253044 4930 scope.go:117] "RemoveContainer" containerID="45b95ea15c160281b8f97dc972900c17b01d56c6be87d4775fe56be6aa73296e" Oct 12 07:12:57 crc kubenswrapper[4930]: I1012 07:12:57.274851 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7szpd"] Oct 12 07:12:57 crc kubenswrapper[4930]: I1012 07:12:57.284524 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7szpd"] Oct 12 07:12:57 crc kubenswrapper[4930]: I1012 07:12:57.287472 4930 scope.go:117] "RemoveContainer" containerID="4cb98227db959ecd0dd2121df3297ed50b28c699df474010d09e0286d9d27924" Oct 12 07:12:57 crc kubenswrapper[4930]: I1012 07:12:57.331548 4930 scope.go:117] "RemoveContainer" containerID="77ed7b5a7061a2ebfdd4d67fb8b46b19d61fbd32ae2805f15653d8604fd64c00" Oct 12 07:12:57 crc kubenswrapper[4930]: E1012 07:12:57.331969 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77ed7b5a7061a2ebfdd4d67fb8b46b19d61fbd32ae2805f15653d8604fd64c00\": container with ID starting with 77ed7b5a7061a2ebfdd4d67fb8b46b19d61fbd32ae2805f15653d8604fd64c00 not found: ID does not exist" containerID="77ed7b5a7061a2ebfdd4d67fb8b46b19d61fbd32ae2805f15653d8604fd64c00" Oct 12 07:12:57 crc kubenswrapper[4930]: I1012 07:12:57.332023 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77ed7b5a7061a2ebfdd4d67fb8b46b19d61fbd32ae2805f15653d8604fd64c00"} err="failed to get container status \"77ed7b5a7061a2ebfdd4d67fb8b46b19d61fbd32ae2805f15653d8604fd64c00\": rpc error: code = NotFound desc = could not find container \"77ed7b5a7061a2ebfdd4d67fb8b46b19d61fbd32ae2805f15653d8604fd64c00\": container with ID starting with 77ed7b5a7061a2ebfdd4d67fb8b46b19d61fbd32ae2805f15653d8604fd64c00 not found: ID does not exist" Oct 12 07:12:57 crc kubenswrapper[4930]: I1012 07:12:57.332058 4930 scope.go:117] "RemoveContainer" containerID="45b95ea15c160281b8f97dc972900c17b01d56c6be87d4775fe56be6aa73296e" Oct 12 07:12:57 crc kubenswrapper[4930]: E1012 07:12:57.332567 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45b95ea15c160281b8f97dc972900c17b01d56c6be87d4775fe56be6aa73296e\": container with ID starting with 45b95ea15c160281b8f97dc972900c17b01d56c6be87d4775fe56be6aa73296e not found: ID does not exist" containerID="45b95ea15c160281b8f97dc972900c17b01d56c6be87d4775fe56be6aa73296e" Oct 12 07:12:57 crc kubenswrapper[4930]: I1012 07:12:57.332612 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45b95ea15c160281b8f97dc972900c17b01d56c6be87d4775fe56be6aa73296e"} err="failed to get container status \"45b95ea15c160281b8f97dc972900c17b01d56c6be87d4775fe56be6aa73296e\": rpc error: code = NotFound desc = could not find container \"45b95ea15c160281b8f97dc972900c17b01d56c6be87d4775fe56be6aa73296e\": container with ID starting with 45b95ea15c160281b8f97dc972900c17b01d56c6be87d4775fe56be6aa73296e not found: ID does not exist" Oct 12 07:12:57 crc kubenswrapper[4930]: I1012 07:12:57.332636 4930 scope.go:117] "RemoveContainer" containerID="4cb98227db959ecd0dd2121df3297ed50b28c699df474010d09e0286d9d27924" Oct 12 07:12:57 crc kubenswrapper[4930]: E1012 07:12:57.333716 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cb98227db959ecd0dd2121df3297ed50b28c699df474010d09e0286d9d27924\": container with ID starting with 4cb98227db959ecd0dd2121df3297ed50b28c699df474010d09e0286d9d27924 not found: ID does not exist" containerID="4cb98227db959ecd0dd2121df3297ed50b28c699df474010d09e0286d9d27924" Oct 12 07:12:57 crc kubenswrapper[4930]: I1012 07:12:57.333778 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cb98227db959ecd0dd2121df3297ed50b28c699df474010d09e0286d9d27924"} err="failed to get container status \"4cb98227db959ecd0dd2121df3297ed50b28c699df474010d09e0286d9d27924\": rpc error: code = NotFound desc = could not find container \"4cb98227db959ecd0dd2121df3297ed50b28c699df474010d09e0286d9d27924\": container with ID starting with 4cb98227db959ecd0dd2121df3297ed50b28c699df474010d09e0286d9d27924 not found: ID does not exist" Oct 12 07:12:58 crc kubenswrapper[4930]: I1012 07:12:58.145800 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f08cba15-5f4a-4a0c-acc0-d9b88c9c9ba3" path="/var/lib/kubelet/pods/f08cba15-5f4a-4a0c-acc0-d9b88c9c9ba3/volumes" Oct 12 07:13:04 crc kubenswrapper[4930]: I1012 07:13:04.605194 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-658bdf4b74-vf2tl_9e6cd80c-4aa5-40de-81fc-10d0329f5481/kube-rbac-proxy/0.log" Oct 12 07:13:04 crc kubenswrapper[4930]: I1012 07:13:04.627345 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-658bdf4b74-vf2tl_9e6cd80c-4aa5-40de-81fc-10d0329f5481/manager/0.log" Oct 12 07:13:05 crc kubenswrapper[4930]: I1012 07:13:05.436262 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bpcx87_84bb56e1-6ddf-439a-9889-f2e1973ad8fa/util/0.log" Oct 12 07:13:05 crc kubenswrapper[4930]: I1012 07:13:05.665870 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bpcx87_84bb56e1-6ddf-439a-9889-f2e1973ad8fa/pull/0.log" Oct 12 07:13:05 crc kubenswrapper[4930]: I1012 07:13:05.688021 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bpcx87_84bb56e1-6ddf-439a-9889-f2e1973ad8fa/util/0.log" Oct 12 07:13:05 crc kubenswrapper[4930]: I1012 07:13:05.704705 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bpcx87_84bb56e1-6ddf-439a-9889-f2e1973ad8fa/pull/0.log" Oct 12 07:13:05 crc kubenswrapper[4930]: I1012 07:13:05.794431 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bpcx87_84bb56e1-6ddf-439a-9889-f2e1973ad8fa/util/0.log" Oct 12 07:13:05 crc kubenswrapper[4930]: I1012 07:13:05.846719 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bpcx87_84bb56e1-6ddf-439a-9889-f2e1973ad8fa/pull/0.log" Oct 12 07:13:05 crc kubenswrapper[4930]: I1012 07:13:05.879092 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bpcx87_84bb56e1-6ddf-439a-9889-f2e1973ad8fa/extract/0.log" Oct 12 07:13:05 crc kubenswrapper[4930]: I1012 07:13:05.978921 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-7b7fb68549-2p7m6_5a7f54b7-1891-4e3a-a768-e937269bd384/kube-rbac-proxy/0.log" Oct 12 07:13:06 crc kubenswrapper[4930]: I1012 07:13:06.057772 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-7b7fb68549-2p7m6_5a7f54b7-1891-4e3a-a768-e937269bd384/manager/0.log" Oct 12 07:13:06 crc kubenswrapper[4930]: I1012 07:13:06.126412 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-85d5d9dd78-8vdcg_f883b7e0-65a4-4cea-9d02-7cdedfaf9ec2/kube-rbac-proxy/0.log" Oct 12 07:13:06 crc kubenswrapper[4930]: I1012 07:13:06.137537 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-85d5d9dd78-8vdcg_f883b7e0-65a4-4cea-9d02-7cdedfaf9ec2/manager/0.log" Oct 12 07:13:06 crc kubenswrapper[4930]: I1012 07:13:06.313869 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84b9b84486-ckctw_93426c54-3448-421e-aa85-b03c466c7bf8/kube-rbac-proxy/0.log" Oct 12 07:13:06 crc kubenswrapper[4930]: I1012 07:13:06.382710 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84b9b84486-ckctw_93426c54-3448-421e-aa85-b03c466c7bf8/manager/0.log" Oct 12 07:13:06 crc kubenswrapper[4930]: I1012 07:13:06.464020 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-858f76bbdd-2fq48_25276148-1b95-4b4d-9f18-ef97020632a7/kube-rbac-proxy/0.log" Oct 12 07:13:06 crc kubenswrapper[4930]: I1012 07:13:06.547018 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-858f76bbdd-2fq48_25276148-1b95-4b4d-9f18-ef97020632a7/manager/0.log" Oct 12 07:13:06 crc kubenswrapper[4930]: I1012 07:13:06.583902 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-7ffbcb7588-zh959_ee89a0b0-868b-4b2e-a274-c5a4ee40a872/kube-rbac-proxy/0.log" Oct 12 07:13:06 crc kubenswrapper[4930]: I1012 07:13:06.667329 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-7ffbcb7588-zh959_ee89a0b0-868b-4b2e-a274-c5a4ee40a872/manager/0.log" Oct 12 07:13:06 crc kubenswrapper[4930]: I1012 07:13:06.766498 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-656bcbd775-wgp8z_df7a25ba-c240-4d05-a117-0040e24bb33c/kube-rbac-proxy/0.log" Oct 12 07:13:06 crc kubenswrapper[4930]: I1012 07:13:06.952527 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-9c5c78d49-ksz2s_adf9f01b-70b6-46b9-acde-c1eedc16f299/kube-rbac-proxy/0.log" Oct 12 07:13:07 crc kubenswrapper[4930]: I1012 07:13:07.013662 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-656bcbd775-wgp8z_df7a25ba-c240-4d05-a117-0040e24bb33c/manager/0.log" Oct 12 07:13:07 crc kubenswrapper[4930]: I1012 07:13:07.014233 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-9c5c78d49-ksz2s_adf9f01b-70b6-46b9-acde-c1eedc16f299/manager/0.log" Oct 12 07:13:07 crc kubenswrapper[4930]: I1012 07:13:07.301241 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-55b6b7c7b8-wk6qh_fe5f36d2-82b4-4bce-a189-7844dae5dc0e/kube-rbac-proxy/0.log" Oct 12 07:13:07 crc kubenswrapper[4930]: I1012 07:13:07.392633 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-55b6b7c7b8-wk6qh_fe5f36d2-82b4-4bce-a189-7844dae5dc0e/manager/0.log" Oct 12 07:13:07 crc kubenswrapper[4930]: I1012 07:13:07.509719 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5f67fbc655-ql78t_fe819f44-6224-4b45-a33c-6b6ef8e73b92/kube-rbac-proxy/0.log" Oct 12 07:13:07 crc kubenswrapper[4930]: I1012 07:13:07.519078 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5f67fbc655-ql78t_fe819f44-6224-4b45-a33c-6b6ef8e73b92/manager/0.log" Oct 12 07:13:07 crc kubenswrapper[4930]: I1012 07:13:07.607539 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-f9fb45f8f-qcc9w_37d1af03-8709-4b4a-8d4c-bda1dbefff59/kube-rbac-proxy/0.log" Oct 12 07:13:07 crc kubenswrapper[4930]: I1012 07:13:07.709599 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-f9fb45f8f-qcc9w_37d1af03-8709-4b4a-8d4c-bda1dbefff59/manager/0.log" Oct 12 07:13:07 crc kubenswrapper[4930]: I1012 07:13:07.783716 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-79d585cb66-2brdt_7134f9eb-cfa6-41b8-a245-2f1b17669ca4/manager/0.log" Oct 12 07:13:07 crc kubenswrapper[4930]: I1012 07:13:07.786568 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-79d585cb66-2brdt_7134f9eb-cfa6-41b8-a245-2f1b17669ca4/kube-rbac-proxy/0.log" Oct 12 07:13:07 crc kubenswrapper[4930]: I1012 07:13:07.853717 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5df598886f-hx224_510d5f0a-5f67-4171-99e6-1de6734e7bdf/kube-rbac-proxy/0.log" Oct 12 07:13:07 crc kubenswrapper[4930]: I1012 07:13:07.954138 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-69fdcfc5f5-vn5dq_5383069b-8f72-4173-97bf-34ffc36c235e/kube-rbac-proxy/0.log" Oct 12 07:13:07 crc kubenswrapper[4930]: I1012 07:13:07.959908 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5df598886f-hx224_510d5f0a-5f67-4171-99e6-1de6734e7bdf/manager/0.log" Oct 12 07:13:08 crc kubenswrapper[4930]: I1012 07:13:08.020422 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-69fdcfc5f5-vn5dq_5383069b-8f72-4173-97bf-34ffc36c235e/manager/0.log" Oct 12 07:13:08 crc kubenswrapper[4930]: I1012 07:13:08.122671 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5956dffb7bwrr99_20b56dea-8d10-4b11-b437-fc38320417c9/kube-rbac-proxy/0.log" Oct 12 07:13:08 crc kubenswrapper[4930]: I1012 07:13:08.132500 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5956dffb7bwrr99_20b56dea-8d10-4b11-b437-fc38320417c9/manager/0.log" Oct 12 07:13:08 crc kubenswrapper[4930]: I1012 07:13:08.225637 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5b95c8954b-46gk6_f89b4da4-a74f-4f12-b056-05f201bedabd/kube-rbac-proxy/0.log" Oct 12 07:13:08 crc kubenswrapper[4930]: I1012 07:13:08.316576 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-688d597459-t5cps_d0de84bb-9b0d-4552-887d-1da3d50467e2/kube-rbac-proxy/0.log" Oct 12 07:13:08 crc kubenswrapper[4930]: I1012 07:13:08.538792 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-688d597459-t5cps_d0de84bb-9b0d-4552-887d-1da3d50467e2/operator/0.log" Oct 12 07:13:08 crc kubenswrapper[4930]: I1012 07:13:08.628309 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-6prjm_2214be78-67f7-4965-b3a0-8c1401ff658c/registry-server/0.log" Oct 12 07:13:08 crc kubenswrapper[4930]: I1012 07:13:08.675320 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-79df5fb58c-2hx92_8ac93070-497c-48ca-a58a-fb47657e6c2a/kube-rbac-proxy/0.log" Oct 12 07:13:08 crc kubenswrapper[4930]: I1012 07:13:08.842970 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-79df5fb58c-2hx92_8ac93070-497c-48ca-a58a-fb47657e6c2a/manager/0.log" Oct 12 07:13:08 crc kubenswrapper[4930]: I1012 07:13:08.878637 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-68b6c87b68-l2x8t_97b9da7d-7c47-46b5-ac4a-f190a92dceef/kube-rbac-proxy/0.log" Oct 12 07:13:08 crc kubenswrapper[4930]: I1012 07:13:08.981293 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-68b6c87b68-l2x8t_97b9da7d-7c47-46b5-ac4a-f190a92dceef/manager/0.log" Oct 12 07:13:09 crc kubenswrapper[4930]: I1012 07:13:09.103244 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-5vrjw_7063a11c-2d85-47f0-85ca-61cf4949e10d/operator/0.log" Oct 12 07:13:09 crc kubenswrapper[4930]: I1012 07:13:09.233375 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-db6d7f97b-862f6_27d33d74-f1d1-4208-aead-8f6091c524df/kube-rbac-proxy/0.log" Oct 12 07:13:09 crc kubenswrapper[4930]: I1012 07:13:09.279507 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-db6d7f97b-862f6_27d33d74-f1d1-4208-aead-8f6091c524df/manager/0.log" Oct 12 07:13:09 crc kubenswrapper[4930]: I1012 07:13:09.358635 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-67cfc6749b-wk5xm_89d21577-9d93-4003-bae1-3b66e679eeeb/kube-rbac-proxy/0.log" Oct 12 07:13:09 crc kubenswrapper[4930]: I1012 07:13:09.478923 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5b95c8954b-46gk6_f89b4da4-a74f-4f12-b056-05f201bedabd/manager/0.log" Oct 12 07:13:09 crc kubenswrapper[4930]: I1012 07:13:09.532003 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5458f77c4-52242_18286f44-9f6c-4699-9d4d-afa069c980ed/kube-rbac-proxy/0.log" Oct 12 07:13:09 crc kubenswrapper[4930]: I1012 07:13:09.592859 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5458f77c4-52242_18286f44-9f6c-4699-9d4d-afa069c980ed/manager/0.log" Oct 12 07:13:09 crc kubenswrapper[4930]: I1012 07:13:09.648488 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-67cfc6749b-wk5xm_89d21577-9d93-4003-bae1-3b66e679eeeb/manager/0.log" Oct 12 07:13:09 crc kubenswrapper[4930]: I1012 07:13:09.762771 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-7f554bff7b-mkzqf_7498c8de-da98-47fd-8096-58cb4f1c4f87/kube-rbac-proxy/0.log" Oct 12 07:13:09 crc kubenswrapper[4930]: I1012 07:13:09.856514 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-7f554bff7b-mkzqf_7498c8de-da98-47fd-8096-58cb4f1c4f87/manager/0.log" Oct 12 07:13:28 crc kubenswrapper[4930]: I1012 07:13:28.128962 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-6r5ch_4af4dbea-ada1-465b-a6d9-24843bf3808d/control-plane-machine-set-operator/0.log" Oct 12 07:13:28 crc kubenswrapper[4930]: I1012 07:13:28.251725 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-zxqcl_54c34701-7d60-4396-9a64-81b91379fbe9/kube-rbac-proxy/0.log" Oct 12 07:13:28 crc kubenswrapper[4930]: I1012 07:13:28.314577 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-zxqcl_54c34701-7d60-4396-9a64-81b91379fbe9/machine-api-operator/0.log" Oct 12 07:13:41 crc kubenswrapper[4930]: I1012 07:13:41.191666 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-wvrgv_e7b9505d-6b6f-4460-9f40-119155e4ba33/cert-manager-controller/0.log" Oct 12 07:13:41 crc kubenswrapper[4930]: I1012 07:13:41.332511 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-v95v2_b488879d-5b6c-438f-8b60-842f84b05028/cert-manager-cainjector/0.log" Oct 12 07:13:41 crc kubenswrapper[4930]: I1012 07:13:41.363653 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-6ptlt_a45f8116-ddf1-4733-bfb2-9fd498a04620/cert-manager-webhook/0.log" Oct 12 07:13:54 crc kubenswrapper[4930]: I1012 07:13:54.948910 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-54mmw_0d4f3b47-0a65-4b2d-837d-9e9e9efab38e/nmstate-console-plugin/0.log" Oct 12 07:13:55 crc kubenswrapper[4930]: I1012 07:13:55.081519 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-mngll_c75f4df3-9537-4a4b-9170-bee951bdb162/nmstate-handler/0.log" Oct 12 07:13:55 crc kubenswrapper[4930]: I1012 07:13:55.141400 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-gtfdw_2cb896f0-cc6a-44bd-ae9e-801659d154f4/kube-rbac-proxy/0.log" Oct 12 07:13:55 crc kubenswrapper[4930]: I1012 07:13:55.169869 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-gtfdw_2cb896f0-cc6a-44bd-ae9e-801659d154f4/nmstate-metrics/0.log" Oct 12 07:13:55 crc kubenswrapper[4930]: I1012 07:13:55.273883 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-gznzl_0920866d-8bd7-49ce-81f7-5aa6b77e9198/nmstate-operator/0.log" Oct 12 07:13:55 crc kubenswrapper[4930]: I1012 07:13:55.340669 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-k92gs_9a8ee773-dcea-477c-98ea-afc18295b1a0/nmstate-webhook/0.log" Oct 12 07:14:11 crc kubenswrapper[4930]: I1012 07:14:11.806206 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-7sgrd_908d6f68-9800-4794-88a0-21cca2bb3691/kube-rbac-proxy/0.log" Oct 12 07:14:11 crc kubenswrapper[4930]: I1012 07:14:11.951222 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qv5cn_c9aa3dfb-870d-4246-96a2-4c52c470239f/cp-frr-files/0.log" Oct 12 07:14:11 crc kubenswrapper[4930]: I1012 07:14:11.958652 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-7sgrd_908d6f68-9800-4794-88a0-21cca2bb3691/controller/0.log" Oct 12 07:14:12 crc kubenswrapper[4930]: I1012 07:14:12.095964 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qv5cn_c9aa3dfb-870d-4246-96a2-4c52c470239f/cp-reloader/0.log" Oct 12 07:14:12 crc kubenswrapper[4930]: I1012 07:14:12.120610 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qv5cn_c9aa3dfb-870d-4246-96a2-4c52c470239f/cp-frr-files/0.log" Oct 12 07:14:12 crc kubenswrapper[4930]: I1012 07:14:12.137071 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qv5cn_c9aa3dfb-870d-4246-96a2-4c52c470239f/cp-metrics/0.log" Oct 12 07:14:12 crc kubenswrapper[4930]: I1012 07:14:12.163079 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qv5cn_c9aa3dfb-870d-4246-96a2-4c52c470239f/cp-reloader/0.log" Oct 12 07:14:12 crc kubenswrapper[4930]: I1012 07:14:12.278059 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qv5cn_c9aa3dfb-870d-4246-96a2-4c52c470239f/cp-frr-files/0.log" Oct 12 07:14:12 crc kubenswrapper[4930]: I1012 07:14:12.310398 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qv5cn_c9aa3dfb-870d-4246-96a2-4c52c470239f/cp-reloader/0.log" Oct 12 07:14:12 crc kubenswrapper[4930]: I1012 07:14:12.349076 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qv5cn_c9aa3dfb-870d-4246-96a2-4c52c470239f/cp-metrics/0.log" Oct 12 07:14:12 crc kubenswrapper[4930]: I1012 07:14:12.373405 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qv5cn_c9aa3dfb-870d-4246-96a2-4c52c470239f/cp-metrics/0.log" Oct 12 07:14:12 crc kubenswrapper[4930]: I1012 07:14:12.544868 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qv5cn_c9aa3dfb-870d-4246-96a2-4c52c470239f/cp-metrics/0.log" Oct 12 07:14:12 crc kubenswrapper[4930]: I1012 07:14:12.553199 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qv5cn_c9aa3dfb-870d-4246-96a2-4c52c470239f/cp-reloader/0.log" Oct 12 07:14:12 crc kubenswrapper[4930]: I1012 07:14:12.558431 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qv5cn_c9aa3dfb-870d-4246-96a2-4c52c470239f/cp-frr-files/0.log" Oct 12 07:14:12 crc kubenswrapper[4930]: I1012 07:14:12.589073 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qv5cn_c9aa3dfb-870d-4246-96a2-4c52c470239f/controller/0.log" Oct 12 07:14:12 crc kubenswrapper[4930]: I1012 07:14:12.740492 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qv5cn_c9aa3dfb-870d-4246-96a2-4c52c470239f/frr-metrics/0.log" Oct 12 07:14:12 crc kubenswrapper[4930]: I1012 07:14:12.778496 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qv5cn_c9aa3dfb-870d-4246-96a2-4c52c470239f/kube-rbac-proxy/0.log" Oct 12 07:14:12 crc kubenswrapper[4930]: I1012 07:14:12.814981 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qv5cn_c9aa3dfb-870d-4246-96a2-4c52c470239f/kube-rbac-proxy-frr/0.log" Oct 12 07:14:12 crc kubenswrapper[4930]: I1012 07:14:12.924992 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qv5cn_c9aa3dfb-870d-4246-96a2-4c52c470239f/reloader/0.log" Oct 12 07:14:13 crc kubenswrapper[4930]: I1012 07:14:13.025981 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-b5w72_610c85e1-ab61-4938-a33d-cbed5a873874/frr-k8s-webhook-server/0.log" Oct 12 07:14:13 crc kubenswrapper[4930]: I1012 07:14:13.763621 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7ccf4dfd66-4nwxw_8ccb343f-6987-45c8-9b9f-a2bb32efbe22/manager/0.log" Oct 12 07:14:13 crc kubenswrapper[4930]: I1012 07:14:13.803184 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-688d4f8df8-dh759_726977f0-7624-4ce2-95ed-80f844fcfc81/webhook-server/0.log" Oct 12 07:14:13 crc kubenswrapper[4930]: I1012 07:14:13.970457 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-dd89b_61b8c651-74f8-4532-9554-55b01e58c0e6/kube-rbac-proxy/0.log" Oct 12 07:14:14 crc kubenswrapper[4930]: I1012 07:14:14.546985 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-dd89b_61b8c651-74f8-4532-9554-55b01e58c0e6/speaker/0.log" Oct 12 07:14:14 crc kubenswrapper[4930]: I1012 07:14:14.629480 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qv5cn_c9aa3dfb-870d-4246-96a2-4c52c470239f/frr/0.log" Oct 12 07:14:29 crc kubenswrapper[4930]: I1012 07:14:29.409540 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qfplm_e54ad199-e123-435c-a125-b662c577a86b/util/0.log" Oct 12 07:14:29 crc kubenswrapper[4930]: I1012 07:14:29.558783 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qfplm_e54ad199-e123-435c-a125-b662c577a86b/util/0.log" Oct 12 07:14:29 crc kubenswrapper[4930]: I1012 07:14:29.591965 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qfplm_e54ad199-e123-435c-a125-b662c577a86b/pull/0.log" Oct 12 07:14:29 crc kubenswrapper[4930]: I1012 07:14:29.614084 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qfplm_e54ad199-e123-435c-a125-b662c577a86b/pull/0.log" Oct 12 07:14:29 crc kubenswrapper[4930]: I1012 07:14:29.763768 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qfplm_e54ad199-e123-435c-a125-b662c577a86b/extract/0.log" Oct 12 07:14:29 crc kubenswrapper[4930]: I1012 07:14:29.767278 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qfplm_e54ad199-e123-435c-a125-b662c577a86b/pull/0.log" Oct 12 07:14:29 crc kubenswrapper[4930]: I1012 07:14:29.768707 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qfplm_e54ad199-e123-435c-a125-b662c577a86b/util/0.log" Oct 12 07:14:29 crc kubenswrapper[4930]: I1012 07:14:29.952354 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddrpfq_f143a61d-685c-4ee5-b95c-f7fae137d0bc/util/0.log" Oct 12 07:14:30 crc kubenswrapper[4930]: I1012 07:14:30.135811 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddrpfq_f143a61d-685c-4ee5-b95c-f7fae137d0bc/util/0.log" Oct 12 07:14:30 crc kubenswrapper[4930]: I1012 07:14:30.157144 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddrpfq_f143a61d-685c-4ee5-b95c-f7fae137d0bc/pull/0.log" Oct 12 07:14:30 crc kubenswrapper[4930]: I1012 07:14:30.177198 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddrpfq_f143a61d-685c-4ee5-b95c-f7fae137d0bc/pull/0.log" Oct 12 07:14:30 crc kubenswrapper[4930]: I1012 07:14:30.373702 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddrpfq_f143a61d-685c-4ee5-b95c-f7fae137d0bc/extract/0.log" Oct 12 07:14:30 crc kubenswrapper[4930]: I1012 07:14:30.391517 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddrpfq_f143a61d-685c-4ee5-b95c-f7fae137d0bc/util/0.log" Oct 12 07:14:30 crc kubenswrapper[4930]: I1012 07:14:30.392091 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddrpfq_f143a61d-685c-4ee5-b95c-f7fae137d0bc/pull/0.log" Oct 12 07:14:30 crc kubenswrapper[4930]: I1012 07:14:30.554080 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jwl4b_051544d2-1a7f-4cb4-8b3e-c6ffeb23732f/extract-utilities/0.log" Oct 12 07:14:30 crc kubenswrapper[4930]: I1012 07:14:30.790084 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jwl4b_051544d2-1a7f-4cb4-8b3e-c6ffeb23732f/extract-utilities/0.log" Oct 12 07:14:30 crc kubenswrapper[4930]: I1012 07:14:30.790648 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jwl4b_051544d2-1a7f-4cb4-8b3e-c6ffeb23732f/extract-content/0.log" Oct 12 07:14:30 crc kubenswrapper[4930]: I1012 07:14:30.793812 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jwl4b_051544d2-1a7f-4cb4-8b3e-c6ffeb23732f/extract-content/0.log" Oct 12 07:14:31 crc kubenswrapper[4930]: I1012 07:14:31.037396 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jwl4b_051544d2-1a7f-4cb4-8b3e-c6ffeb23732f/extract-content/0.log" Oct 12 07:14:31 crc kubenswrapper[4930]: I1012 07:14:31.042788 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jwl4b_051544d2-1a7f-4cb4-8b3e-c6ffeb23732f/extract-utilities/0.log" Oct 12 07:14:31 crc kubenswrapper[4930]: I1012 07:14:31.203937 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6d7zn_732507c6-d79c-4b3e-a0fa-23111ec6b02b/extract-utilities/0.log" Oct 12 07:14:31 crc kubenswrapper[4930]: I1012 07:14:31.487053 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6d7zn_732507c6-d79c-4b3e-a0fa-23111ec6b02b/extract-content/0.log" Oct 12 07:14:31 crc kubenswrapper[4930]: I1012 07:14:31.550723 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6d7zn_732507c6-d79c-4b3e-a0fa-23111ec6b02b/extract-utilities/0.log" Oct 12 07:14:31 crc kubenswrapper[4930]: I1012 07:14:31.558538 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6d7zn_732507c6-d79c-4b3e-a0fa-23111ec6b02b/extract-content/0.log" Oct 12 07:14:31 crc kubenswrapper[4930]: I1012 07:14:31.596643 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jwl4b_051544d2-1a7f-4cb4-8b3e-c6ffeb23732f/registry-server/0.log" Oct 12 07:14:31 crc kubenswrapper[4930]: I1012 07:14:31.718422 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6d7zn_732507c6-d79c-4b3e-a0fa-23111ec6b02b/extract-utilities/0.log" Oct 12 07:14:31 crc kubenswrapper[4930]: I1012 07:14:31.731408 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6d7zn_732507c6-d79c-4b3e-a0fa-23111ec6b02b/extract-content/0.log" Oct 12 07:14:31 crc kubenswrapper[4930]: I1012 07:14:31.883292 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c94dtk_f643d105-d04a-48e4-b956-27fa864a1542/util/0.log" Oct 12 07:14:32 crc kubenswrapper[4930]: I1012 07:14:32.015860 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c94dtk_f643d105-d04a-48e4-b956-27fa864a1542/util/0.log" Oct 12 07:14:32 crc kubenswrapper[4930]: I1012 07:14:32.049015 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c94dtk_f643d105-d04a-48e4-b956-27fa864a1542/pull/0.log" Oct 12 07:14:32 crc kubenswrapper[4930]: I1012 07:14:32.049834 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c94dtk_f643d105-d04a-48e4-b956-27fa864a1542/pull/0.log" Oct 12 07:14:32 crc kubenswrapper[4930]: I1012 07:14:32.268780 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c94dtk_f643d105-d04a-48e4-b956-27fa864a1542/pull/0.log" Oct 12 07:14:32 crc kubenswrapper[4930]: I1012 07:14:32.294381 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c94dtk_f643d105-d04a-48e4-b956-27fa864a1542/util/0.log" Oct 12 07:14:32 crc kubenswrapper[4930]: I1012 07:14:32.373888 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c94dtk_f643d105-d04a-48e4-b956-27fa864a1542/extract/0.log" Oct 12 07:14:32 crc kubenswrapper[4930]: I1012 07:14:32.411060 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6d7zn_732507c6-d79c-4b3e-a0fa-23111ec6b02b/registry-server/0.log" Oct 12 07:14:32 crc kubenswrapper[4930]: I1012 07:14:32.512911 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-c9wvx_00ab86fc-c038-4dc6-aaf5-6eac7c953d24/marketplace-operator/0.log" Oct 12 07:14:32 crc kubenswrapper[4930]: I1012 07:14:32.530584 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m4lsn_0b85acba-bed0-4e85-8252-3972368d611c/extract-utilities/0.log" Oct 12 07:14:32 crc kubenswrapper[4930]: I1012 07:14:32.719764 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m4lsn_0b85acba-bed0-4e85-8252-3972368d611c/extract-utilities/0.log" Oct 12 07:14:32 crc kubenswrapper[4930]: I1012 07:14:32.727556 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m4lsn_0b85acba-bed0-4e85-8252-3972368d611c/extract-content/0.log" Oct 12 07:14:32 crc kubenswrapper[4930]: I1012 07:14:32.738828 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m4lsn_0b85acba-bed0-4e85-8252-3972368d611c/extract-content/0.log" Oct 12 07:14:32 crc kubenswrapper[4930]: I1012 07:14:32.939627 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m4lsn_0b85acba-bed0-4e85-8252-3972368d611c/extract-content/0.log" Oct 12 07:14:32 crc kubenswrapper[4930]: I1012 07:14:32.946266 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m4lsn_0b85acba-bed0-4e85-8252-3972368d611c/extract-utilities/0.log" Oct 12 07:14:33 crc kubenswrapper[4930]: I1012 07:14:33.007399 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-tzfd6_d98b1ef4-3cf1-4432-8b03-0c9d7118248d/extract-utilities/0.log" Oct 12 07:14:33 crc kubenswrapper[4930]: I1012 07:14:33.077238 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m4lsn_0b85acba-bed0-4e85-8252-3972368d611c/registry-server/0.log" Oct 12 07:14:33 crc kubenswrapper[4930]: I1012 07:14:33.322270 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-tzfd6_d98b1ef4-3cf1-4432-8b03-0c9d7118248d/extract-utilities/0.log" Oct 12 07:14:33 crc kubenswrapper[4930]: I1012 07:14:33.347027 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-tzfd6_d98b1ef4-3cf1-4432-8b03-0c9d7118248d/extract-content/0.log" Oct 12 07:14:33 crc kubenswrapper[4930]: I1012 07:14:33.360320 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-tzfd6_d98b1ef4-3cf1-4432-8b03-0c9d7118248d/extract-content/0.log" Oct 12 07:14:33 crc kubenswrapper[4930]: I1012 07:14:33.482888 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-tzfd6_d98b1ef4-3cf1-4432-8b03-0c9d7118248d/extract-utilities/0.log" Oct 12 07:14:33 crc kubenswrapper[4930]: I1012 07:14:33.505904 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-tzfd6_d98b1ef4-3cf1-4432-8b03-0c9d7118248d/extract-content/0.log" Oct 12 07:14:34 crc kubenswrapper[4930]: I1012 07:14:34.298922 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-tzfd6_d98b1ef4-3cf1-4432-8b03-0c9d7118248d/registry-server/0.log" Oct 12 07:14:48 crc kubenswrapper[4930]: I1012 07:14:48.967380 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-7c8cf85677-v4gns_36fc8dbb-9393-4ad2-a475-7933483eef61/prometheus-operator/0.log" Oct 12 07:14:49 crc kubenswrapper[4930]: I1012 07:14:49.116970 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-78c9d5c958-576bf_1373019b-d435-40a9-8551-11fb23298b48/prometheus-operator-admission-webhook/0.log" Oct 12 07:14:49 crc kubenswrapper[4930]: I1012 07:14:49.130642 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-78c9d5c958-fbq82_828b5980-9511-4284-a5f4-4197242fef19/prometheus-operator-admission-webhook/0.log" Oct 12 07:14:49 crc kubenswrapper[4930]: I1012 07:14:49.292569 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-cc5f78dfc-tfp9w_cd5c9b39-afff-488d-9bc7-875c644a6975/operator/0.log" Oct 12 07:14:49 crc kubenswrapper[4930]: I1012 07:14:49.351522 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-54bc95c9fb-sxjbx_71965cf6-b3c6-4e30-8771-eaad927fcc46/perses-operator/0.log" Oct 12 07:14:59 crc kubenswrapper[4930]: I1012 07:14:59.432764 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-pfvdm"] Oct 12 07:14:59 crc kubenswrapper[4930]: E1012 07:14:59.433795 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f08cba15-5f4a-4a0c-acc0-d9b88c9c9ba3" containerName="extract-utilities" Oct 12 07:14:59 crc kubenswrapper[4930]: I1012 07:14:59.433812 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="f08cba15-5f4a-4a0c-acc0-d9b88c9c9ba3" containerName="extract-utilities" Oct 12 07:14:59 crc kubenswrapper[4930]: E1012 07:14:59.433833 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f08cba15-5f4a-4a0c-acc0-d9b88c9c9ba3" containerName="extract-content" Oct 12 07:14:59 crc kubenswrapper[4930]: I1012 07:14:59.433842 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="f08cba15-5f4a-4a0c-acc0-d9b88c9c9ba3" containerName="extract-content" Oct 12 07:14:59 crc kubenswrapper[4930]: E1012 07:14:59.433870 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f08cba15-5f4a-4a0c-acc0-d9b88c9c9ba3" containerName="registry-server" Oct 12 07:14:59 crc kubenswrapper[4930]: I1012 07:14:59.433878 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="f08cba15-5f4a-4a0c-acc0-d9b88c9c9ba3" containerName="registry-server" Oct 12 07:14:59 crc kubenswrapper[4930]: I1012 07:14:59.434193 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="f08cba15-5f4a-4a0c-acc0-d9b88c9c9ba3" containerName="registry-server" Oct 12 07:14:59 crc kubenswrapper[4930]: I1012 07:14:59.436326 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pfvdm" Oct 12 07:14:59 crc kubenswrapper[4930]: I1012 07:14:59.448091 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pfvdm"] Oct 12 07:14:59 crc kubenswrapper[4930]: I1012 07:14:59.570806 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59d13334-bab2-4a08-9aa5-1835b1993192-utilities\") pod \"certified-operators-pfvdm\" (UID: \"59d13334-bab2-4a08-9aa5-1835b1993192\") " pod="openshift-marketplace/certified-operators-pfvdm" Oct 12 07:14:59 crc kubenswrapper[4930]: I1012 07:14:59.570914 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jj92c\" (UniqueName: \"kubernetes.io/projected/59d13334-bab2-4a08-9aa5-1835b1993192-kube-api-access-jj92c\") pod \"certified-operators-pfvdm\" (UID: \"59d13334-bab2-4a08-9aa5-1835b1993192\") " pod="openshift-marketplace/certified-operators-pfvdm" Oct 12 07:14:59 crc kubenswrapper[4930]: I1012 07:14:59.570990 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59d13334-bab2-4a08-9aa5-1835b1993192-catalog-content\") pod \"certified-operators-pfvdm\" (UID: \"59d13334-bab2-4a08-9aa5-1835b1993192\") " pod="openshift-marketplace/certified-operators-pfvdm" Oct 12 07:14:59 crc kubenswrapper[4930]: I1012 07:14:59.672685 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jj92c\" (UniqueName: \"kubernetes.io/projected/59d13334-bab2-4a08-9aa5-1835b1993192-kube-api-access-jj92c\") pod \"certified-operators-pfvdm\" (UID: \"59d13334-bab2-4a08-9aa5-1835b1993192\") " pod="openshift-marketplace/certified-operators-pfvdm" Oct 12 07:14:59 crc kubenswrapper[4930]: I1012 07:14:59.673038 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59d13334-bab2-4a08-9aa5-1835b1993192-catalog-content\") pod \"certified-operators-pfvdm\" (UID: \"59d13334-bab2-4a08-9aa5-1835b1993192\") " pod="openshift-marketplace/certified-operators-pfvdm" Oct 12 07:14:59 crc kubenswrapper[4930]: I1012 07:14:59.673113 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59d13334-bab2-4a08-9aa5-1835b1993192-utilities\") pod \"certified-operators-pfvdm\" (UID: \"59d13334-bab2-4a08-9aa5-1835b1993192\") " pod="openshift-marketplace/certified-operators-pfvdm" Oct 12 07:14:59 crc kubenswrapper[4930]: I1012 07:14:59.673564 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59d13334-bab2-4a08-9aa5-1835b1993192-utilities\") pod \"certified-operators-pfvdm\" (UID: \"59d13334-bab2-4a08-9aa5-1835b1993192\") " pod="openshift-marketplace/certified-operators-pfvdm" Oct 12 07:14:59 crc kubenswrapper[4930]: I1012 07:14:59.673795 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59d13334-bab2-4a08-9aa5-1835b1993192-catalog-content\") pod \"certified-operators-pfvdm\" (UID: \"59d13334-bab2-4a08-9aa5-1835b1993192\") " pod="openshift-marketplace/certified-operators-pfvdm" Oct 12 07:14:59 crc kubenswrapper[4930]: I1012 07:14:59.712213 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jj92c\" (UniqueName: \"kubernetes.io/projected/59d13334-bab2-4a08-9aa5-1835b1993192-kube-api-access-jj92c\") pod \"certified-operators-pfvdm\" (UID: \"59d13334-bab2-4a08-9aa5-1835b1993192\") " pod="openshift-marketplace/certified-operators-pfvdm" Oct 12 07:14:59 crc kubenswrapper[4930]: I1012 07:14:59.766969 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pfvdm" Oct 12 07:15:00 crc kubenswrapper[4930]: I1012 07:15:00.155796 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29337555-pjcrg"] Oct 12 07:15:00 crc kubenswrapper[4930]: I1012 07:15:00.157831 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29337555-pjcrg" Oct 12 07:15:00 crc kubenswrapper[4930]: I1012 07:15:00.160412 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 12 07:15:00 crc kubenswrapper[4930]: I1012 07:15:00.161087 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 12 07:15:00 crc kubenswrapper[4930]: I1012 07:15:00.170089 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29337555-pjcrg"] Oct 12 07:15:00 crc kubenswrapper[4930]: I1012 07:15:00.185380 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxc9s\" (UniqueName: \"kubernetes.io/projected/f8c54212-a0af-43dd-890a-0b1fb5c2a8a4-kube-api-access-lxc9s\") pod \"collect-profiles-29337555-pjcrg\" (UID: \"f8c54212-a0af-43dd-890a-0b1fb5c2a8a4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337555-pjcrg" Oct 12 07:15:00 crc kubenswrapper[4930]: I1012 07:15:00.185512 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f8c54212-a0af-43dd-890a-0b1fb5c2a8a4-config-volume\") pod \"collect-profiles-29337555-pjcrg\" (UID: \"f8c54212-a0af-43dd-890a-0b1fb5c2a8a4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337555-pjcrg" Oct 12 07:15:00 crc kubenswrapper[4930]: I1012 07:15:00.185633 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f8c54212-a0af-43dd-890a-0b1fb5c2a8a4-secret-volume\") pod \"collect-profiles-29337555-pjcrg\" (UID: \"f8c54212-a0af-43dd-890a-0b1fb5c2a8a4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337555-pjcrg" Oct 12 07:15:00 crc kubenswrapper[4930]: I1012 07:15:00.287535 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f8c54212-a0af-43dd-890a-0b1fb5c2a8a4-config-volume\") pod \"collect-profiles-29337555-pjcrg\" (UID: \"f8c54212-a0af-43dd-890a-0b1fb5c2a8a4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337555-pjcrg" Oct 12 07:15:00 crc kubenswrapper[4930]: I1012 07:15:00.287635 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f8c54212-a0af-43dd-890a-0b1fb5c2a8a4-secret-volume\") pod \"collect-profiles-29337555-pjcrg\" (UID: \"f8c54212-a0af-43dd-890a-0b1fb5c2a8a4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337555-pjcrg" Oct 12 07:15:00 crc kubenswrapper[4930]: I1012 07:15:00.287687 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxc9s\" (UniqueName: \"kubernetes.io/projected/f8c54212-a0af-43dd-890a-0b1fb5c2a8a4-kube-api-access-lxc9s\") pod \"collect-profiles-29337555-pjcrg\" (UID: \"f8c54212-a0af-43dd-890a-0b1fb5c2a8a4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337555-pjcrg" Oct 12 07:15:00 crc kubenswrapper[4930]: I1012 07:15:00.288753 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f8c54212-a0af-43dd-890a-0b1fb5c2a8a4-config-volume\") pod \"collect-profiles-29337555-pjcrg\" (UID: \"f8c54212-a0af-43dd-890a-0b1fb5c2a8a4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337555-pjcrg" Oct 12 07:15:00 crc kubenswrapper[4930]: I1012 07:15:00.304078 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f8c54212-a0af-43dd-890a-0b1fb5c2a8a4-secret-volume\") pod \"collect-profiles-29337555-pjcrg\" (UID: \"f8c54212-a0af-43dd-890a-0b1fb5c2a8a4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337555-pjcrg" Oct 12 07:15:00 crc kubenswrapper[4930]: I1012 07:15:00.310877 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxc9s\" (UniqueName: \"kubernetes.io/projected/f8c54212-a0af-43dd-890a-0b1fb5c2a8a4-kube-api-access-lxc9s\") pod \"collect-profiles-29337555-pjcrg\" (UID: \"f8c54212-a0af-43dd-890a-0b1fb5c2a8a4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29337555-pjcrg" Oct 12 07:15:00 crc kubenswrapper[4930]: I1012 07:15:00.336310 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pfvdm"] Oct 12 07:15:00 crc kubenswrapper[4930]: I1012 07:15:00.478118 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29337555-pjcrg" Oct 12 07:15:00 crc kubenswrapper[4930]: I1012 07:15:00.631383 4930 generic.go:334] "Generic (PLEG): container finished" podID="59d13334-bab2-4a08-9aa5-1835b1993192" containerID="d3f1e122d7c2a5375243c65981b42d1a15f9a57c1ccb353c7c440ed04a83385e" exitCode=0 Oct 12 07:15:00 crc kubenswrapper[4930]: I1012 07:15:00.631427 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pfvdm" event={"ID":"59d13334-bab2-4a08-9aa5-1835b1993192","Type":"ContainerDied","Data":"d3f1e122d7c2a5375243c65981b42d1a15f9a57c1ccb353c7c440ed04a83385e"} Oct 12 07:15:00 crc kubenswrapper[4930]: I1012 07:15:00.631456 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pfvdm" event={"ID":"59d13334-bab2-4a08-9aa5-1835b1993192","Type":"ContainerStarted","Data":"6f2cf6edf8861604d84efca7f2cb1761f2bd90b8cb9750f1a7670b807e27c27f"} Oct 12 07:15:00 crc kubenswrapper[4930]: I1012 07:15:00.977350 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29337555-pjcrg"] Oct 12 07:15:01 crc kubenswrapper[4930]: I1012 07:15:01.640462 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29337555-pjcrg" event={"ID":"f8c54212-a0af-43dd-890a-0b1fb5c2a8a4","Type":"ContainerStarted","Data":"e9a95da0e41f6cc39dd16a86d92b14852e01f247ae6ad5eb1bff54990f665b8e"} Oct 12 07:15:02 crc kubenswrapper[4930]: I1012 07:15:02.654627 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pfvdm" event={"ID":"59d13334-bab2-4a08-9aa5-1835b1993192","Type":"ContainerStarted","Data":"23acea8863aeb87398fac2ba8b368643ab8f4adbe90bf9f31e51d4102e26daff"} Oct 12 07:15:02 crc kubenswrapper[4930]: I1012 07:15:02.656594 4930 generic.go:334] "Generic (PLEG): container finished" podID="f8c54212-a0af-43dd-890a-0b1fb5c2a8a4" containerID="3c30c16db57dd438379a7b219f04793f71eafdb730a928e5419cbc0b8d58863a" exitCode=0 Oct 12 07:15:02 crc kubenswrapper[4930]: I1012 07:15:02.656660 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29337555-pjcrg" event={"ID":"f8c54212-a0af-43dd-890a-0b1fb5c2a8a4","Type":"ContainerDied","Data":"3c30c16db57dd438379a7b219f04793f71eafdb730a928e5419cbc0b8d58863a"} Oct 12 07:15:03 crc kubenswrapper[4930]: I1012 07:15:03.666699 4930 generic.go:334] "Generic (PLEG): container finished" podID="59d13334-bab2-4a08-9aa5-1835b1993192" containerID="23acea8863aeb87398fac2ba8b368643ab8f4adbe90bf9f31e51d4102e26daff" exitCode=0 Oct 12 07:15:03 crc kubenswrapper[4930]: I1012 07:15:03.666770 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pfvdm" event={"ID":"59d13334-bab2-4a08-9aa5-1835b1993192","Type":"ContainerDied","Data":"23acea8863aeb87398fac2ba8b368643ab8f4adbe90bf9f31e51d4102e26daff"} Oct 12 07:15:03 crc kubenswrapper[4930]: I1012 07:15:03.668954 4930 patch_prober.go:28] interesting pod/machine-config-daemon-mk4tf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 07:15:03 crc kubenswrapper[4930]: I1012 07:15:03.668996 4930 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 07:15:04 crc kubenswrapper[4930]: I1012 07:15:04.032042 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29337555-pjcrg" Oct 12 07:15:04 crc kubenswrapper[4930]: I1012 07:15:04.161729 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxc9s\" (UniqueName: \"kubernetes.io/projected/f8c54212-a0af-43dd-890a-0b1fb5c2a8a4-kube-api-access-lxc9s\") pod \"f8c54212-a0af-43dd-890a-0b1fb5c2a8a4\" (UID: \"f8c54212-a0af-43dd-890a-0b1fb5c2a8a4\") " Oct 12 07:15:04 crc kubenswrapper[4930]: I1012 07:15:04.161816 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f8c54212-a0af-43dd-890a-0b1fb5c2a8a4-secret-volume\") pod \"f8c54212-a0af-43dd-890a-0b1fb5c2a8a4\" (UID: \"f8c54212-a0af-43dd-890a-0b1fb5c2a8a4\") " Oct 12 07:15:04 crc kubenswrapper[4930]: I1012 07:15:04.162166 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f8c54212-a0af-43dd-890a-0b1fb5c2a8a4-config-volume\") pod \"f8c54212-a0af-43dd-890a-0b1fb5c2a8a4\" (UID: \"f8c54212-a0af-43dd-890a-0b1fb5c2a8a4\") " Oct 12 07:15:04 crc kubenswrapper[4930]: I1012 07:15:04.165344 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8c54212-a0af-43dd-890a-0b1fb5c2a8a4-config-volume" (OuterVolumeSpecName: "config-volume") pod "f8c54212-a0af-43dd-890a-0b1fb5c2a8a4" (UID: "f8c54212-a0af-43dd-890a-0b1fb5c2a8a4"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 12 07:15:04 crc kubenswrapper[4930]: I1012 07:15:04.171978 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8c54212-a0af-43dd-890a-0b1fb5c2a8a4-kube-api-access-lxc9s" (OuterVolumeSpecName: "kube-api-access-lxc9s") pod "f8c54212-a0af-43dd-890a-0b1fb5c2a8a4" (UID: "f8c54212-a0af-43dd-890a-0b1fb5c2a8a4"). InnerVolumeSpecName "kube-api-access-lxc9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:15:04 crc kubenswrapper[4930]: I1012 07:15:04.172075 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8c54212-a0af-43dd-890a-0b1fb5c2a8a4-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f8c54212-a0af-43dd-890a-0b1fb5c2a8a4" (UID: "f8c54212-a0af-43dd-890a-0b1fb5c2a8a4"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 12 07:15:04 crc kubenswrapper[4930]: I1012 07:15:04.265088 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxc9s\" (UniqueName: \"kubernetes.io/projected/f8c54212-a0af-43dd-890a-0b1fb5c2a8a4-kube-api-access-lxc9s\") on node \"crc\" DevicePath \"\"" Oct 12 07:15:04 crc kubenswrapper[4930]: I1012 07:15:04.265126 4930 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f8c54212-a0af-43dd-890a-0b1fb5c2a8a4-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 12 07:15:04 crc kubenswrapper[4930]: I1012 07:15:04.265136 4930 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f8c54212-a0af-43dd-890a-0b1fb5c2a8a4-config-volume\") on node \"crc\" DevicePath \"\"" Oct 12 07:15:04 crc kubenswrapper[4930]: I1012 07:15:04.683113 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pfvdm" event={"ID":"59d13334-bab2-4a08-9aa5-1835b1993192","Type":"ContainerStarted","Data":"b013507b7386012bc802428e74454ef806ece59306d7a6a9e5925754ec4d71c6"} Oct 12 07:15:04 crc kubenswrapper[4930]: I1012 07:15:04.686161 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29337555-pjcrg" event={"ID":"f8c54212-a0af-43dd-890a-0b1fb5c2a8a4","Type":"ContainerDied","Data":"e9a95da0e41f6cc39dd16a86d92b14852e01f247ae6ad5eb1bff54990f665b8e"} Oct 12 07:15:04 crc kubenswrapper[4930]: I1012 07:15:04.686287 4930 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9a95da0e41f6cc39dd16a86d92b14852e01f247ae6ad5eb1bff54990f665b8e" Oct 12 07:15:04 crc kubenswrapper[4930]: I1012 07:15:04.686221 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29337555-pjcrg" Oct 12 07:15:05 crc kubenswrapper[4930]: I1012 07:15:05.072683 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-pfvdm" podStartSLOduration=2.506536038 podStartE2EDuration="6.072660077s" podCreationTimestamp="2025-10-12 07:14:59 +0000 UTC" firstStartedPulling="2025-10-12 07:15:00.636264839 +0000 UTC m=+5633.178366604" lastFinishedPulling="2025-10-12 07:15:04.202388888 +0000 UTC m=+5636.744490643" observedRunningTime="2025-10-12 07:15:04.706141956 +0000 UTC m=+5637.248243731" watchObservedRunningTime="2025-10-12 07:15:05.072660077 +0000 UTC m=+5637.614761842" Oct 12 07:15:05 crc kubenswrapper[4930]: I1012 07:15:05.115547 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29337510-4lkdj"] Oct 12 07:15:05 crc kubenswrapper[4930]: I1012 07:15:05.126045 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29337510-4lkdj"] Oct 12 07:15:06 crc kubenswrapper[4930]: I1012 07:15:06.145917 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e257f2a-ca74-43c1-bc58-ed297d75567c" path="/var/lib/kubelet/pods/5e257f2a-ca74-43c1-bc58-ed297d75567c/volumes" Oct 12 07:15:09 crc kubenswrapper[4930]: I1012 07:15:09.767625 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-pfvdm" Oct 12 07:15:09 crc kubenswrapper[4930]: I1012 07:15:09.768104 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-pfvdm" Oct 12 07:15:10 crc kubenswrapper[4930]: I1012 07:15:10.821519 4930 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-pfvdm" podUID="59d13334-bab2-4a08-9aa5-1835b1993192" containerName="registry-server" probeResult="failure" output=< Oct 12 07:15:10 crc kubenswrapper[4930]: timeout: failed to connect service ":50051" within 1s Oct 12 07:15:10 crc kubenswrapper[4930]: > Oct 12 07:15:19 crc kubenswrapper[4930]: I1012 07:15:19.859421 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-pfvdm" Oct 12 07:15:19 crc kubenswrapper[4930]: I1012 07:15:19.948105 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-pfvdm" Oct 12 07:15:20 crc kubenswrapper[4930]: I1012 07:15:20.123958 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pfvdm"] Oct 12 07:15:21 crc kubenswrapper[4930]: I1012 07:15:21.911084 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-pfvdm" podUID="59d13334-bab2-4a08-9aa5-1835b1993192" containerName="registry-server" containerID="cri-o://b013507b7386012bc802428e74454ef806ece59306d7a6a9e5925754ec4d71c6" gracePeriod=2 Oct 12 07:15:22 crc kubenswrapper[4930]: I1012 07:15:22.925420 4930 generic.go:334] "Generic (PLEG): container finished" podID="59d13334-bab2-4a08-9aa5-1835b1993192" containerID="b013507b7386012bc802428e74454ef806ece59306d7a6a9e5925754ec4d71c6" exitCode=0 Oct 12 07:15:22 crc kubenswrapper[4930]: I1012 07:15:22.925523 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pfvdm" event={"ID":"59d13334-bab2-4a08-9aa5-1835b1993192","Type":"ContainerDied","Data":"b013507b7386012bc802428e74454ef806ece59306d7a6a9e5925754ec4d71c6"} Oct 12 07:15:22 crc kubenswrapper[4930]: I1012 07:15:22.925864 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pfvdm" event={"ID":"59d13334-bab2-4a08-9aa5-1835b1993192","Type":"ContainerDied","Data":"6f2cf6edf8861604d84efca7f2cb1761f2bd90b8cb9750f1a7670b807e27c27f"} Oct 12 07:15:22 crc kubenswrapper[4930]: I1012 07:15:22.925896 4930 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f2cf6edf8861604d84efca7f2cb1761f2bd90b8cb9750f1a7670b807e27c27f" Oct 12 07:15:23 crc kubenswrapper[4930]: I1012 07:15:23.003215 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pfvdm" Oct 12 07:15:23 crc kubenswrapper[4930]: I1012 07:15:23.099353 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59d13334-bab2-4a08-9aa5-1835b1993192-catalog-content\") pod \"59d13334-bab2-4a08-9aa5-1835b1993192\" (UID: \"59d13334-bab2-4a08-9aa5-1835b1993192\") " Oct 12 07:15:23 crc kubenswrapper[4930]: I1012 07:15:23.099864 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59d13334-bab2-4a08-9aa5-1835b1993192-utilities\") pod \"59d13334-bab2-4a08-9aa5-1835b1993192\" (UID: \"59d13334-bab2-4a08-9aa5-1835b1993192\") " Oct 12 07:15:23 crc kubenswrapper[4930]: I1012 07:15:23.099960 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jj92c\" (UniqueName: \"kubernetes.io/projected/59d13334-bab2-4a08-9aa5-1835b1993192-kube-api-access-jj92c\") pod \"59d13334-bab2-4a08-9aa5-1835b1993192\" (UID: \"59d13334-bab2-4a08-9aa5-1835b1993192\") " Oct 12 07:15:23 crc kubenswrapper[4930]: I1012 07:15:23.100355 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59d13334-bab2-4a08-9aa5-1835b1993192-utilities" (OuterVolumeSpecName: "utilities") pod "59d13334-bab2-4a08-9aa5-1835b1993192" (UID: "59d13334-bab2-4a08-9aa5-1835b1993192"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:15:23 crc kubenswrapper[4930]: I1012 07:15:23.100726 4930 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59d13334-bab2-4a08-9aa5-1835b1993192-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 07:15:23 crc kubenswrapper[4930]: I1012 07:15:23.144407 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59d13334-bab2-4a08-9aa5-1835b1993192-kube-api-access-jj92c" (OuterVolumeSpecName: "kube-api-access-jj92c") pod "59d13334-bab2-4a08-9aa5-1835b1993192" (UID: "59d13334-bab2-4a08-9aa5-1835b1993192"). InnerVolumeSpecName "kube-api-access-jj92c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:15:23 crc kubenswrapper[4930]: I1012 07:15:23.165840 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59d13334-bab2-4a08-9aa5-1835b1993192-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "59d13334-bab2-4a08-9aa5-1835b1993192" (UID: "59d13334-bab2-4a08-9aa5-1835b1993192"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:15:23 crc kubenswrapper[4930]: I1012 07:15:23.203197 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jj92c\" (UniqueName: \"kubernetes.io/projected/59d13334-bab2-4a08-9aa5-1835b1993192-kube-api-access-jj92c\") on node \"crc\" DevicePath \"\"" Oct 12 07:15:23 crc kubenswrapper[4930]: I1012 07:15:23.203232 4930 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59d13334-bab2-4a08-9aa5-1835b1993192-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 07:15:23 crc kubenswrapper[4930]: I1012 07:15:23.939903 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pfvdm" Oct 12 07:15:24 crc kubenswrapper[4930]: I1012 07:15:24.006881 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pfvdm"] Oct 12 07:15:24 crc kubenswrapper[4930]: I1012 07:15:24.024910 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-pfvdm"] Oct 12 07:15:24 crc kubenswrapper[4930]: I1012 07:15:24.153514 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59d13334-bab2-4a08-9aa5-1835b1993192" path="/var/lib/kubelet/pods/59d13334-bab2-4a08-9aa5-1835b1993192/volumes" Oct 12 07:15:29 crc kubenswrapper[4930]: I1012 07:15:29.290951 4930 scope.go:117] "RemoveContainer" containerID="771ca82c2d62c80b3674215b00d0b217fc5b66ad7f479526714ced3458198596" Oct 12 07:15:33 crc kubenswrapper[4930]: I1012 07:15:33.669986 4930 patch_prober.go:28] interesting pod/machine-config-daemon-mk4tf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 07:15:33 crc kubenswrapper[4930]: I1012 07:15:33.670878 4930 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 07:16:03 crc kubenswrapper[4930]: I1012 07:16:03.669549 4930 patch_prober.go:28] interesting pod/machine-config-daemon-mk4tf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 07:16:03 crc kubenswrapper[4930]: I1012 07:16:03.670178 4930 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 07:16:03 crc kubenswrapper[4930]: I1012 07:16:03.670241 4930 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" Oct 12 07:16:03 crc kubenswrapper[4930]: I1012 07:16:03.671306 4930 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fadbe983b9d7f8cf56d93bde4ead838c4a3a392820ff85b2d655d6e5fed57ddf"} pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 12 07:16:03 crc kubenswrapper[4930]: I1012 07:16:03.671389 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerName="machine-config-daemon" containerID="cri-o://fadbe983b9d7f8cf56d93bde4ead838c4a3a392820ff85b2d655d6e5fed57ddf" gracePeriod=600 Oct 12 07:16:04 crc kubenswrapper[4930]: I1012 07:16:04.459781 4930 generic.go:334] "Generic (PLEG): container finished" podID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerID="fadbe983b9d7f8cf56d93bde4ead838c4a3a392820ff85b2d655d6e5fed57ddf" exitCode=0 Oct 12 07:16:04 crc kubenswrapper[4930]: I1012 07:16:04.460525 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" event={"ID":"02f8684c-a3e4-44e8-9741-9f54488d8d8d","Type":"ContainerDied","Data":"fadbe983b9d7f8cf56d93bde4ead838c4a3a392820ff85b2d655d6e5fed57ddf"} Oct 12 07:16:04 crc kubenswrapper[4930]: I1012 07:16:04.460555 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" event={"ID":"02f8684c-a3e4-44e8-9741-9f54488d8d8d","Type":"ContainerStarted","Data":"0d8022778327009e45bab53498f81e9a59c7470864854500a94947d41f09cf19"} Oct 12 07:16:04 crc kubenswrapper[4930]: I1012 07:16:04.460889 4930 scope.go:117] "RemoveContainer" containerID="3508cacfbe81eb819f121a8aa9dec29ca2ce11e5e5e32ed6ab29606cfbe47413" Oct 12 07:16:37 crc kubenswrapper[4930]: I1012 07:16:37.897865 4930 generic.go:334] "Generic (PLEG): container finished" podID="2243a5d2-1751-4335-8e6f-1c500a51b226" containerID="fcff62ab0ae0934eb7c77fb05a98256fc523f92c8c58b5a6e4fe29af5d7c32e4" exitCode=0 Oct 12 07:16:37 crc kubenswrapper[4930]: I1012 07:16:37.898004 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-smj9w/must-gather-q65p2" event={"ID":"2243a5d2-1751-4335-8e6f-1c500a51b226","Type":"ContainerDied","Data":"fcff62ab0ae0934eb7c77fb05a98256fc523f92c8c58b5a6e4fe29af5d7c32e4"} Oct 12 07:16:37 crc kubenswrapper[4930]: I1012 07:16:37.899371 4930 scope.go:117] "RemoveContainer" containerID="fcff62ab0ae0934eb7c77fb05a98256fc523f92c8c58b5a6e4fe29af5d7c32e4" Oct 12 07:16:38 crc kubenswrapper[4930]: I1012 07:16:38.739925 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-smj9w_must-gather-q65p2_2243a5d2-1751-4335-8e6f-1c500a51b226/gather/0.log" Oct 12 07:16:46 crc kubenswrapper[4930]: I1012 07:16:46.843138 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-smj9w/must-gather-q65p2"] Oct 12 07:16:46 crc kubenswrapper[4930]: I1012 07:16:46.844082 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-smj9w/must-gather-q65p2" podUID="2243a5d2-1751-4335-8e6f-1c500a51b226" containerName="copy" containerID="cri-o://5d180da7adfa332e35940aef536aaea1ef82d5116cbb40d52849797afea65b14" gracePeriod=2 Oct 12 07:16:46 crc kubenswrapper[4930]: I1012 07:16:46.862160 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-smj9w/must-gather-q65p2"] Oct 12 07:16:47 crc kubenswrapper[4930]: I1012 07:16:47.011311 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-smj9w_must-gather-q65p2_2243a5d2-1751-4335-8e6f-1c500a51b226/copy/0.log" Oct 12 07:16:47 crc kubenswrapper[4930]: I1012 07:16:47.012031 4930 generic.go:334] "Generic (PLEG): container finished" podID="2243a5d2-1751-4335-8e6f-1c500a51b226" containerID="5d180da7adfa332e35940aef536aaea1ef82d5116cbb40d52849797afea65b14" exitCode=143 Oct 12 07:16:47 crc kubenswrapper[4930]: I1012 07:16:47.300705 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-smj9w_must-gather-q65p2_2243a5d2-1751-4335-8e6f-1c500a51b226/copy/0.log" Oct 12 07:16:47 crc kubenswrapper[4930]: I1012 07:16:47.301271 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-smj9w/must-gather-q65p2" Oct 12 07:16:47 crc kubenswrapper[4930]: I1012 07:16:47.466729 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-skqpt\" (UniqueName: \"kubernetes.io/projected/2243a5d2-1751-4335-8e6f-1c500a51b226-kube-api-access-skqpt\") pod \"2243a5d2-1751-4335-8e6f-1c500a51b226\" (UID: \"2243a5d2-1751-4335-8e6f-1c500a51b226\") " Oct 12 07:16:47 crc kubenswrapper[4930]: I1012 07:16:47.466989 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2243a5d2-1751-4335-8e6f-1c500a51b226-must-gather-output\") pod \"2243a5d2-1751-4335-8e6f-1c500a51b226\" (UID: \"2243a5d2-1751-4335-8e6f-1c500a51b226\") " Oct 12 07:16:47 crc kubenswrapper[4930]: I1012 07:16:47.474179 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2243a5d2-1751-4335-8e6f-1c500a51b226-kube-api-access-skqpt" (OuterVolumeSpecName: "kube-api-access-skqpt") pod "2243a5d2-1751-4335-8e6f-1c500a51b226" (UID: "2243a5d2-1751-4335-8e6f-1c500a51b226"). InnerVolumeSpecName "kube-api-access-skqpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:16:47 crc kubenswrapper[4930]: I1012 07:16:47.569008 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-skqpt\" (UniqueName: \"kubernetes.io/projected/2243a5d2-1751-4335-8e6f-1c500a51b226-kube-api-access-skqpt\") on node \"crc\" DevicePath \"\"" Oct 12 07:16:47 crc kubenswrapper[4930]: I1012 07:16:47.634685 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2243a5d2-1751-4335-8e6f-1c500a51b226-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "2243a5d2-1751-4335-8e6f-1c500a51b226" (UID: "2243a5d2-1751-4335-8e6f-1c500a51b226"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:16:47 crc kubenswrapper[4930]: I1012 07:16:47.670538 4930 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2243a5d2-1751-4335-8e6f-1c500a51b226-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 12 07:16:48 crc kubenswrapper[4930]: I1012 07:16:48.020399 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-smj9w_must-gather-q65p2_2243a5d2-1751-4335-8e6f-1c500a51b226/copy/0.log" Oct 12 07:16:48 crc kubenswrapper[4930]: I1012 07:16:48.021718 4930 scope.go:117] "RemoveContainer" containerID="5d180da7adfa332e35940aef536aaea1ef82d5116cbb40d52849797afea65b14" Oct 12 07:16:48 crc kubenswrapper[4930]: I1012 07:16:48.021822 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-smj9w/must-gather-q65p2" Oct 12 07:16:48 crc kubenswrapper[4930]: I1012 07:16:48.041123 4930 scope.go:117] "RemoveContainer" containerID="fcff62ab0ae0934eb7c77fb05a98256fc523f92c8c58b5a6e4fe29af5d7c32e4" Oct 12 07:16:48 crc kubenswrapper[4930]: I1012 07:16:48.146250 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2243a5d2-1751-4335-8e6f-1c500a51b226" path="/var/lib/kubelet/pods/2243a5d2-1751-4335-8e6f-1c500a51b226/volumes" Oct 12 07:17:37 crc kubenswrapper[4930]: I1012 07:17:37.716113 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-nwxk2/must-gather-rj56h"] Oct 12 07:17:37 crc kubenswrapper[4930]: E1012 07:17:37.716905 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8c54212-a0af-43dd-890a-0b1fb5c2a8a4" containerName="collect-profiles" Oct 12 07:17:37 crc kubenswrapper[4930]: I1012 07:17:37.716918 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8c54212-a0af-43dd-890a-0b1fb5c2a8a4" containerName="collect-profiles" Oct 12 07:17:37 crc kubenswrapper[4930]: E1012 07:17:37.716929 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59d13334-bab2-4a08-9aa5-1835b1993192" containerName="extract-content" Oct 12 07:17:37 crc kubenswrapper[4930]: I1012 07:17:37.716934 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="59d13334-bab2-4a08-9aa5-1835b1993192" containerName="extract-content" Oct 12 07:17:37 crc kubenswrapper[4930]: E1012 07:17:37.716945 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59d13334-bab2-4a08-9aa5-1835b1993192" containerName="extract-utilities" Oct 12 07:17:37 crc kubenswrapper[4930]: I1012 07:17:37.716952 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="59d13334-bab2-4a08-9aa5-1835b1993192" containerName="extract-utilities" Oct 12 07:17:37 crc kubenswrapper[4930]: E1012 07:17:37.716969 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2243a5d2-1751-4335-8e6f-1c500a51b226" containerName="copy" Oct 12 07:17:37 crc kubenswrapper[4930]: I1012 07:17:37.716975 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="2243a5d2-1751-4335-8e6f-1c500a51b226" containerName="copy" Oct 12 07:17:37 crc kubenswrapper[4930]: E1012 07:17:37.716988 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2243a5d2-1751-4335-8e6f-1c500a51b226" containerName="gather" Oct 12 07:17:37 crc kubenswrapper[4930]: I1012 07:17:37.716993 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="2243a5d2-1751-4335-8e6f-1c500a51b226" containerName="gather" Oct 12 07:17:37 crc kubenswrapper[4930]: E1012 07:17:37.717010 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59d13334-bab2-4a08-9aa5-1835b1993192" containerName="registry-server" Oct 12 07:17:37 crc kubenswrapper[4930]: I1012 07:17:37.717017 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="59d13334-bab2-4a08-9aa5-1835b1993192" containerName="registry-server" Oct 12 07:17:37 crc kubenswrapper[4930]: I1012 07:17:37.717200 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="2243a5d2-1751-4335-8e6f-1c500a51b226" containerName="copy" Oct 12 07:17:37 crc kubenswrapper[4930]: I1012 07:17:37.717210 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="2243a5d2-1751-4335-8e6f-1c500a51b226" containerName="gather" Oct 12 07:17:37 crc kubenswrapper[4930]: I1012 07:17:37.717233 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8c54212-a0af-43dd-890a-0b1fb5c2a8a4" containerName="collect-profiles" Oct 12 07:17:37 crc kubenswrapper[4930]: I1012 07:17:37.717242 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="59d13334-bab2-4a08-9aa5-1835b1993192" containerName="registry-server" Oct 12 07:17:37 crc kubenswrapper[4930]: I1012 07:17:37.718274 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nwxk2/must-gather-rj56h" Oct 12 07:17:37 crc kubenswrapper[4930]: I1012 07:17:37.721063 4930 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-nwxk2"/"default-dockercfg-276gq" Oct 12 07:17:37 crc kubenswrapper[4930]: I1012 07:17:37.731051 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-nwxk2"/"openshift-service-ca.crt" Oct 12 07:17:37 crc kubenswrapper[4930]: I1012 07:17:37.732229 4930 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-nwxk2"/"kube-root-ca.crt" Oct 12 07:17:37 crc kubenswrapper[4930]: I1012 07:17:37.733409 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nwxk2/must-gather-rj56h"] Oct 12 07:17:37 crc kubenswrapper[4930]: I1012 07:17:37.839022 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7406add0-dc8d-45d5-8fef-cb49293eb22d-must-gather-output\") pod \"must-gather-rj56h\" (UID: \"7406add0-dc8d-45d5-8fef-cb49293eb22d\") " pod="openshift-must-gather-nwxk2/must-gather-rj56h" Oct 12 07:17:37 crc kubenswrapper[4930]: I1012 07:17:37.839246 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4rt8\" (UniqueName: \"kubernetes.io/projected/7406add0-dc8d-45d5-8fef-cb49293eb22d-kube-api-access-r4rt8\") pod \"must-gather-rj56h\" (UID: \"7406add0-dc8d-45d5-8fef-cb49293eb22d\") " pod="openshift-must-gather-nwxk2/must-gather-rj56h" Oct 12 07:17:37 crc kubenswrapper[4930]: I1012 07:17:37.941332 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4rt8\" (UniqueName: \"kubernetes.io/projected/7406add0-dc8d-45d5-8fef-cb49293eb22d-kube-api-access-r4rt8\") pod \"must-gather-rj56h\" (UID: \"7406add0-dc8d-45d5-8fef-cb49293eb22d\") " pod="openshift-must-gather-nwxk2/must-gather-rj56h" Oct 12 07:17:37 crc kubenswrapper[4930]: I1012 07:17:37.941779 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7406add0-dc8d-45d5-8fef-cb49293eb22d-must-gather-output\") pod \"must-gather-rj56h\" (UID: \"7406add0-dc8d-45d5-8fef-cb49293eb22d\") " pod="openshift-must-gather-nwxk2/must-gather-rj56h" Oct 12 07:17:37 crc kubenswrapper[4930]: I1012 07:17:37.942276 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7406add0-dc8d-45d5-8fef-cb49293eb22d-must-gather-output\") pod \"must-gather-rj56h\" (UID: \"7406add0-dc8d-45d5-8fef-cb49293eb22d\") " pod="openshift-must-gather-nwxk2/must-gather-rj56h" Oct 12 07:17:38 crc kubenswrapper[4930]: I1012 07:17:38.145127 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4rt8\" (UniqueName: \"kubernetes.io/projected/7406add0-dc8d-45d5-8fef-cb49293eb22d-kube-api-access-r4rt8\") pod \"must-gather-rj56h\" (UID: \"7406add0-dc8d-45d5-8fef-cb49293eb22d\") " pod="openshift-must-gather-nwxk2/must-gather-rj56h" Oct 12 07:17:38 crc kubenswrapper[4930]: I1012 07:17:38.336480 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nwxk2/must-gather-rj56h" Oct 12 07:17:38 crc kubenswrapper[4930]: I1012 07:17:38.964603 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nwxk2/must-gather-rj56h"] Oct 12 07:17:39 crc kubenswrapper[4930]: I1012 07:17:39.627782 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nwxk2/must-gather-rj56h" event={"ID":"7406add0-dc8d-45d5-8fef-cb49293eb22d","Type":"ContainerStarted","Data":"515e244a833eb2e4340f335f10a04472d4f90a550242c21da01a012a760b7457"} Oct 12 07:17:39 crc kubenswrapper[4930]: I1012 07:17:39.628417 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nwxk2/must-gather-rj56h" event={"ID":"7406add0-dc8d-45d5-8fef-cb49293eb22d","Type":"ContainerStarted","Data":"a224154544669734fd0052804975d74ca69cfe0c2bc280db7be78fad86ec61bf"} Oct 12 07:17:39 crc kubenswrapper[4930]: I1012 07:17:39.628440 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nwxk2/must-gather-rj56h" event={"ID":"7406add0-dc8d-45d5-8fef-cb49293eb22d","Type":"ContainerStarted","Data":"8ec1ad13d368f1171dbfbb41decd2d1072bbf10330e05c623dea326050d8afce"} Oct 12 07:17:39 crc kubenswrapper[4930]: I1012 07:17:39.657491 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-nwxk2/must-gather-rj56h" podStartSLOduration=2.657462577 podStartE2EDuration="2.657462577s" podCreationTimestamp="2025-10-12 07:17:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:17:39.638394258 +0000 UTC m=+5792.180496063" watchObservedRunningTime="2025-10-12 07:17:39.657462577 +0000 UTC m=+5792.199564392" Oct 12 07:17:42 crc kubenswrapper[4930]: I1012 07:17:42.931812 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-nwxk2/crc-debug-chjn5"] Oct 12 07:17:42 crc kubenswrapper[4930]: I1012 07:17:42.934057 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nwxk2/crc-debug-chjn5" Oct 12 07:17:43 crc kubenswrapper[4930]: I1012 07:17:43.017602 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/787adc86-f087-4d4d-87db-c38667bb2960-host\") pod \"crc-debug-chjn5\" (UID: \"787adc86-f087-4d4d-87db-c38667bb2960\") " pod="openshift-must-gather-nwxk2/crc-debug-chjn5" Oct 12 07:17:43 crc kubenswrapper[4930]: I1012 07:17:43.017879 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2p8nm\" (UniqueName: \"kubernetes.io/projected/787adc86-f087-4d4d-87db-c38667bb2960-kube-api-access-2p8nm\") pod \"crc-debug-chjn5\" (UID: \"787adc86-f087-4d4d-87db-c38667bb2960\") " pod="openshift-must-gather-nwxk2/crc-debug-chjn5" Oct 12 07:17:43 crc kubenswrapper[4930]: I1012 07:17:43.120438 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2p8nm\" (UniqueName: \"kubernetes.io/projected/787adc86-f087-4d4d-87db-c38667bb2960-kube-api-access-2p8nm\") pod \"crc-debug-chjn5\" (UID: \"787adc86-f087-4d4d-87db-c38667bb2960\") " pod="openshift-must-gather-nwxk2/crc-debug-chjn5" Oct 12 07:17:43 crc kubenswrapper[4930]: I1012 07:17:43.120910 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/787adc86-f087-4d4d-87db-c38667bb2960-host\") pod \"crc-debug-chjn5\" (UID: \"787adc86-f087-4d4d-87db-c38667bb2960\") " pod="openshift-must-gather-nwxk2/crc-debug-chjn5" Oct 12 07:17:43 crc kubenswrapper[4930]: I1012 07:17:43.121007 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/787adc86-f087-4d4d-87db-c38667bb2960-host\") pod \"crc-debug-chjn5\" (UID: \"787adc86-f087-4d4d-87db-c38667bb2960\") " pod="openshift-must-gather-nwxk2/crc-debug-chjn5" Oct 12 07:17:43 crc kubenswrapper[4930]: I1012 07:17:43.138600 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2p8nm\" (UniqueName: \"kubernetes.io/projected/787adc86-f087-4d4d-87db-c38667bb2960-kube-api-access-2p8nm\") pod \"crc-debug-chjn5\" (UID: \"787adc86-f087-4d4d-87db-c38667bb2960\") " pod="openshift-must-gather-nwxk2/crc-debug-chjn5" Oct 12 07:17:43 crc kubenswrapper[4930]: I1012 07:17:43.255378 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nwxk2/crc-debug-chjn5" Oct 12 07:17:43 crc kubenswrapper[4930]: I1012 07:17:43.672211 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nwxk2/crc-debug-chjn5" event={"ID":"787adc86-f087-4d4d-87db-c38667bb2960","Type":"ContainerStarted","Data":"86400f583e17019934a161f6f6bae03e03330c39c42330f73af96b567b4c0ae4"} Oct 12 07:17:43 crc kubenswrapper[4930]: I1012 07:17:43.672516 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nwxk2/crc-debug-chjn5" event={"ID":"787adc86-f087-4d4d-87db-c38667bb2960","Type":"ContainerStarted","Data":"4c106468d1fd6eb8cfab95c02d317f91e9a7121ab0a5c7ded514534798c777d6"} Oct 12 07:17:43 crc kubenswrapper[4930]: I1012 07:17:43.693865 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-nwxk2/crc-debug-chjn5" podStartSLOduration=1.693846861 podStartE2EDuration="1.693846861s" podCreationTimestamp="2025-10-12 07:17:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:17:43.687100185 +0000 UTC m=+5796.229201980" watchObservedRunningTime="2025-10-12 07:17:43.693846861 +0000 UTC m=+5796.235948626" Oct 12 07:18:23 crc kubenswrapper[4930]: I1012 07:18:23.096602 4930 generic.go:334] "Generic (PLEG): container finished" podID="787adc86-f087-4d4d-87db-c38667bb2960" containerID="86400f583e17019934a161f6f6bae03e03330c39c42330f73af96b567b4c0ae4" exitCode=0 Oct 12 07:18:23 crc kubenswrapper[4930]: I1012 07:18:23.096678 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nwxk2/crc-debug-chjn5" event={"ID":"787adc86-f087-4d4d-87db-c38667bb2960","Type":"ContainerDied","Data":"86400f583e17019934a161f6f6bae03e03330c39c42330f73af96b567b4c0ae4"} Oct 12 07:18:24 crc kubenswrapper[4930]: I1012 07:18:24.211285 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nwxk2/crc-debug-chjn5" Oct 12 07:18:24 crc kubenswrapper[4930]: I1012 07:18:24.242573 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-nwxk2/crc-debug-chjn5"] Oct 12 07:18:24 crc kubenswrapper[4930]: I1012 07:18:24.249503 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-nwxk2/crc-debug-chjn5"] Oct 12 07:18:24 crc kubenswrapper[4930]: I1012 07:18:24.306602 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/787adc86-f087-4d4d-87db-c38667bb2960-host\") pod \"787adc86-f087-4d4d-87db-c38667bb2960\" (UID: \"787adc86-f087-4d4d-87db-c38667bb2960\") " Oct 12 07:18:24 crc kubenswrapper[4930]: I1012 07:18:24.306729 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/787adc86-f087-4d4d-87db-c38667bb2960-host" (OuterVolumeSpecName: "host") pod "787adc86-f087-4d4d-87db-c38667bb2960" (UID: "787adc86-f087-4d4d-87db-c38667bb2960"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 07:18:24 crc kubenswrapper[4930]: I1012 07:18:24.306878 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2p8nm\" (UniqueName: \"kubernetes.io/projected/787adc86-f087-4d4d-87db-c38667bb2960-kube-api-access-2p8nm\") pod \"787adc86-f087-4d4d-87db-c38667bb2960\" (UID: \"787adc86-f087-4d4d-87db-c38667bb2960\") " Oct 12 07:18:24 crc kubenswrapper[4930]: I1012 07:18:24.307276 4930 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/787adc86-f087-4d4d-87db-c38667bb2960-host\") on node \"crc\" DevicePath \"\"" Oct 12 07:18:24 crc kubenswrapper[4930]: I1012 07:18:24.311953 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/787adc86-f087-4d4d-87db-c38667bb2960-kube-api-access-2p8nm" (OuterVolumeSpecName: "kube-api-access-2p8nm") pod "787adc86-f087-4d4d-87db-c38667bb2960" (UID: "787adc86-f087-4d4d-87db-c38667bb2960"). InnerVolumeSpecName "kube-api-access-2p8nm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:18:24 crc kubenswrapper[4930]: I1012 07:18:24.409249 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2p8nm\" (UniqueName: \"kubernetes.io/projected/787adc86-f087-4d4d-87db-c38667bb2960-kube-api-access-2p8nm\") on node \"crc\" DevicePath \"\"" Oct 12 07:18:25 crc kubenswrapper[4930]: I1012 07:18:25.117911 4930 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c106468d1fd6eb8cfab95c02d317f91e9a7121ab0a5c7ded514534798c777d6" Oct 12 07:18:25 crc kubenswrapper[4930]: I1012 07:18:25.117970 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nwxk2/crc-debug-chjn5" Oct 12 07:18:25 crc kubenswrapper[4930]: I1012 07:18:25.459226 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-nwxk2/crc-debug-466ms"] Oct 12 07:18:25 crc kubenswrapper[4930]: E1012 07:18:25.459602 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="787adc86-f087-4d4d-87db-c38667bb2960" containerName="container-00" Oct 12 07:18:25 crc kubenswrapper[4930]: I1012 07:18:25.459616 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="787adc86-f087-4d4d-87db-c38667bb2960" containerName="container-00" Oct 12 07:18:25 crc kubenswrapper[4930]: I1012 07:18:25.459849 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="787adc86-f087-4d4d-87db-c38667bb2960" containerName="container-00" Oct 12 07:18:25 crc kubenswrapper[4930]: I1012 07:18:25.460474 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nwxk2/crc-debug-466ms" Oct 12 07:18:25 crc kubenswrapper[4930]: I1012 07:18:25.531652 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e0fb6ad9-0ef1-40ff-a824-1ceb624ac0d5-host\") pod \"crc-debug-466ms\" (UID: \"e0fb6ad9-0ef1-40ff-a824-1ceb624ac0d5\") " pod="openshift-must-gather-nwxk2/crc-debug-466ms" Oct 12 07:18:25 crc kubenswrapper[4930]: I1012 07:18:25.531702 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdkkk\" (UniqueName: \"kubernetes.io/projected/e0fb6ad9-0ef1-40ff-a824-1ceb624ac0d5-kube-api-access-tdkkk\") pod \"crc-debug-466ms\" (UID: \"e0fb6ad9-0ef1-40ff-a824-1ceb624ac0d5\") " pod="openshift-must-gather-nwxk2/crc-debug-466ms" Oct 12 07:18:25 crc kubenswrapper[4930]: I1012 07:18:25.634044 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e0fb6ad9-0ef1-40ff-a824-1ceb624ac0d5-host\") pod \"crc-debug-466ms\" (UID: \"e0fb6ad9-0ef1-40ff-a824-1ceb624ac0d5\") " pod="openshift-must-gather-nwxk2/crc-debug-466ms" Oct 12 07:18:25 crc kubenswrapper[4930]: I1012 07:18:25.634118 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdkkk\" (UniqueName: \"kubernetes.io/projected/e0fb6ad9-0ef1-40ff-a824-1ceb624ac0d5-kube-api-access-tdkkk\") pod \"crc-debug-466ms\" (UID: \"e0fb6ad9-0ef1-40ff-a824-1ceb624ac0d5\") " pod="openshift-must-gather-nwxk2/crc-debug-466ms" Oct 12 07:18:25 crc kubenswrapper[4930]: I1012 07:18:25.634228 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e0fb6ad9-0ef1-40ff-a824-1ceb624ac0d5-host\") pod \"crc-debug-466ms\" (UID: \"e0fb6ad9-0ef1-40ff-a824-1ceb624ac0d5\") " pod="openshift-must-gather-nwxk2/crc-debug-466ms" Oct 12 07:18:25 crc kubenswrapper[4930]: I1012 07:18:25.652562 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdkkk\" (UniqueName: \"kubernetes.io/projected/e0fb6ad9-0ef1-40ff-a824-1ceb624ac0d5-kube-api-access-tdkkk\") pod \"crc-debug-466ms\" (UID: \"e0fb6ad9-0ef1-40ff-a824-1ceb624ac0d5\") " pod="openshift-must-gather-nwxk2/crc-debug-466ms" Oct 12 07:18:25 crc kubenswrapper[4930]: I1012 07:18:25.777564 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nwxk2/crc-debug-466ms" Oct 12 07:18:26 crc kubenswrapper[4930]: I1012 07:18:26.126889 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nwxk2/crc-debug-466ms" event={"ID":"e0fb6ad9-0ef1-40ff-a824-1ceb624ac0d5","Type":"ContainerStarted","Data":"a3ad5a9a1fdb82c2670954ad2a2ecf61c2b7ba9dad0e4b2b14acc31d6be3cc66"} Oct 12 07:18:26 crc kubenswrapper[4930]: I1012 07:18:26.127216 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nwxk2/crc-debug-466ms" event={"ID":"e0fb6ad9-0ef1-40ff-a824-1ceb624ac0d5","Type":"ContainerStarted","Data":"189750e38f79a79a9393bde48a6c1e7e5c5fb6beaacb47ac4ad5428e700f3209"} Oct 12 07:18:26 crc kubenswrapper[4930]: I1012 07:18:26.144443 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-nwxk2/crc-debug-466ms" podStartSLOduration=1.144422775 podStartE2EDuration="1.144422775s" podCreationTimestamp="2025-10-12 07:18:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-12 07:18:26.138758276 +0000 UTC m=+5838.680860061" watchObservedRunningTime="2025-10-12 07:18:26.144422775 +0000 UTC m=+5838.686524560" Oct 12 07:18:26 crc kubenswrapper[4930]: I1012 07:18:26.148274 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="787adc86-f087-4d4d-87db-c38667bb2960" path="/var/lib/kubelet/pods/787adc86-f087-4d4d-87db-c38667bb2960/volumes" Oct 12 07:18:27 crc kubenswrapper[4930]: I1012 07:18:27.137382 4930 generic.go:334] "Generic (PLEG): container finished" podID="e0fb6ad9-0ef1-40ff-a824-1ceb624ac0d5" containerID="a3ad5a9a1fdb82c2670954ad2a2ecf61c2b7ba9dad0e4b2b14acc31d6be3cc66" exitCode=0 Oct 12 07:18:27 crc kubenswrapper[4930]: I1012 07:18:27.137421 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nwxk2/crc-debug-466ms" event={"ID":"e0fb6ad9-0ef1-40ff-a824-1ceb624ac0d5","Type":"ContainerDied","Data":"a3ad5a9a1fdb82c2670954ad2a2ecf61c2b7ba9dad0e4b2b14acc31d6be3cc66"} Oct 12 07:18:28 crc kubenswrapper[4930]: I1012 07:18:28.368326 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nwxk2/crc-debug-466ms" Oct 12 07:18:28 crc kubenswrapper[4930]: I1012 07:18:28.578129 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tdkkk\" (UniqueName: \"kubernetes.io/projected/e0fb6ad9-0ef1-40ff-a824-1ceb624ac0d5-kube-api-access-tdkkk\") pod \"e0fb6ad9-0ef1-40ff-a824-1ceb624ac0d5\" (UID: \"e0fb6ad9-0ef1-40ff-a824-1ceb624ac0d5\") " Oct 12 07:18:28 crc kubenswrapper[4930]: I1012 07:18:28.578414 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e0fb6ad9-0ef1-40ff-a824-1ceb624ac0d5-host\") pod \"e0fb6ad9-0ef1-40ff-a824-1ceb624ac0d5\" (UID: \"e0fb6ad9-0ef1-40ff-a824-1ceb624ac0d5\") " Oct 12 07:18:28 crc kubenswrapper[4930]: I1012 07:18:28.578550 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e0fb6ad9-0ef1-40ff-a824-1ceb624ac0d5-host" (OuterVolumeSpecName: "host") pod "e0fb6ad9-0ef1-40ff-a824-1ceb624ac0d5" (UID: "e0fb6ad9-0ef1-40ff-a824-1ceb624ac0d5"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 07:18:28 crc kubenswrapper[4930]: I1012 07:18:28.578722 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-nwxk2/crc-debug-466ms"] Oct 12 07:18:28 crc kubenswrapper[4930]: I1012 07:18:28.579049 4930 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e0fb6ad9-0ef1-40ff-a824-1ceb624ac0d5-host\") on node \"crc\" DevicePath \"\"" Oct 12 07:18:28 crc kubenswrapper[4930]: I1012 07:18:28.583747 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0fb6ad9-0ef1-40ff-a824-1ceb624ac0d5-kube-api-access-tdkkk" (OuterVolumeSpecName: "kube-api-access-tdkkk") pod "e0fb6ad9-0ef1-40ff-a824-1ceb624ac0d5" (UID: "e0fb6ad9-0ef1-40ff-a824-1ceb624ac0d5"). InnerVolumeSpecName "kube-api-access-tdkkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:18:28 crc kubenswrapper[4930]: I1012 07:18:28.586441 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-nwxk2/crc-debug-466ms"] Oct 12 07:18:28 crc kubenswrapper[4930]: I1012 07:18:28.680589 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tdkkk\" (UniqueName: \"kubernetes.io/projected/e0fb6ad9-0ef1-40ff-a824-1ceb624ac0d5-kube-api-access-tdkkk\") on node \"crc\" DevicePath \"\"" Oct 12 07:18:29 crc kubenswrapper[4930]: I1012 07:18:29.157582 4930 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="189750e38f79a79a9393bde48a6c1e7e5c5fb6beaacb47ac4ad5428e700f3209" Oct 12 07:18:29 crc kubenswrapper[4930]: I1012 07:18:29.157699 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nwxk2/crc-debug-466ms" Oct 12 07:18:29 crc kubenswrapper[4930]: I1012 07:18:29.781994 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-nwxk2/crc-debug-kbg7p"] Oct 12 07:18:29 crc kubenswrapper[4930]: E1012 07:18:29.782730 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0fb6ad9-0ef1-40ff-a824-1ceb624ac0d5" containerName="container-00" Oct 12 07:18:29 crc kubenswrapper[4930]: I1012 07:18:29.782761 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0fb6ad9-0ef1-40ff-a824-1ceb624ac0d5" containerName="container-00" Oct 12 07:18:29 crc kubenswrapper[4930]: I1012 07:18:29.783040 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0fb6ad9-0ef1-40ff-a824-1ceb624ac0d5" containerName="container-00" Oct 12 07:18:29 crc kubenswrapper[4930]: I1012 07:18:29.783893 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nwxk2/crc-debug-kbg7p" Oct 12 07:18:29 crc kubenswrapper[4930]: I1012 07:18:29.799424 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnl2z\" (UniqueName: \"kubernetes.io/projected/d00a6732-bb8b-477a-b6f6-9a1d3e37fd62-kube-api-access-dnl2z\") pod \"crc-debug-kbg7p\" (UID: \"d00a6732-bb8b-477a-b6f6-9a1d3e37fd62\") " pod="openshift-must-gather-nwxk2/crc-debug-kbg7p" Oct 12 07:18:29 crc kubenswrapper[4930]: I1012 07:18:29.799649 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d00a6732-bb8b-477a-b6f6-9a1d3e37fd62-host\") pod \"crc-debug-kbg7p\" (UID: \"d00a6732-bb8b-477a-b6f6-9a1d3e37fd62\") " pod="openshift-must-gather-nwxk2/crc-debug-kbg7p" Oct 12 07:18:29 crc kubenswrapper[4930]: I1012 07:18:29.901495 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d00a6732-bb8b-477a-b6f6-9a1d3e37fd62-host\") pod \"crc-debug-kbg7p\" (UID: \"d00a6732-bb8b-477a-b6f6-9a1d3e37fd62\") " pod="openshift-must-gather-nwxk2/crc-debug-kbg7p" Oct 12 07:18:29 crc kubenswrapper[4930]: I1012 07:18:29.901641 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnl2z\" (UniqueName: \"kubernetes.io/projected/d00a6732-bb8b-477a-b6f6-9a1d3e37fd62-kube-api-access-dnl2z\") pod \"crc-debug-kbg7p\" (UID: \"d00a6732-bb8b-477a-b6f6-9a1d3e37fd62\") " pod="openshift-must-gather-nwxk2/crc-debug-kbg7p" Oct 12 07:18:29 crc kubenswrapper[4930]: I1012 07:18:29.901711 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d00a6732-bb8b-477a-b6f6-9a1d3e37fd62-host\") pod \"crc-debug-kbg7p\" (UID: \"d00a6732-bb8b-477a-b6f6-9a1d3e37fd62\") " pod="openshift-must-gather-nwxk2/crc-debug-kbg7p" Oct 12 07:18:30 crc kubenswrapper[4930]: I1012 07:18:30.159517 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0fb6ad9-0ef1-40ff-a824-1ceb624ac0d5" path="/var/lib/kubelet/pods/e0fb6ad9-0ef1-40ff-a824-1ceb624ac0d5/volumes" Oct 12 07:18:30 crc kubenswrapper[4930]: I1012 07:18:30.358220 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnl2z\" (UniqueName: \"kubernetes.io/projected/d00a6732-bb8b-477a-b6f6-9a1d3e37fd62-kube-api-access-dnl2z\") pod \"crc-debug-kbg7p\" (UID: \"d00a6732-bb8b-477a-b6f6-9a1d3e37fd62\") " pod="openshift-must-gather-nwxk2/crc-debug-kbg7p" Oct 12 07:18:30 crc kubenswrapper[4930]: I1012 07:18:30.401608 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nwxk2/crc-debug-kbg7p" Oct 12 07:18:31 crc kubenswrapper[4930]: I1012 07:18:31.182757 4930 generic.go:334] "Generic (PLEG): container finished" podID="d00a6732-bb8b-477a-b6f6-9a1d3e37fd62" containerID="495a653470cca961c823b5e5dc94a8b258e04fcd632a77da28e10a323043b78c" exitCode=0 Oct 12 07:18:31 crc kubenswrapper[4930]: I1012 07:18:31.182810 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nwxk2/crc-debug-kbg7p" event={"ID":"d00a6732-bb8b-477a-b6f6-9a1d3e37fd62","Type":"ContainerDied","Data":"495a653470cca961c823b5e5dc94a8b258e04fcd632a77da28e10a323043b78c"} Oct 12 07:18:31 crc kubenswrapper[4930]: I1012 07:18:31.183188 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nwxk2/crc-debug-kbg7p" event={"ID":"d00a6732-bb8b-477a-b6f6-9a1d3e37fd62","Type":"ContainerStarted","Data":"fe6c019356322c28b17eee770838cf6b37d6b55bbb63ad2eefb62fe5baa98245"} Oct 12 07:18:31 crc kubenswrapper[4930]: I1012 07:18:31.223713 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-nwxk2/crc-debug-kbg7p"] Oct 12 07:18:31 crc kubenswrapper[4930]: I1012 07:18:31.234057 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-nwxk2/crc-debug-kbg7p"] Oct 12 07:18:32 crc kubenswrapper[4930]: I1012 07:18:32.327176 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nwxk2/crc-debug-kbg7p" Oct 12 07:18:32 crc kubenswrapper[4930]: I1012 07:18:32.357110 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d00a6732-bb8b-477a-b6f6-9a1d3e37fd62-host\") pod \"d00a6732-bb8b-477a-b6f6-9a1d3e37fd62\" (UID: \"d00a6732-bb8b-477a-b6f6-9a1d3e37fd62\") " Oct 12 07:18:32 crc kubenswrapper[4930]: I1012 07:18:32.357169 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnl2z\" (UniqueName: \"kubernetes.io/projected/d00a6732-bb8b-477a-b6f6-9a1d3e37fd62-kube-api-access-dnl2z\") pod \"d00a6732-bb8b-477a-b6f6-9a1d3e37fd62\" (UID: \"d00a6732-bb8b-477a-b6f6-9a1d3e37fd62\") " Oct 12 07:18:32 crc kubenswrapper[4930]: I1012 07:18:32.357259 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d00a6732-bb8b-477a-b6f6-9a1d3e37fd62-host" (OuterVolumeSpecName: "host") pod "d00a6732-bb8b-477a-b6f6-9a1d3e37fd62" (UID: "d00a6732-bb8b-477a-b6f6-9a1d3e37fd62"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 12 07:18:32 crc kubenswrapper[4930]: I1012 07:18:32.357873 4930 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d00a6732-bb8b-477a-b6f6-9a1d3e37fd62-host\") on node \"crc\" DevicePath \"\"" Oct 12 07:18:32 crc kubenswrapper[4930]: I1012 07:18:32.371283 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d00a6732-bb8b-477a-b6f6-9a1d3e37fd62-kube-api-access-dnl2z" (OuterVolumeSpecName: "kube-api-access-dnl2z") pod "d00a6732-bb8b-477a-b6f6-9a1d3e37fd62" (UID: "d00a6732-bb8b-477a-b6f6-9a1d3e37fd62"). InnerVolumeSpecName "kube-api-access-dnl2z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:18:32 crc kubenswrapper[4930]: I1012 07:18:32.459705 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dnl2z\" (UniqueName: \"kubernetes.io/projected/d00a6732-bb8b-477a-b6f6-9a1d3e37fd62-kube-api-access-dnl2z\") on node \"crc\" DevicePath \"\"" Oct 12 07:18:33 crc kubenswrapper[4930]: I1012 07:18:33.207385 4930 scope.go:117] "RemoveContainer" containerID="495a653470cca961c823b5e5dc94a8b258e04fcd632a77da28e10a323043b78c" Oct 12 07:18:33 crc kubenswrapper[4930]: I1012 07:18:33.207419 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nwxk2/crc-debug-kbg7p" Oct 12 07:18:33 crc kubenswrapper[4930]: I1012 07:18:33.669214 4930 patch_prober.go:28] interesting pod/machine-config-daemon-mk4tf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 07:18:33 crc kubenswrapper[4930]: I1012 07:18:33.669292 4930 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 07:18:34 crc kubenswrapper[4930]: I1012 07:18:34.148961 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d00a6732-bb8b-477a-b6f6-9a1d3e37fd62" path="/var/lib/kubelet/pods/d00a6732-bb8b-477a-b6f6-9a1d3e37fd62/volumes" Oct 12 07:19:03 crc kubenswrapper[4930]: I1012 07:19:03.669529 4930 patch_prober.go:28] interesting pod/machine-config-daemon-mk4tf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 07:19:03 crc kubenswrapper[4930]: I1012 07:19:03.669982 4930 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 07:19:06 crc kubenswrapper[4930]: I1012 07:19:06.157933 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-696d7778c8-zcb9x_b054ea5a-466c-432d-aa75-7af68a134c5e/barbican-api/0.log" Oct 12 07:19:06 crc kubenswrapper[4930]: I1012 07:19:06.272385 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-696d7778c8-zcb9x_b054ea5a-466c-432d-aa75-7af68a134c5e/barbican-api-log/0.log" Oct 12 07:19:06 crc kubenswrapper[4930]: I1012 07:19:06.333518 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-587996ddf4-fcrwq_f37233c9-4b67-4e63-949a-24fd340b334b/barbican-keystone-listener/0.log" Oct 12 07:19:06 crc kubenswrapper[4930]: I1012 07:19:06.398464 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-587996ddf4-fcrwq_f37233c9-4b67-4e63-949a-24fd340b334b/barbican-keystone-listener-log/0.log" Oct 12 07:19:06 crc kubenswrapper[4930]: I1012 07:19:06.541343 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-76d484677c-ptrh6_e26aa90e-071d-46ff-8fa1-b86f43a70e01/barbican-worker/0.log" Oct 12 07:19:06 crc kubenswrapper[4930]: I1012 07:19:06.544755 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-76d484677c-ptrh6_e26aa90e-071d-46ff-8fa1-b86f43a70e01/barbican-worker-log/0.log" Oct 12 07:19:06 crc kubenswrapper[4930]: I1012 07:19:06.715862 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-fdbjn_fe5863bc-46d9-4eb8-8ab5-e5c6c2e6dbdc/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 07:19:06 crc kubenswrapper[4930]: I1012 07:19:06.808182 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_1d036de3-1a99-408d-8677-978880f41705/ceilometer-notification-agent/0.log" Oct 12 07:19:06 crc kubenswrapper[4930]: I1012 07:19:06.842719 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_1d036de3-1a99-408d-8677-978880f41705/ceilometer-central-agent/0.log" Oct 12 07:19:06 crc kubenswrapper[4930]: I1012 07:19:06.935941 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_1d036de3-1a99-408d-8677-978880f41705/proxy-httpd/0.log" Oct 12 07:19:06 crc kubenswrapper[4930]: I1012 07:19:06.949492 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_1d036de3-1a99-408d-8677-978880f41705/sg-core/0.log" Oct 12 07:19:07 crc kubenswrapper[4930]: I1012 07:19:07.117569 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_eceafd59-b491-4468-b6b6-78fe1c689e6b/cinder-api-log/0.log" Oct 12 07:19:07 crc kubenswrapper[4930]: I1012 07:19:07.292790 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_6c911528-2136-4abe-a716-c75437784628/cinder-scheduler/0.log" Oct 12 07:19:07 crc kubenswrapper[4930]: I1012 07:19:07.416055 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_eceafd59-b491-4468-b6b6-78fe1c689e6b/cinder-api/0.log" Oct 12 07:19:07 crc kubenswrapper[4930]: I1012 07:19:07.432707 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_6c911528-2136-4abe-a716-c75437784628/probe/0.log" Oct 12 07:19:07 crc kubenswrapper[4930]: I1012 07:19:07.476782 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-wrzrm_d8b9bf67-f82f-421d-98df-4f8e95911d5a/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 07:19:07 crc kubenswrapper[4930]: I1012 07:19:07.657329 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-l8jmm_8096b9b1-e514-4928-8e48-88e6519dc35e/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 07:19:07 crc kubenswrapper[4930]: I1012 07:19:07.664552 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-w7vqt_eebc8efc-b160-4a75-a213-74fcf9c2595e/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 07:19:07 crc kubenswrapper[4930]: I1012 07:19:07.833818 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-77b58f4b85-jll5z_829039a6-ad10-4532-b406-f497e661fd8d/init/0.log" Oct 12 07:19:08 crc kubenswrapper[4930]: I1012 07:19:08.009124 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-77b58f4b85-jll5z_829039a6-ad10-4532-b406-f497e661fd8d/init/0.log" Oct 12 07:19:08 crc kubenswrapper[4930]: I1012 07:19:08.040973 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-k6s6t_9727cf23-8270-491f-ba18-218bd73cd0c8/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 07:19:08 crc kubenswrapper[4930]: I1012 07:19:08.158727 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-77b58f4b85-jll5z_829039a6-ad10-4532-b406-f497e661fd8d/dnsmasq-dns/0.log" Oct 12 07:19:08 crc kubenswrapper[4930]: I1012 07:19:08.282553 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_00546299-d7c8-4536-9059-85a75dc5824e/glance-httpd/0.log" Oct 12 07:19:08 crc kubenswrapper[4930]: I1012 07:19:08.298354 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_00546299-d7c8-4536-9059-85a75dc5824e/glance-log/0.log" Oct 12 07:19:08 crc kubenswrapper[4930]: I1012 07:19:08.474701 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_6ceabefb-4c59-49ab-9ec7-cc011d6aa659/glance-httpd/0.log" Oct 12 07:19:08 crc kubenswrapper[4930]: I1012 07:19:08.528858 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_6ceabefb-4c59-49ab-9ec7-cc011d6aa659/glance-log/0.log" Oct 12 07:19:08 crc kubenswrapper[4930]: I1012 07:19:08.688013 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6d76466876-jf9t8_a97771f5-bcbe-42d8-bdd8-41b43f8899a0/horizon/0.log" Oct 12 07:19:08 crc kubenswrapper[4930]: I1012 07:19:08.790198 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-nj5k9_91938dad-6cac-4246-94d7-d93214ae2a5d/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 07:19:09 crc kubenswrapper[4930]: I1012 07:19:09.026535 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-pckm6_f64044f7-a939-48e9-a986-a3a39f4d1a4a/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 07:19:09 crc kubenswrapper[4930]: I1012 07:19:09.214046 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29337481-hzx9c_f8e371c3-204c-4a74-8c6f-49c25f6b7e90/keystone-cron/0.log" Oct 12 07:19:09 crc kubenswrapper[4930]: I1012 07:19:09.282819 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6d76466876-jf9t8_a97771f5-bcbe-42d8-bdd8-41b43f8899a0/horizon-log/0.log" Oct 12 07:19:09 crc kubenswrapper[4930]: I1012 07:19:09.416938 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29337541-5db6s_f9a44177-8b21-4b9b-8367-bdb45a60f379/keystone-cron/0.log" Oct 12 07:19:09 crc kubenswrapper[4930]: I1012 07:19:09.493231 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_8a5cf183-2e6a-408d-9baa-2f43f7b7b354/kube-state-metrics/0.log" Oct 12 07:19:09 crc kubenswrapper[4930]: I1012 07:19:09.616728 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-546b85cb56-ln9lt_456742ca-6f3a-485a-81ee-2a4d84df38c8/keystone-api/0.log" Oct 12 07:19:09 crc kubenswrapper[4930]: I1012 07:19:09.797212 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-4dbjr_1c3c7557-a115-43bb-9147-7faf17337317/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 07:19:10 crc kubenswrapper[4930]: I1012 07:19:10.226519 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-68ch8_31615b46-4290-46db-993e-3e5afa29c3f6/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 07:19:10 crc kubenswrapper[4930]: I1012 07:19:10.234395 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6844c9655c-rvdcz_532bae95-f9fc-4633-b53c-2f398cbb8bd2/neutron-httpd/0.log" Oct 12 07:19:10 crc kubenswrapper[4930]: I1012 07:19:10.314179 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6844c9655c-rvdcz_532bae95-f9fc-4633-b53c-2f398cbb8bd2/neutron-api/0.log" Oct 12 07:19:10 crc kubenswrapper[4930]: I1012 07:19:10.894310 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_534efb5a-d958-48db-9d8d-1a49091be4de/nova-cell0-conductor-conductor/0.log" Oct 12 07:19:11 crc kubenswrapper[4930]: I1012 07:19:11.218904 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_e6981a83-f891-4520-8602-a51b9132dbfa/nova-cell1-conductor-conductor/0.log" Oct 12 07:19:11 crc kubenswrapper[4930]: I1012 07:19:11.631976 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_3e40e826-773b-46e2-aa5d-d1efe925bf9f/nova-cell1-novncproxy-novncproxy/0.log" Oct 12 07:19:11 crc kubenswrapper[4930]: I1012 07:19:11.724062 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-t8xlg_e6afdef5-2b76-470f-9fb1-a98ae115072a/nova-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 07:19:11 crc kubenswrapper[4930]: I1012 07:19:11.843288 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_5709234d-7700-462e-9fa6-7e4f09bd0d91/nova-api-log/0.log" Oct 12 07:19:12 crc kubenswrapper[4930]: I1012 07:19:12.113264 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_0b0fcd1f-00b5-41e2-85e9-b1fd08dcf139/nova-metadata-log/0.log" Oct 12 07:19:12 crc kubenswrapper[4930]: I1012 07:19:12.273041 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_5709234d-7700-462e-9fa6-7e4f09bd0d91/nova-api-api/0.log" Oct 12 07:19:12 crc kubenswrapper[4930]: I1012 07:19:12.470691 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_46dfc5e5-80f4-49c8-bd9d-885e5dcb3fe5/mysql-bootstrap/0.log" Oct 12 07:19:12 crc kubenswrapper[4930]: I1012 07:19:12.674226 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_58a1e3ca-ad7a-49ac-8129-33ba2953d881/nova-scheduler-scheduler/0.log" Oct 12 07:19:12 crc kubenswrapper[4930]: I1012 07:19:12.701557 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_46dfc5e5-80f4-49c8-bd9d-885e5dcb3fe5/mysql-bootstrap/0.log" Oct 12 07:19:12 crc kubenswrapper[4930]: I1012 07:19:12.724707 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_46dfc5e5-80f4-49c8-bd9d-885e5dcb3fe5/galera/0.log" Oct 12 07:19:12 crc kubenswrapper[4930]: I1012 07:19:12.926563 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e52fde5b-22df-4fea-ae39-2bb2ef6fa033/mysql-bootstrap/0.log" Oct 12 07:19:13 crc kubenswrapper[4930]: I1012 07:19:13.103114 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e52fde5b-22df-4fea-ae39-2bb2ef6fa033/galera/0.log" Oct 12 07:19:13 crc kubenswrapper[4930]: I1012 07:19:13.127393 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e52fde5b-22df-4fea-ae39-2bb2ef6fa033/mysql-bootstrap/0.log" Oct 12 07:19:13 crc kubenswrapper[4930]: I1012 07:19:13.298673 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_1975f875-9e09-4d30-b5d4-2e883f13781b/openstackclient/0.log" Oct 12 07:19:13 crc kubenswrapper[4930]: I1012 07:19:13.440318 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-kpwjv_78fdc19a-5689-461a-89da-3054932b88c3/openstack-network-exporter/0.log" Oct 12 07:19:13 crc kubenswrapper[4930]: I1012 07:19:13.811533 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-nw5dm_ddccae59-8916-4bd7-bffa-041cf574e89e/ovn-controller/0.log" Oct 12 07:19:13 crc kubenswrapper[4930]: I1012 07:19:13.887810 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-hjqfl_382036ff-8896-494d-9670-ec527019676f/ovsdb-server-init/0.log" Oct 12 07:19:14 crc kubenswrapper[4930]: I1012 07:19:14.103124 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-hjqfl_382036ff-8896-494d-9670-ec527019676f/ovsdb-server/0.log" Oct 12 07:19:14 crc kubenswrapper[4930]: I1012 07:19:14.104299 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-hjqfl_382036ff-8896-494d-9670-ec527019676f/ovsdb-server-init/0.log" Oct 12 07:19:14 crc kubenswrapper[4930]: I1012 07:19:14.363268 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-h52cl_1d14f488-c958-4022-b1db-a1161afad246/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 07:19:14 crc kubenswrapper[4930]: I1012 07:19:14.425297 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_0b0fcd1f-00b5-41e2-85e9-b1fd08dcf139/nova-metadata-metadata/0.log" Oct 12 07:19:14 crc kubenswrapper[4930]: I1012 07:19:14.450491 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-hjqfl_382036ff-8896-494d-9670-ec527019676f/ovs-vswitchd/0.log" Oct 12 07:19:14 crc kubenswrapper[4930]: I1012 07:19:14.566853 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_b4580a3d-faab-45c2-a7a6-ef2802549ef9/openstack-network-exporter/0.log" Oct 12 07:19:14 crc kubenswrapper[4930]: I1012 07:19:14.625826 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_b4580a3d-faab-45c2-a7a6-ef2802549ef9/ovn-northd/0.log" Oct 12 07:19:14 crc kubenswrapper[4930]: I1012 07:19:14.738231 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_94bfaf3d-7abe-446f-b5ca-a359c65039b9/openstack-network-exporter/0.log" Oct 12 07:19:14 crc kubenswrapper[4930]: I1012 07:19:14.792716 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_94bfaf3d-7abe-446f-b5ca-a359c65039b9/ovsdbserver-nb/0.log" Oct 12 07:19:14 crc kubenswrapper[4930]: I1012 07:19:14.889798 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_acd687f2-88b8-4750-9f1a-ba8fa345e290/openstack-network-exporter/0.log" Oct 12 07:19:14 crc kubenswrapper[4930]: I1012 07:19:14.914362 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_acd687f2-88b8-4750-9f1a-ba8fa345e290/ovsdbserver-sb/0.log" Oct 12 07:19:15 crc kubenswrapper[4930]: I1012 07:19:15.288505 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_4a86a13a-7ebf-4fba-b066-f4c1ff705ffd/init-config-reloader/0.log" Oct 12 07:19:15 crc kubenswrapper[4930]: I1012 07:19:15.308481 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-544d94f45b-79l8m_8cfa2a2e-ac4f-415b-9dd2-dabf059ad679/placement-api/0.log" Oct 12 07:19:15 crc kubenswrapper[4930]: I1012 07:19:15.337970 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-544d94f45b-79l8m_8cfa2a2e-ac4f-415b-9dd2-dabf059ad679/placement-log/0.log" Oct 12 07:19:15 crc kubenswrapper[4930]: I1012 07:19:15.481982 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_4a86a13a-7ebf-4fba-b066-f4c1ff705ffd/config-reloader/0.log" Oct 12 07:19:15 crc kubenswrapper[4930]: I1012 07:19:15.519438 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_4a86a13a-7ebf-4fba-b066-f4c1ff705ffd/init-config-reloader/0.log" Oct 12 07:19:15 crc kubenswrapper[4930]: I1012 07:19:15.561357 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_4a86a13a-7ebf-4fba-b066-f4c1ff705ffd/prometheus/0.log" Oct 12 07:19:15 crc kubenswrapper[4930]: I1012 07:19:15.633831 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_4a86a13a-7ebf-4fba-b066-f4c1ff705ffd/thanos-sidecar/0.log" Oct 12 07:19:15 crc kubenswrapper[4930]: I1012 07:19:15.742105 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_62c5d71d-6283-44b4-9b50-96fd50d7ad99/setup-container/0.log" Oct 12 07:19:15 crc kubenswrapper[4930]: I1012 07:19:15.960453 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_62c5d71d-6283-44b4-9b50-96fd50d7ad99/rabbitmq/0.log" Oct 12 07:19:15 crc kubenswrapper[4930]: I1012 07:19:15.977164 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_1594846a-5c2f-49f8-9bea-22661720c5a6/setup-container/0.log" Oct 12 07:19:16 crc kubenswrapper[4930]: I1012 07:19:16.004691 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_62c5d71d-6283-44b4-9b50-96fd50d7ad99/setup-container/0.log" Oct 12 07:19:16 crc kubenswrapper[4930]: I1012 07:19:16.201724 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_1594846a-5c2f-49f8-9bea-22661720c5a6/rabbitmq/0.log" Oct 12 07:19:16 crc kubenswrapper[4930]: I1012 07:19:16.242917 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_1594846a-5c2f-49f8-9bea-22661720c5a6/setup-container/0.log" Oct 12 07:19:16 crc kubenswrapper[4930]: I1012 07:19:16.270603 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_cb65a4f8-39a7-4b20-b1cd-a077fff7ad8d/setup-container/0.log" Oct 12 07:19:16 crc kubenswrapper[4930]: I1012 07:19:16.495595 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-bgltt_5afc7c35-49e0-45d7-a3fd-ab6584abe8a7/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 07:19:16 crc kubenswrapper[4930]: I1012 07:19:16.534648 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_cb65a4f8-39a7-4b20-b1cd-a077fff7ad8d/setup-container/0.log" Oct 12 07:19:16 crc kubenswrapper[4930]: I1012 07:19:16.535182 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_cb65a4f8-39a7-4b20-b1cd-a077fff7ad8d/rabbitmq/0.log" Oct 12 07:19:16 crc kubenswrapper[4930]: I1012 07:19:16.729704 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-rj48r_4f819c97-4853-42ee-ac71-a252fefe38c5/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 07:19:16 crc kubenswrapper[4930]: I1012 07:19:16.796302 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-lw8bg_6b8c39cb-bec0-49b4-a4bb-5949e695db04/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 07:19:17 crc kubenswrapper[4930]: I1012 07:19:17.023013 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-xksvf_0b9f3033-c01f-46bf-9d12-3e60310ec6f3/ssh-known-hosts-edpm-deployment/0.log" Oct 12 07:19:17 crc kubenswrapper[4930]: I1012 07:19:17.044407 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-nnxtv_d23402d4-c8f9-4e04-9972-f40037dadec9/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 07:19:17 crc kubenswrapper[4930]: I1012 07:19:17.193429 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_76348a63-90b8-46b5-8856-da5c983b6d72/memcached/0.log" Oct 12 07:19:17 crc kubenswrapper[4930]: I1012 07:19:17.388305 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7d8c9db847-bqfrb_4ed14594-beb5-4ce3-bf04-4a9299a932be/proxy-server/0.log" Oct 12 07:19:17 crc kubenswrapper[4930]: I1012 07:19:17.507953 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-xldxb_df839ac4-27ff-436b-b328-55b948887fce/swift-ring-rebalance/0.log" Oct 12 07:19:17 crc kubenswrapper[4930]: I1012 07:19:17.534162 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7d8c9db847-bqfrb_4ed14594-beb5-4ce3-bf04-4a9299a932be/proxy-httpd/0.log" Oct 12 07:19:17 crc kubenswrapper[4930]: I1012 07:19:17.604643 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3/account-auditor/0.log" Oct 12 07:19:17 crc kubenswrapper[4930]: I1012 07:19:17.699304 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3/account-reaper/0.log" Oct 12 07:19:17 crc kubenswrapper[4930]: I1012 07:19:17.730725 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3/account-replicator/0.log" Oct 12 07:19:17 crc kubenswrapper[4930]: I1012 07:19:17.761800 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3/account-server/0.log" Oct 12 07:19:17 crc kubenswrapper[4930]: I1012 07:19:17.799567 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3/container-auditor/0.log" Oct 12 07:19:17 crc kubenswrapper[4930]: I1012 07:19:17.854023 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3/container-replicator/0.log" Oct 12 07:19:17 crc kubenswrapper[4930]: I1012 07:19:17.915402 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3/container-updater/0.log" Oct 12 07:19:17 crc kubenswrapper[4930]: I1012 07:19:17.930870 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3/container-server/0.log" Oct 12 07:19:17 crc kubenswrapper[4930]: I1012 07:19:17.961633 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3/object-auditor/0.log" Oct 12 07:19:17 crc kubenswrapper[4930]: I1012 07:19:17.989694 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3/object-expirer/0.log" Oct 12 07:19:18 crc kubenswrapper[4930]: I1012 07:19:18.089603 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3/object-replicator/0.log" Oct 12 07:19:18 crc kubenswrapper[4930]: I1012 07:19:18.126158 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3/object-server/0.log" Oct 12 07:19:18 crc kubenswrapper[4930]: I1012 07:19:18.144267 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3/object-updater/0.log" Oct 12 07:19:18 crc kubenswrapper[4930]: I1012 07:19:18.169648 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3/rsync/0.log" Oct 12 07:19:18 crc kubenswrapper[4930]: I1012 07:19:18.177443 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_073f287d-dbf2-4e8f-ac91-0e9d7c7a0ce3/swift-recon-cron/0.log" Oct 12 07:19:18 crc kubenswrapper[4930]: I1012 07:19:18.407194 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_acf9e824-abb4-4b1c-9925-c7794fafaad4/tempest-tests-tempest-tests-runner/0.log" Oct 12 07:19:18 crc kubenswrapper[4930]: I1012 07:19:18.414856 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-n5xlj_fa0905ab-f3dc-41c6-b517-9f9ac23d7adc/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 07:19:18 crc kubenswrapper[4930]: I1012 07:19:18.594016 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-w8fp5_33c2f765-a7aa-4d04-87b3-2f8483e4623a/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 12 07:19:18 crc kubenswrapper[4930]: I1012 07:19:18.600648 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_da6596b0-b6eb-4af2-97dd-7bda46883284/test-operator-logs-container/0.log" Oct 12 07:19:19 crc kubenswrapper[4930]: I1012 07:19:19.291623 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-applier-0_079031ef-591d-44a8-9a65-fdc0eaea1a0d/watcher-applier/0.log" Oct 12 07:19:19 crc kubenswrapper[4930]: I1012 07:19:19.716724 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_eb68a1f2-d5d6-4fea-b29a-bc253bfc919d/watcher-api-log/0.log" Oct 12 07:19:21 crc kubenswrapper[4930]: I1012 07:19:21.479575 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-decision-engine-0_220ebc1c-6f2b-4beb-8a34-339ba62a484f/watcher-decision-engine/0.log" Oct 12 07:19:22 crc kubenswrapper[4930]: I1012 07:19:22.417761 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_eb68a1f2-d5d6-4fea-b29a-bc253bfc919d/watcher-api/0.log" Oct 12 07:19:33 crc kubenswrapper[4930]: I1012 07:19:33.669816 4930 patch_prober.go:28] interesting pod/machine-config-daemon-mk4tf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 12 07:19:33 crc kubenswrapper[4930]: I1012 07:19:33.670352 4930 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 12 07:19:33 crc kubenswrapper[4930]: I1012 07:19:33.670400 4930 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" Oct 12 07:19:33 crc kubenswrapper[4930]: I1012 07:19:33.671223 4930 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0d8022778327009e45bab53498f81e9a59c7470864854500a94947d41f09cf19"} pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 12 07:19:33 crc kubenswrapper[4930]: I1012 07:19:33.671290 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerName="machine-config-daemon" containerID="cri-o://0d8022778327009e45bab53498f81e9a59c7470864854500a94947d41f09cf19" gracePeriod=600 Oct 12 07:19:33 crc kubenswrapper[4930]: E1012 07:19:33.809901 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 07:19:34 crc kubenswrapper[4930]: I1012 07:19:34.793702 4930 generic.go:334] "Generic (PLEG): container finished" podID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" containerID="0d8022778327009e45bab53498f81e9a59c7470864854500a94947d41f09cf19" exitCode=0 Oct 12 07:19:34 crc kubenswrapper[4930]: I1012 07:19:34.793817 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" event={"ID":"02f8684c-a3e4-44e8-9741-9f54488d8d8d","Type":"ContainerDied","Data":"0d8022778327009e45bab53498f81e9a59c7470864854500a94947d41f09cf19"} Oct 12 07:19:34 crc kubenswrapper[4930]: I1012 07:19:34.794046 4930 scope.go:117] "RemoveContainer" containerID="fadbe983b9d7f8cf56d93bde4ead838c4a3a392820ff85b2d655d6e5fed57ddf" Oct 12 07:19:34 crc kubenswrapper[4930]: I1012 07:19:34.794943 4930 scope.go:117] "RemoveContainer" containerID="0d8022778327009e45bab53498f81e9a59c7470864854500a94947d41f09cf19" Oct 12 07:19:34 crc kubenswrapper[4930]: E1012 07:19:34.795403 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 07:19:44 crc kubenswrapper[4930]: I1012 07:19:44.249249 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-658bdf4b74-vf2tl_9e6cd80c-4aa5-40de-81fc-10d0329f5481/kube-rbac-proxy/0.log" Oct 12 07:19:44 crc kubenswrapper[4930]: I1012 07:19:44.282815 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-658bdf4b74-vf2tl_9e6cd80c-4aa5-40de-81fc-10d0329f5481/manager/0.log" Oct 12 07:19:44 crc kubenswrapper[4930]: I1012 07:19:44.423483 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bpcx87_84bb56e1-6ddf-439a-9889-f2e1973ad8fa/util/0.log" Oct 12 07:19:44 crc kubenswrapper[4930]: I1012 07:19:44.566640 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bpcx87_84bb56e1-6ddf-439a-9889-f2e1973ad8fa/util/0.log" Oct 12 07:19:44 crc kubenswrapper[4930]: I1012 07:19:44.606489 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bpcx87_84bb56e1-6ddf-439a-9889-f2e1973ad8fa/pull/0.log" Oct 12 07:19:44 crc kubenswrapper[4930]: I1012 07:19:44.609488 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bpcx87_84bb56e1-6ddf-439a-9889-f2e1973ad8fa/pull/0.log" Oct 12 07:19:44 crc kubenswrapper[4930]: I1012 07:19:44.725677 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bpcx87_84bb56e1-6ddf-439a-9889-f2e1973ad8fa/util/0.log" Oct 12 07:19:44 crc kubenswrapper[4930]: I1012 07:19:44.759091 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bpcx87_84bb56e1-6ddf-439a-9889-f2e1973ad8fa/pull/0.log" Oct 12 07:19:44 crc kubenswrapper[4930]: I1012 07:19:44.774188 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bbf55ab9b6da9dfde4a224fc1e3f049ee7cb6cab839422fb52a09a365bpcx87_84bb56e1-6ddf-439a-9889-f2e1973ad8fa/extract/0.log" Oct 12 07:19:44 crc kubenswrapper[4930]: I1012 07:19:44.909130 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-7b7fb68549-2p7m6_5a7f54b7-1891-4e3a-a768-e937269bd384/kube-rbac-proxy/0.log" Oct 12 07:19:44 crc kubenswrapper[4930]: I1012 07:19:44.940343 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-7b7fb68549-2p7m6_5a7f54b7-1891-4e3a-a768-e937269bd384/manager/0.log" Oct 12 07:19:44 crc kubenswrapper[4930]: I1012 07:19:44.999394 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-85d5d9dd78-8vdcg_f883b7e0-65a4-4cea-9d02-7cdedfaf9ec2/kube-rbac-proxy/0.log" Oct 12 07:19:45 crc kubenswrapper[4930]: I1012 07:19:45.102261 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-85d5d9dd78-8vdcg_f883b7e0-65a4-4cea-9d02-7cdedfaf9ec2/manager/0.log" Oct 12 07:19:45 crc kubenswrapper[4930]: I1012 07:19:45.135362 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84b9b84486-ckctw_93426c54-3448-421e-aa85-b03c466c7bf8/kube-rbac-proxy/0.log" Oct 12 07:19:45 crc kubenswrapper[4930]: I1012 07:19:45.232320 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84b9b84486-ckctw_93426c54-3448-421e-aa85-b03c466c7bf8/manager/0.log" Oct 12 07:19:45 crc kubenswrapper[4930]: I1012 07:19:45.334717 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-858f76bbdd-2fq48_25276148-1b95-4b4d-9f18-ef97020632a7/kube-rbac-proxy/0.log" Oct 12 07:19:45 crc kubenswrapper[4930]: I1012 07:19:45.335501 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-858f76bbdd-2fq48_25276148-1b95-4b4d-9f18-ef97020632a7/manager/0.log" Oct 12 07:19:45 crc kubenswrapper[4930]: I1012 07:19:45.447582 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-7ffbcb7588-zh959_ee89a0b0-868b-4b2e-a274-c5a4ee40a872/kube-rbac-proxy/0.log" Oct 12 07:19:45 crc kubenswrapper[4930]: I1012 07:19:45.504260 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-7ffbcb7588-zh959_ee89a0b0-868b-4b2e-a274-c5a4ee40a872/manager/0.log" Oct 12 07:19:45 crc kubenswrapper[4930]: I1012 07:19:45.623872 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-656bcbd775-wgp8z_df7a25ba-c240-4d05-a117-0040e24bb33c/kube-rbac-proxy/0.log" Oct 12 07:19:45 crc kubenswrapper[4930]: I1012 07:19:45.753249 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-9c5c78d49-ksz2s_adf9f01b-70b6-46b9-acde-c1eedc16f299/kube-rbac-proxy/0.log" Oct 12 07:19:45 crc kubenswrapper[4930]: I1012 07:19:45.812856 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-656bcbd775-wgp8z_df7a25ba-c240-4d05-a117-0040e24bb33c/manager/0.log" Oct 12 07:19:45 crc kubenswrapper[4930]: I1012 07:19:45.909512 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-9c5c78d49-ksz2s_adf9f01b-70b6-46b9-acde-c1eedc16f299/manager/0.log" Oct 12 07:19:45 crc kubenswrapper[4930]: I1012 07:19:45.956903 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-55b6b7c7b8-wk6qh_fe5f36d2-82b4-4bce-a189-7844dae5dc0e/kube-rbac-proxy/0.log" Oct 12 07:19:46 crc kubenswrapper[4930]: I1012 07:19:46.102362 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-55b6b7c7b8-wk6qh_fe5f36d2-82b4-4bce-a189-7844dae5dc0e/manager/0.log" Oct 12 07:19:46 crc kubenswrapper[4930]: I1012 07:19:46.135701 4930 scope.go:117] "RemoveContainer" containerID="0d8022778327009e45bab53498f81e9a59c7470864854500a94947d41f09cf19" Oct 12 07:19:46 crc kubenswrapper[4930]: E1012 07:19:46.136087 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 07:19:46 crc kubenswrapper[4930]: I1012 07:19:46.148967 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5f67fbc655-ql78t_fe819f44-6224-4b45-a33c-6b6ef8e73b92/manager/0.log" Oct 12 07:19:46 crc kubenswrapper[4930]: I1012 07:19:46.154331 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5f67fbc655-ql78t_fe819f44-6224-4b45-a33c-6b6ef8e73b92/kube-rbac-proxy/0.log" Oct 12 07:19:46 crc kubenswrapper[4930]: I1012 07:19:46.282077 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-f9fb45f8f-qcc9w_37d1af03-8709-4b4a-8d4c-bda1dbefff59/kube-rbac-proxy/0.log" Oct 12 07:19:46 crc kubenswrapper[4930]: I1012 07:19:46.323725 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-f9fb45f8f-qcc9w_37d1af03-8709-4b4a-8d4c-bda1dbefff59/manager/0.log" Oct 12 07:19:46 crc kubenswrapper[4930]: I1012 07:19:46.473116 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-79d585cb66-2brdt_7134f9eb-cfa6-41b8-a245-2f1b17669ca4/kube-rbac-proxy/0.log" Oct 12 07:19:46 crc kubenswrapper[4930]: I1012 07:19:46.520509 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-79d585cb66-2brdt_7134f9eb-cfa6-41b8-a245-2f1b17669ca4/manager/0.log" Oct 12 07:19:46 crc kubenswrapper[4930]: I1012 07:19:46.560913 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5df598886f-hx224_510d5f0a-5f67-4171-99e6-1de6734e7bdf/kube-rbac-proxy/0.log" Oct 12 07:19:46 crc kubenswrapper[4930]: I1012 07:19:46.725350 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5df598886f-hx224_510d5f0a-5f67-4171-99e6-1de6734e7bdf/manager/0.log" Oct 12 07:19:46 crc kubenswrapper[4930]: I1012 07:19:46.739036 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-69fdcfc5f5-vn5dq_5383069b-8f72-4173-97bf-34ffc36c235e/manager/0.log" Oct 12 07:19:46 crc kubenswrapper[4930]: I1012 07:19:46.752233 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-69fdcfc5f5-vn5dq_5383069b-8f72-4173-97bf-34ffc36c235e/kube-rbac-proxy/0.log" Oct 12 07:19:46 crc kubenswrapper[4930]: I1012 07:19:46.918705 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5956dffb7bwrr99_20b56dea-8d10-4b11-b437-fc38320417c9/manager/0.log" Oct 12 07:19:46 crc kubenswrapper[4930]: I1012 07:19:46.937185 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5956dffb7bwrr99_20b56dea-8d10-4b11-b437-fc38320417c9/kube-rbac-proxy/0.log" Oct 12 07:19:47 crc kubenswrapper[4930]: I1012 07:19:47.074813 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5b95c8954b-46gk6_f89b4da4-a74f-4f12-b056-05f201bedabd/kube-rbac-proxy/0.log" Oct 12 07:19:47 crc kubenswrapper[4930]: I1012 07:19:47.245306 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-688d597459-t5cps_d0de84bb-9b0d-4552-887d-1da3d50467e2/kube-rbac-proxy/0.log" Oct 12 07:19:47 crc kubenswrapper[4930]: I1012 07:19:47.390077 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-688d597459-t5cps_d0de84bb-9b0d-4552-887d-1da3d50467e2/operator/0.log" Oct 12 07:19:47 crc kubenswrapper[4930]: I1012 07:19:47.562626 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-6prjm_2214be78-67f7-4965-b3a0-8c1401ff658c/registry-server/0.log" Oct 12 07:19:47 crc kubenswrapper[4930]: I1012 07:19:47.569344 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-79df5fb58c-2hx92_8ac93070-497c-48ca-a58a-fb47657e6c2a/kube-rbac-proxy/0.log" Oct 12 07:19:47 crc kubenswrapper[4930]: I1012 07:19:47.691388 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-79df5fb58c-2hx92_8ac93070-497c-48ca-a58a-fb47657e6c2a/manager/0.log" Oct 12 07:19:47 crc kubenswrapper[4930]: I1012 07:19:47.787164 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-68b6c87b68-l2x8t_97b9da7d-7c47-46b5-ac4a-f190a92dceef/kube-rbac-proxy/0.log" Oct 12 07:19:47 crc kubenswrapper[4930]: I1012 07:19:47.831849 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-68b6c87b68-l2x8t_97b9da7d-7c47-46b5-ac4a-f190a92dceef/manager/0.log" Oct 12 07:19:48 crc kubenswrapper[4930]: I1012 07:19:48.070150 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-db6d7f97b-862f6_27d33d74-f1d1-4208-aead-8f6091c524df/kube-rbac-proxy/0.log" Oct 12 07:19:48 crc kubenswrapper[4930]: I1012 07:19:48.097822 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-5vrjw_7063a11c-2d85-47f0-85ca-61cf4949e10d/operator/0.log" Oct 12 07:19:48 crc kubenswrapper[4930]: I1012 07:19:48.167152 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-db6d7f97b-862f6_27d33d74-f1d1-4208-aead-8f6091c524df/manager/0.log" Oct 12 07:19:48 crc kubenswrapper[4930]: I1012 07:19:48.289238 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5b95c8954b-46gk6_f89b4da4-a74f-4f12-b056-05f201bedabd/manager/0.log" Oct 12 07:19:48 crc kubenswrapper[4930]: I1012 07:19:48.300756 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-67cfc6749b-wk5xm_89d21577-9d93-4003-bae1-3b66e679eeeb/kube-rbac-proxy/0.log" Oct 12 07:19:48 crc kubenswrapper[4930]: I1012 07:19:48.502093 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5458f77c4-52242_18286f44-9f6c-4699-9d4d-afa069c980ed/manager/0.log" Oct 12 07:19:48 crc kubenswrapper[4930]: I1012 07:19:48.502852 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-67cfc6749b-wk5xm_89d21577-9d93-4003-bae1-3b66e679eeeb/manager/0.log" Oct 12 07:19:48 crc kubenswrapper[4930]: I1012 07:19:48.507857 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5458f77c4-52242_18286f44-9f6c-4699-9d4d-afa069c980ed/kube-rbac-proxy/0.log" Oct 12 07:19:48 crc kubenswrapper[4930]: I1012 07:19:48.630072 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-7f554bff7b-mkzqf_7498c8de-da98-47fd-8096-58cb4f1c4f87/kube-rbac-proxy/0.log" Oct 12 07:19:48 crc kubenswrapper[4930]: I1012 07:19:48.755032 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-7f554bff7b-mkzqf_7498c8de-da98-47fd-8096-58cb4f1c4f87/manager/0.log" Oct 12 07:19:58 crc kubenswrapper[4930]: I1012 07:19:58.135348 4930 scope.go:117] "RemoveContainer" containerID="0d8022778327009e45bab53498f81e9a59c7470864854500a94947d41f09cf19" Oct 12 07:19:58 crc kubenswrapper[4930]: E1012 07:19:58.136304 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 07:20:04 crc kubenswrapper[4930]: I1012 07:20:04.880254 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-6r5ch_4af4dbea-ada1-465b-a6d9-24843bf3808d/control-plane-machine-set-operator/0.log" Oct 12 07:20:05 crc kubenswrapper[4930]: I1012 07:20:05.064638 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-zxqcl_54c34701-7d60-4396-9a64-81b91379fbe9/machine-api-operator/0.log" Oct 12 07:20:05 crc kubenswrapper[4930]: I1012 07:20:05.101755 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-zxqcl_54c34701-7d60-4396-9a64-81b91379fbe9/kube-rbac-proxy/0.log" Oct 12 07:20:10 crc kubenswrapper[4930]: I1012 07:20:10.135916 4930 scope.go:117] "RemoveContainer" containerID="0d8022778327009e45bab53498f81e9a59c7470864854500a94947d41f09cf19" Oct 12 07:20:10 crc kubenswrapper[4930]: E1012 07:20:10.136678 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 07:20:18 crc kubenswrapper[4930]: I1012 07:20:18.075613 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-wvrgv_e7b9505d-6b6f-4460-9f40-119155e4ba33/cert-manager-controller/0.log" Oct 12 07:20:18 crc kubenswrapper[4930]: I1012 07:20:18.243757 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-v95v2_b488879d-5b6c-438f-8b60-842f84b05028/cert-manager-cainjector/0.log" Oct 12 07:20:18 crc kubenswrapper[4930]: I1012 07:20:18.330963 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-6ptlt_a45f8116-ddf1-4733-bfb2-9fd498a04620/cert-manager-webhook/0.log" Oct 12 07:20:25 crc kubenswrapper[4930]: I1012 07:20:25.136310 4930 scope.go:117] "RemoveContainer" containerID="0d8022778327009e45bab53498f81e9a59c7470864854500a94947d41f09cf19" Oct 12 07:20:25 crc kubenswrapper[4930]: E1012 07:20:25.137482 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 07:20:31 crc kubenswrapper[4930]: I1012 07:20:31.194622 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-54mmw_0d4f3b47-0a65-4b2d-837d-9e9e9efab38e/nmstate-console-plugin/0.log" Oct 12 07:20:31 crc kubenswrapper[4930]: I1012 07:20:31.366926 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-mngll_c75f4df3-9537-4a4b-9170-bee951bdb162/nmstate-handler/0.log" Oct 12 07:20:31 crc kubenswrapper[4930]: I1012 07:20:31.413144 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-gtfdw_2cb896f0-cc6a-44bd-ae9e-801659d154f4/kube-rbac-proxy/0.log" Oct 12 07:20:31 crc kubenswrapper[4930]: I1012 07:20:31.426920 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-gtfdw_2cb896f0-cc6a-44bd-ae9e-801659d154f4/nmstate-metrics/0.log" Oct 12 07:20:31 crc kubenswrapper[4930]: I1012 07:20:31.573355 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-gznzl_0920866d-8bd7-49ce-81f7-5aa6b77e9198/nmstate-operator/0.log" Oct 12 07:20:31 crc kubenswrapper[4930]: I1012 07:20:31.594886 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-k92gs_9a8ee773-dcea-477c-98ea-afc18295b1a0/nmstate-webhook/0.log" Oct 12 07:20:37 crc kubenswrapper[4930]: I1012 07:20:37.134960 4930 scope.go:117] "RemoveContainer" containerID="0d8022778327009e45bab53498f81e9a59c7470864854500a94947d41f09cf19" Oct 12 07:20:37 crc kubenswrapper[4930]: E1012 07:20:37.136425 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 07:20:46 crc kubenswrapper[4930]: I1012 07:20:46.261558 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-7sgrd_908d6f68-9800-4794-88a0-21cca2bb3691/kube-rbac-proxy/0.log" Oct 12 07:20:46 crc kubenswrapper[4930]: I1012 07:20:46.365703 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-7sgrd_908d6f68-9800-4794-88a0-21cca2bb3691/controller/0.log" Oct 12 07:20:46 crc kubenswrapper[4930]: I1012 07:20:46.481808 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qv5cn_c9aa3dfb-870d-4246-96a2-4c52c470239f/cp-frr-files/0.log" Oct 12 07:20:46 crc kubenswrapper[4930]: I1012 07:20:46.615411 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qv5cn_c9aa3dfb-870d-4246-96a2-4c52c470239f/cp-frr-files/0.log" Oct 12 07:20:46 crc kubenswrapper[4930]: I1012 07:20:46.638948 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qv5cn_c9aa3dfb-870d-4246-96a2-4c52c470239f/cp-metrics/0.log" Oct 12 07:20:46 crc kubenswrapper[4930]: I1012 07:20:46.643900 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qv5cn_c9aa3dfb-870d-4246-96a2-4c52c470239f/cp-reloader/0.log" Oct 12 07:20:46 crc kubenswrapper[4930]: I1012 07:20:46.716189 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qv5cn_c9aa3dfb-870d-4246-96a2-4c52c470239f/cp-reloader/0.log" Oct 12 07:20:46 crc kubenswrapper[4930]: I1012 07:20:46.804446 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qv5cn_c9aa3dfb-870d-4246-96a2-4c52c470239f/cp-frr-files/0.log" Oct 12 07:20:46 crc kubenswrapper[4930]: I1012 07:20:46.850727 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qv5cn_c9aa3dfb-870d-4246-96a2-4c52c470239f/cp-metrics/0.log" Oct 12 07:20:46 crc kubenswrapper[4930]: I1012 07:20:46.871795 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qv5cn_c9aa3dfb-870d-4246-96a2-4c52c470239f/cp-reloader/0.log" Oct 12 07:20:46 crc kubenswrapper[4930]: I1012 07:20:46.894031 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qv5cn_c9aa3dfb-870d-4246-96a2-4c52c470239f/cp-metrics/0.log" Oct 12 07:20:47 crc kubenswrapper[4930]: I1012 07:20:47.040272 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qv5cn_c9aa3dfb-870d-4246-96a2-4c52c470239f/cp-frr-files/0.log" Oct 12 07:20:47 crc kubenswrapper[4930]: I1012 07:20:47.070573 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qv5cn_c9aa3dfb-870d-4246-96a2-4c52c470239f/controller/0.log" Oct 12 07:20:47 crc kubenswrapper[4930]: I1012 07:20:47.074671 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qv5cn_c9aa3dfb-870d-4246-96a2-4c52c470239f/cp-metrics/0.log" Oct 12 07:20:47 crc kubenswrapper[4930]: I1012 07:20:47.081092 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qv5cn_c9aa3dfb-870d-4246-96a2-4c52c470239f/cp-reloader/0.log" Oct 12 07:20:47 crc kubenswrapper[4930]: I1012 07:20:47.236628 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qv5cn_c9aa3dfb-870d-4246-96a2-4c52c470239f/frr-metrics/0.log" Oct 12 07:20:47 crc kubenswrapper[4930]: I1012 07:20:47.277531 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qv5cn_c9aa3dfb-870d-4246-96a2-4c52c470239f/kube-rbac-proxy/0.log" Oct 12 07:20:47 crc kubenswrapper[4930]: I1012 07:20:47.290697 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qv5cn_c9aa3dfb-870d-4246-96a2-4c52c470239f/kube-rbac-proxy-frr/0.log" Oct 12 07:20:47 crc kubenswrapper[4930]: I1012 07:20:47.413201 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qv5cn_c9aa3dfb-870d-4246-96a2-4c52c470239f/reloader/0.log" Oct 12 07:20:47 crc kubenswrapper[4930]: I1012 07:20:47.508049 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-b5w72_610c85e1-ab61-4938-a33d-cbed5a873874/frr-k8s-webhook-server/0.log" Oct 12 07:20:47 crc kubenswrapper[4930]: I1012 07:20:47.713398 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7ccf4dfd66-4nwxw_8ccb343f-6987-45c8-9b9f-a2bb32efbe22/manager/0.log" Oct 12 07:20:47 crc kubenswrapper[4930]: I1012 07:20:47.778603 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-688d4f8df8-dh759_726977f0-7624-4ce2-95ed-80f844fcfc81/webhook-server/0.log" Oct 12 07:20:47 crc kubenswrapper[4930]: I1012 07:20:47.981937 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-dd89b_61b8c651-74f8-4532-9554-55b01e58c0e6/kube-rbac-proxy/0.log" Oct 12 07:20:48 crc kubenswrapper[4930]: I1012 07:20:48.435233 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-dd89b_61b8c651-74f8-4532-9554-55b01e58c0e6/speaker/0.log" Oct 12 07:20:48 crc kubenswrapper[4930]: I1012 07:20:48.944587 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qv5cn_c9aa3dfb-870d-4246-96a2-4c52c470239f/frr/0.log" Oct 12 07:20:49 crc kubenswrapper[4930]: I1012 07:20:49.136196 4930 scope.go:117] "RemoveContainer" containerID="0d8022778327009e45bab53498f81e9a59c7470864854500a94947d41f09cf19" Oct 12 07:20:49 crc kubenswrapper[4930]: E1012 07:20:49.136488 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 07:21:00 crc kubenswrapper[4930]: I1012 07:21:00.135209 4930 scope.go:117] "RemoveContainer" containerID="0d8022778327009e45bab53498f81e9a59c7470864854500a94947d41f09cf19" Oct 12 07:21:00 crc kubenswrapper[4930]: E1012 07:21:00.136177 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 07:21:01 crc kubenswrapper[4930]: I1012 07:21:01.946281 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qfplm_e54ad199-e123-435c-a125-b662c577a86b/util/0.log" Oct 12 07:21:02 crc kubenswrapper[4930]: I1012 07:21:02.072188 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qfplm_e54ad199-e123-435c-a125-b662c577a86b/util/0.log" Oct 12 07:21:02 crc kubenswrapper[4930]: I1012 07:21:02.121450 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qfplm_e54ad199-e123-435c-a125-b662c577a86b/pull/0.log" Oct 12 07:21:02 crc kubenswrapper[4930]: I1012 07:21:02.165969 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qfplm_e54ad199-e123-435c-a125-b662c577a86b/pull/0.log" Oct 12 07:21:02 crc kubenswrapper[4930]: I1012 07:21:02.294554 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qfplm_e54ad199-e123-435c-a125-b662c577a86b/util/0.log" Oct 12 07:21:02 crc kubenswrapper[4930]: I1012 07:21:02.317242 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qfplm_e54ad199-e123-435c-a125-b662c577a86b/extract/0.log" Oct 12 07:21:02 crc kubenswrapper[4930]: I1012 07:21:02.325481 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qfplm_e54ad199-e123-435c-a125-b662c577a86b/pull/0.log" Oct 12 07:21:02 crc kubenswrapper[4930]: I1012 07:21:02.456710 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddrpfq_f143a61d-685c-4ee5-b95c-f7fae137d0bc/util/0.log" Oct 12 07:21:02 crc kubenswrapper[4930]: I1012 07:21:02.616042 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddrpfq_f143a61d-685c-4ee5-b95c-f7fae137d0bc/pull/0.log" Oct 12 07:21:02 crc kubenswrapper[4930]: I1012 07:21:02.625360 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddrpfq_f143a61d-685c-4ee5-b95c-f7fae137d0bc/util/0.log" Oct 12 07:21:02 crc kubenswrapper[4930]: I1012 07:21:02.666321 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddrpfq_f143a61d-685c-4ee5-b95c-f7fae137d0bc/pull/0.log" Oct 12 07:21:02 crc kubenswrapper[4930]: I1012 07:21:02.785305 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddrpfq_f143a61d-685c-4ee5-b95c-f7fae137d0bc/util/0.log" Oct 12 07:21:02 crc kubenswrapper[4930]: I1012 07:21:02.815090 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddrpfq_f143a61d-685c-4ee5-b95c-f7fae137d0bc/pull/0.log" Oct 12 07:21:02 crc kubenswrapper[4930]: I1012 07:21:02.833355 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2ddrpfq_f143a61d-685c-4ee5-b95c-f7fae137d0bc/extract/0.log" Oct 12 07:21:02 crc kubenswrapper[4930]: I1012 07:21:02.987522 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jwl4b_051544d2-1a7f-4cb4-8b3e-c6ffeb23732f/extract-utilities/0.log" Oct 12 07:21:03 crc kubenswrapper[4930]: I1012 07:21:03.146659 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jwl4b_051544d2-1a7f-4cb4-8b3e-c6ffeb23732f/extract-utilities/0.log" Oct 12 07:21:03 crc kubenswrapper[4930]: I1012 07:21:03.148333 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jwl4b_051544d2-1a7f-4cb4-8b3e-c6ffeb23732f/extract-content/0.log" Oct 12 07:21:03 crc kubenswrapper[4930]: I1012 07:21:03.183592 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jwl4b_051544d2-1a7f-4cb4-8b3e-c6ffeb23732f/extract-content/0.log" Oct 12 07:21:03 crc kubenswrapper[4930]: I1012 07:21:03.337322 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jwl4b_051544d2-1a7f-4cb4-8b3e-c6ffeb23732f/extract-utilities/0.log" Oct 12 07:21:03 crc kubenswrapper[4930]: I1012 07:21:03.433373 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jwl4b_051544d2-1a7f-4cb4-8b3e-c6ffeb23732f/extract-content/0.log" Oct 12 07:21:03 crc kubenswrapper[4930]: I1012 07:21:03.529486 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6d7zn_732507c6-d79c-4b3e-a0fa-23111ec6b02b/extract-utilities/0.log" Oct 12 07:21:03 crc kubenswrapper[4930]: I1012 07:21:03.785104 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6d7zn_732507c6-d79c-4b3e-a0fa-23111ec6b02b/extract-content/0.log" Oct 12 07:21:03 crc kubenswrapper[4930]: I1012 07:21:03.817094 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6d7zn_732507c6-d79c-4b3e-a0fa-23111ec6b02b/extract-content/0.log" Oct 12 07:21:03 crc kubenswrapper[4930]: I1012 07:21:03.821289 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6d7zn_732507c6-d79c-4b3e-a0fa-23111ec6b02b/extract-utilities/0.log" Oct 12 07:21:03 crc kubenswrapper[4930]: I1012 07:21:03.939053 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jwl4b_051544d2-1a7f-4cb4-8b3e-c6ffeb23732f/registry-server/0.log" Oct 12 07:21:04 crc kubenswrapper[4930]: I1012 07:21:04.007429 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6d7zn_732507c6-d79c-4b3e-a0fa-23111ec6b02b/extract-content/0.log" Oct 12 07:21:04 crc kubenswrapper[4930]: I1012 07:21:04.027853 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6d7zn_732507c6-d79c-4b3e-a0fa-23111ec6b02b/extract-utilities/0.log" Oct 12 07:21:04 crc kubenswrapper[4930]: I1012 07:21:04.240644 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c94dtk_f643d105-d04a-48e4-b956-27fa864a1542/util/0.log" Oct 12 07:21:04 crc kubenswrapper[4930]: I1012 07:21:04.457824 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c94dtk_f643d105-d04a-48e4-b956-27fa864a1542/pull/0.log" Oct 12 07:21:04 crc kubenswrapper[4930]: I1012 07:21:04.463529 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c94dtk_f643d105-d04a-48e4-b956-27fa864a1542/util/0.log" Oct 12 07:21:04 crc kubenswrapper[4930]: I1012 07:21:04.492126 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c94dtk_f643d105-d04a-48e4-b956-27fa864a1542/pull/0.log" Oct 12 07:21:04 crc kubenswrapper[4930]: I1012 07:21:04.667536 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6d7zn_732507c6-d79c-4b3e-a0fa-23111ec6b02b/registry-server/0.log" Oct 12 07:21:04 crc kubenswrapper[4930]: I1012 07:21:04.690219 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c94dtk_f643d105-d04a-48e4-b956-27fa864a1542/util/0.log" Oct 12 07:21:04 crc kubenswrapper[4930]: I1012 07:21:04.692913 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c94dtk_f643d105-d04a-48e4-b956-27fa864a1542/pull/0.log" Oct 12 07:21:04 crc kubenswrapper[4930]: I1012 07:21:04.726954 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c94dtk_f643d105-d04a-48e4-b956-27fa864a1542/extract/0.log" Oct 12 07:21:04 crc kubenswrapper[4930]: I1012 07:21:04.852668 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-c9wvx_00ab86fc-c038-4dc6-aaf5-6eac7c953d24/marketplace-operator/0.log" Oct 12 07:21:04 crc kubenswrapper[4930]: I1012 07:21:04.885671 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m4lsn_0b85acba-bed0-4e85-8252-3972368d611c/extract-utilities/0.log" Oct 12 07:21:05 crc kubenswrapper[4930]: I1012 07:21:05.080341 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m4lsn_0b85acba-bed0-4e85-8252-3972368d611c/extract-utilities/0.log" Oct 12 07:21:05 crc kubenswrapper[4930]: I1012 07:21:05.097639 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m4lsn_0b85acba-bed0-4e85-8252-3972368d611c/extract-content/0.log" Oct 12 07:21:05 crc kubenswrapper[4930]: I1012 07:21:05.104605 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m4lsn_0b85acba-bed0-4e85-8252-3972368d611c/extract-content/0.log" Oct 12 07:21:05 crc kubenswrapper[4930]: I1012 07:21:05.224924 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m4lsn_0b85acba-bed0-4e85-8252-3972368d611c/extract-utilities/0.log" Oct 12 07:21:05 crc kubenswrapper[4930]: I1012 07:21:05.288281 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m4lsn_0b85acba-bed0-4e85-8252-3972368d611c/extract-content/0.log" Oct 12 07:21:05 crc kubenswrapper[4930]: I1012 07:21:05.304005 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-tzfd6_d98b1ef4-3cf1-4432-8b03-0c9d7118248d/extract-utilities/0.log" Oct 12 07:21:05 crc kubenswrapper[4930]: I1012 07:21:05.425543 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m4lsn_0b85acba-bed0-4e85-8252-3972368d611c/registry-server/0.log" Oct 12 07:21:05 crc kubenswrapper[4930]: I1012 07:21:05.443890 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-tzfd6_d98b1ef4-3cf1-4432-8b03-0c9d7118248d/extract-utilities/0.log" Oct 12 07:21:05 crc kubenswrapper[4930]: I1012 07:21:05.473915 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-tzfd6_d98b1ef4-3cf1-4432-8b03-0c9d7118248d/extract-content/0.log" Oct 12 07:21:05 crc kubenswrapper[4930]: I1012 07:21:05.497010 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-tzfd6_d98b1ef4-3cf1-4432-8b03-0c9d7118248d/extract-content/0.log" Oct 12 07:21:05 crc kubenswrapper[4930]: I1012 07:21:05.712838 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-tzfd6_d98b1ef4-3cf1-4432-8b03-0c9d7118248d/extract-content/0.log" Oct 12 07:21:05 crc kubenswrapper[4930]: I1012 07:21:05.746278 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-tzfd6_d98b1ef4-3cf1-4432-8b03-0c9d7118248d/extract-utilities/0.log" Oct 12 07:21:06 crc kubenswrapper[4930]: I1012 07:21:06.383072 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-tzfd6_d98b1ef4-3cf1-4432-8b03-0c9d7118248d/registry-server/0.log" Oct 12 07:21:11 crc kubenswrapper[4930]: I1012 07:21:11.136994 4930 scope.go:117] "RemoveContainer" containerID="0d8022778327009e45bab53498f81e9a59c7470864854500a94947d41f09cf19" Oct 12 07:21:11 crc kubenswrapper[4930]: E1012 07:21:11.138332 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 07:21:18 crc kubenswrapper[4930]: I1012 07:21:18.752103 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-7c8cf85677-v4gns_36fc8dbb-9393-4ad2-a475-7933483eef61/prometheus-operator/0.log" Oct 12 07:21:18 crc kubenswrapper[4930]: I1012 07:21:18.922320 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-78c9d5c958-576bf_1373019b-d435-40a9-8551-11fb23298b48/prometheus-operator-admission-webhook/0.log" Oct 12 07:21:18 crc kubenswrapper[4930]: I1012 07:21:18.991579 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-78c9d5c958-fbq82_828b5980-9511-4284-a5f4-4197242fef19/prometheus-operator-admission-webhook/0.log" Oct 12 07:21:19 crc kubenswrapper[4930]: I1012 07:21:19.090550 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-cc5f78dfc-tfp9w_cd5c9b39-afff-488d-9bc7-875c644a6975/operator/0.log" Oct 12 07:21:19 crc kubenswrapper[4930]: I1012 07:21:19.139533 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-54bc95c9fb-sxjbx_71965cf6-b3c6-4e30-8771-eaad927fcc46/perses-operator/0.log" Oct 12 07:21:25 crc kubenswrapper[4930]: I1012 07:21:25.136141 4930 scope.go:117] "RemoveContainer" containerID="0d8022778327009e45bab53498f81e9a59c7470864854500a94947d41f09cf19" Oct 12 07:21:25 crc kubenswrapper[4930]: E1012 07:21:25.136858 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 07:21:29 crc kubenswrapper[4930]: I1012 07:21:29.537888 4930 scope.go:117] "RemoveContainer" containerID="b013507b7386012bc802428e74454ef806ece59306d7a6a9e5925754ec4d71c6" Oct 12 07:21:29 crc kubenswrapper[4930]: I1012 07:21:29.571451 4930 scope.go:117] "RemoveContainer" containerID="23acea8863aeb87398fac2ba8b368643ab8f4adbe90bf9f31e51d4102e26daff" Oct 12 07:21:29 crc kubenswrapper[4930]: I1012 07:21:29.596046 4930 scope.go:117] "RemoveContainer" containerID="d3f1e122d7c2a5375243c65981b42d1a15f9a57c1ccb353c7c440ed04a83385e" Oct 12 07:21:36 crc kubenswrapper[4930]: E1012 07:21:36.945484 4930 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.111:39008->38.102.83.111:46517: write tcp 38.102.83.111:39008->38.102.83.111:46517: write: broken pipe Oct 12 07:21:40 crc kubenswrapper[4930]: I1012 07:21:40.136946 4930 scope.go:117] "RemoveContainer" containerID="0d8022778327009e45bab53498f81e9a59c7470864854500a94947d41f09cf19" Oct 12 07:21:40 crc kubenswrapper[4930]: E1012 07:21:40.138119 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 07:21:43 crc kubenswrapper[4930]: I1012 07:21:43.070719 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pgxls"] Oct 12 07:21:43 crc kubenswrapper[4930]: E1012 07:21:43.071451 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d00a6732-bb8b-477a-b6f6-9a1d3e37fd62" containerName="container-00" Oct 12 07:21:43 crc kubenswrapper[4930]: I1012 07:21:43.071465 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="d00a6732-bb8b-477a-b6f6-9a1d3e37fd62" containerName="container-00" Oct 12 07:21:43 crc kubenswrapper[4930]: I1012 07:21:43.071708 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="d00a6732-bb8b-477a-b6f6-9a1d3e37fd62" containerName="container-00" Oct 12 07:21:43 crc kubenswrapper[4930]: I1012 07:21:43.073172 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pgxls" Oct 12 07:21:43 crc kubenswrapper[4930]: I1012 07:21:43.093569 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pgxls"] Oct 12 07:21:43 crc kubenswrapper[4930]: I1012 07:21:43.189089 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnzdt\" (UniqueName: \"kubernetes.io/projected/ac8f92a7-13a2-4de8-b9d2-c32e518a7ea1-kube-api-access-nnzdt\") pod \"redhat-marketplace-pgxls\" (UID: \"ac8f92a7-13a2-4de8-b9d2-c32e518a7ea1\") " pod="openshift-marketplace/redhat-marketplace-pgxls" Oct 12 07:21:43 crc kubenswrapper[4930]: I1012 07:21:43.189387 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac8f92a7-13a2-4de8-b9d2-c32e518a7ea1-catalog-content\") pod \"redhat-marketplace-pgxls\" (UID: \"ac8f92a7-13a2-4de8-b9d2-c32e518a7ea1\") " pod="openshift-marketplace/redhat-marketplace-pgxls" Oct 12 07:21:43 crc kubenswrapper[4930]: I1012 07:21:43.189470 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac8f92a7-13a2-4de8-b9d2-c32e518a7ea1-utilities\") pod \"redhat-marketplace-pgxls\" (UID: \"ac8f92a7-13a2-4de8-b9d2-c32e518a7ea1\") " pod="openshift-marketplace/redhat-marketplace-pgxls" Oct 12 07:21:43 crc kubenswrapper[4930]: I1012 07:21:43.291892 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnzdt\" (UniqueName: \"kubernetes.io/projected/ac8f92a7-13a2-4de8-b9d2-c32e518a7ea1-kube-api-access-nnzdt\") pod \"redhat-marketplace-pgxls\" (UID: \"ac8f92a7-13a2-4de8-b9d2-c32e518a7ea1\") " pod="openshift-marketplace/redhat-marketplace-pgxls" Oct 12 07:21:43 crc kubenswrapper[4930]: I1012 07:21:43.291967 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac8f92a7-13a2-4de8-b9d2-c32e518a7ea1-catalog-content\") pod \"redhat-marketplace-pgxls\" (UID: \"ac8f92a7-13a2-4de8-b9d2-c32e518a7ea1\") " pod="openshift-marketplace/redhat-marketplace-pgxls" Oct 12 07:21:43 crc kubenswrapper[4930]: I1012 07:21:43.292093 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac8f92a7-13a2-4de8-b9d2-c32e518a7ea1-utilities\") pod \"redhat-marketplace-pgxls\" (UID: \"ac8f92a7-13a2-4de8-b9d2-c32e518a7ea1\") " pod="openshift-marketplace/redhat-marketplace-pgxls" Oct 12 07:21:43 crc kubenswrapper[4930]: I1012 07:21:43.292536 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac8f92a7-13a2-4de8-b9d2-c32e518a7ea1-catalog-content\") pod \"redhat-marketplace-pgxls\" (UID: \"ac8f92a7-13a2-4de8-b9d2-c32e518a7ea1\") " pod="openshift-marketplace/redhat-marketplace-pgxls" Oct 12 07:21:43 crc kubenswrapper[4930]: I1012 07:21:43.292915 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac8f92a7-13a2-4de8-b9d2-c32e518a7ea1-utilities\") pod \"redhat-marketplace-pgxls\" (UID: \"ac8f92a7-13a2-4de8-b9d2-c32e518a7ea1\") " pod="openshift-marketplace/redhat-marketplace-pgxls" Oct 12 07:21:43 crc kubenswrapper[4930]: I1012 07:21:43.323238 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnzdt\" (UniqueName: \"kubernetes.io/projected/ac8f92a7-13a2-4de8-b9d2-c32e518a7ea1-kube-api-access-nnzdt\") pod \"redhat-marketplace-pgxls\" (UID: \"ac8f92a7-13a2-4de8-b9d2-c32e518a7ea1\") " pod="openshift-marketplace/redhat-marketplace-pgxls" Oct 12 07:21:43 crc kubenswrapper[4930]: I1012 07:21:43.403936 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pgxls" Oct 12 07:21:43 crc kubenswrapper[4930]: I1012 07:21:43.979977 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pgxls"] Oct 12 07:21:43 crc kubenswrapper[4930]: W1012 07:21:43.981355 4930 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac8f92a7_13a2_4de8_b9d2_c32e518a7ea1.slice/crio-ad69e5cdc83286e156a13ffd07cea5a48918bae74808249ee9cafd79fb971475 WatchSource:0}: Error finding container ad69e5cdc83286e156a13ffd07cea5a48918bae74808249ee9cafd79fb971475: Status 404 returned error can't find the container with id ad69e5cdc83286e156a13ffd07cea5a48918bae74808249ee9cafd79fb971475 Oct 12 07:21:44 crc kubenswrapper[4930]: I1012 07:21:44.105626 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pgxls" event={"ID":"ac8f92a7-13a2-4de8-b9d2-c32e518a7ea1","Type":"ContainerStarted","Data":"ad69e5cdc83286e156a13ffd07cea5a48918bae74808249ee9cafd79fb971475"} Oct 12 07:21:45 crc kubenswrapper[4930]: I1012 07:21:45.121327 4930 generic.go:334] "Generic (PLEG): container finished" podID="ac8f92a7-13a2-4de8-b9d2-c32e518a7ea1" containerID="6ed2589a68101e3a503180c761d39cfe84517a8850c27384095e73184fff08c0" exitCode=0 Oct 12 07:21:45 crc kubenswrapper[4930]: I1012 07:21:45.121455 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pgxls" event={"ID":"ac8f92a7-13a2-4de8-b9d2-c32e518a7ea1","Type":"ContainerDied","Data":"6ed2589a68101e3a503180c761d39cfe84517a8850c27384095e73184fff08c0"} Oct 12 07:21:45 crc kubenswrapper[4930]: I1012 07:21:45.124764 4930 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 12 07:21:46 crc kubenswrapper[4930]: I1012 07:21:46.155058 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pgxls" event={"ID":"ac8f92a7-13a2-4de8-b9d2-c32e518a7ea1","Type":"ContainerStarted","Data":"b1028d8dd7520b52b16ffb57547dcafa81e84833d3ceca3940c251c557199f16"} Oct 12 07:21:47 crc kubenswrapper[4930]: I1012 07:21:47.172591 4930 generic.go:334] "Generic (PLEG): container finished" podID="ac8f92a7-13a2-4de8-b9d2-c32e518a7ea1" containerID="b1028d8dd7520b52b16ffb57547dcafa81e84833d3ceca3940c251c557199f16" exitCode=0 Oct 12 07:21:47 crc kubenswrapper[4930]: I1012 07:21:47.172693 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pgxls" event={"ID":"ac8f92a7-13a2-4de8-b9d2-c32e518a7ea1","Type":"ContainerDied","Data":"b1028d8dd7520b52b16ffb57547dcafa81e84833d3ceca3940c251c557199f16"} Oct 12 07:21:48 crc kubenswrapper[4930]: I1012 07:21:48.182333 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pgxls" event={"ID":"ac8f92a7-13a2-4de8-b9d2-c32e518a7ea1","Type":"ContainerStarted","Data":"513b3fd591321063a88cef0b77e263c85248b4909232fc0be2429643c7648926"} Oct 12 07:21:48 crc kubenswrapper[4930]: I1012 07:21:48.199587 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pgxls" podStartSLOduration=2.701443272 podStartE2EDuration="5.199570046s" podCreationTimestamp="2025-10-12 07:21:43 +0000 UTC" firstStartedPulling="2025-10-12 07:21:45.12439071 +0000 UTC m=+6037.666492485" lastFinishedPulling="2025-10-12 07:21:47.622517464 +0000 UTC m=+6040.164619259" observedRunningTime="2025-10-12 07:21:48.197551906 +0000 UTC m=+6040.739653671" watchObservedRunningTime="2025-10-12 07:21:48.199570046 +0000 UTC m=+6040.741671811" Oct 12 07:21:53 crc kubenswrapper[4930]: I1012 07:21:53.404188 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pgxls" Oct 12 07:21:53 crc kubenswrapper[4930]: I1012 07:21:53.404866 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pgxls" Oct 12 07:21:53 crc kubenswrapper[4930]: I1012 07:21:53.477542 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pgxls" Oct 12 07:21:54 crc kubenswrapper[4930]: I1012 07:21:54.360214 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pgxls" Oct 12 07:21:54 crc kubenswrapper[4930]: I1012 07:21:54.410955 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pgxls"] Oct 12 07:21:55 crc kubenswrapper[4930]: I1012 07:21:55.136452 4930 scope.go:117] "RemoveContainer" containerID="0d8022778327009e45bab53498f81e9a59c7470864854500a94947d41f09cf19" Oct 12 07:21:55 crc kubenswrapper[4930]: E1012 07:21:55.137345 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 07:21:56 crc kubenswrapper[4930]: I1012 07:21:56.310249 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pgxls" podUID="ac8f92a7-13a2-4de8-b9d2-c32e518a7ea1" containerName="registry-server" containerID="cri-o://513b3fd591321063a88cef0b77e263c85248b4909232fc0be2429643c7648926" gracePeriod=2 Oct 12 07:21:56 crc kubenswrapper[4930]: I1012 07:21:56.852230 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pgxls" Oct 12 07:21:56 crc kubenswrapper[4930]: I1012 07:21:56.950173 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac8f92a7-13a2-4de8-b9d2-c32e518a7ea1-utilities\") pod \"ac8f92a7-13a2-4de8-b9d2-c32e518a7ea1\" (UID: \"ac8f92a7-13a2-4de8-b9d2-c32e518a7ea1\") " Oct 12 07:21:56 crc kubenswrapper[4930]: I1012 07:21:56.950383 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nnzdt\" (UniqueName: \"kubernetes.io/projected/ac8f92a7-13a2-4de8-b9d2-c32e518a7ea1-kube-api-access-nnzdt\") pod \"ac8f92a7-13a2-4de8-b9d2-c32e518a7ea1\" (UID: \"ac8f92a7-13a2-4de8-b9d2-c32e518a7ea1\") " Oct 12 07:21:56 crc kubenswrapper[4930]: I1012 07:21:56.950625 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac8f92a7-13a2-4de8-b9d2-c32e518a7ea1-catalog-content\") pod \"ac8f92a7-13a2-4de8-b9d2-c32e518a7ea1\" (UID: \"ac8f92a7-13a2-4de8-b9d2-c32e518a7ea1\") " Oct 12 07:21:56 crc kubenswrapper[4930]: I1012 07:21:56.951140 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac8f92a7-13a2-4de8-b9d2-c32e518a7ea1-utilities" (OuterVolumeSpecName: "utilities") pod "ac8f92a7-13a2-4de8-b9d2-c32e518a7ea1" (UID: "ac8f92a7-13a2-4de8-b9d2-c32e518a7ea1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:21:56 crc kubenswrapper[4930]: I1012 07:21:56.959749 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac8f92a7-13a2-4de8-b9d2-c32e518a7ea1-kube-api-access-nnzdt" (OuterVolumeSpecName: "kube-api-access-nnzdt") pod "ac8f92a7-13a2-4de8-b9d2-c32e518a7ea1" (UID: "ac8f92a7-13a2-4de8-b9d2-c32e518a7ea1"). InnerVolumeSpecName "kube-api-access-nnzdt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:21:56 crc kubenswrapper[4930]: I1012 07:21:56.963441 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac8f92a7-13a2-4de8-b9d2-c32e518a7ea1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ac8f92a7-13a2-4de8-b9d2-c32e518a7ea1" (UID: "ac8f92a7-13a2-4de8-b9d2-c32e518a7ea1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:21:57 crc kubenswrapper[4930]: I1012 07:21:57.052788 4930 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac8f92a7-13a2-4de8-b9d2-c32e518a7ea1-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 07:21:57 crc kubenswrapper[4930]: I1012 07:21:57.052824 4930 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac8f92a7-13a2-4de8-b9d2-c32e518a7ea1-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 07:21:57 crc kubenswrapper[4930]: I1012 07:21:57.052870 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nnzdt\" (UniqueName: \"kubernetes.io/projected/ac8f92a7-13a2-4de8-b9d2-c32e518a7ea1-kube-api-access-nnzdt\") on node \"crc\" DevicePath \"\"" Oct 12 07:21:57 crc kubenswrapper[4930]: I1012 07:21:57.328199 4930 generic.go:334] "Generic (PLEG): container finished" podID="ac8f92a7-13a2-4de8-b9d2-c32e518a7ea1" containerID="513b3fd591321063a88cef0b77e263c85248b4909232fc0be2429643c7648926" exitCode=0 Oct 12 07:21:57 crc kubenswrapper[4930]: I1012 07:21:57.328249 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pgxls" event={"ID":"ac8f92a7-13a2-4de8-b9d2-c32e518a7ea1","Type":"ContainerDied","Data":"513b3fd591321063a88cef0b77e263c85248b4909232fc0be2429643c7648926"} Oct 12 07:21:57 crc kubenswrapper[4930]: I1012 07:21:57.328282 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pgxls" event={"ID":"ac8f92a7-13a2-4de8-b9d2-c32e518a7ea1","Type":"ContainerDied","Data":"ad69e5cdc83286e156a13ffd07cea5a48918bae74808249ee9cafd79fb971475"} Oct 12 07:21:57 crc kubenswrapper[4930]: I1012 07:21:57.328292 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pgxls" Oct 12 07:21:57 crc kubenswrapper[4930]: I1012 07:21:57.328303 4930 scope.go:117] "RemoveContainer" containerID="513b3fd591321063a88cef0b77e263c85248b4909232fc0be2429643c7648926" Oct 12 07:21:57 crc kubenswrapper[4930]: I1012 07:21:57.353913 4930 scope.go:117] "RemoveContainer" containerID="b1028d8dd7520b52b16ffb57547dcafa81e84833d3ceca3940c251c557199f16" Oct 12 07:21:57 crc kubenswrapper[4930]: I1012 07:21:57.388227 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pgxls"] Oct 12 07:21:57 crc kubenswrapper[4930]: I1012 07:21:57.394390 4930 scope.go:117] "RemoveContainer" containerID="6ed2589a68101e3a503180c761d39cfe84517a8850c27384095e73184fff08c0" Oct 12 07:21:57 crc kubenswrapper[4930]: I1012 07:21:57.406881 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pgxls"] Oct 12 07:21:57 crc kubenswrapper[4930]: I1012 07:21:57.435579 4930 scope.go:117] "RemoveContainer" containerID="513b3fd591321063a88cef0b77e263c85248b4909232fc0be2429643c7648926" Oct 12 07:21:57 crc kubenswrapper[4930]: E1012 07:21:57.436040 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"513b3fd591321063a88cef0b77e263c85248b4909232fc0be2429643c7648926\": container with ID starting with 513b3fd591321063a88cef0b77e263c85248b4909232fc0be2429643c7648926 not found: ID does not exist" containerID="513b3fd591321063a88cef0b77e263c85248b4909232fc0be2429643c7648926" Oct 12 07:21:57 crc kubenswrapper[4930]: I1012 07:21:57.436101 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"513b3fd591321063a88cef0b77e263c85248b4909232fc0be2429643c7648926"} err="failed to get container status \"513b3fd591321063a88cef0b77e263c85248b4909232fc0be2429643c7648926\": rpc error: code = NotFound desc = could not find container \"513b3fd591321063a88cef0b77e263c85248b4909232fc0be2429643c7648926\": container with ID starting with 513b3fd591321063a88cef0b77e263c85248b4909232fc0be2429643c7648926 not found: ID does not exist" Oct 12 07:21:57 crc kubenswrapper[4930]: I1012 07:21:57.436124 4930 scope.go:117] "RemoveContainer" containerID="b1028d8dd7520b52b16ffb57547dcafa81e84833d3ceca3940c251c557199f16" Oct 12 07:21:57 crc kubenswrapper[4930]: E1012 07:21:57.436698 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1028d8dd7520b52b16ffb57547dcafa81e84833d3ceca3940c251c557199f16\": container with ID starting with b1028d8dd7520b52b16ffb57547dcafa81e84833d3ceca3940c251c557199f16 not found: ID does not exist" containerID="b1028d8dd7520b52b16ffb57547dcafa81e84833d3ceca3940c251c557199f16" Oct 12 07:21:57 crc kubenswrapper[4930]: I1012 07:21:57.436989 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1028d8dd7520b52b16ffb57547dcafa81e84833d3ceca3940c251c557199f16"} err="failed to get container status \"b1028d8dd7520b52b16ffb57547dcafa81e84833d3ceca3940c251c557199f16\": rpc error: code = NotFound desc = could not find container \"b1028d8dd7520b52b16ffb57547dcafa81e84833d3ceca3940c251c557199f16\": container with ID starting with b1028d8dd7520b52b16ffb57547dcafa81e84833d3ceca3940c251c557199f16 not found: ID does not exist" Oct 12 07:21:57 crc kubenswrapper[4930]: I1012 07:21:57.437211 4930 scope.go:117] "RemoveContainer" containerID="6ed2589a68101e3a503180c761d39cfe84517a8850c27384095e73184fff08c0" Oct 12 07:21:57 crc kubenswrapper[4930]: E1012 07:21:57.437925 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ed2589a68101e3a503180c761d39cfe84517a8850c27384095e73184fff08c0\": container with ID starting with 6ed2589a68101e3a503180c761d39cfe84517a8850c27384095e73184fff08c0 not found: ID does not exist" containerID="6ed2589a68101e3a503180c761d39cfe84517a8850c27384095e73184fff08c0" Oct 12 07:21:57 crc kubenswrapper[4930]: I1012 07:21:57.437975 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ed2589a68101e3a503180c761d39cfe84517a8850c27384095e73184fff08c0"} err="failed to get container status \"6ed2589a68101e3a503180c761d39cfe84517a8850c27384095e73184fff08c0\": rpc error: code = NotFound desc = could not find container \"6ed2589a68101e3a503180c761d39cfe84517a8850c27384095e73184fff08c0\": container with ID starting with 6ed2589a68101e3a503180c761d39cfe84517a8850c27384095e73184fff08c0 not found: ID does not exist" Oct 12 07:21:58 crc kubenswrapper[4930]: I1012 07:21:58.155531 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac8f92a7-13a2-4de8-b9d2-c32e518a7ea1" path="/var/lib/kubelet/pods/ac8f92a7-13a2-4de8-b9d2-c32e518a7ea1/volumes" Oct 12 07:22:07 crc kubenswrapper[4930]: I1012 07:22:07.135498 4930 scope.go:117] "RemoveContainer" containerID="0d8022778327009e45bab53498f81e9a59c7470864854500a94947d41f09cf19" Oct 12 07:22:07 crc kubenswrapper[4930]: E1012 07:22:07.136699 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 07:22:19 crc kubenswrapper[4930]: I1012 07:22:19.137640 4930 scope.go:117] "RemoveContainer" containerID="0d8022778327009e45bab53498f81e9a59c7470864854500a94947d41f09cf19" Oct 12 07:22:19 crc kubenswrapper[4930]: E1012 07:22:19.138652 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 07:22:30 crc kubenswrapper[4930]: I1012 07:22:30.135998 4930 scope.go:117] "RemoveContainer" containerID="0d8022778327009e45bab53498f81e9a59c7470864854500a94947d41f09cf19" Oct 12 07:22:30 crc kubenswrapper[4930]: E1012 07:22:30.139630 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 07:22:43 crc kubenswrapper[4930]: I1012 07:22:43.136817 4930 scope.go:117] "RemoveContainer" containerID="0d8022778327009e45bab53498f81e9a59c7470864854500a94947d41f09cf19" Oct 12 07:22:43 crc kubenswrapper[4930]: E1012 07:22:43.138975 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 07:22:55 crc kubenswrapper[4930]: I1012 07:22:55.136252 4930 scope.go:117] "RemoveContainer" containerID="0d8022778327009e45bab53498f81e9a59c7470864854500a94947d41f09cf19" Oct 12 07:22:55 crc kubenswrapper[4930]: E1012 07:22:55.137104 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 07:22:55 crc kubenswrapper[4930]: I1012 07:22:55.677256 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-x2jmv"] Oct 12 07:22:55 crc kubenswrapper[4930]: E1012 07:22:55.677689 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac8f92a7-13a2-4de8-b9d2-c32e518a7ea1" containerName="extract-content" Oct 12 07:22:55 crc kubenswrapper[4930]: I1012 07:22:55.677709 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac8f92a7-13a2-4de8-b9d2-c32e518a7ea1" containerName="extract-content" Oct 12 07:22:55 crc kubenswrapper[4930]: E1012 07:22:55.677887 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac8f92a7-13a2-4de8-b9d2-c32e518a7ea1" containerName="extract-utilities" Oct 12 07:22:55 crc kubenswrapper[4930]: I1012 07:22:55.677899 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac8f92a7-13a2-4de8-b9d2-c32e518a7ea1" containerName="extract-utilities" Oct 12 07:22:55 crc kubenswrapper[4930]: E1012 07:22:55.677916 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac8f92a7-13a2-4de8-b9d2-c32e518a7ea1" containerName="registry-server" Oct 12 07:22:55 crc kubenswrapper[4930]: I1012 07:22:55.677924 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac8f92a7-13a2-4de8-b9d2-c32e518a7ea1" containerName="registry-server" Oct 12 07:22:55 crc kubenswrapper[4930]: I1012 07:22:55.678194 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac8f92a7-13a2-4de8-b9d2-c32e518a7ea1" containerName="registry-server" Oct 12 07:22:55 crc kubenswrapper[4930]: I1012 07:22:55.679868 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x2jmv" Oct 12 07:22:55 crc kubenswrapper[4930]: I1012 07:22:55.695084 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-x2jmv"] Oct 12 07:22:55 crc kubenswrapper[4930]: I1012 07:22:55.748260 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94b9559e-4bad-4534-8fa3-b25a6bc82e58-catalog-content\") pod \"community-operators-x2jmv\" (UID: \"94b9559e-4bad-4534-8fa3-b25a6bc82e58\") " pod="openshift-marketplace/community-operators-x2jmv" Oct 12 07:22:55 crc kubenswrapper[4930]: I1012 07:22:55.748388 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94b9559e-4bad-4534-8fa3-b25a6bc82e58-utilities\") pod \"community-operators-x2jmv\" (UID: \"94b9559e-4bad-4534-8fa3-b25a6bc82e58\") " pod="openshift-marketplace/community-operators-x2jmv" Oct 12 07:22:55 crc kubenswrapper[4930]: I1012 07:22:55.748439 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlz6j\" (UniqueName: \"kubernetes.io/projected/94b9559e-4bad-4534-8fa3-b25a6bc82e58-kube-api-access-dlz6j\") pod \"community-operators-x2jmv\" (UID: \"94b9559e-4bad-4534-8fa3-b25a6bc82e58\") " pod="openshift-marketplace/community-operators-x2jmv" Oct 12 07:22:55 crc kubenswrapper[4930]: I1012 07:22:55.849700 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94b9559e-4bad-4534-8fa3-b25a6bc82e58-catalog-content\") pod \"community-operators-x2jmv\" (UID: \"94b9559e-4bad-4534-8fa3-b25a6bc82e58\") " pod="openshift-marketplace/community-operators-x2jmv" Oct 12 07:22:55 crc kubenswrapper[4930]: I1012 07:22:55.850401 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94b9559e-4bad-4534-8fa3-b25a6bc82e58-utilities\") pod \"community-operators-x2jmv\" (UID: \"94b9559e-4bad-4534-8fa3-b25a6bc82e58\") " pod="openshift-marketplace/community-operators-x2jmv" Oct 12 07:22:55 crc kubenswrapper[4930]: I1012 07:22:55.850806 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlz6j\" (UniqueName: \"kubernetes.io/projected/94b9559e-4bad-4534-8fa3-b25a6bc82e58-kube-api-access-dlz6j\") pod \"community-operators-x2jmv\" (UID: \"94b9559e-4bad-4534-8fa3-b25a6bc82e58\") " pod="openshift-marketplace/community-operators-x2jmv" Oct 12 07:22:55 crc kubenswrapper[4930]: I1012 07:22:55.850753 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94b9559e-4bad-4534-8fa3-b25a6bc82e58-utilities\") pod \"community-operators-x2jmv\" (UID: \"94b9559e-4bad-4534-8fa3-b25a6bc82e58\") " pod="openshift-marketplace/community-operators-x2jmv" Oct 12 07:22:55 crc kubenswrapper[4930]: I1012 07:22:55.850309 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94b9559e-4bad-4534-8fa3-b25a6bc82e58-catalog-content\") pod \"community-operators-x2jmv\" (UID: \"94b9559e-4bad-4534-8fa3-b25a6bc82e58\") " pod="openshift-marketplace/community-operators-x2jmv" Oct 12 07:22:55 crc kubenswrapper[4930]: I1012 07:22:55.871490 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlz6j\" (UniqueName: \"kubernetes.io/projected/94b9559e-4bad-4534-8fa3-b25a6bc82e58-kube-api-access-dlz6j\") pod \"community-operators-x2jmv\" (UID: \"94b9559e-4bad-4534-8fa3-b25a6bc82e58\") " pod="openshift-marketplace/community-operators-x2jmv" Oct 12 07:22:56 crc kubenswrapper[4930]: I1012 07:22:56.002983 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x2jmv" Oct 12 07:22:56 crc kubenswrapper[4930]: I1012 07:22:56.507589 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-x2jmv"] Oct 12 07:22:57 crc kubenswrapper[4930]: I1012 07:22:57.178085 4930 generic.go:334] "Generic (PLEG): container finished" podID="94b9559e-4bad-4534-8fa3-b25a6bc82e58" containerID="632e82da176434579fd5346697a19ef3e3c5e93df14970fd873854b44b3687e6" exitCode=0 Oct 12 07:22:57 crc kubenswrapper[4930]: I1012 07:22:57.178392 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x2jmv" event={"ID":"94b9559e-4bad-4534-8fa3-b25a6bc82e58","Type":"ContainerDied","Data":"632e82da176434579fd5346697a19ef3e3c5e93df14970fd873854b44b3687e6"} Oct 12 07:22:57 crc kubenswrapper[4930]: I1012 07:22:57.178729 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x2jmv" event={"ID":"94b9559e-4bad-4534-8fa3-b25a6bc82e58","Type":"ContainerStarted","Data":"3b83d0feab5ad48f64385806c89d6c56950e25c4d721833c19290b3bdd887825"} Oct 12 07:23:00 crc kubenswrapper[4930]: I1012 07:23:00.218879 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x2jmv" event={"ID":"94b9559e-4bad-4534-8fa3-b25a6bc82e58","Type":"ContainerStarted","Data":"5d1addd2ba56b9f558486301bff0290944e926b20495dd04446dbbbf33ddf60e"} Oct 12 07:23:01 crc kubenswrapper[4930]: I1012 07:23:01.229922 4930 generic.go:334] "Generic (PLEG): container finished" podID="94b9559e-4bad-4534-8fa3-b25a6bc82e58" containerID="5d1addd2ba56b9f558486301bff0290944e926b20495dd04446dbbbf33ddf60e" exitCode=0 Oct 12 07:23:01 crc kubenswrapper[4930]: I1012 07:23:01.229964 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x2jmv" event={"ID":"94b9559e-4bad-4534-8fa3-b25a6bc82e58","Type":"ContainerDied","Data":"5d1addd2ba56b9f558486301bff0290944e926b20495dd04446dbbbf33ddf60e"} Oct 12 07:23:02 crc kubenswrapper[4930]: I1012 07:23:02.242705 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x2jmv" event={"ID":"94b9559e-4bad-4534-8fa3-b25a6bc82e58","Type":"ContainerStarted","Data":"c409fe2850f91c8eaf60ab691de1677b4f1d4ece24a8335da7023a797b8f3b2b"} Oct 12 07:23:02 crc kubenswrapper[4930]: I1012 07:23:02.275798 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-x2jmv" podStartSLOduration=2.7291986059999997 podStartE2EDuration="7.275769405s" podCreationTimestamp="2025-10-12 07:22:55 +0000 UTC" firstStartedPulling="2025-10-12 07:22:57.182405178 +0000 UTC m=+6109.724506983" lastFinishedPulling="2025-10-12 07:23:01.728976017 +0000 UTC m=+6114.271077782" observedRunningTime="2025-10-12 07:23:02.267986014 +0000 UTC m=+6114.810087769" watchObservedRunningTime="2025-10-12 07:23:02.275769405 +0000 UTC m=+6114.817871180" Oct 12 07:23:06 crc kubenswrapper[4930]: I1012 07:23:06.003680 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-x2jmv" Oct 12 07:23:06 crc kubenswrapper[4930]: I1012 07:23:06.003829 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-x2jmv" Oct 12 07:23:06 crc kubenswrapper[4930]: I1012 07:23:06.097832 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-x2jmv" Oct 12 07:23:06 crc kubenswrapper[4930]: I1012 07:23:06.433455 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-x2jmv" Oct 12 07:23:06 crc kubenswrapper[4930]: I1012 07:23:06.513295 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-x2jmv"] Oct 12 07:23:07 crc kubenswrapper[4930]: I1012 07:23:07.309091 4930 generic.go:334] "Generic (PLEG): container finished" podID="7406add0-dc8d-45d5-8fef-cb49293eb22d" containerID="a224154544669734fd0052804975d74ca69cfe0c2bc280db7be78fad86ec61bf" exitCode=0 Oct 12 07:23:07 crc kubenswrapper[4930]: I1012 07:23:07.309228 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nwxk2/must-gather-rj56h" event={"ID":"7406add0-dc8d-45d5-8fef-cb49293eb22d","Type":"ContainerDied","Data":"a224154544669734fd0052804975d74ca69cfe0c2bc280db7be78fad86ec61bf"} Oct 12 07:23:07 crc kubenswrapper[4930]: I1012 07:23:07.310363 4930 scope.go:117] "RemoveContainer" containerID="a224154544669734fd0052804975d74ca69cfe0c2bc280db7be78fad86ec61bf" Oct 12 07:23:07 crc kubenswrapper[4930]: I1012 07:23:07.449529 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-nwxk2_must-gather-rj56h_7406add0-dc8d-45d5-8fef-cb49293eb22d/gather/0.log" Oct 12 07:23:08 crc kubenswrapper[4930]: I1012 07:23:08.152275 4930 scope.go:117] "RemoveContainer" containerID="0d8022778327009e45bab53498f81e9a59c7470864854500a94947d41f09cf19" Oct 12 07:23:08 crc kubenswrapper[4930]: E1012 07:23:08.153798 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 07:23:08 crc kubenswrapper[4930]: I1012 07:23:08.317124 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-x2jmv" podUID="94b9559e-4bad-4534-8fa3-b25a6bc82e58" containerName="registry-server" containerID="cri-o://c409fe2850f91c8eaf60ab691de1677b4f1d4ece24a8335da7023a797b8f3b2b" gracePeriod=2 Oct 12 07:23:08 crc kubenswrapper[4930]: I1012 07:23:08.818345 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x2jmv" Oct 12 07:23:08 crc kubenswrapper[4930]: I1012 07:23:08.957664 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94b9559e-4bad-4534-8fa3-b25a6bc82e58-utilities\") pod \"94b9559e-4bad-4534-8fa3-b25a6bc82e58\" (UID: \"94b9559e-4bad-4534-8fa3-b25a6bc82e58\") " Oct 12 07:23:08 crc kubenswrapper[4930]: I1012 07:23:08.957752 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94b9559e-4bad-4534-8fa3-b25a6bc82e58-catalog-content\") pod \"94b9559e-4bad-4534-8fa3-b25a6bc82e58\" (UID: \"94b9559e-4bad-4534-8fa3-b25a6bc82e58\") " Oct 12 07:23:08 crc kubenswrapper[4930]: I1012 07:23:08.957784 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlz6j\" (UniqueName: \"kubernetes.io/projected/94b9559e-4bad-4534-8fa3-b25a6bc82e58-kube-api-access-dlz6j\") pod \"94b9559e-4bad-4534-8fa3-b25a6bc82e58\" (UID: \"94b9559e-4bad-4534-8fa3-b25a6bc82e58\") " Oct 12 07:23:08 crc kubenswrapper[4930]: I1012 07:23:08.959462 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94b9559e-4bad-4534-8fa3-b25a6bc82e58-utilities" (OuterVolumeSpecName: "utilities") pod "94b9559e-4bad-4534-8fa3-b25a6bc82e58" (UID: "94b9559e-4bad-4534-8fa3-b25a6bc82e58"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:23:08 crc kubenswrapper[4930]: I1012 07:23:08.965097 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94b9559e-4bad-4534-8fa3-b25a6bc82e58-kube-api-access-dlz6j" (OuterVolumeSpecName: "kube-api-access-dlz6j") pod "94b9559e-4bad-4534-8fa3-b25a6bc82e58" (UID: "94b9559e-4bad-4534-8fa3-b25a6bc82e58"). InnerVolumeSpecName "kube-api-access-dlz6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:23:09 crc kubenswrapper[4930]: I1012 07:23:09.015083 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94b9559e-4bad-4534-8fa3-b25a6bc82e58-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "94b9559e-4bad-4534-8fa3-b25a6bc82e58" (UID: "94b9559e-4bad-4534-8fa3-b25a6bc82e58"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:23:09 crc kubenswrapper[4930]: I1012 07:23:09.059797 4930 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94b9559e-4bad-4534-8fa3-b25a6bc82e58-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 07:23:09 crc kubenswrapper[4930]: I1012 07:23:09.059862 4930 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94b9559e-4bad-4534-8fa3-b25a6bc82e58-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 07:23:09 crc kubenswrapper[4930]: I1012 07:23:09.059878 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlz6j\" (UniqueName: \"kubernetes.io/projected/94b9559e-4bad-4534-8fa3-b25a6bc82e58-kube-api-access-dlz6j\") on node \"crc\" DevicePath \"\"" Oct 12 07:23:09 crc kubenswrapper[4930]: I1012 07:23:09.335715 4930 generic.go:334] "Generic (PLEG): container finished" podID="94b9559e-4bad-4534-8fa3-b25a6bc82e58" containerID="c409fe2850f91c8eaf60ab691de1677b4f1d4ece24a8335da7023a797b8f3b2b" exitCode=0 Oct 12 07:23:09 crc kubenswrapper[4930]: I1012 07:23:09.335823 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x2jmv" event={"ID":"94b9559e-4bad-4534-8fa3-b25a6bc82e58","Type":"ContainerDied","Data":"c409fe2850f91c8eaf60ab691de1677b4f1d4ece24a8335da7023a797b8f3b2b"} Oct 12 07:23:09 crc kubenswrapper[4930]: I1012 07:23:09.335956 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x2jmv" Oct 12 07:23:09 crc kubenswrapper[4930]: I1012 07:23:09.336146 4930 scope.go:117] "RemoveContainer" containerID="c409fe2850f91c8eaf60ab691de1677b4f1d4ece24a8335da7023a797b8f3b2b" Oct 12 07:23:09 crc kubenswrapper[4930]: I1012 07:23:09.336128 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x2jmv" event={"ID":"94b9559e-4bad-4534-8fa3-b25a6bc82e58","Type":"ContainerDied","Data":"3b83d0feab5ad48f64385806c89d6c56950e25c4d721833c19290b3bdd887825"} Oct 12 07:23:09 crc kubenswrapper[4930]: I1012 07:23:09.378980 4930 scope.go:117] "RemoveContainer" containerID="5d1addd2ba56b9f558486301bff0290944e926b20495dd04446dbbbf33ddf60e" Oct 12 07:23:09 crc kubenswrapper[4930]: I1012 07:23:09.398951 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-x2jmv"] Oct 12 07:23:09 crc kubenswrapper[4930]: I1012 07:23:09.409456 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-x2jmv"] Oct 12 07:23:09 crc kubenswrapper[4930]: I1012 07:23:09.434561 4930 scope.go:117] "RemoveContainer" containerID="632e82da176434579fd5346697a19ef3e3c5e93df14970fd873854b44b3687e6" Oct 12 07:23:09 crc kubenswrapper[4930]: I1012 07:23:09.485954 4930 scope.go:117] "RemoveContainer" containerID="c409fe2850f91c8eaf60ab691de1677b4f1d4ece24a8335da7023a797b8f3b2b" Oct 12 07:23:09 crc kubenswrapper[4930]: E1012 07:23:09.487139 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c409fe2850f91c8eaf60ab691de1677b4f1d4ece24a8335da7023a797b8f3b2b\": container with ID starting with c409fe2850f91c8eaf60ab691de1677b4f1d4ece24a8335da7023a797b8f3b2b not found: ID does not exist" containerID="c409fe2850f91c8eaf60ab691de1677b4f1d4ece24a8335da7023a797b8f3b2b" Oct 12 07:23:09 crc kubenswrapper[4930]: I1012 07:23:09.487204 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c409fe2850f91c8eaf60ab691de1677b4f1d4ece24a8335da7023a797b8f3b2b"} err="failed to get container status \"c409fe2850f91c8eaf60ab691de1677b4f1d4ece24a8335da7023a797b8f3b2b\": rpc error: code = NotFound desc = could not find container \"c409fe2850f91c8eaf60ab691de1677b4f1d4ece24a8335da7023a797b8f3b2b\": container with ID starting with c409fe2850f91c8eaf60ab691de1677b4f1d4ece24a8335da7023a797b8f3b2b not found: ID does not exist" Oct 12 07:23:09 crc kubenswrapper[4930]: I1012 07:23:09.487229 4930 scope.go:117] "RemoveContainer" containerID="5d1addd2ba56b9f558486301bff0290944e926b20495dd04446dbbbf33ddf60e" Oct 12 07:23:09 crc kubenswrapper[4930]: E1012 07:23:09.487980 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d1addd2ba56b9f558486301bff0290944e926b20495dd04446dbbbf33ddf60e\": container with ID starting with 5d1addd2ba56b9f558486301bff0290944e926b20495dd04446dbbbf33ddf60e not found: ID does not exist" containerID="5d1addd2ba56b9f558486301bff0290944e926b20495dd04446dbbbf33ddf60e" Oct 12 07:23:09 crc kubenswrapper[4930]: I1012 07:23:09.488050 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d1addd2ba56b9f558486301bff0290944e926b20495dd04446dbbbf33ddf60e"} err="failed to get container status \"5d1addd2ba56b9f558486301bff0290944e926b20495dd04446dbbbf33ddf60e\": rpc error: code = NotFound desc = could not find container \"5d1addd2ba56b9f558486301bff0290944e926b20495dd04446dbbbf33ddf60e\": container with ID starting with 5d1addd2ba56b9f558486301bff0290944e926b20495dd04446dbbbf33ddf60e not found: ID does not exist" Oct 12 07:23:09 crc kubenswrapper[4930]: I1012 07:23:09.488087 4930 scope.go:117] "RemoveContainer" containerID="632e82da176434579fd5346697a19ef3e3c5e93df14970fd873854b44b3687e6" Oct 12 07:23:09 crc kubenswrapper[4930]: E1012 07:23:09.488426 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"632e82da176434579fd5346697a19ef3e3c5e93df14970fd873854b44b3687e6\": container with ID starting with 632e82da176434579fd5346697a19ef3e3c5e93df14970fd873854b44b3687e6 not found: ID does not exist" containerID="632e82da176434579fd5346697a19ef3e3c5e93df14970fd873854b44b3687e6" Oct 12 07:23:09 crc kubenswrapper[4930]: I1012 07:23:09.488456 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"632e82da176434579fd5346697a19ef3e3c5e93df14970fd873854b44b3687e6"} err="failed to get container status \"632e82da176434579fd5346697a19ef3e3c5e93df14970fd873854b44b3687e6\": rpc error: code = NotFound desc = could not find container \"632e82da176434579fd5346697a19ef3e3c5e93df14970fd873854b44b3687e6\": container with ID starting with 632e82da176434579fd5346697a19ef3e3c5e93df14970fd873854b44b3687e6 not found: ID does not exist" Oct 12 07:23:10 crc kubenswrapper[4930]: I1012 07:23:10.150433 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94b9559e-4bad-4534-8fa3-b25a6bc82e58" path="/var/lib/kubelet/pods/94b9559e-4bad-4534-8fa3-b25a6bc82e58/volumes" Oct 12 07:23:18 crc kubenswrapper[4930]: I1012 07:23:18.749186 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-nwxk2/must-gather-rj56h"] Oct 12 07:23:18 crc kubenswrapper[4930]: I1012 07:23:18.750187 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-nwxk2/must-gather-rj56h" podUID="7406add0-dc8d-45d5-8fef-cb49293eb22d" containerName="copy" containerID="cri-o://515e244a833eb2e4340f335f10a04472d4f90a550242c21da01a012a760b7457" gracePeriod=2 Oct 12 07:23:18 crc kubenswrapper[4930]: I1012 07:23:18.763087 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-nwxk2/must-gather-rj56h"] Oct 12 07:23:19 crc kubenswrapper[4930]: I1012 07:23:19.196037 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-nwxk2_must-gather-rj56h_7406add0-dc8d-45d5-8fef-cb49293eb22d/copy/0.log" Oct 12 07:23:19 crc kubenswrapper[4930]: I1012 07:23:19.196704 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nwxk2/must-gather-rj56h" Oct 12 07:23:19 crc kubenswrapper[4930]: I1012 07:23:19.397120 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4rt8\" (UniqueName: \"kubernetes.io/projected/7406add0-dc8d-45d5-8fef-cb49293eb22d-kube-api-access-r4rt8\") pod \"7406add0-dc8d-45d5-8fef-cb49293eb22d\" (UID: \"7406add0-dc8d-45d5-8fef-cb49293eb22d\") " Oct 12 07:23:19 crc kubenswrapper[4930]: I1012 07:23:19.397227 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7406add0-dc8d-45d5-8fef-cb49293eb22d-must-gather-output\") pod \"7406add0-dc8d-45d5-8fef-cb49293eb22d\" (UID: \"7406add0-dc8d-45d5-8fef-cb49293eb22d\") " Oct 12 07:23:19 crc kubenswrapper[4930]: I1012 07:23:19.416264 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7406add0-dc8d-45d5-8fef-cb49293eb22d-kube-api-access-r4rt8" (OuterVolumeSpecName: "kube-api-access-r4rt8") pod "7406add0-dc8d-45d5-8fef-cb49293eb22d" (UID: "7406add0-dc8d-45d5-8fef-cb49293eb22d"). InnerVolumeSpecName "kube-api-access-r4rt8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:23:19 crc kubenswrapper[4930]: I1012 07:23:19.460703 4930 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-nwxk2_must-gather-rj56h_7406add0-dc8d-45d5-8fef-cb49293eb22d/copy/0.log" Oct 12 07:23:19 crc kubenswrapper[4930]: I1012 07:23:19.461177 4930 generic.go:334] "Generic (PLEG): container finished" podID="7406add0-dc8d-45d5-8fef-cb49293eb22d" containerID="515e244a833eb2e4340f335f10a04472d4f90a550242c21da01a012a760b7457" exitCode=143 Oct 12 07:23:19 crc kubenswrapper[4930]: I1012 07:23:19.461227 4930 scope.go:117] "RemoveContainer" containerID="515e244a833eb2e4340f335f10a04472d4f90a550242c21da01a012a760b7457" Oct 12 07:23:19 crc kubenswrapper[4930]: I1012 07:23:19.461355 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nwxk2/must-gather-rj56h" Oct 12 07:23:19 crc kubenswrapper[4930]: I1012 07:23:19.502824 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4rt8\" (UniqueName: \"kubernetes.io/projected/7406add0-dc8d-45d5-8fef-cb49293eb22d-kube-api-access-r4rt8\") on node \"crc\" DevicePath \"\"" Oct 12 07:23:19 crc kubenswrapper[4930]: I1012 07:23:19.503015 4930 scope.go:117] "RemoveContainer" containerID="a224154544669734fd0052804975d74ca69cfe0c2bc280db7be78fad86ec61bf" Oct 12 07:23:19 crc kubenswrapper[4930]: I1012 07:23:19.603102 4930 scope.go:117] "RemoveContainer" containerID="515e244a833eb2e4340f335f10a04472d4f90a550242c21da01a012a760b7457" Oct 12 07:23:19 crc kubenswrapper[4930]: E1012 07:23:19.604273 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"515e244a833eb2e4340f335f10a04472d4f90a550242c21da01a012a760b7457\": container with ID starting with 515e244a833eb2e4340f335f10a04472d4f90a550242c21da01a012a760b7457 not found: ID does not exist" containerID="515e244a833eb2e4340f335f10a04472d4f90a550242c21da01a012a760b7457" Oct 12 07:23:19 crc kubenswrapper[4930]: I1012 07:23:19.604315 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"515e244a833eb2e4340f335f10a04472d4f90a550242c21da01a012a760b7457"} err="failed to get container status \"515e244a833eb2e4340f335f10a04472d4f90a550242c21da01a012a760b7457\": rpc error: code = NotFound desc = could not find container \"515e244a833eb2e4340f335f10a04472d4f90a550242c21da01a012a760b7457\": container with ID starting with 515e244a833eb2e4340f335f10a04472d4f90a550242c21da01a012a760b7457 not found: ID does not exist" Oct 12 07:23:19 crc kubenswrapper[4930]: I1012 07:23:19.604339 4930 scope.go:117] "RemoveContainer" containerID="a224154544669734fd0052804975d74ca69cfe0c2bc280db7be78fad86ec61bf" Oct 12 07:23:19 crc kubenswrapper[4930]: E1012 07:23:19.604674 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a224154544669734fd0052804975d74ca69cfe0c2bc280db7be78fad86ec61bf\": container with ID starting with a224154544669734fd0052804975d74ca69cfe0c2bc280db7be78fad86ec61bf not found: ID does not exist" containerID="a224154544669734fd0052804975d74ca69cfe0c2bc280db7be78fad86ec61bf" Oct 12 07:23:19 crc kubenswrapper[4930]: I1012 07:23:19.604766 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a224154544669734fd0052804975d74ca69cfe0c2bc280db7be78fad86ec61bf"} err="failed to get container status \"a224154544669734fd0052804975d74ca69cfe0c2bc280db7be78fad86ec61bf\": rpc error: code = NotFound desc = could not find container \"a224154544669734fd0052804975d74ca69cfe0c2bc280db7be78fad86ec61bf\": container with ID starting with a224154544669734fd0052804975d74ca69cfe0c2bc280db7be78fad86ec61bf not found: ID does not exist" Oct 12 07:23:19 crc kubenswrapper[4930]: I1012 07:23:19.619677 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7406add0-dc8d-45d5-8fef-cb49293eb22d-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "7406add0-dc8d-45d5-8fef-cb49293eb22d" (UID: "7406add0-dc8d-45d5-8fef-cb49293eb22d"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:23:19 crc kubenswrapper[4930]: I1012 07:23:19.706009 4930 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7406add0-dc8d-45d5-8fef-cb49293eb22d-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 12 07:23:20 crc kubenswrapper[4930]: I1012 07:23:20.137571 4930 scope.go:117] "RemoveContainer" containerID="0d8022778327009e45bab53498f81e9a59c7470864854500a94947d41f09cf19" Oct 12 07:23:20 crc kubenswrapper[4930]: E1012 07:23:20.137927 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 07:23:20 crc kubenswrapper[4930]: I1012 07:23:20.146955 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7406add0-dc8d-45d5-8fef-cb49293eb22d" path="/var/lib/kubelet/pods/7406add0-dc8d-45d5-8fef-cb49293eb22d/volumes" Oct 12 07:23:32 crc kubenswrapper[4930]: I1012 07:23:32.489404 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-z82cn"] Oct 12 07:23:32 crc kubenswrapper[4930]: E1012 07:23:32.490446 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94b9559e-4bad-4534-8fa3-b25a6bc82e58" containerName="registry-server" Oct 12 07:23:32 crc kubenswrapper[4930]: I1012 07:23:32.490463 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="94b9559e-4bad-4534-8fa3-b25a6bc82e58" containerName="registry-server" Oct 12 07:23:32 crc kubenswrapper[4930]: E1012 07:23:32.490491 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94b9559e-4bad-4534-8fa3-b25a6bc82e58" containerName="extract-utilities" Oct 12 07:23:32 crc kubenswrapper[4930]: I1012 07:23:32.490500 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="94b9559e-4bad-4534-8fa3-b25a6bc82e58" containerName="extract-utilities" Oct 12 07:23:32 crc kubenswrapper[4930]: E1012 07:23:32.490514 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94b9559e-4bad-4534-8fa3-b25a6bc82e58" containerName="extract-content" Oct 12 07:23:32 crc kubenswrapper[4930]: I1012 07:23:32.490523 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="94b9559e-4bad-4534-8fa3-b25a6bc82e58" containerName="extract-content" Oct 12 07:23:32 crc kubenswrapper[4930]: E1012 07:23:32.490557 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7406add0-dc8d-45d5-8fef-cb49293eb22d" containerName="gather" Oct 12 07:23:32 crc kubenswrapper[4930]: I1012 07:23:32.490564 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="7406add0-dc8d-45d5-8fef-cb49293eb22d" containerName="gather" Oct 12 07:23:32 crc kubenswrapper[4930]: E1012 07:23:32.490583 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7406add0-dc8d-45d5-8fef-cb49293eb22d" containerName="copy" Oct 12 07:23:32 crc kubenswrapper[4930]: I1012 07:23:32.490590 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="7406add0-dc8d-45d5-8fef-cb49293eb22d" containerName="copy" Oct 12 07:23:32 crc kubenswrapper[4930]: I1012 07:23:32.490841 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="94b9559e-4bad-4534-8fa3-b25a6bc82e58" containerName="registry-server" Oct 12 07:23:32 crc kubenswrapper[4930]: I1012 07:23:32.490883 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="7406add0-dc8d-45d5-8fef-cb49293eb22d" containerName="copy" Oct 12 07:23:32 crc kubenswrapper[4930]: I1012 07:23:32.490892 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="7406add0-dc8d-45d5-8fef-cb49293eb22d" containerName="gather" Oct 12 07:23:32 crc kubenswrapper[4930]: I1012 07:23:32.492644 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z82cn" Oct 12 07:23:32 crc kubenswrapper[4930]: I1012 07:23:32.505377 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-z82cn"] Oct 12 07:23:32 crc kubenswrapper[4930]: I1012 07:23:32.603512 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c6870b2-e7bf-4454-9b73-17a247473b19-utilities\") pod \"redhat-operators-z82cn\" (UID: \"9c6870b2-e7bf-4454-9b73-17a247473b19\") " pod="openshift-marketplace/redhat-operators-z82cn" Oct 12 07:23:32 crc kubenswrapper[4930]: I1012 07:23:32.604217 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c6870b2-e7bf-4454-9b73-17a247473b19-catalog-content\") pod \"redhat-operators-z82cn\" (UID: \"9c6870b2-e7bf-4454-9b73-17a247473b19\") " pod="openshift-marketplace/redhat-operators-z82cn" Oct 12 07:23:32 crc kubenswrapper[4930]: I1012 07:23:32.604341 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkjwg\" (UniqueName: \"kubernetes.io/projected/9c6870b2-e7bf-4454-9b73-17a247473b19-kube-api-access-nkjwg\") pod \"redhat-operators-z82cn\" (UID: \"9c6870b2-e7bf-4454-9b73-17a247473b19\") " pod="openshift-marketplace/redhat-operators-z82cn" Oct 12 07:23:32 crc kubenswrapper[4930]: I1012 07:23:32.706279 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkjwg\" (UniqueName: \"kubernetes.io/projected/9c6870b2-e7bf-4454-9b73-17a247473b19-kube-api-access-nkjwg\") pod \"redhat-operators-z82cn\" (UID: \"9c6870b2-e7bf-4454-9b73-17a247473b19\") " pod="openshift-marketplace/redhat-operators-z82cn" Oct 12 07:23:32 crc kubenswrapper[4930]: I1012 07:23:32.706417 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c6870b2-e7bf-4454-9b73-17a247473b19-utilities\") pod \"redhat-operators-z82cn\" (UID: \"9c6870b2-e7bf-4454-9b73-17a247473b19\") " pod="openshift-marketplace/redhat-operators-z82cn" Oct 12 07:23:32 crc kubenswrapper[4930]: I1012 07:23:32.706534 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c6870b2-e7bf-4454-9b73-17a247473b19-catalog-content\") pod \"redhat-operators-z82cn\" (UID: \"9c6870b2-e7bf-4454-9b73-17a247473b19\") " pod="openshift-marketplace/redhat-operators-z82cn" Oct 12 07:23:32 crc kubenswrapper[4930]: I1012 07:23:32.707066 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c6870b2-e7bf-4454-9b73-17a247473b19-catalog-content\") pod \"redhat-operators-z82cn\" (UID: \"9c6870b2-e7bf-4454-9b73-17a247473b19\") " pod="openshift-marketplace/redhat-operators-z82cn" Oct 12 07:23:32 crc kubenswrapper[4930]: I1012 07:23:32.707196 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c6870b2-e7bf-4454-9b73-17a247473b19-utilities\") pod \"redhat-operators-z82cn\" (UID: \"9c6870b2-e7bf-4454-9b73-17a247473b19\") " pod="openshift-marketplace/redhat-operators-z82cn" Oct 12 07:23:32 crc kubenswrapper[4930]: I1012 07:23:32.732873 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkjwg\" (UniqueName: \"kubernetes.io/projected/9c6870b2-e7bf-4454-9b73-17a247473b19-kube-api-access-nkjwg\") pod \"redhat-operators-z82cn\" (UID: \"9c6870b2-e7bf-4454-9b73-17a247473b19\") " pod="openshift-marketplace/redhat-operators-z82cn" Oct 12 07:23:32 crc kubenswrapper[4930]: I1012 07:23:32.814362 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z82cn" Oct 12 07:23:33 crc kubenswrapper[4930]: I1012 07:23:33.298206 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-z82cn"] Oct 12 07:23:33 crc kubenswrapper[4930]: I1012 07:23:33.664827 4930 generic.go:334] "Generic (PLEG): container finished" podID="9c6870b2-e7bf-4454-9b73-17a247473b19" containerID="48840cc9ba3d5cca35d0f099797e2c194e9e11506a69598611e8aa7f41abb327" exitCode=0 Oct 12 07:23:33 crc kubenswrapper[4930]: I1012 07:23:33.665100 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z82cn" event={"ID":"9c6870b2-e7bf-4454-9b73-17a247473b19","Type":"ContainerDied","Data":"48840cc9ba3d5cca35d0f099797e2c194e9e11506a69598611e8aa7f41abb327"} Oct 12 07:23:33 crc kubenswrapper[4930]: I1012 07:23:33.665125 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z82cn" event={"ID":"9c6870b2-e7bf-4454-9b73-17a247473b19","Type":"ContainerStarted","Data":"73c1ba2d56b0381d60944213627ee55ee20038b377db77477b5b506b9d3be002"} Oct 12 07:23:34 crc kubenswrapper[4930]: I1012 07:23:34.135436 4930 scope.go:117] "RemoveContainer" containerID="0d8022778327009e45bab53498f81e9a59c7470864854500a94947d41f09cf19" Oct 12 07:23:34 crc kubenswrapper[4930]: E1012 07:23:34.136102 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 07:23:35 crc kubenswrapper[4930]: I1012 07:23:35.683867 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z82cn" event={"ID":"9c6870b2-e7bf-4454-9b73-17a247473b19","Type":"ContainerStarted","Data":"a304904de228b6e63a396a0222c3718a760f7ca40828101947e5b06199b752ea"} Oct 12 07:23:37 crc kubenswrapper[4930]: I1012 07:23:37.705503 4930 generic.go:334] "Generic (PLEG): container finished" podID="9c6870b2-e7bf-4454-9b73-17a247473b19" containerID="a304904de228b6e63a396a0222c3718a760f7ca40828101947e5b06199b752ea" exitCode=0 Oct 12 07:23:37 crc kubenswrapper[4930]: I1012 07:23:37.705552 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z82cn" event={"ID":"9c6870b2-e7bf-4454-9b73-17a247473b19","Type":"ContainerDied","Data":"a304904de228b6e63a396a0222c3718a760f7ca40828101947e5b06199b752ea"} Oct 12 07:23:38 crc kubenswrapper[4930]: I1012 07:23:38.722217 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z82cn" event={"ID":"9c6870b2-e7bf-4454-9b73-17a247473b19","Type":"ContainerStarted","Data":"b42aaceba3c37828ecf1088bf80abb0210cd73b93f600b36600940dfb916fc8d"} Oct 12 07:23:38 crc kubenswrapper[4930]: I1012 07:23:38.754562 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-z82cn" podStartSLOduration=2.314116167 podStartE2EDuration="6.754545984s" podCreationTimestamp="2025-10-12 07:23:32 +0000 UTC" firstStartedPulling="2025-10-12 07:23:33.666257162 +0000 UTC m=+6146.208358927" lastFinishedPulling="2025-10-12 07:23:38.106686979 +0000 UTC m=+6150.648788744" observedRunningTime="2025-10-12 07:23:38.747356267 +0000 UTC m=+6151.289458032" watchObservedRunningTime="2025-10-12 07:23:38.754545984 +0000 UTC m=+6151.296647749" Oct 12 07:23:42 crc kubenswrapper[4930]: I1012 07:23:42.814975 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-z82cn" Oct 12 07:23:42 crc kubenswrapper[4930]: I1012 07:23:42.815432 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-z82cn" Oct 12 07:23:43 crc kubenswrapper[4930]: I1012 07:23:43.873968 4930 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-z82cn" podUID="9c6870b2-e7bf-4454-9b73-17a247473b19" containerName="registry-server" probeResult="failure" output=< Oct 12 07:23:43 crc kubenswrapper[4930]: timeout: failed to connect service ":50051" within 1s Oct 12 07:23:43 crc kubenswrapper[4930]: > Oct 12 07:23:47 crc kubenswrapper[4930]: I1012 07:23:47.135440 4930 scope.go:117] "RemoveContainer" containerID="0d8022778327009e45bab53498f81e9a59c7470864854500a94947d41f09cf19" Oct 12 07:23:47 crc kubenswrapper[4930]: E1012 07:23:47.137336 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 07:23:52 crc kubenswrapper[4930]: I1012 07:23:52.890377 4930 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-z82cn" Oct 12 07:23:52 crc kubenswrapper[4930]: I1012 07:23:52.984980 4930 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-z82cn" Oct 12 07:23:53 crc kubenswrapper[4930]: I1012 07:23:53.147449 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-z82cn"] Oct 12 07:23:54 crc kubenswrapper[4930]: I1012 07:23:54.909421 4930 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-z82cn" podUID="9c6870b2-e7bf-4454-9b73-17a247473b19" containerName="registry-server" containerID="cri-o://b42aaceba3c37828ecf1088bf80abb0210cd73b93f600b36600940dfb916fc8d" gracePeriod=2 Oct 12 07:23:55 crc kubenswrapper[4930]: I1012 07:23:55.525422 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z82cn" Oct 12 07:23:55 crc kubenswrapper[4930]: I1012 07:23:55.571290 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c6870b2-e7bf-4454-9b73-17a247473b19-catalog-content\") pod \"9c6870b2-e7bf-4454-9b73-17a247473b19\" (UID: \"9c6870b2-e7bf-4454-9b73-17a247473b19\") " Oct 12 07:23:55 crc kubenswrapper[4930]: I1012 07:23:55.571380 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c6870b2-e7bf-4454-9b73-17a247473b19-utilities\") pod \"9c6870b2-e7bf-4454-9b73-17a247473b19\" (UID: \"9c6870b2-e7bf-4454-9b73-17a247473b19\") " Oct 12 07:23:55 crc kubenswrapper[4930]: I1012 07:23:55.571619 4930 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nkjwg\" (UniqueName: \"kubernetes.io/projected/9c6870b2-e7bf-4454-9b73-17a247473b19-kube-api-access-nkjwg\") pod \"9c6870b2-e7bf-4454-9b73-17a247473b19\" (UID: \"9c6870b2-e7bf-4454-9b73-17a247473b19\") " Oct 12 07:23:55 crc kubenswrapper[4930]: I1012 07:23:55.574112 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c6870b2-e7bf-4454-9b73-17a247473b19-utilities" (OuterVolumeSpecName: "utilities") pod "9c6870b2-e7bf-4454-9b73-17a247473b19" (UID: "9c6870b2-e7bf-4454-9b73-17a247473b19"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:23:55 crc kubenswrapper[4930]: I1012 07:23:55.578999 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c6870b2-e7bf-4454-9b73-17a247473b19-kube-api-access-nkjwg" (OuterVolumeSpecName: "kube-api-access-nkjwg") pod "9c6870b2-e7bf-4454-9b73-17a247473b19" (UID: "9c6870b2-e7bf-4454-9b73-17a247473b19"). InnerVolumeSpecName "kube-api-access-nkjwg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 12 07:23:55 crc kubenswrapper[4930]: I1012 07:23:55.667890 4930 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c6870b2-e7bf-4454-9b73-17a247473b19-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9c6870b2-e7bf-4454-9b73-17a247473b19" (UID: "9c6870b2-e7bf-4454-9b73-17a247473b19"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 12 07:23:55 crc kubenswrapper[4930]: I1012 07:23:55.673999 4930 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nkjwg\" (UniqueName: \"kubernetes.io/projected/9c6870b2-e7bf-4454-9b73-17a247473b19-kube-api-access-nkjwg\") on node \"crc\" DevicePath \"\"" Oct 12 07:23:55 crc kubenswrapper[4930]: I1012 07:23:55.674283 4930 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c6870b2-e7bf-4454-9b73-17a247473b19-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 12 07:23:55 crc kubenswrapper[4930]: I1012 07:23:55.674298 4930 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c6870b2-e7bf-4454-9b73-17a247473b19-utilities\") on node \"crc\" DevicePath \"\"" Oct 12 07:23:55 crc kubenswrapper[4930]: I1012 07:23:55.924890 4930 generic.go:334] "Generic (PLEG): container finished" podID="9c6870b2-e7bf-4454-9b73-17a247473b19" containerID="b42aaceba3c37828ecf1088bf80abb0210cd73b93f600b36600940dfb916fc8d" exitCode=0 Oct 12 07:23:55 crc kubenswrapper[4930]: I1012 07:23:55.924962 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z82cn" event={"ID":"9c6870b2-e7bf-4454-9b73-17a247473b19","Type":"ContainerDied","Data":"b42aaceba3c37828ecf1088bf80abb0210cd73b93f600b36600940dfb916fc8d"} Oct 12 07:23:55 crc kubenswrapper[4930]: I1012 07:23:55.925005 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z82cn" event={"ID":"9c6870b2-e7bf-4454-9b73-17a247473b19","Type":"ContainerDied","Data":"73c1ba2d56b0381d60944213627ee55ee20038b377db77477b5b506b9d3be002"} Oct 12 07:23:55 crc kubenswrapper[4930]: I1012 07:23:55.925034 4930 scope.go:117] "RemoveContainer" containerID="b42aaceba3c37828ecf1088bf80abb0210cd73b93f600b36600940dfb916fc8d" Oct 12 07:23:55 crc kubenswrapper[4930]: I1012 07:23:55.925235 4930 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z82cn" Oct 12 07:23:55 crc kubenswrapper[4930]: I1012 07:23:55.962930 4930 scope.go:117] "RemoveContainer" containerID="a304904de228b6e63a396a0222c3718a760f7ca40828101947e5b06199b752ea" Oct 12 07:23:55 crc kubenswrapper[4930]: I1012 07:23:55.998651 4930 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-z82cn"] Oct 12 07:23:56 crc kubenswrapper[4930]: I1012 07:23:56.000896 4930 scope.go:117] "RemoveContainer" containerID="48840cc9ba3d5cca35d0f099797e2c194e9e11506a69598611e8aa7f41abb327" Oct 12 07:23:56 crc kubenswrapper[4930]: I1012 07:23:56.010865 4930 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-z82cn"] Oct 12 07:23:56 crc kubenswrapper[4930]: I1012 07:23:56.080757 4930 scope.go:117] "RemoveContainer" containerID="b42aaceba3c37828ecf1088bf80abb0210cd73b93f600b36600940dfb916fc8d" Oct 12 07:23:56 crc kubenswrapper[4930]: E1012 07:23:56.081241 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b42aaceba3c37828ecf1088bf80abb0210cd73b93f600b36600940dfb916fc8d\": container with ID starting with b42aaceba3c37828ecf1088bf80abb0210cd73b93f600b36600940dfb916fc8d not found: ID does not exist" containerID="b42aaceba3c37828ecf1088bf80abb0210cd73b93f600b36600940dfb916fc8d" Oct 12 07:23:56 crc kubenswrapper[4930]: I1012 07:23:56.081281 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b42aaceba3c37828ecf1088bf80abb0210cd73b93f600b36600940dfb916fc8d"} err="failed to get container status \"b42aaceba3c37828ecf1088bf80abb0210cd73b93f600b36600940dfb916fc8d\": rpc error: code = NotFound desc = could not find container \"b42aaceba3c37828ecf1088bf80abb0210cd73b93f600b36600940dfb916fc8d\": container with ID starting with b42aaceba3c37828ecf1088bf80abb0210cd73b93f600b36600940dfb916fc8d not found: ID does not exist" Oct 12 07:23:56 crc kubenswrapper[4930]: I1012 07:23:56.081306 4930 scope.go:117] "RemoveContainer" containerID="a304904de228b6e63a396a0222c3718a760f7ca40828101947e5b06199b752ea" Oct 12 07:23:56 crc kubenswrapper[4930]: E1012 07:23:56.081691 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a304904de228b6e63a396a0222c3718a760f7ca40828101947e5b06199b752ea\": container with ID starting with a304904de228b6e63a396a0222c3718a760f7ca40828101947e5b06199b752ea not found: ID does not exist" containerID="a304904de228b6e63a396a0222c3718a760f7ca40828101947e5b06199b752ea" Oct 12 07:23:56 crc kubenswrapper[4930]: I1012 07:23:56.081718 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a304904de228b6e63a396a0222c3718a760f7ca40828101947e5b06199b752ea"} err="failed to get container status \"a304904de228b6e63a396a0222c3718a760f7ca40828101947e5b06199b752ea\": rpc error: code = NotFound desc = could not find container \"a304904de228b6e63a396a0222c3718a760f7ca40828101947e5b06199b752ea\": container with ID starting with a304904de228b6e63a396a0222c3718a760f7ca40828101947e5b06199b752ea not found: ID does not exist" Oct 12 07:23:56 crc kubenswrapper[4930]: I1012 07:23:56.081733 4930 scope.go:117] "RemoveContainer" containerID="48840cc9ba3d5cca35d0f099797e2c194e9e11506a69598611e8aa7f41abb327" Oct 12 07:23:56 crc kubenswrapper[4930]: E1012 07:23:56.082033 4930 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48840cc9ba3d5cca35d0f099797e2c194e9e11506a69598611e8aa7f41abb327\": container with ID starting with 48840cc9ba3d5cca35d0f099797e2c194e9e11506a69598611e8aa7f41abb327 not found: ID does not exist" containerID="48840cc9ba3d5cca35d0f099797e2c194e9e11506a69598611e8aa7f41abb327" Oct 12 07:23:56 crc kubenswrapper[4930]: I1012 07:23:56.082059 4930 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48840cc9ba3d5cca35d0f099797e2c194e9e11506a69598611e8aa7f41abb327"} err="failed to get container status \"48840cc9ba3d5cca35d0f099797e2c194e9e11506a69598611e8aa7f41abb327\": rpc error: code = NotFound desc = could not find container \"48840cc9ba3d5cca35d0f099797e2c194e9e11506a69598611e8aa7f41abb327\": container with ID starting with 48840cc9ba3d5cca35d0f099797e2c194e9e11506a69598611e8aa7f41abb327 not found: ID does not exist" Oct 12 07:23:56 crc kubenswrapper[4930]: I1012 07:23:56.149822 4930 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c6870b2-e7bf-4454-9b73-17a247473b19" path="/var/lib/kubelet/pods/9c6870b2-e7bf-4454-9b73-17a247473b19/volumes" Oct 12 07:24:02 crc kubenswrapper[4930]: I1012 07:24:02.135426 4930 scope.go:117] "RemoveContainer" containerID="0d8022778327009e45bab53498f81e9a59c7470864854500a94947d41f09cf19" Oct 12 07:24:02 crc kubenswrapper[4930]: E1012 07:24:02.136448 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 07:24:13 crc kubenswrapper[4930]: I1012 07:24:13.136360 4930 scope.go:117] "RemoveContainer" containerID="0d8022778327009e45bab53498f81e9a59c7470864854500a94947d41f09cf19" Oct 12 07:24:13 crc kubenswrapper[4930]: E1012 07:24:13.137496 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 07:24:28 crc kubenswrapper[4930]: I1012 07:24:28.148128 4930 scope.go:117] "RemoveContainer" containerID="0d8022778327009e45bab53498f81e9a59c7470864854500a94947d41f09cf19" Oct 12 07:24:28 crc kubenswrapper[4930]: E1012 07:24:28.150888 4930 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mk4tf_openshift-machine-config-operator(02f8684c-a3e4-44e8-9741-9f54488d8d8d)\"" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" podUID="02f8684c-a3e4-44e8-9741-9f54488d8d8d" Oct 12 07:24:29 crc kubenswrapper[4930]: I1012 07:24:29.764512 4930 scope.go:117] "RemoveContainer" containerID="86400f583e17019934a161f6f6bae03e03330c39c42330f73af96b567b4c0ae4" Oct 12 07:24:29 crc kubenswrapper[4930]: I1012 07:24:29.806592 4930 scope.go:117] "RemoveContainer" containerID="a3ad5a9a1fdb82c2670954ad2a2ecf61c2b7ba9dad0e4b2b14acc31d6be3cc66" Oct 12 07:24:41 crc kubenswrapper[4930]: I1012 07:24:41.135237 4930 scope.go:117] "RemoveContainer" containerID="0d8022778327009e45bab53498f81e9a59c7470864854500a94947d41f09cf19" Oct 12 07:24:41 crc kubenswrapper[4930]: I1012 07:24:41.553727 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mk4tf" event={"ID":"02f8684c-a3e4-44e8-9741-9f54488d8d8d","Type":"ContainerStarted","Data":"e0c7c88cf891842639a6a576649ad66bcedb9991ae8620faa0765084bc7880bd"} Oct 12 07:25:26 crc kubenswrapper[4930]: I1012 07:25:26.014555 4930 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tb7fv"] Oct 12 07:25:26 crc kubenswrapper[4930]: E1012 07:25:26.018004 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c6870b2-e7bf-4454-9b73-17a247473b19" containerName="registry-server" Oct 12 07:25:26 crc kubenswrapper[4930]: I1012 07:25:26.018457 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c6870b2-e7bf-4454-9b73-17a247473b19" containerName="registry-server" Oct 12 07:25:26 crc kubenswrapper[4930]: E1012 07:25:26.018663 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c6870b2-e7bf-4454-9b73-17a247473b19" containerName="extract-content" Oct 12 07:25:26 crc kubenswrapper[4930]: I1012 07:25:26.018855 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c6870b2-e7bf-4454-9b73-17a247473b19" containerName="extract-content" Oct 12 07:25:26 crc kubenswrapper[4930]: E1012 07:25:26.019036 4930 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c6870b2-e7bf-4454-9b73-17a247473b19" containerName="extract-utilities" Oct 12 07:25:26 crc kubenswrapper[4930]: I1012 07:25:26.019155 4930 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c6870b2-e7bf-4454-9b73-17a247473b19" containerName="extract-utilities" Oct 12 07:25:26 crc kubenswrapper[4930]: I1012 07:25:26.019754 4930 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c6870b2-e7bf-4454-9b73-17a247473b19" containerName="registry-server" Oct 12 07:25:26 crc kubenswrapper[4930]: I1012 07:25:26.021537 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tb7fv" Oct 12 07:25:26 crc kubenswrapper[4930]: I1012 07:25:26.042119 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tb7fv"] Oct 12 07:25:26 crc kubenswrapper[4930]: I1012 07:25:26.157301 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0171a2b4-24d6-4ebe-89f9-9629157bdeff-catalog-content\") pod \"certified-operators-tb7fv\" (UID: \"0171a2b4-24d6-4ebe-89f9-9629157bdeff\") " pod="openshift-marketplace/certified-operators-tb7fv" Oct 12 07:25:26 crc kubenswrapper[4930]: I1012 07:25:26.157365 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jd6dq\" (UniqueName: \"kubernetes.io/projected/0171a2b4-24d6-4ebe-89f9-9629157bdeff-kube-api-access-jd6dq\") pod \"certified-operators-tb7fv\" (UID: \"0171a2b4-24d6-4ebe-89f9-9629157bdeff\") " pod="openshift-marketplace/certified-operators-tb7fv" Oct 12 07:25:26 crc kubenswrapper[4930]: I1012 07:25:26.157430 4930 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0171a2b4-24d6-4ebe-89f9-9629157bdeff-utilities\") pod \"certified-operators-tb7fv\" (UID: \"0171a2b4-24d6-4ebe-89f9-9629157bdeff\") " pod="openshift-marketplace/certified-operators-tb7fv" Oct 12 07:25:26 crc kubenswrapper[4930]: I1012 07:25:26.259871 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0171a2b4-24d6-4ebe-89f9-9629157bdeff-utilities\") pod \"certified-operators-tb7fv\" (UID: \"0171a2b4-24d6-4ebe-89f9-9629157bdeff\") " pod="openshift-marketplace/certified-operators-tb7fv" Oct 12 07:25:26 crc kubenswrapper[4930]: I1012 07:25:26.260141 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0171a2b4-24d6-4ebe-89f9-9629157bdeff-catalog-content\") pod \"certified-operators-tb7fv\" (UID: \"0171a2b4-24d6-4ebe-89f9-9629157bdeff\") " pod="openshift-marketplace/certified-operators-tb7fv" Oct 12 07:25:26 crc kubenswrapper[4930]: I1012 07:25:26.260244 4930 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jd6dq\" (UniqueName: \"kubernetes.io/projected/0171a2b4-24d6-4ebe-89f9-9629157bdeff-kube-api-access-jd6dq\") pod \"certified-operators-tb7fv\" (UID: \"0171a2b4-24d6-4ebe-89f9-9629157bdeff\") " pod="openshift-marketplace/certified-operators-tb7fv" Oct 12 07:25:26 crc kubenswrapper[4930]: I1012 07:25:26.260841 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0171a2b4-24d6-4ebe-89f9-9629157bdeff-catalog-content\") pod \"certified-operators-tb7fv\" (UID: \"0171a2b4-24d6-4ebe-89f9-9629157bdeff\") " pod="openshift-marketplace/certified-operators-tb7fv" Oct 12 07:25:26 crc kubenswrapper[4930]: I1012 07:25:26.260891 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0171a2b4-24d6-4ebe-89f9-9629157bdeff-utilities\") pod \"certified-operators-tb7fv\" (UID: \"0171a2b4-24d6-4ebe-89f9-9629157bdeff\") " pod="openshift-marketplace/certified-operators-tb7fv" Oct 12 07:25:26 crc kubenswrapper[4930]: I1012 07:25:26.281178 4930 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jd6dq\" (UniqueName: \"kubernetes.io/projected/0171a2b4-24d6-4ebe-89f9-9629157bdeff-kube-api-access-jd6dq\") pod \"certified-operators-tb7fv\" (UID: \"0171a2b4-24d6-4ebe-89f9-9629157bdeff\") " pod="openshift-marketplace/certified-operators-tb7fv" Oct 12 07:25:26 crc kubenswrapper[4930]: I1012 07:25:26.345271 4930 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tb7fv" Oct 12 07:25:26 crc kubenswrapper[4930]: I1012 07:25:26.729540 4930 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tb7fv"] Oct 12 07:25:27 crc kubenswrapper[4930]: I1012 07:25:27.139542 4930 generic.go:334] "Generic (PLEG): container finished" podID="0171a2b4-24d6-4ebe-89f9-9629157bdeff" containerID="0685d5af3c1b47038b53c65b66f7f6fad60fffc18203e44097afa0495a585a41" exitCode=0 Oct 12 07:25:27 crc kubenswrapper[4930]: I1012 07:25:27.139808 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tb7fv" event={"ID":"0171a2b4-24d6-4ebe-89f9-9629157bdeff","Type":"ContainerDied","Data":"0685d5af3c1b47038b53c65b66f7f6fad60fffc18203e44097afa0495a585a41"} Oct 12 07:25:27 crc kubenswrapper[4930]: I1012 07:25:27.139828 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tb7fv" event={"ID":"0171a2b4-24d6-4ebe-89f9-9629157bdeff","Type":"ContainerStarted","Data":"1401c13efc25e826397ae614bbae2b1f7a052defa09c1d685225ee970e0800e5"} Oct 12 07:25:29 crc kubenswrapper[4930]: I1012 07:25:29.163029 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tb7fv" event={"ID":"0171a2b4-24d6-4ebe-89f9-9629157bdeff","Type":"ContainerStarted","Data":"be72128136de4b764ba8256c267aa450b6b3d9e62cd2fbdb76a09b5725694550"} Oct 12 07:25:30 crc kubenswrapper[4930]: I1012 07:25:30.176436 4930 generic.go:334] "Generic (PLEG): container finished" podID="0171a2b4-24d6-4ebe-89f9-9629157bdeff" containerID="be72128136de4b764ba8256c267aa450b6b3d9e62cd2fbdb76a09b5725694550" exitCode=0 Oct 12 07:25:30 crc kubenswrapper[4930]: I1012 07:25:30.176786 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tb7fv" event={"ID":"0171a2b4-24d6-4ebe-89f9-9629157bdeff","Type":"ContainerDied","Data":"be72128136de4b764ba8256c267aa450b6b3d9e62cd2fbdb76a09b5725694550"} Oct 12 07:25:31 crc kubenswrapper[4930]: I1012 07:25:31.189946 4930 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tb7fv" event={"ID":"0171a2b4-24d6-4ebe-89f9-9629157bdeff","Type":"ContainerStarted","Data":"e620608637ccc14a98a0a21e131b99a1b379e82ba0ea44cf0d50eaadece4b15c"} Oct 12 07:25:31 crc kubenswrapper[4930]: I1012 07:25:31.221114 4930 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-tb7fv" podStartSLOduration=2.737950843 podStartE2EDuration="6.221090349s" podCreationTimestamp="2025-10-12 07:25:25 +0000 UTC" firstStartedPulling="2025-10-12 07:25:27.142465627 +0000 UTC m=+6259.684567392" lastFinishedPulling="2025-10-12 07:25:30.625605133 +0000 UTC m=+6263.167706898" observedRunningTime="2025-10-12 07:25:31.21910902 +0000 UTC m=+6263.761210825" watchObservedRunningTime="2025-10-12 07:25:31.221090349 +0000 UTC m=+6263.763192154"